You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@lucene.apache.org by Policeman Jenkins Server <je...@thetaphi.de> on 2020/01/15 09:19:57 UTC

[JENKINS] Lucene-Solr-master-MacOSX (64bit/jdk-13.0.1) - Build # 5533 - Unstable!

Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-MacOSX/5533/
Java: 64bit/jdk-13.0.1 -XX:+UseCompressedOops -XX:+UseG1GC

2 tests failed.
FAILED:  org.apache.solr.cloud.TestCryptoKeys.test

Error Message:
{   "responseHeader":{     "status":500,     "QTime":30117},   "errorMessages":["4 out of 5 the property overlay to be of version 3 within 30 seconds! Failed cores: [http://127.0.0.1:54467/_/os/collection1_shard1_replica_n1/, http://127.0.0.1:54475/_/os/collection1_shard1_replica_n3/, http://127.0.0.1:54483/_/os/collection1_shard1_replica_n5/, http://127.0.0.1:54494/_/os/collection1_shard1_replica_n7/]\n"],   "WARNING":"This response format is experimental.  It is likely to change in the future.",   "error":{     "metadata":[       "error-class","org.apache.solr.common.SolrException",       "root-error-class","org.apache.solr.common.SolrException"],     "msg":"4 out of 5 the property overlay to be of version 3 within 30 seconds! Failed cores: [http://127.0.0.1:54467/_/os/collection1_shard1_replica_n1/, http://127.0.0.1:54475/_/os/collection1_shard1_replica_n3/, http://127.0.0.1:54483/_/os/collection1_shard1_replica_n5/, http://127.0.0.1:54494/_/os/collection1_shard1_replica_n7/]",     "trace":"org.apache.solr.common.SolrException: 4 out of 5 the property overlay to be of version 3 within 30 seconds! Failed cores: [http://127.0.0.1:54467/_/os/collection1_shard1_replica_n1/, http://127.0.0.1:54475/_/os/collection1_shard1_replica_n3/, http://127.0.0.1:54483/_/os/collection1_shard1_replica_n5/, http://127.0.0.1:54494/_/os/collection1_shard1_replica_n7/]\n\tat org.apache.solr.handler.SolrConfigHandler.waitForAllReplicasState(SolrConfigHandler.java:813)\n\tat org.apache.solr.handler.SolrConfigHandler$Command.handleCommands(SolrConfigHandler.java:522)\n\tat org.apache.solr.handler.SolrConfigHandler$Command.handlePOST(SolrConfigHandler.java:363)\n\tat org.apache.solr.handler.SolrConfigHandler.handleRequestBody(SolrConfigHandler.java:139)\n\tat org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:208)\n\tat org.apache.solr.core.SolrCore.execute(SolrCore.java:2582)\n\tat org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:799)\n\tat org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:578)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)\n\tat org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)\n\tat org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)\n\tat org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1607)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1297)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)\n\tat org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)\n\tat org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1577)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1212)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n\tat org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)\n\tat org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:717)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n\tat org.eclipse.jetty.server.Server.handle(Server.java:500)\n\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)\n\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)\n\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)\n\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:270)\n\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\n\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)\n\tat org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)\n\tat java.base/java.lang.Thread.run(Thread.java:830)\n",     "code":500}}  expected null, but was:<[4 out of 5 the property overlay to be of version 3 within 30 seconds! Failed cores: [http://127.0.0.1:54467/_/os/collection1_shard1_replica_n1/, http://127.0.0.1:54475/_/os/collection1_shard1_replica_n3/, http://127.0.0.1:54483/_/os/collection1_shard1_replica_n5/, http://127.0.0.1:54494/_/os/collection1_shard1_replica_n7/] ]>

Stack Trace:
java.lang.AssertionError: {
  "responseHeader":{
    "status":500,
    "QTime":30117},
  "errorMessages":["4 out of 5 the property overlay to be of version 3 within 30 seconds! Failed cores: [http://127.0.0.1:54467/_/os/collection1_shard1_replica_n1/, http://127.0.0.1:54475/_/os/collection1_shard1_replica_n3/, http://127.0.0.1:54483/_/os/collection1_shard1_replica_n5/, http://127.0.0.1:54494/_/os/collection1_shard1_replica_n7/]\n"],
  "WARNING":"This response format is experimental.  It is likely to change in the future.",
  "error":{
    "metadata":[
      "error-class","org.apache.solr.common.SolrException",
      "root-error-class","org.apache.solr.common.SolrException"],
    "msg":"4 out of 5 the property overlay to be of version 3 within 30 seconds! Failed cores: [http://127.0.0.1:54467/_/os/collection1_shard1_replica_n1/, http://127.0.0.1:54475/_/os/collection1_shard1_replica_n3/, http://127.0.0.1:54483/_/os/collection1_shard1_replica_n5/, http://127.0.0.1:54494/_/os/collection1_shard1_replica_n7/]",
    "trace":"org.apache.solr.common.SolrException: 4 out of 5 the property overlay to be of version 3 within 30 seconds! Failed cores: [http://127.0.0.1:54467/_/os/collection1_shard1_replica_n1/, http://127.0.0.1:54475/_/os/collection1_shard1_replica_n3/, http://127.0.0.1:54483/_/os/collection1_shard1_replica_n5/, http://127.0.0.1:54494/_/os/collection1_shard1_replica_n7/]\n\tat org.apache.solr.handler.SolrConfigHandler.waitForAllReplicasState(SolrConfigHandler.java:813)\n\tat org.apache.solr.handler.SolrConfigHandler$Command.handleCommands(SolrConfigHandler.java:522)\n\tat org.apache.solr.handler.SolrConfigHandler$Command.handlePOST(SolrConfigHandler.java:363)\n\tat org.apache.solr.handler.SolrConfigHandler.handleRequestBody(SolrConfigHandler.java:139)\n\tat org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:208)\n\tat org.apache.solr.core.SolrCore.execute(SolrCore.java:2582)\n\tat org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:799)\n\tat org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:578)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)\n\tat org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)\n\tat org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)\n\tat org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1607)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1297)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)\n\tat org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)\n\tat org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1577)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1212)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n\tat org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)\n\tat org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:717)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n\tat org.eclipse.jetty.server.Server.handle(Server.java:500)\n\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)\n\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)\n\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)\n\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:270)\n\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\n\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)\n\tat org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)\n\tat java.base/java.lang.Thread.run(Thread.java:830)\n",
    "code":500}}
 expected null, but was:<[4 out of 5 the property overlay to be of version 3 within 30 seconds! Failed cores: [http://127.0.0.1:54467/_/os/collection1_shard1_replica_n1/, http://127.0.0.1:54475/_/os/collection1_shard1_replica_n3/, http://127.0.0.1:54483/_/os/collection1_shard1_replica_n5/, http://127.0.0.1:54494/_/os/collection1_shard1_replica_n7/]
]>
	at __randomizedtesting.SeedInfo.seed([5F57E984AD45F55C:D703D65E03B998A4]:0)
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotNull(Assert.java:755)
	at org.junit.Assert.assertNull(Assert.java:737)
	at org.apache.solr.core.TestSolrConfigHandler.runConfigCommand(TestSolrConfigHandler.java:179)
	at org.apache.solr.cloud.TestCryptoKeys.test(TestCryptoKeys.java:159)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1754)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:942)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:978)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:992)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1082)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1054)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:819)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:470)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:951)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:887)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:898)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:826)
	at java.base/java.lang.Thread.run(Thread.java:830)


FAILED:  org.apache.solr.cloud.autoscaling.TriggerCooldownIntegrationTest.testCooldown

Error Message:
The trigger did not fire at all

Stack Trace:
java.lang.AssertionError: The trigger did not fire at all
	at __randomizedtesting.SeedInfo.seed([5F57E984AD45F55C:6EE98460D3EF80AE]:0)
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.solr.cloud.autoscaling.TriggerCooldownIntegrationTest.testCooldown(TriggerCooldownIntegrationTest.java:160)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1754)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:942)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:978)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:992)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:819)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:470)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:951)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:887)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:898)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:826)
	at java.base/java.lang.Thread.run(Thread.java:830)




Build Log:
[...truncated 13956 lines...]
   [junit4] Suite: org.apache.solr.cloud.TestCryptoKeys
   [junit4]   2> 752302 INFO  (SUITE-TestCryptoKeys-seed#[5F57E984AD45F55C]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 752302 INFO  (SUITE-TestCryptoKeys-seed#[5F57E984AD45F55C]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/data-dir-54-001
   [junit4]   2> 752302 INFO  (SUITE-TestCryptoKeys-seed#[5F57E984AD45F55C]-worker) [     ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=false
   [junit4]   2> 752303 INFO  (SUITE-TestCryptoKeys-seed#[5F57E984AD45F55C]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason="", value=0.0/0.0, ssl=0.0/0.0, clientAuth=0.0/0.0) w/ MAC_OS_X supressed clientAuth
   [junit4]   2> 752303 INFO  (SUITE-TestCryptoKeys-seed#[5F57E984AD45F55C]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /_/os
   [junit4]   2> 752307 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 752307 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 752307 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 752410 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer start zk server on port:54430
   [junit4]   2> 752410 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:54430
   [junit4]   2> 752410 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:54430
   [junit4]   2> 752410 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 54430
   [junit4]   2> 752413 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 752423 INFO  (zkConnectionManagerCallback-4260-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 752423 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 752446 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 752455 INFO  (zkConnectionManagerCallback-4262-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 752455 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 752461 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 752465 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 752471 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 752480 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 752488 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 752497 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 752504 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 752510 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 752517 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 752524 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 752529 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 752534 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 753393 WARN  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 753393 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 753393 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 753393 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 753396 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 753396 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 753396 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 753396 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@7af3da75{/_/os,null,AVAILABLE}
   [junit4]   2> 753402 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@afae674{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:54439}
   [junit4]   2> 753402 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.e.j.s.Server Started @753460ms
   [junit4]   2> 753402 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/_/os, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/tempDir-001/control/data, hostPort=54439, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/../../../../../../../../../Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/control-001/cores}
   [junit4]   2> 753402 ERROR (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 753402 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 753402 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 9.0.0
   [junit4]   2> 753402 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 753402 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 753402 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-15T08:29:31.944980Z
   [junit4]   2> 753405 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 753410 INFO  (zkConnectionManagerCallback-4264-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 753410 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 753517 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 753517 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/control-001/solr.xml
   [junit4]   2> 753519 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 753519 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 753520 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 753606 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 753608 WARN  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@5c9d3c48[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 753608 WARN  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@5c9d3c48[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 753612 WARN  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@15ebed58[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 753612 WARN  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@15ebed58[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 753612 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:54430/solr
   [junit4]   2> 753612 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 753618 INFO  (zkConnectionManagerCallback-4271-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 753618 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 753731 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [n:127.0.0.1:54439__%2Fos     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 753734 INFO  (zkConnectionManagerCallback-4273-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 753735 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [n:127.0.0.1:54439__%2Fos     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 753993 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [n:127.0.0.1:54439__%2Fos     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:54439__%2Fos
   [junit4]   2> 753997 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [n:127.0.0.1:54439__%2Fos     ] o.a.s.c.Overseer Overseer (id=72058221813432324-127.0.0.1:54439__%2Fos-n_0000000000) starting
   [junit4]   2> 754031 INFO  (OverseerStateUpdate-72058221813432324-127.0.0.1:54439__%2Fos-n_0000000000) [n:127.0.0.1:54439__%2Fos     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:54439__%2Fos
   [junit4]   2> 754033 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [n:127.0.0.1:54439__%2Fos     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:54439__%2Fos
   [junit4]   2> 754042 INFO  (zkCallback-4272-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 754052 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [n:127.0.0.1:54439__%2Fos     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 754053 WARN  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [n:127.0.0.1:54439__%2Fos     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 754080 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [n:127.0.0.1:54439__%2Fos     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 754114 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [n:127.0.0.1:54439__%2Fos     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c
   [junit4]   2> 754123 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [n:127.0.0.1:54439__%2Fos     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c
   [junit4]   2> 754123 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [n:127.0.0.1:54439__%2Fos     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c
   [junit4]   2> 754125 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [n:127.0.0.1:54439__%2Fos     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/../../../../../../../../../Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/control-001/cores
   [junit4]   2> 754147 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 754152 INFO  (zkConnectionManagerCallback-4282-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 754152 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 754161 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 754169 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:54430/solr ready
   [junit4]   2> 754173 INFO  (qtp1893605684-10738) [n:127.0.0.1:54439__%2Fos     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:54439__%252Fos&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 754189 INFO  (OverseerThreadFactory-2477-thread-1-processing-n:127.0.0.1:54439__%2Fos) [n:127.0.0.1:54439__%2Fos     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 754320 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 754324 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 755359 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 755374 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema [control_collection_shard1_replica_n1] Schema name=test
   [junit4]   2> 755455 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 755642 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from collection control_collection, trusted=true
   [junit4]   2> 755644 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c
   [junit4]   2> 755644 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/../../../../../../../../../Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/control-001/cores/control_collection_shard1_replica_n1/data/]
   [junit4]   2> 755648 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=10, maxMergeAtOnceExplicit=45, maxMergedSegmentMB=1.8408203125, floorSegmentMB=0.6357421875, forceMergeDeletesPctAllowed=6.719986869380841, segmentsPerTier=21.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=32.89640153268013
   [junit4]   2> 755652 WARN  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}}
   [junit4]   2> 755830 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 755830 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 755832 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 755832 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 755833 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=46, maxMergeAtOnceExplicit=42, maxMergedSegmentMB=22.8896484375, floorSegmentMB=0.6337890625, forceMergeDeletesPctAllowed=11.343436892213774, segmentsPerTier=50.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.7740097558278903, deletesPctAllowed=30.944794221630104
   [junit4]   2> 755834 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@702288e[control_collection_shard1_replica_n1] main]
   [junit4]   2> 755838 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 755839 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 755839 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 755839 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1655782217488531456
   [junit4]   2> 755845 INFO  (searcherExecutor-2482-thread-1-processing-n:127.0.0.1:54439__%2Fos x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@702288e[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 755853 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 755853 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 755870 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 755870 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 755870 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:54439/_/os/control_collection_shard1_replica_n1/
   [junit4]   2> 755871 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 755873 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:54439/_/os/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 755873 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72058221813432324-core_node2-n_0000000000
   [junit4]   2> 755888 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:54439/_/os/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 756027 INFO  (zkCallback-4272-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 756029 INFO  (zkCallback-4272-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 756048 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 756062 INFO  (qtp1893605684-10740) [n:127.0.0.1:54439__%2Fos c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1742
   [junit4]   2> 756075 INFO  (qtp1893605684-10738) [n:127.0.0.1:54439__%2Fos     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 756162 INFO  (zkCallback-4272-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 756163 INFO  (zkCallback-4272-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 756164 INFO  (zkCallback-4272-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 756168 INFO  (qtp1893605684-10738) [n:127.0.0.1:54439__%2Fos     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:54439__%252Fos&wt=javabin&version=2} status=0 QTime=1995
   [junit4]   2> 756169 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 756287 INFO  (OverseerCollectionConfigSetProcessor-72058221813432324-127.0.0.1:54439__%2Fos-n_0000000000) [n:127.0.0.1:54439__%2Fos     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 756288 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 756293 INFO  (zkConnectionManagerCallback-4288-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 756293 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 756301 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 756310 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:54430/solr ready
   [junit4]   2> 756310 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 756313 INFO  (qtp1893605684-10738) [n:127.0.0.1:54439__%2Fos     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 756338 INFO  (OverseerThreadFactory-2477-thread-2-processing-n:127.0.0.1:54439__%2Fos) [n:127.0.0.1:54439__%2Fos     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 756564 WARN  (OverseerThreadFactory-2477-thread-2-processing-n:127.0.0.1:54439__%2Fos) [n:127.0.0.1:54439__%2Fos     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 756583 INFO  (qtp1893605684-10738) [n:127.0.0.1:54439__%2Fos     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 756588 INFO  (qtp1893605684-10738) [n:127.0.0.1:54439__%2Fos     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2} status=0 QTime=274
   [junit4]   2> 756596 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 756596 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 756596 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 756596 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 756596 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 756596 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 756596 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=4
   [junit4]   2> 756933 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-1-001 of type NRT
   [junit4]   2> 756934 WARN  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 756934 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 756934 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 756934 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 756935 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 756935 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 756935 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 756936 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@1db7adfa{/_/os,null,AVAILABLE}
   [junit4]   2> 756936 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@69349a20{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:54467}
   [junit4]   2> 756936 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.Server Started @756995ms
   [junit4]   2> 756936 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/_/os, solrconfig=solrconfig.xml, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/tempDir-001/jetty1, hostPort=54467, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/../../../../../../../../../Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-1-001/cores, replicaType=NRT}
   [junit4]   2> 756937 ERROR (closeThreadPool-4289-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 756937 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 756937 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 9.0.0
   [junit4]   2> 756937 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 756937 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 756937 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-15T08:29:35.479476Z
   [junit4]   2> 756939 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 756942 INFO  (zkConnectionManagerCallback-4291-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 756942 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 757045 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 757045 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-1-001/solr.xml
   [junit4]   2> 757049 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 757049 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 757050 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 757145 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 757146 WARN  (closeThreadPool-4289-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@67f0a7d6[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 757146 WARN  (closeThreadPool-4289-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@67f0a7d6[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 757150 WARN  (closeThreadPool-4289-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@709eaff9[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 757150 WARN  (closeThreadPool-4289-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@709eaff9[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 757151 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:54430/solr
   [junit4]   2> 757152 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 757173 INFO  (zkConnectionManagerCallback-4298-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 757174 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 757272 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 2 in directory /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-2-001 of type NRT
   [junit4]   2> 757273 WARN  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 757273 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 757273 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 757274 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 757275 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 757275 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 757275 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 757276 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@694fc8ec{/_/os,null,AVAILABLE}
   [junit4]   2> 757277 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.AbstractConnector Started ServerConnector@62a1e07a{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:54475}
   [junit4]   2> 757277 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.Server Started @757335ms
   [junit4]   2> 757277 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/_/os, solrconfig=solrconfig.xml, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/tempDir-001/jetty2, hostPort=54475, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/../../../../../../../../../Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-2-001/cores, replicaType=NRT}
   [junit4]   2> 757277 ERROR (closeThreadPool-4289-thread-2) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 757277 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 757277 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 9.0.0
   [junit4]   2> 757277 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 757277 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 757277 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-15T08:29:35.819951Z
   [junit4]   2> 757278 INFO  (closeThreadPool-4289-thread-1) [n:127.0.0.1:54467__%2Fos     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 757280 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 757282 INFO  (zkConnectionManagerCallback-4300-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 757282 INFO  (closeThreadPool-4289-thread-1) [n:127.0.0.1:54467__%2Fos     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 757282 INFO  (zkConnectionManagerCallback-4302-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 757282 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 757298 INFO  (closeThreadPool-4289-thread-1) [n:127.0.0.1:54467__%2Fos     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 757308 INFO  (closeThreadPool-4289-thread-1) [n:127.0.0.1:54467__%2Fos     ] o.a.s.c.ZkController Publish node=127.0.0.1:54467__%2Fos as DOWN
   [junit4]   2> 757308 INFO  (closeThreadPool-4289-thread-1) [n:127.0.0.1:54467__%2Fos     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 757308 INFO  (closeThreadPool-4289-thread-1) [n:127.0.0.1:54467__%2Fos     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:54467__%2Fos
   [junit4]   2> 757315 INFO  (zkCallback-4287-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 757316 INFO  (zkCallback-4272-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 757316 INFO  (zkCallback-4299-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 757331 INFO  (closeThreadPool-4289-thread-1) [n:127.0.0.1:54467__%2Fos     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 757331 WARN  (closeThreadPool-4289-thread-1) [n:127.0.0.1:54467__%2Fos     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 757350 INFO  (closeThreadPool-4289-thread-1) [n:127.0.0.1:54467__%2Fos     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 757372 INFO  (closeThreadPool-4289-thread-1) [n:127.0.0.1:54467__%2Fos     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c
   [junit4]   2> 757385 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 757385 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-2-001/solr.xml
   [junit4]   2> 757388 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 757388 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 757389 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 757392 INFO  (closeThreadPool-4289-thread-1) [n:127.0.0.1:54467__%2Fos     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c
   [junit4]   2> 757392 INFO  (closeThreadPool-4289-thread-1) [n:127.0.0.1:54467__%2Fos     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c
   [junit4]   2> 757393 INFO  (closeThreadPool-4289-thread-1) [n:127.0.0.1:54467__%2Fos     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/../../../../../../../../../Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-1-001/cores
   [junit4]   2> 757416 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:54467__%2Fos
   [junit4]   2> 757481 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 757483 WARN  (closeThreadPool-4289-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@3a123fee[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 757483 WARN  (closeThreadPool-4289-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@3a123fee[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 757487 WARN  (closeThreadPool-4289-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@7cc06195[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 757487 WARN  (closeThreadPool-4289-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@7cc06195[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 757488 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:54430/solr
   [junit4]   2> 757489 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 757493 INFO  (zkConnectionManagerCallback-4313-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 757493 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 757599 INFO  (closeThreadPool-4289-thread-2) [n:127.0.0.1:54475__%2Fos     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 757603 INFO  (zkConnectionManagerCallback-4315-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 757604 INFO  (closeThreadPool-4289-thread-2) [n:127.0.0.1:54475__%2Fos     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 757635 INFO  (closeThreadPool-4289-thread-2) [n:127.0.0.1:54475__%2Fos     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2)
   [junit4]   2> 757654 INFO  (closeThreadPool-4289-thread-2) [n:127.0.0.1:54475__%2Fos     ] o.a.s.c.ZkController Publish node=127.0.0.1:54475__%2Fos as DOWN
   [junit4]   2> 757657 INFO  (closeThreadPool-4289-thread-2) [n:127.0.0.1:54475__%2Fos     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 757657 INFO  (closeThreadPool-4289-thread-2) [n:127.0.0.1:54475__%2Fos     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:54475__%2Fos
   [junit4]   2> 757662 INFO  (zkCallback-4299-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 757662 INFO  (zkCallback-4287-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 757663 INFO  (zkCallback-4272-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 757663 INFO  (zkCallback-4314-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 757667 INFO  (closeThreadPool-4289-thread-2) [n:127.0.0.1:54475__%2Fos     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 757667 WARN  (closeThreadPool-4289-thread-2) [n:127.0.0.1:54475__%2Fos     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 757689 INFO  (closeThreadPool-4289-thread-2) [n:127.0.0.1:54475__%2Fos     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 757720 INFO  (closeThreadPool-4289-thread-2) [n:127.0.0.1:54475__%2Fos     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c
   [junit4]   2> 757728 INFO  (closeThreadPool-4289-thread-2) [n:127.0.0.1:54475__%2Fos     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c
   [junit4]   2> 757728 INFO  (closeThreadPool-4289-thread-2) [n:127.0.0.1:54475__%2Fos     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c
   [junit4]   2> 757732 INFO  (closeThreadPool-4289-thread-2) [n:127.0.0.1:54475__%2Fos     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/../../../../../../../../../Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-2-001/cores
   [junit4]   2> 757769 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:54475__%2Fos
   [junit4]   2> 757784 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 3 in directory /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-3-001 of type NRT
   [junit4]   2> 757785 WARN  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 757785 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 757785 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 757785 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 757787 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 757788 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 757788 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 757788 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@995485a{/_/os,null,AVAILABLE}
   [junit4]   2> 757789 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.AbstractConnector Started ServerConnector@4d9c3002{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:54483}
   [junit4]   2> 757789 INFO  (closeThreadPool-4289-thread-2) [     ] o.e.j.s.Server Started @757848ms
   [junit4]   2> 757789 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/_/os, solrconfig=solrconfig.xml, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/tempDir-001/jetty3, hostPort=54483, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-3-001/cores, replicaType=NRT}
   [junit4]   2> 757790 ERROR (closeThreadPool-4289-thread-2) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 757790 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 757790 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 9.0.0
   [junit4]   2> 757790 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 757790 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 757791 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-15T08:29:36.333053Z
   [junit4]   2> 757792 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 757795 INFO  (zkConnectionManagerCallback-4321-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 757796 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 757906 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 757906 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-3-001/solr.xml
   [junit4]   2> 757914 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 757914 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 757915 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 758020 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 758022 WARN  (closeThreadPool-4289-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@178b032[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 758022 WARN  (closeThreadPool-4289-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@178b032[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 758026 WARN  (closeThreadPool-4289-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@37fe7c7e[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 758026 WARN  (closeThreadPool-4289-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@37fe7c7e[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 758027 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:54430/solr
   [junit4]   2> 758029 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 758035 INFO  (zkConnectionManagerCallback-4328-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 758035 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 758146 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 758154 INFO  (zkConnectionManagerCallback-4330-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 758154 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 758180 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3)
   [junit4]   2> 758227 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:54483__%2Fos as DOWN
   [junit4]   2> 758231 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 758231 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:54483__%2Fos
   [junit4]   2> 758238 INFO  (zkCallback-4287-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 758238 INFO  (zkCallback-4314-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 758239 INFO  (zkCallback-4299-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 758239 INFO  (zkCallback-4272-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 758240 INFO  (zkCallback-4329-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 758254 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 758254 WARN  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 758279 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 758299 INFO  (TEST-TestCryptoKeys.test-seed#[5F57E984AD45F55C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 4 in directory /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-4-001 of type NRT
   [junit4]   2> 758300 WARN  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 758300 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 758300 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 758300 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 758301 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c
   [junit4]   2> 758303 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 758303 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 758303 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 758304 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@2171eb9b{/_/os,null,AVAILABLE}
   [junit4]   2> 758305 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@6b9840ae{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:54494}
   [junit4]   2> 758305 INFO  (closeThreadPool-4289-thread-1) [     ] o.e.j.s.Server Started @758363ms
   [junit4]   2> 758305 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/_/os, solrconfig=solrconfig.xml, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/tempDir-001/jetty4, hostPort=54494, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-4-001/cores, replicaType=NRT}
   [junit4]   2> 758306 ERROR (closeThreadPool-4289-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 758306 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 758306 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 9.0.0
   [junit4]   2> 758306 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 758306 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 758306 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-15T08:29:36.848273Z
   [junit4]   2> 758308 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 758311 INFO  (zkConnectionManagerCallback-4336-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 758311 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 758317 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c
   [junit4]   2> 758317 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c
   [junit4]   2> 758319 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-3-001/cores
   [junit4]   2> 758352 INFO  (OverseerCollectionConfigSetProcessor-72058221813432324-127.0.0.1:54439__%2Fos-n_0000000000) [n:127.0.0.1:54439__%2Fos     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 758368 INFO  (closeThreadPool-4289-thread-2) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:54483__%2Fos
   [junit4]   2> 758417 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 758418 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_5F57E984AD45F55C-001/shard-4-001/solr.xml
   [junit4]   2> 758420 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 758420 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 758421 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@2a9a8c3c, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 758640 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 758642 WARN  (closeThreadPool-4289-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@735537e8[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 758642 WARN  (closeThreadPool-4289-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@735537e8[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 758646 WARN  (closeThreadPool-4289-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@1a45f586[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 758646 WARN  (closeThreadPool-4289-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@1a45f586[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 758648 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:54430/solr
   [junit4]   2> 758650 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 758652 INFO  (zkConnectionManagerCallback-4343-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 758652 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 758756 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 758760 INFO  (zkConnectionManagerCallback-4345-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 758760 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 758776 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (4)
   [junit4]   2> 758784 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:54494__%2Fos as DOWN
   [junit4]   2> 758790 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 758791 INFO  (closeThreadPool-4289-thread-1) [     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:54494__%2Fos
   [junit4]   2> 758795 INFO  (zkCallback-4329-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 758795 INFO  (zkCallback-4314-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 758795 INFO  (zkCallback-4287-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 758796 INFO  (zkCallback-4272-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 758796 INFO  (zkCallback-4299-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]

[...truncated too long message...]

 url = null, agent id = null] for registry solr.jetty / com.codahale.metrics.MetricRegistry@722d4070
   [junit4]   2> 2086463 INFO  (jetty-closer-7853-thread-4) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster, tag=null
   [junit4]   2> 2086565 INFO  (jetty-closer-7853-thread-1) [     ] o.a.s.c.Overseer Overseer (id=72058306360705032-127.0.0.1:50996_solr-n_0000000000) closing
   [junit4]   2> 2086568 INFO  (jetty-closer-7853-thread-1) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@2df2af31{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:0}
   [junit4]   2> 2086568 INFO  (jetty-closer-7853-thread-1) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@2e18524a{/solr,null,UNAVAILABLE}
   [junit4]   2> 2086568 INFO  (jetty-closer-7853-thread-1) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 2086585 INFO  (jetty-closer-7853-thread-3) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@2c0a78a1{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:0}
   [junit4]   2> 2086586 INFO  (jetty-closer-7853-thread-3) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@349f7a2f{/solr,null,UNAVAILABLE}
   [junit4]   2> 2086586 INFO  (jetty-closer-7853-thread-3) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 2086690 INFO  (jetty-closer-7853-thread-6) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@4a552ca6{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:0}
   [junit4]   2> 2086690 INFO  (jetty-closer-7853-thread-5) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@777982b5{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:0}
   [junit4]   2> 2086691 INFO  (jetty-closer-7853-thread-6) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@3c86a29c{/solr,null,UNAVAILABLE}
   [junit4]   2> 2086692 INFO  (jetty-closer-7853-thread-6) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 2086692 INFO  (jetty-closer-7853-thread-5) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@3f226ab8{/solr,null,UNAVAILABLE}
   [junit4]   2> 2086693 INFO  (jetty-closer-7853-thread-5) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 2086801 INFO  (jetty-closer-7853-thread-4) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@4968345a{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:0}
   [junit4]   2> 2086802 INFO  (jetty-closer-7853-thread-4) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@29d9222d{/solr,null,UNAVAILABLE}
   [junit4]   2> 2086803 INFO  (jetty-closer-7853-thread-4) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 2086804 INFO  (SUITE-TriggerCooldownIntegrationTest-seed#[5F57E984AD45F55C]-worker) [     ] o.a.s.c.ZkTestServer Shutting down ZkTestServer.
   [junit4]   2> 2087031 WARN  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Watch limit violations: 
   [junit4]   2> Maximum concurrent create/delete watches above limit:
   [junit4]   2> 
   [junit4]   2> 	7	/solr/aliases.json
   [junit4]   2> 	7	/solr/clusterprops.json
   [junit4]   2> 	6	/solr/packages.json
   [junit4]   2> 	6	/solr/security.json
   [junit4]   2> 
   [junit4]   2> Maximum concurrent data watches above limit:
   [junit4]   2> 
   [junit4]   2> 	7	/solr/clusterstate.json
   [junit4]   2> 	5	/solr/autoscaling.json
   [junit4]   2> 	2	/solr/overseer_elect/election/72058306360705032-127.0.0.1:50996_solr-n_0000000000
   [junit4]   2> 
   [junit4]   2> Maximum concurrent children watches above limit:
   [junit4]   2> 
   [junit4]   2> 	52	/solr/live_nodes
   [junit4]   2> 	14	/solr/overseer/queue
   [junit4]   2> 	7	/solr/collections
   [junit4]   2> 	4	/solr/autoscaling/events/node_added_cooldown_trigger
   [junit4]   2> 
   [junit4]   2> 2087037 INFO  (SUITE-TriggerCooldownIntegrationTest-seed#[5F57E984AD45F55C]-worker) [     ] o.a.s.c.ZkTestServer waitForServerDown: 127.0.0.1:50990
   [junit4]   2> 2087037 INFO  (SUITE-TriggerCooldownIntegrationTest-seed#[5F57E984AD45F55C]-worker) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:50990
   [junit4]   2> 2087037 INFO  (SUITE-TriggerCooldownIntegrationTest-seed#[5F57E984AD45F55C]-worker) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 50990
   [junit4]   2> NOTE: leaving temporary files on disk at: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.autoscaling.TriggerCooldownIntegrationTest_5F57E984AD45F55C-001
   [junit4]   2> Jan 15, 2020 8:51:45 AM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 1 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene84): {}, docValues:{}, maxPointsInLeafNode=1802, maxMBSortInHeap=5.01666915155663, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@314dbbff), locale=sr-Latn-BA, timezone=CAT
   [junit4]   2> NOTE: Mac OS X 10.14.6 x86_64/AdoptOpenJDK 13.0.1 (64-bit)/cpus=6,threads=1,free=113342424,total=437256192
   [junit4]   2> NOTE: All tests run in this JVM: [BlobRepositoryCloudTest, TestInPlaceUpdatesRequiredField, RankQueryTest, SliceStateTest, TestDynamicLoading, TestExactStatsCache, TestCSVLoader, ClusterStateTest, TestRecovery, TestPullReplica, SolrCloudReportersTest, AutoAddReplicasIntegrationTest, TestManagedSchemaThreadSafety, TestFieldTypeCollectionResource, TestCloudDeleteByQuery, SynonymTokenizerTest, TestDocTermOrds, DebugComponentTest, NoCacheHeaderTest, MetricTriggerIntegrationTest, TestSubQueryTransformerDistrib, TestXIncludeConfig, DistributedTermsComponentTest, TestHashQParserPlugin, TestMacros, TestStressUserVersions, TestRandomDVFaceting, DistributedQueryComponentCustomSortTest, DistanceUnitsTest, DeleteStatusTest, NodeLostTriggerTest, MetricUtilsTest, TestOrdValues, BasicFunctionalityTest, TestSimGenericDistributedQueue, TestDistributedMap, TestTolerantUpdateProcessorCloud, HdfsCollectionsAPIDistributedZkTest, TestFieldCacheSortRandom, TestRemoteStreaming, TemplateUpdateProcessorTest, SpellingQueryConverterTest, TestSQLHandlerNonCloud, TestSimComputePlanAction, BinaryUpdateRequestHandlerTest, TestSolrCloudWithHadoopAuthPlugin, TestMultiValuedNumericRangeQuery, BasicDistributedZk2Test, SolrCoreCheckLockOnStartupTest, TestSchemalessBufferedUpdates, TestAddFieldRealTimeGet, TestLuceneIndexBackCompat, OverseerCollectionConfigSetProcessorTest, TestConfigSetsAPIExclusivity, TestRTGBase, QueryElevationComponentTest, JvmMetricsTest, PolyFieldTest, DistributedFacetPivotLongTailTest, TestTolerantSearch, TestCopyFieldCollectionResource, HealthCheckHandlerTest, TestSolrJacksonAnnotation, TestCloudRecovery, SolrTestCaseJ4Test, TestInitParams, LeaderVoteWaitTimeoutTest, LeaderElectionIntegrationTest, TestPushWriter, DataDrivenBlockJoinTest, JavaBinAtomicUpdateMultivalueTest, CollectionsAPIAsyncDistributedZkTest, TestSizeLimitedDistributedMap, DocValuesMultiTest, ConnectionReuseTest, TestSolrConfigHandler, TestFieldSortValues, TestMaxTokenLenTokenizer, TestPayloadCheckQParserPlugin, RandomizedTaggerTest, TestManagedSynonymGraphFilterFactory, TestLMJelinekMercerSimilarityFactory, AutoscalingHistoryHandlerTest, StatsReloadRaceTest, SoftAutoCommitTest, TestAtomicUpdateErrorCases, MissingSegmentRecoveryTest, TestDocumentBuilder, TestGroupingSearch, ProtectedTermFilterFactoryTest, TestCharFilters, TestDeprecatedFilters, TestLuceneMatchVersion, TestReversedWildcardFilterFactory, TestWordDelimiterFilterFactory, AliasIntegrationTest, ChaosMonkeySafeLeaderTest, ConcurrentCreateRoutedAliasTest, ConfigSetsAPITest, ConnectionManagerTest, DeleteInactiveReplicaTest, DeleteNodeTest, DistributedQueueTest, DocValuesNotIndexedTest, ForceLeaderTest, ForceLeaderWithTlogReplicasTest, HttpPartitionWithTlogReplicasTest, RecoveryAfterSoftCommitTest, RemoteQueryErrorTest, ReplaceNodeNoTargetTest, ReplaceNodeTest, RestartWhileUpdatingTest, RoutingToNodesWithPropertiesTest, SSLMigrationTest, SaslZkACLProviderTest, ShardRoutingCustomTest, SyncSliceTest, TestLockTree, TestMiniSolrCloudClusterSSL, TlogReplayBufferedWhileIndexingTest, TriLevelCompositeIdRoutingTest, VMParamsZkACLAndCredentialsProvidersTest, ZkCLITest, TestCollectionAPI, IndexSizeTriggerSizeEstimationTest, TriggerCooldownIntegrationTest]
   [junit4] Completed [566/899 (2!)] on J1 in 44.90s, 1 test, 1 failure <<< FAILURES!

[...truncated 46171 lines...]
[repro] Jenkins log URL: https://jenkins.thetaphi.de/job/Lucene-Solr-master-MacOSX/5533/consoleText

[repro] Revision: 5cf1ffef321cdcd43677d7e4fc3363f73a4ed468

[repro] Ant options: "-Dargs=-XX:+UseCompressedOops -XX:+UseG1GC"
[repro] Repro line:  ant test  -Dtestcase=TestCryptoKeys -Dtests.method=test -Dtests.seed=5F57E984AD45F55C -Dtests.slow=true -Dtests.locale=shi-Latn -Dtests.timezone=Atlantic/Reykjavik -Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] Repro line:  ant test  -Dtestcase=TriggerCooldownIntegrationTest -Dtests.method=testCooldown -Dtests.seed=5F57E984AD45F55C -Dtests.slow=true -Dtests.locale=sr-Latn-BA -Dtests.timezone=CAT -Dtests.asserts=true -Dtests.file.encoding=UTF-8

[repro] JUnit rest result XML files will be moved to: ./repro-reports
[repro] ant clean

[...truncated 6 lines...]
[repro] Test suites by module:
[repro]    solr/core
[repro]       TestCryptoKeys
[repro]       TriggerCooldownIntegrationTest
[repro] ant compile-test

[...truncated 3410 lines...]
[repro] ant test-nocompile -Dtests.dups=5 -Dtests.maxfailures=10 -Dtests.class="*.TestCryptoKeys|*.TriggerCooldownIntegrationTest" -Dtests.showOutput=onerror "-Dargs=-XX:+UseCompressedOops -XX:+UseG1GC" -Dtests.seed=5F57E984AD45F55C -Dtests.slow=true -Dtests.locale=shi-Latn -Dtests.timezone=Atlantic/Reykjavik -Dtests.asserts=true -Dtests.file.encoding=UTF-8

[...truncated 106 lines...]
[repro] Failures w/original seeds:
[repro]   0/5 failed: org.apache.solr.cloud.TestCryptoKeys
[repro]   0/5 failed: org.apache.solr.cloud.autoscaling.TriggerCooldownIntegrationTest
[repro] Exiting with code 0

[...truncated 73 lines...]

[JENKINS] Lucene-Solr-master-MacOSX (64bit/jdk-11.0.4) - Build # 5538 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-MacOSX/5538/
Java: 64bit/jdk-11.0.4 -XX:-UseCompressedOops -XX:+UseParallelGC

2 tests failed.
FAILED:  org.apache.solr.client.solrj.impl.CloudSolrClientTest.testRetryUpdatesWhenClusterStateIsStale

Error Message:
Test abandoned because suite timeout was reached.

Stack Trace:
java.lang.Exception: Test abandoned because suite timeout was reached.
	at __randomizedtesting.SeedInfo.seed([545782413F1883B0]:0)


FAILED:  junit.framework.TestSuite.org.apache.solr.client.solrj.impl.CloudSolrClientTest

Error Message:
Suite timeout exceeded (>= 7200000 msec).

Stack Trace:
java.lang.Exception: Suite timeout exceeded (>= 7200000 msec).
	at __randomizedtesting.SeedInfo.seed([545782413F1883B0]:0)




Build Log:
[...truncated 17203 lines...]
   [junit4] Suite: org.apache.solr.client.solrj.impl.CloudSolrClientTest
   [junit4]   2> 227514 INFO  (SUITE-CloudSolrClientTest-seed#[545782413F1883B0]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/data-dir-19-001
   [junit4]   2> 227514 WARN  (SUITE-CloudSolrClientTest-seed#[545782413F1883B0]-worker) [     ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=133 numCloses=133
   [junit4]   2> 227514 INFO  (SUITE-CloudSolrClientTest-seed#[545782413F1883B0]-worker) [     ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=true
   [junit4]   2> 227515 INFO  (SUITE-CloudSolrClientTest-seed#[545782413F1883B0]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason="", value=0.0/0.0, ssl=0.0/0.0, clientAuth=0.0/0.0) w/ MAC_OS_X supressed clientAuth
   [junit4]   2> 227515 INFO  (SUITE-CloudSolrClientTest-seed#[545782413F1883B0]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 227518 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.SolrTestCaseJ4 ###Starting preferLocalShardsTest
   [junit4]   2> 227519 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.MiniSolrCloudCluster Starting cluster of 3 servers in /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001
   [junit4]   2> 227520 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 227520 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 227520 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 227625 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.ZkTestServer start zk server on port:51187
   [junit4]   2> 227626 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:51187
   [junit4]   2> 227626 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:51187
   [junit4]   2> 227626 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 51187
   [junit4]   2> 227635 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 227653 INFO  (zkConnectionManagerCallback-1097-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 227653 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 227664 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 227679 INFO  (zkConnectionManagerCallback-1099-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 227679 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 227681 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 227685 INFO  (zkConnectionManagerCallback-1101-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 227685 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 227820 WARN  (jetty-launcher-1102-thread-2) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 227820 WARN  (jetty-launcher-1102-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 227820 WARN  (jetty-launcher-1102-thread-3) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 227820 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 227820 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 227820 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 227820 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 227820 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 227820 INFO  (jetty-launcher-1102-thread-2) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 11.0.4+11
   [junit4]   2> 227820 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 227820 INFO  (jetty-launcher-1102-thread-3) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 11.0.4+11
   [junit4]   2> 227820 INFO  (jetty-launcher-1102-thread-1) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 11.0.4+11
   [junit4]   2> 227837 INFO  (jetty-launcher-1102-thread-3) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 227837 INFO  (jetty-launcher-1102-thread-3) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 227837 INFO  (jetty-launcher-1102-thread-3) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 227840 INFO  (jetty-launcher-1102-thread-3) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@12dde1f8{/solr,null,AVAILABLE}
   [junit4]   2> 227841 INFO  (jetty-launcher-1102-thread-2) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 227841 INFO  (jetty-launcher-1102-thread-2) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 227841 INFO  (jetty-launcher-1102-thread-2) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 227842 INFO  (jetty-launcher-1102-thread-3) [     ] o.e.j.s.AbstractConnector Started ServerConnector@7f6f322{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:51194}
   [junit4]   2> 227842 INFO  (jetty-launcher-1102-thread-3) [     ] o.e.j.s.Server Started @227915ms
   [junit4]   2> 227842 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=51194}
   [junit4]   2> 227843 INFO  (jetty-launcher-1102-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 227843 ERROR (jetty-launcher-1102-thread-3) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 227843 INFO  (jetty-launcher-1102-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 227843 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 227843 INFO  (jetty-launcher-1102-thread-1) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 227843 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 9.0.0
   [junit4]   2> 227843 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 227843 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 227843 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-16T16:53:21.837287Z
   [junit4]   2> 227844 INFO  (jetty-launcher-1102-thread-2) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@78c07b44{/solr,null,AVAILABLE}
   [junit4]   2> 227844 INFO  (jetty-launcher-1102-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@7e229427{/solr,null,AVAILABLE}
   [junit4]   2> 227845 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 227847 INFO  (jetty-launcher-1102-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@186bbffc{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:51196}
   [junit4]   2> 227847 INFO  (jetty-launcher-1102-thread-2) [     ] o.e.j.s.AbstractConnector Started ServerConnector@b532240{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:51195}
   [junit4]   2> 227847 INFO  (jetty-launcher-1102-thread-2) [     ] o.e.j.s.Server Started @227920ms
   [junit4]   2> 227847 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=51195}
   [junit4]   2> 227847 INFO  (jetty-launcher-1102-thread-1) [     ] o.e.j.s.Server Started @227920ms
   [junit4]   2> 227847 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=51196}
   [junit4]   2> 227847 ERROR (jetty-launcher-1102-thread-2) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 227847 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 227848 ERROR (jetty-launcher-1102-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 227848 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 227848 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 9.0.0
   [junit4]   2> 227848 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 227848 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 227848 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-16T16:53:21.842261Z
   [junit4]   2> 227848 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 9.0.0
   [junit4]   2> 227848 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 227848 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 227848 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-16T16:53:21.842377Z
   [junit4]   2> 227856 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 227858 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 227864 INFO  (zkConnectionManagerCallback-1104-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 227865 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 227866 INFO  (zkConnectionManagerCallback-1108-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 227867 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 227867 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.s.SolrDispatchFilter solr.xml found in ZooKeeper. Loading...
   [junit4]   2> 227868 INFO  (zkConnectionManagerCallback-1106-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 227868 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 227869 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.s.SolrDispatchFilter solr.xml found in ZooKeeper. Loading...
   [junit4]   2> 227871 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.s.SolrDispatchFilter solr.xml found in ZooKeeper. Loading...
   [junit4]   2> 228162 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=true]
   [junit4]   2> 228162 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=true]
   [junit4]   2> 228166 WARN  (jetty-launcher-1102-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@39e35a29[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228166 WARN  (jetty-launcher-1102-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@39e35a29[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228168 WARN  (jetty-launcher-1102-thread-3) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@66b28529[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228168 WARN  (jetty-launcher-1102-thread-3) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@66b28529[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228171 WARN  (jetty-launcher-1102-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@413a7d33[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228171 WARN  (jetty-launcher-1102-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@413a7d33[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228175 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:51187/solr
   [junit4]   2> 228184 WARN  (jetty-launcher-1102-thread-3) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@4cb2bd7f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228185 WARN  (jetty-launcher-1102-thread-3) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@4cb2bd7f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228187 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:51187/solr
   [junit4]   2> 228192 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 228200 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 228203 INFO  (zkConnectionManagerCallback-1118-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 228203 INFO  (jetty-launcher-1102-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 228206 INFO  (zkConnectionManagerCallback-1122-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 228207 INFO  (jetty-launcher-1102-thread-3) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 228315 INFO  (jetty-launcher-1102-thread-1) [n:127.0.0.1:51196_solr     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 228315 INFO  (zkConnectionManagerCallback-1125-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 228318 INFO  (jetty-launcher-1102-thread-1) [n:127.0.0.1:51196_solr     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 228411 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=true]
   [junit4]   2> 228417 WARN  (jetty-launcher-1102-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@72008fa5[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228418 WARN  (jetty-launcher-1102-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@72008fa5[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228421 INFO  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 228431 INFO  (zkConnectionManagerCallback-1128-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 228432 INFO  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 228432 WARN  (jetty-launcher-1102-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@43f9acd2[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228432 WARN  (jetty-launcher-1102-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@43f9acd2[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228434 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:51187/solr
   [junit4]   2> 228436 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 228443 INFO  (zkConnectionManagerCallback-1133-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 228443 INFO  (jetty-launcher-1102-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 228555 INFO  (jetty-launcher-1102-thread-2) [n:127.0.0.1:51195_solr     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 228565 INFO  (zkConnectionManagerCallback-1139-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 228566 INFO  (jetty-launcher-1102-thread-2) [n:127.0.0.1:51195_solr     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 228596 WARN  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.c.ZkController The _default configset could not be uploaded. Please provide 'solr.default.confdir' parameter that points to a configset intended to be the default. Current 'solr.default.confdir' value: null
   [junit4]   2> 228596 WARN  (jetty-launcher-1102-thread-2) [n:127.0.0.1:51195_solr     ] o.a.s.c.ZkController The _default configset could not be uploaded. Please provide 'solr.default.confdir' parameter that points to a configset intended to be the default. Current 'solr.default.confdir' value: null
   [junit4]   2> 228598 WARN  (jetty-launcher-1102-thread-1) [n:127.0.0.1:51196_solr     ] o.a.s.c.ZkController The _default configset could not be uploaded. Please provide 'solr.default.confdir' parameter that points to a configset intended to be the default. Current 'solr.default.confdir' value: null
   [junit4]   2> 228629 INFO  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:51194_solr
   [junit4]   2> 228644 INFO  (jetty-launcher-1102-thread-1) [n:127.0.0.1:51196_solr     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:51196_solr
   [junit4]   2> 228644 INFO  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.c.Overseer Overseer (id=72059403128340489-127.0.0.1:51194_solr-n_0000000000) starting
   [junit4]   2> 228647 INFO  (jetty-launcher-1102-thread-2) [n:127.0.0.1:51195_solr     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:51195_solr
   [junit4]   2> 228651 INFO  (zkCallback-1138-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 228656 INFO  (zkCallback-1127-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 228662 INFO  (zkCallback-1124-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2)
   [junit4]   2> 228667 INFO  (zkCallback-1127-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 228667 INFO  (zkCallback-1138-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 228679 INFO  (jetty-launcher-1102-thread-2) [n:127.0.0.1:51195_solr     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 228680 INFO  (jetty-launcher-1102-thread-1) [n:127.0.0.1:51196_solr     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 228681 WARN  (jetty-launcher-1102-thread-1) [n:127.0.0.1:51196_solr     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 228684 WARN  (jetty-launcher-1102-thread-2) [n:127.0.0.1:51195_solr     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 228702 INFO  (OverseerStateUpdate-72059403128340489-127.0.0.1:51194_solr-n_0000000000) [n:127.0.0.1:51194_solr     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:51194_solr
   [junit4]   2> 228707 INFO  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.c.ZkController Publish node=127.0.0.1:51194_solr as DOWN
   [junit4]   2> 228718 INFO  (jetty-launcher-1102-thread-1) [n:127.0.0.1:51196_solr     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 228723 INFO  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 2147483647 transient cores
   [junit4]   2> 228727 INFO  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:51194_solr
   [junit4]   2> 228737 INFO  (zkCallback-1138-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 228738 INFO  (zkCallback-1127-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 228740 INFO  (zkCallback-1124-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 228759 INFO  (jetty-launcher-1102-thread-1) [n:127.0.0.1:51196_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51196.solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 228771 INFO  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 228776 WARN  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 228788 INFO  (jetty-launcher-1102-thread-1) [n:127.0.0.1:51196_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51196.solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 228796 INFO  (jetty-launcher-1102-thread-1) [n:127.0.0.1:51196_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51196.solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 228800 INFO  (jetty-launcher-1102-thread-1) [n:127.0.0.1:51196_solr     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node1/.
   [junit4]   2> 228804 INFO  (jetty-launcher-1102-thread-2) [n:127.0.0.1:51195_solr     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 228807 INFO  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 228939 INFO  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51194.solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 228948 INFO  (jetty-launcher-1102-thread-2) [n:127.0.0.1:51195_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51195.solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 228972 INFO  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51194.solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 228973 INFO  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51194.solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 228980 INFO  (jetty-launcher-1102-thread-3) [n:127.0.0.1:51194_solr     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node2/.
   [junit4]   2> 228984 INFO  (jetty-launcher-1102-thread-2) [n:127.0.0.1:51195_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51195.solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 228984 INFO  (jetty-launcher-1102-thread-2) [n:127.0.0.1:51195_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51195.solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 228985 INFO  (jetty-launcher-1102-thread-2) [n:127.0.0.1:51195_solr     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node3/.
   [junit4]   2> 229154 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.MiniSolrCloudCluster waitForAllNodes: numServers=3
   [junit4]   2> 229155 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 229159 INFO  (zkConnectionManagerCallback-1152-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 229160 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 229167 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3)
   [junit4]   2> 229176 INFO  (TEST-CloudSolrClientTest.preferLocalShardsTest-seed#[545782413F1883B0]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:51187/solr ready
   [junit4]   2> 229194 INFO  (qtp380388185-2776) [n:127.0.0.1:51196_solr     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :clusterstatus with params action=CLUSTERSTATUS&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 229198 INFO  (qtp380388185-2776) [n:127.0.0.1:51196_solr     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={action=CLUSTERSTATUS&wt=javabin&version=2} status=0 QTime=4
   [junit4]   2> 229202 INFO  (qtp154209396-2772) [n:127.0.0.1:51194_solr     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e&collection.configName=conf&maxShardsPerNode=9&name=localShardsTestColl&nrtReplicas=3&action=CREATE&numShards=3&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 229213 INFO  (qtp154209396-2772) [n:127.0.0.1:51194_solr     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e&collection.configName=conf&maxShardsPerNode=9&name=localShardsTestColl&nrtReplicas=3&action=CREATE&numShards=3&wt=javabin&version=2} status=0 QTime=10
   [junit4]   2> 229215 INFO  (qtp154209396-2773) [n:127.0.0.1:51194_solr     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :requeststatus with params requestid=2a6f772d-abb1-4d38-9a9f-4dcd689c248e&action=REQUESTSTATUS&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 229222 INFO  (OverseerThreadFactory-595-thread-1-processing-n:127.0.0.1:51194_solr) [n:127.0.0.1:51194_solr     ] o.a.s.c.a.c.CreateCollectionCmd Create collection localShardsTestColl
   [junit4]   2> 229222 INFO  (qtp154209396-2773) [n:127.0.0.1:51194_solr     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=2a6f772d-abb1-4d38-9a9f-4dcd689c248e&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=7
   [junit4]   2> 229351 INFO  (OverseerStateUpdate-72059403128340489-127.0.0.1:51194_solr-n_0000000000) [n:127.0.0.1:51194_solr     ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"localShardsTestColl",
   [junit4]   2>   "shard":"shard1",
   [junit4]   2>   "core":"localShardsTestColl_shard1_replica_n1",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:51196/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 229360 INFO  (OverseerStateUpdate-72059403128340489-127.0.0.1:51194_solr-n_0000000000) [n:127.0.0.1:51194_solr     ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"localShardsTestColl",
   [junit4]   2>   "shard":"shard1",
   [junit4]   2>   "core":"localShardsTestColl_shard1_replica_n2",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:51194/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 229370 INFO  (OverseerStateUpdate-72059403128340489-127.0.0.1:51194_solr-n_0000000000) [n:127.0.0.1:51194_solr     ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"localShardsTestColl",
   [junit4]   2>   "shard":"shard1",
   [junit4]   2>   "core":"localShardsTestColl_shard1_replica_n4",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:51195/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 229381 INFO  (OverseerStateUpdate-72059403128340489-127.0.0.1:51194_solr-n_0000000000) [n:127.0.0.1:51194_solr     ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"localShardsTestColl",
   [junit4]   2>   "shard":"shard2",
   [junit4]   2>   "core":"localShardsTestColl_shard2_replica_n6",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:51196/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 229390 INFO  (OverseerStateUpdate-72059403128340489-127.0.0.1:51194_solr-n_0000000000) [n:127.0.0.1:51194_solr     ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"localShardsTestColl",
   [junit4]   2>   "shard":"shard2",
   [junit4]   2>   "core":"localShardsTestColl_shard2_replica_n8",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:51194/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 229401 INFO  (OverseerStateUpdate-72059403128340489-127.0.0.1:51194_solr-n_0000000000) [n:127.0.0.1:51194_solr     ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"localShardsTestColl",
   [junit4]   2>   "shard":"shard2",
   [junit4]   2>   "core":"localShardsTestColl_shard2_replica_n10",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:51195/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 229410 INFO  (OverseerStateUpdate-72059403128340489-127.0.0.1:51194_solr-n_0000000000) [n:127.0.0.1:51194_solr     ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"localShardsTestColl",
   [junit4]   2>   "shard":"shard3",
   [junit4]   2>   "core":"localShardsTestColl_shard3_replica_n12",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:51196/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 229416 INFO  (OverseerStateUpdate-72059403128340489-127.0.0.1:51194_solr-n_0000000000) [n:127.0.0.1:51194_solr     ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"localShardsTestColl",
   [junit4]   2>   "shard":"shard3",
   [junit4]   2>   "core":"localShardsTestColl_shard3_replica_n14",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:51194/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 229425 INFO  (OverseerStateUpdate-72059403128340489-127.0.0.1:51194_solr-n_0000000000) [n:127.0.0.1:51194_solr     ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"localShardsTestColl",
   [junit4]   2>   "shard":"shard3",
   [junit4]   2>   "core":"localShardsTestColl_shard3_replica_n16",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:51195/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 229647 INFO  (qtp380388185-2780) [n:127.0.0.1:51196_solr    x:localShardsTestColl_shard1_replica_n1 ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026&coreNodeName=core_node3&name=localShardsTestColl_shard1_replica_n1&action=CREATE&numShards=3&shard=shard1&wt=javabin} status=0 QTime=0
   [junit4]   2> 229647 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr    x:localShardsTestColl_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026&coreNodeName=core_node3&name=localShardsTestColl_shard1_replica_n1&action=CREATE&numShards=3&shard=shard1&wt=javabin
   [junit4]   2> 229647 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr    x:localShardsTestColl_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 2147483647 transient cores
   [junit4]   2> 229652 INFO  (qtp154209396-2771) [n:127.0.0.1:51194_solr    x:localShardsTestColl_shard1_replica_n2 ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994&coreNodeName=core_node5&name=localShardsTestColl_shard1_replica_n2&action=CREATE&numShards=3&shard=shard1&wt=javabin} status=0 QTime=5
   [junit4]   2> 229653 INFO  (parallelCoreAdminExecutor-600-thread-1-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard1_replica_n2 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994 CREATE) [n:127.0.0.1:51194_solr    x:localShardsTestColl_shard1_replica_n2 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994&coreNodeName=core_node5&name=localShardsTestColl_shard1_replica_n2&action=CREATE&numShards=3&shard=shard1&wt=javabin
   [junit4]   2> 229653 INFO  (qtp93641858-2769) [n:127.0.0.1:51195_solr    x:localShardsTestColl_shard1_replica_n4 ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606373211321&coreNodeName=core_node7&name=localShardsTestColl_shard1_replica_n4&action=CREATE&numShards=3&shard=shard1&wt=javabin} status=0 QTime=1
   [junit4]   2> 229657 INFO  (parallelCoreAdminExecutor-599-thread-1-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard1_replica_n4 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606373211321 CREATE) [n:127.0.0.1:51195_solr    x:localShardsTestColl_shard1_replica_n4 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606373211321&coreNodeName=core_node7&name=localShardsTestColl_shard1_replica_n4&action=CREATE&numShards=3&shard=shard1&wt=javabin
   [junit4]   2> 229659 INFO  (parallelCoreAdminExecutor-599-thread-1-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard1_replica_n4 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606373211321 CREATE) [n:127.0.0.1:51195_solr    x:localShardsTestColl_shard1_replica_n4 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 2147483647 transient cores
   [junit4]   2> 229660 INFO  (qtp154209396-2773) [n:127.0.0.1:51194_solr    x:localShardsTestColl_shard2_replica_n8 ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606393944042&coreNodeName=core_node11&name=localShardsTestColl_shard2_replica_n8&action=CREATE&numShards=3&shard=shard2&wt=javabin} status=0 QTime=13
   [junit4]   2> 229660 INFO  (parallelCoreAdminExecutor-600-thread-2-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard2_replica_n8 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606393944042 CREATE) [n:127.0.0.1:51194_solr    x:localShardsTestColl_shard2_replica_n8 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606393944042&coreNodeName=core_node11&name=localShardsTestColl_shard2_replica_n8&action=CREATE&numShards=3&shard=shard2&wt=javabin
   [junit4]   2> 229661 INFO  (qtp154209396-2774) [n:127.0.0.1:51194_solr    x:localShardsTestColl_shard3_replica_n14 ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606421399309&coreNodeName=core_node17&name=localShardsTestColl_shard3_replica_n14&action=CREATE&numShards=3&shard=shard3&wt=javabin} status=0 QTime=0
   [junit4]   2> 229661 INFO  (parallelCoreAdminExecutor-600-thread-3-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard3_replica_n14 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606421399309 CREATE) [n:127.0.0.1:51194_solr    x:localShardsTestColl_shard3_replica_n14 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606421399309&coreNodeName=core_node17&name=localShardsTestColl_shard3_replica_n14&action=CREATE&numShards=3&shard=shard3&wt=javabin
   [junit4]   2> 229661 INFO  (qtp93641858-2768) [n:127.0.0.1:51195_solr    x:localShardsTestColl_shard3_replica_n16 ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606425863766&coreNodeName=core_node18&name=localShardsTestColl_shard3_replica_n16&action=CREATE&numShards=3&shard=shard3&wt=javabin} status=0 QTime=1
   [junit4]   2> 229662 INFO  (parallelCoreAdminExecutor-599-thread-2-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard3_replica_n16 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606425863766 CREATE) [n:127.0.0.1:51195_solr    x:localShardsTestColl_shard3_replica_n16 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606425863766&coreNodeName=core_node18&name=localShardsTestColl_shard3_replica_n16&action=CREATE&numShards=3&shard=shard3&wt=javabin
   [junit4]   2> 229662 INFO  (qtp380388185-2763) [n:127.0.0.1:51196_solr    x:localShardsTestColl_shard2_replica_n6 ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606384546223&coreNodeName=core_node9&name=localShardsTestColl_shard2_replica_n6&action=CREATE&numShards=3&shard=shard2&wt=javabin} status=0 QTime=14
   [junit4]   2> 229663 INFO  (qtp93641858-2777) [n:127.0.0.1:51195_solr    x:localShardsTestColl_shard2_replica_n10 ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606401745387&coreNodeName=core_node13&name=localShardsTestColl_shard2_replica_n10&action=CREATE&numShards=3&shard=shard2&wt=javabin} status=0 QTime=10
   [junit4]   2> 229663 INFO  (parallelCoreAdminExecutor-599-thread-3-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard2_replica_n10 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606401745387 CREATE) [n:127.0.0.1:51195_solr    x:localShardsTestColl_shard2_replica_n10 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606401745387&coreNodeName=core_node13&name=localShardsTestColl_shard2_replica_n10&action=CREATE&numShards=3&shard=shard2&wt=javabin
   [junit4]   2> 229663 INFO  (qtp380388185-2779) [n:127.0.0.1:51196_solr    x:localShardsTestColl_shard3_replica_n12 ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606414237233&coreNodeName=core_node15&name=localShardsTestColl_shard3_replica_n12&action=CREATE&numShards=3&shard=shard3&wt=javabin} status=0 QTime=3
   [junit4]   2> 229667 INFO  (parallelCoreAdminExecutor-596-thread-2-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard2_replica_n6 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606384546223 CREATE) [n:127.0.0.1:51196_solr    x:localShardsTestColl_shard2_replica_n6 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606384546223&coreNodeName=core_node9&name=localShardsTestColl_shard2_replica_n6&action=CREATE&numShards=3&shard=shard2&wt=javabin
   [junit4]   2> 229671 INFO  (parallelCoreAdminExecutor-596-thread-3-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard3_replica_n12 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606414237233 CREATE) [n:127.0.0.1:51196_solr    x:localShardsTestColl_shard3_replica_n12 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf&newCollection=true&collection=localShardsTestColl&version=2&replicaType=NRT&async=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606414237233&coreNodeName=core_node15&name=localShardsTestColl_shard3_replica_n12&action=CREATE&numShards=3&shard=shard3&wt=javabin
   [junit4]   2> 229690 INFO  (qtp380388185-2779) [n:127.0.0.1:51196_solr     ] o.a.s.h.a.CoreAdminOperation Checking request status for : 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026
   [junit4]   2> 229690 INFO  (qtp380388185-2779) [n:127.0.0.1:51196_solr     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=0
   [junit4]   2> 230224 INFO  (qtp154209396-2772) [n:127.0.0.1:51194_solr     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :requeststatus with params requestid=2a6f772d-abb1-4d38-9a9f-4dcd689c248e&action=REQUESTSTATUS&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 230226 INFO  (qtp154209396-2772) [n:127.0.0.1:51194_solr     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=2a6f772d-abb1-4d38-9a9f-4dcd689c248e&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=2
   [junit4]   2> 230698 INFO  (qtp380388185-2763) [n:127.0.0.1:51196_solr     ] o.a.s.h.a.CoreAdminOperation Checking request status for : 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026
   [junit4]   2> 230698 INFO  (qtp380388185-2763) [n:127.0.0.1:51196_solr     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=0
   [junit4]   2> 230748 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 230748 INFO  (parallelCoreAdminExecutor-600-thread-1-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard1_replica_n2 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard1 r:core_node5 x:localShardsTestColl_shard1_replica_n2 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 230749 INFO  (parallelCoreAdminExecutor-596-thread-3-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard3_replica_n12 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606414237233 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard3 r:core_node15 x:localShardsTestColl_shard3_replica_n12 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 230768 INFO  (parallelCoreAdminExecutor-599-thread-1-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard1_replica_n4 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606373211321 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard1 r:core_node7 x:localShardsTestColl_shard1_replica_n4 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 230770 INFO  (parallelCoreAdminExecutor-599-thread-3-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard2_replica_n10 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606401745387 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard2 r:core_node13 x:localShardsTestColl_shard2_replica_n10 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 230780 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.s.IndexSchema [localShardsTestColl_shard1_replica_n1] Schema name=test
   [junit4]   2> 230781 INFO  (parallelCoreAdminExecutor-600-thread-1-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard1_replica_n2 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard1 r:core_node5 x:localShardsTestColl_shard1_replica_n2 ] o.a.s.s.IndexSchema [localShardsTestColl_shard1_replica_n2] Schema name=test
   [junit4]   2> 230781 INFO  (parallelCoreAdminExecutor-596-thread-3-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard3_replica_n12 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606414237233 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard3 r:core_node15 x:localShardsTestColl_shard3_replica_n12 ] o.a.s.s.IndexSchema [localShardsTestColl_shard3_replica_n12] Schema name=test
   [junit4]   2> 230928 INFO  (parallelCoreAdminExecutor-599-thread-2-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard3_replica_n16 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606425863766 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard3 r:core_node18 x:localShardsTestColl_shard3_replica_n16 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 230937 INFO  (parallelCoreAdminExecutor-596-thread-2-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard2_replica_n6 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606384546223 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard2 r:core_node9 x:localShardsTestColl_shard2_replica_n6 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 230939 INFO  (parallelCoreAdminExecutor-600-thread-3-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard3_replica_n14 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606421399309 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard3 r:core_node17 x:localShardsTestColl_shard3_replica_n14 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 230942 INFO  (parallelCoreAdminExecutor-599-thread-1-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard1_replica_n4 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606373211321 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard1 r:core_node7 x:localShardsTestColl_shard1_replica_n4 ] o.a.s.s.IndexSchema [localShardsTestColl_shard1_replica_n4] Schema name=test
   [junit4]   2> 230952 INFO  (parallelCoreAdminExecutor-599-thread-3-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard2_replica_n10 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606401745387 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard2 r:core_node13 x:localShardsTestColl_shard2_replica_n10 ] o.a.s.s.IndexSchema [localShardsTestColl_shard2_replica_n10] Schema name=test
   [junit4]   2> 230972 INFO  (parallelCoreAdminExecutor-600-thread-2-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard2_replica_n8 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606393944042 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard2 r:core_node11 x:localShardsTestColl_shard2_replica_n8 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 230999 INFO  (parallelCoreAdminExecutor-600-thread-3-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard3_replica_n14 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606421399309 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard3 r:core_node17 x:localShardsTestColl_shard3_replica_n14 ] o.a.s.s.IndexSchema [localShardsTestColl_shard3_replica_n14] Schema name=test
   [junit4]   2> 231016 INFO  (parallelCoreAdminExecutor-599-thread-2-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard3_replica_n16 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606425863766 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard3 r:core_node18 x:localShardsTestColl_shard3_replica_n16 ] o.a.s.s.IndexSchema [localShardsTestColl_shard3_replica_n16] Schema name=test
   [junit4]   2> 231028 INFO  (parallelCoreAdminExecutor-596-thread-2-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard2_replica_n6 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606384546223 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard2 r:core_node9 x:localShardsTestColl_shard2_replica_n6 ] o.a.s.s.IndexSchema [localShardsTestColl_shard2_replica_n6] Schema name=test
   [junit4]   2> 231044 INFO  (parallelCoreAdminExecutor-600-thread-2-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard2_replica_n8 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606393944042 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard2 r:core_node11 x:localShardsTestColl_shard2_replica_n8 ] o.a.s.s.IndexSchema [localShardsTestColl_shard2_replica_n8] Schema name=test
   [junit4]   2> 231089 WARN  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.s.IndexSchema Field lowerfilt1and2 is not multivalued and destination for multiple copyFields (2)
   [junit4]   2> 231090 WARN  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.s.IndexSchema Field text is not multivalued and destination for multiple copyFields (3)
   [junit4]   2> 231090 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 231091 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'localShardsTestColl_shard1_replica_n1' using configuration from collection localShardsTestColl, trusted=true
   [junit4]   2> 231091 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51196.solr.core.localShardsTestColl.shard1.replica_n1' (registry 'solr.core.localShardsTestColl.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 231092 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.c.SolrCore [[localShardsTestColl_shard1_replica_n1] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node1/localShardsTestColl_shard1_replica_n1], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node1/./localShardsTestColl_shard1_replica_n1/data/]
   [junit4]   2> 231231 INFO  (qtp154209396-2771) [n:127.0.0.1:51194_solr     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :requeststatus with params requestid=2a6f772d-abb1-4d38-9a9f-4dcd689c248e&action=REQUESTSTATUS&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 231232 WARN  (parallelCoreAdminExecutor-596-thread-3-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard3_replica_n12 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606414237233 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard3 r:core_node15 x:localShardsTestColl_shard3_replica_n12 ] o.a.s.s.IndexSchema Field lowerfilt1and2 is not multivalued and destination for multiple copyFields (2)
   [junit4]   2> 231232 WARN  (parallelCoreAdminExecutor-596-thread-3-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard3_replica_n12 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606414237233 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard3 r:core_node15 x:localShardsTestColl_shard3_replica_n12 ] o.a.s.s.IndexSchema Field text is not multivalued and destination for multiple copyFields (3)
   [junit4]   2> 231234 INFO  (parallelCoreAdminExecutor-596-thread-3-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard3_replica_n12 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606414237233 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard3 r:core_node15 x:localShardsTestColl_shard3_replica_n12 ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 231234 INFO  (parallelCoreAdminExecutor-596-thread-3-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard3_replica_n12 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606414237233 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard3 r:core_node15 x:localShardsTestColl_shard3_replica_n12 ] o.a.s.c.CoreContainer Creating SolrCore 'localShardsTestColl_shard3_replica_n12' using configuration from collection localShardsTestColl, trusted=true
   [junit4]   2> 231234 INFO  (qtp154209396-2771) [n:127.0.0.1:51194_solr     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=2a6f772d-abb1-4d38-9a9f-4dcd689c248e&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=3
   [junit4]   2> 231236 WARN  (parallelCoreAdminExecutor-600-thread-1-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard1_replica_n2 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard1 r:core_node5 x:localShardsTestColl_shard1_replica_n2 ] o.a.s.s.IndexSchema Field lowerfilt1and2 is not multivalued and destination for multiple copyFields (2)
   [junit4]   2> 231241 WARN  (parallelCoreAdminExecutor-600-thread-1-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard1_replica_n2 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard1 r:core_node5 x:localShardsTestColl_shard1_replica_n2 ] o.a.s.s.IndexSchema Field text is not multivalued and destination for multiple copyFields (3)
   [junit4]   2> 231241 INFO  (parallelCoreAdminExecutor-600-thread-1-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard1_replica_n2 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard1 r:core_node5 x:localShardsTestColl_shard1_replica_n2 ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 231241 INFO  (parallelCoreAdminExecutor-600-thread-1-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard1_replica_n2 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard1 r:core_node5 x:localShardsTestColl_shard1_replica_n2 ] o.a.s.c.CoreContainer Creating SolrCore 'localShardsTestColl_shard1_replica_n2' using configuration from collection localShardsTestColl, trusted=true
   [junit4]   2> 231242 INFO  (parallelCoreAdminExecutor-600-thread-1-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard1_replica_n2 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard1 r:core_node5 x:localShardsTestColl_shard1_replica_n2 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51194.solr.core.localShardsTestColl.shard1.replica_n2' (registry 'solr.core.localShardsTestColl.shard1.replica_n2') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 231242 INFO  (parallelCoreAdminExecutor-600-thread-1-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard1_replica_n2 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard1 r:core_node5 x:localShardsTestColl_shard1_replica_n2 ] o.a.s.c.SolrCore [[localShardsTestColl_shard1_replica_n2] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node2/localShardsTestColl_shard1_replica_n2], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node2/./localShardsTestColl_shard1_replica_n2/data/]
   [junit4]   2> 231245 INFO  (parallelCoreAdminExecutor-596-thread-3-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard3_replica_n12 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606414237233 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard3 r:core_node15 x:localShardsTestColl_shard3_replica_n12 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51196.solr.core.localShardsTestColl.shard3.replica_n12' (registry 'solr.core.localShardsTestColl.shard3.replica_n12') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 231245 INFO  (parallelCoreAdminExecutor-596-thread-3-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard3_replica_n12 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606414237233 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard3 r:core_node15 x:localShardsTestColl_shard3_replica_n12 ] o.a.s.c.SolrCore [[localShardsTestColl_shard3_replica_n12] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node1/localShardsTestColl_shard3_replica_n12], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node1/./localShardsTestColl_shard3_replica_n12/data/]
   [junit4]   2> 231248 WARN  (parallelCoreAdminExecutor-596-thread-2-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard2_replica_n6 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606384546223 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard2 r:core_node9 x:localShardsTestColl_shard2_replica_n6 ] o.a.s.s.IndexSchema Field lowerfilt1and2 is not multivalued and destination for multiple copyFields (2)
   [junit4]   2> 231248 WARN  (parallelCoreAdminExecutor-596-thread-2-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard2_replica_n6 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606384546223 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard2 r:core_node9 x:localShardsTestColl_shard2_replica_n6 ] o.a.s.s.IndexSchema Field text is not multivalued and destination for multiple copyFields (3)
   [junit4]   2> 231248 WARN  (parallelCoreAdminExecutor-599-thread-1-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard1_replica_n4 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606373211321 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard1 r:core_node7 x:localShardsTestColl_shard1_replica_n4 ] o.a.s.s.IndexSchema Field lowerfilt1and2 is not multivalued and destination for multiple copyFields (2)
   [junit4]   2> 231248 WARN  (parallelCoreAdminExecutor-599-thread-1-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard1_replica_n4 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606373211321 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard1 r:core_node7 x:localShardsTestColl_shard1_replica_n4 ] o.a.s.s.IndexSchema Field text is not multivalued and destination for multiple copyFields (3)
   [junit4]   2> 231249 INFO  (parallelCoreAdminExecutor-596-thread-2-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard2_replica_n6 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606384546223 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard2 r:core_node9 x:localShardsTestColl_shard2_replica_n6 ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 231249 INFO  (parallelCoreAdminExecutor-596-thread-2-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard2_replica_n6 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606384546223 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard2 r:core_node9 x:localShardsTestColl_shard2_replica_n6 ] o.a.s.c.CoreContainer Creating SolrCore 'localShardsTestColl_shard2_replica_n6' using configuration from collection localShardsTestColl, trusted=true
   [junit4]   2> 231249 INFO  (parallelCoreAdminExecutor-599-thread-1-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard1_replica_n4 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606373211321 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard1 r:core_node7 x:localShardsTestColl_shard1_replica_n4 ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 231252 INFO  (parallelCoreAdminExecutor-599-thread-1-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard1_replica_n4 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606373211321 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard1 r:core_node7 x:localShardsTestColl_shard1_replica_n4 ] o.a.s.c.CoreContainer Creating SolrCore 'localShardsTestColl_shard1_replica_n4' using configuration from collection localShardsTestColl, trusted=true
   [junit4]   2> 231253 INFO  (parallelCoreAdminExecutor-599-thread-1-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard1_replica_n4 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606373211321 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard1 r:core_node7 x:localShardsTestColl_shard1_replica_n4 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51195.solr.core.localShardsTestColl.shard1.replica_n4' (registry 'solr.core.localShardsTestColl.shard1.replica_n4') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 231253 INFO  (parallelCoreAdminExecutor-599-thread-1-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard1_replica_n4 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606373211321 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard1 r:core_node7 x:localShardsTestColl_shard1_replica_n4 ] o.a.s.c.SolrCore [[localShardsTestColl_shard1_replica_n4] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node3/localShardsTestColl_shard1_replica_n4], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node3/./localShardsTestColl_shard1_replica_n4/data/]
   [junit4]   2> 231267 INFO  (parallelCoreAdminExecutor-596-thread-2-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard2_replica_n6 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606384546223 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard2 r:core_node9 x:localShardsTestColl_shard2_replica_n6 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51196.solr.core.localShardsTestColl.shard2.replica_n6' (registry 'solr.core.localShardsTestColl.shard2.replica_n6') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 231267 INFO  (parallelCoreAdminExecutor-596-thread-2-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard2_replica_n6 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606384546223 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard2 r:core_node9 x:localShardsTestColl_shard2_replica_n6 ] o.a.s.c.SolrCore [[localShardsTestColl_shard2_replica_n6] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node1/localShardsTestColl_shard2_replica_n6], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node1/./localShardsTestColl_shard2_replica_n6/data/]
   [junit4]   2> 231267 WARN  (parallelCoreAdminExecutor-599-thread-3-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard2_replica_n10 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606401745387 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard2 r:core_node13 x:localShardsTestColl_shard2_replica_n10 ] o.a.s.s.IndexSchema Field lowerfilt1and2 is not multivalued and destination for multiple copyFields (2)
   [junit4]   2> 231267 WARN  (parallelCoreAdminExecutor-599-thread-3-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard2_replica_n10 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606401745387 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard2 r:core_node13 x:localShardsTestColl_shard2_replica_n10 ] o.a.s.s.IndexSchema Field text is not multivalued and destination for multiple copyFields (3)
   [junit4]   2> 231268 INFO  (parallelCoreAdminExecutor-599-thread-3-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard2_replica_n10 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606401745387 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard2 r:core_node13 x:localShardsTestColl_shard2_replica_n10 ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 231268 INFO  (parallelCoreAdminExecutor-599-thread-3-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard2_replica_n10 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606401745387 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard2 r:core_node13 x:localShardsTestColl_shard2_replica_n10 ] o.a.s.c.CoreContainer Creating SolrCore 'localShardsTestColl_shard2_replica_n10' using configuration from collection localShardsTestColl, trusted=true
   [junit4]   2> 231271 INFO  (parallelCoreAdminExecutor-599-thread-3-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard2_replica_n10 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606401745387 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard2 r:core_node13 x:localShardsTestColl_shard2_replica_n10 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51195.solr.core.localShardsTestColl.shard2.replica_n10' (registry 'solr.core.localShardsTestColl.shard2.replica_n10') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 231281 WARN  (parallelCoreAdminExecutor-600-thread-2-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard2_replica_n8 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606393944042 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard2 r:core_node11 x:localShardsTestColl_shard2_replica_n8 ] o.a.s.s.IndexSchema Field lowerfilt1and2 is not multivalued and destination for multiple copyFields (2)
   [junit4]   2> 231281 WARN  (parallelCoreAdminExecutor-600-thread-2-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard2_replica_n8 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606393944042 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard2 r:core_node11 x:localShardsTestColl_shard2_replica_n8 ] o.a.s.s.IndexSchema Field text is not multivalued and destination for multiple copyFields (3)
   [junit4]   2> 231281 WARN  (parallelCoreAdminExecutor-600-thread-3-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard3_replica_n14 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606421399309 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard3 r:core_node17 x:localShardsTestColl_shard3_replica_n14 ] o.a.s.s.IndexSchema Field lowerfilt1and2 is not multivalued and destination for multiple copyFields (2)
   [junit4]   2> 231281 WARN  (parallelCoreAdminExecutor-600-thread-3-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard3_replica_n14 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606421399309 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard3 r:core_node17 x:localShardsTestColl_shard3_replica_n14 ] o.a.s.s.IndexSchema Field text is not multivalued and destination for multiple copyFields (3)
   [junit4]   2> 231281 INFO  (parallelCoreAdminExecutor-600-thread-2-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard2_replica_n8 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606393944042 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard2 r:core_node11 x:localShardsTestColl_shard2_replica_n8 ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 231281 INFO  (parallelCoreAdminExecutor-600-thread-2-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard2_replica_n8 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606393944042 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard2 r:core_node11 x:localShardsTestColl_shard2_replica_n8 ] o.a.s.c.CoreContainer Creating SolrCore 'localShardsTestColl_shard2_replica_n8' using configuration from collection localShardsTestColl, trusted=true
   [junit4]   2> 231281 INFO  (parallelCoreAdminExecutor-600-thread-3-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard3_replica_n14 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606421399309 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard3 r:core_node17 x:localShardsTestColl_shard3_replica_n14 ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 231281 INFO  (parallelCoreAdminExecutor-600-thread-3-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard3_replica_n14 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606421399309 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard3 r:core_node17 x:localShardsTestColl_shard3_replica_n14 ] o.a.s.c.CoreContainer Creating SolrCore 'localShardsTestColl_shard3_replica_n14' using configuration from collection localShardsTestColl, trusted=true
   [junit4]   2> 231282 INFO  (parallelCoreAdminExecutor-600-thread-3-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard3_replica_n14 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606421399309 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard3 r:core_node17 x:localShardsTestColl_shard3_replica_n14 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51194.solr.core.localShardsTestColl.shard3.replica_n14' (registry 'solr.core.localShardsTestColl.shard3.replica_n14') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 231282 INFO  (parallelCoreAdminExecutor-600-thread-3-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard3_replica_n14 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606421399309 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard3 r:core_node17 x:localShardsTestColl_shard3_replica_n14 ] o.a.s.c.SolrCore [[localShardsTestColl_shard3_replica_n14] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node2/localShardsTestColl_shard3_replica_n14], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node2/./localShardsTestColl_shard3_replica_n14/data/]
   [junit4]   2> 231283 INFO  (parallelCoreAdminExecutor-599-thread-3-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard2_replica_n10 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606401745387 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard2 r:core_node13 x:localShardsTestColl_shard2_replica_n10 ] o.a.s.c.SolrCore [[localShardsTestColl_shard2_replica_n10] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node3/localShardsTestColl_shard2_replica_n10], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node3/./localShardsTestColl_shard2_replica_n10/data/]
   [junit4]   2> 231288 INFO  (parallelCoreAdminExecutor-600-thread-2-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard2_replica_n8 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606393944042 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard2 r:core_node11 x:localShardsTestColl_shard2_replica_n8 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51194.solr.core.localShardsTestColl.shard2.replica_n8' (registry 'solr.core.localShardsTestColl.shard2.replica_n8') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 231289 INFO  (parallelCoreAdminExecutor-600-thread-2-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard2_replica_n8 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606393944042 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard2 r:core_node11 x:localShardsTestColl_shard2_replica_n8 ] o.a.s.c.SolrCore [[localShardsTestColl_shard2_replica_n8] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node2/localShardsTestColl_shard2_replica_n8], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node2/./localShardsTestColl_shard2_replica_n8/data/]
   [junit4]   2> 231291 WARN  (parallelCoreAdminExecutor-599-thread-2-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard3_replica_n16 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606425863766 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard3 r:core_node18 x:localShardsTestColl_shard3_replica_n16 ] o.a.s.s.IndexSchema Field lowerfilt1and2 is not multivalued and destination for multiple copyFields (2)
   [junit4]   2> 231291 WARN  (parallelCoreAdminExecutor-599-thread-2-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard3_replica_n16 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606425863766 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard3 r:core_node18 x:localShardsTestColl_shard3_replica_n16 ] o.a.s.s.IndexSchema Field text is not multivalued and destination for multiple copyFields (3)
   [junit4]   2> 231291 INFO  (parallelCoreAdminExecutor-599-thread-2-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard3_replica_n16 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606425863766 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard3 r:core_node18 x:localShardsTestColl_shard3_replica_n16 ] o.a.s.s.IndexSchema Loaded schema test/1.6 with uniqueid field id
   [junit4]   2> 231291 INFO  (parallelCoreAdminExecutor-599-thread-2-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard3_replica_n16 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606425863766 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard3 r:core_node18 x:localShardsTestColl_shard3_replica_n16 ] o.a.s.c.CoreContainer Creating SolrCore 'localShardsTestColl_shard3_replica_n16' using configuration from collection localShardsTestColl, trusted=true
   [junit4]   2> 231291 INFO  (parallelCoreAdminExecutor-599-thread-2-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard3_replica_n16 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606425863766 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard3 r:core_node18 x:localShardsTestColl_shard3_replica_n16 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_51195.solr.core.localShardsTestColl.shard3.replica_n16' (registry 'solr.core.localShardsTestColl.shard3.replica_n16') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@42dbfcf0
   [junit4]   2> 231291 INFO  (parallelCoreAdminExecutor-599-thread-2-processing-n:127.0.0.1:51195_solr x:localShardsTestColl_shard3_replica_n16 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606425863766 CREATE) [n:127.0.0.1:51195_solr c:localShardsTestColl s:shard3 r:core_node18 x:localShardsTestColl_shard3_replica_n16 ] o.a.s.c.SolrCore [[localShardsTestColl_shard3_replica_n16] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node3/localShardsTestColl_shard3_replica_n16], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-solrj/test/J3/temp/solr.client.solrj.impl.CloudSolrClientTest_545782413F1883B0-001/tempDir-001/node3/./localShardsTestColl_shard3_replica_n16/data/]
   [junit4]   2> 231703 INFO  (qtp380388185-2776) [n:127.0.0.1:51196_solr     ] o.a.s.h.a.CoreAdminOperation Checking request status for : 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026
   [junit4]   2> 231703 INFO  (qtp380388185-2776) [n:127.0.0.1:51196_solr     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&requestid=2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=0
   [junit4]   2> 232236 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 232236 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 232245 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 232245 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 232246 INFO  (qtp154209396-2773) [n:127.0.0.1:51194_solr     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :requeststatus with params requestid=2a6f772d-abb1-4d38-9a9f-4dcd689c248e&action=REQUESTSTATUS&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 232250 INFO  (qtp154209396-2773) [n:127.0.0.1:51194_solr     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={requestid=2a6f772d-abb1-4d38-9a9f-4dcd689c248e&action=REQUESTSTATUS&wt=javabin&version=2} status=0 QTime=3
   [junit4]   2> 232259 INFO  (parallelCoreAdminExecutor-600-thread-1-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard1_replica_n2 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard1 r:core_node5 x:localShardsTestColl_shard1_replica_n2 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 232259 INFO  (parallelCoreAdminExecutor-600-thread-1-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard1_replica_n2 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard1 r:core_node5 x:localShardsTestColl_shard1_replica_n2 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 232268 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@56009ea2[localShardsTestColl_shard1_replica_n1] main]
   [junit4]   2> 232273 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf
   [junit4]   2> 232274 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf
   [junit4]   2> 232275 INFO  (parallelCoreAdminExecutor-600-thread-1-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard1_replica_n2 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard1 r:core_node5 x:localShardsTestColl_shard1_replica_n2 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 232275 INFO  (parallelCoreAdminExecutor-600-thread-1-processing-n:127.0.0.1:51194_solr x:localShardsTestColl_shard1_replica_n2 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606364152994 CREATE) [n:127.0.0.1:51194_solr c:localShardsTestColl s:shard1 r:core_node5 x:localShardsTestColl_shard1_replica_n2 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 232283 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2a6f772d-abb1-4d38-9a9f-4dcd689c248e27606355158026 CREATE) [n:127.0.0.1:51196_solr c:localShardsTestColl s:shard1 r:core_node3 x:localShardsTestColl_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 232283 INFO  (parallelCoreAdminExecutor-596-thread-1-processing-n:127.0.0.1:51196_solr x:localShardsTestColl_shard1_replica_n1 2

[...truncated too long message...]

or registry solr.jetty / com.codahale.metrics.MetricRegistry@2e0cabac
   [junit4]   2> 288711 INFO  (jetty-closer-1768-thread-3) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster, tag=null
   [junit4]   2> 288711 INFO  (coreCloseExecutor-1052-thread-1) [n:127.0.0.1:52343_solr c:stale_state_test_col s:shard1 r:core_node4 x:stale_state_test_col_shard1_replica_n3 ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.stale_state_test_col.shard1.leader, tag=SolrCore@4edda032
   [junit4]   2> 288726 INFO  (jetty-closer-1768-thread-2) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node, tag=null
   [junit4]   2> 288726 INFO  (jetty-closer-1768-thread-2) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@5e697480: rootName = solr_52343, domain = solr.node, service url = null, agent id = null] for registry solr.node / com.codahale.metrics.MetricRegistry@6d586462
   [junit4]   2> 288731 INFO  (jetty-closer-1768-thread-2) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm, tag=null
   [junit4]   2> 288731 INFO  (jetty-closer-1768-thread-2) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@7c94af74: rootName = solr_52343, domain = solr.jvm, service url = null, agent id = null] for registry solr.jvm / com.codahale.metrics.MetricRegistry@13d99c5d
   [junit4]   2> 288733 INFO  (jetty-closer-1768-thread-2) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty, tag=null
   [junit4]   2> 288733 INFO  (jetty-closer-1768-thread-2) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@493bd534: rootName = solr_52343, domain = solr.jetty, service url = null, agent id = null] for registry solr.jetty / com.codahale.metrics.MetricRegistry@2e0cabac
   [junit4]   2> 288733 INFO  (jetty-closer-1768-thread-2) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster, tag=null
   [junit4]   2> 288736 INFO  (closeThreadPool-1780-thread-2) [     ] o.a.s.c.Overseer Overseer (id=72059406559281162-127.0.0.1:52343_solr-n_0000000000) closing
   [junit4]   2> 288736 INFO  (OverseerStateUpdate-72059406559281162-127.0.0.1:52343_solr-n_0000000000) [n:127.0.0.1:52343_solr     ] o.a.s.c.Overseer Overseer Loop exiting : 127.0.0.1:52343_solr
   [junit4]   2> 288744 INFO  (OverseerAutoScalingTriggerThread-72059406559281162-127.0.0.1:52343_solr-n_0000000000) [     ] o.a.s.c.a.OverseerTriggerThread OverseerTriggerThread woken up but we are closed, exiting.
   [junit4]   2> 288749 INFO  (closeThreadPool-1780-thread-1) [     ] o.a.s.c.Overseer Overseer (id=72059406559281162-127.0.0.1:52343_solr-n_0000000000) closing
   [junit4]   2> 288812 INFO  (jetty-closer-1768-thread-1) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@25714949{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:0}
   [junit4]   2> 288812 INFO  (jetty-closer-1768-thread-1) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@22cd4f9a{/solr,null,UNAVAILABLE}
   [junit4]   2> 288813 INFO  (jetty-closer-1768-thread-1) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 288827 INFO  (zkCallback-1738-thread-4) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (1)
   [junit4]   2> 288831 INFO  (jetty-closer-1768-thread-3) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@4979bf43{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:0}
   [junit4]   2> 288831 INFO  (jetty-closer-1768-thread-3) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@51a71d69{/solr,null,UNAVAILABLE}
   [junit4]   2> 288831 INFO  (jetty-closer-1768-thread-3) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 288938 INFO  (jetty-closer-1768-thread-2) [     ] o.a.s.c.Overseer Overseer (id=72059406559281162-127.0.0.1:52343_solr-n_0000000000) closing
   [junit4]   2> 288942 INFO  (jetty-closer-1768-thread-2) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@6947f1d6{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:0}
   [junit4]   2> 288942 INFO  (jetty-closer-1768-thread-2) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@70743781{/solr,null,UNAVAILABLE}
   [junit4]   2> 288942 INFO  (jetty-closer-1768-thread-2) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 288944 INFO  (TEST-CloudSolrClientTest.testRetryUpdatesWhenClusterStateIsStale-seed#[545782413F1883B0]) [     ] o.a.s.c.ZkTestServer Shutting down ZkTestServer.
   [junit4]   2> 289159 WARN  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Watch limit violations: 
   [junit4]   2> Maximum concurrent create/delete watches above limit:
   [junit4]   2> 
   [junit4]   2> 	8	/solr/collections/stale_state_test_col/terms/shard1
   [junit4]   2> 	5	/solr/aliases.json
   [junit4]   2> 	5	/solr/clusterprops.json
   [junit4]   2> 	3	/solr/packages.json
   [junit4]   2> 	3	/solr/security.json
   [junit4]   2> 	2	/solr/collections/stale_state_test_col/collectionprops.json
   [junit4]   2> 	2	/solr/configs/conf
   [junit4]   2> 
   [junit4]   2> Maximum concurrent data watches above limit:
   [junit4]   2> 
   [junit4]   2> 	50	/solr/collections/stale_state_test_col/state.json
   [junit4]   2> 	5	/solr/clusterstate.json
   [junit4]   2> 
   [junit4]   2> Maximum concurrent children watches above limit:
   [junit4]   2> 
   [junit4]   2> 	27	/solr/overseer/queue
   [junit4]   2> 	14	/solr/live_nodes
   [junit4]   2> 	13	/solr/overseer/collection-queue-work
   [junit4]   2> 	9	/solr/collections
   [junit4]   2> 
   [junit4]   2> 289163 INFO  (TEST-CloudSolrClientTest.testRetryUpdatesWhenClusterStateIsStale-seed#[545782413F1883B0]) [     ] o.a.s.c.ZkTestServer waitForServerDown: 127.0.0.1:52322
   [junit4]   2> 289163 INFO  (TEST-CloudSolrClientTest.testRetryUpdatesWhenClusterStateIsStale-seed#[545782413F1883B0]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:52322
   [junit4]   2> 289163 INFO  (TEST-CloudSolrClientTest.testRetryUpdatesWhenClusterStateIsStale-seed#[545782413F1883B0]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 52322
   [junit4]   2> 289164 INFO  (TEST-CloudSolrClientTest.testRetryUpdatesWhenClusterStateIsStale-seed#[545782413F1883B0]) [     ] o.a.s.SolrTestCaseJ4 ###Ending testRetryUpdatesWhenClusterStateIsStale
   [junit4]   2> 289165 INFO  (SUITE-CloudSolrClientTest-seed#[545782413F1883B0]-worker) [     ] o.a.s.SolrTestCaseJ4 ------------------------------------------------------- Done waiting for tracked resources to be released
   [junit4]   2> NOTE: test params are: codec=FastDecompressionCompressingStoredFields(storedFieldsFormat=CompressingStoredFieldsFormat(compressionMode=FAST_DECOMPRESSION, chunkSize=17577, maxDocsPerChunk=4, blockSize=9), termVectorsFormat=CompressingTermVectorsFormat(compressionMode=FAST_DECOMPRESSION, chunkSize=17577, blockSize=9)), sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@2739f8b9), locale=nl-SX, timezone=Etc/GMT+4
   [junit4]   2> NOTE: Mac OS X 10.14.6 x86_64/AdoptOpenJDK 11.0.4 (64-bit)/cpus=6,threads=1,free=89044768,total=330825728
   [junit4]   2> NOTE: All tests run in this JVM: [QueryFacetMapTest, TestFastInputStream, RawValueEvaluatorTest, OperationsTest, AscEvaluatorTest, GreaterThanEvaluatorTest, AppendEvaluatorTest, CloudSolrClientBuilderTest, TestJSONParser, SineEvaluatorTest, NormalDistributionEvaluatorTest, EqualToEvaluatorTest, TestTimeSource, NodePreferenceRulesComparatorTest, TestPolicy, TestDocCollectionWatcher, SolrDocumentTest, HttpSolrClientBadInputTest, ZkConfigFilesTest, LessThanEqualToEvaluatorTest, GraphExpressionTest, TestPolicy2Old, TestPolicyOld, JettyWebappTest, LargeVolumeBinaryJettyTest, LargeVolumeEmbeddedTest, LargeVolumeJettyTest, SolrExampleEmbeddedTest, SolrExampleStreamingBinaryTest, SolrExampleXMLHttp2Test, CloudSolrClientTest]
   [junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=CloudSolrClientTest -Dtests.seed=545782413F1883B0 -Dtests.slow=true -Dtests.locale=nl-SX -Dtests.timezone=Etc/GMT+4 -Dtests.asserts=true -Dtests.file.encoding=UTF-8
   [junit4] ERROR   0.00s J3 | CloudSolrClientTest (suite) <<<
   [junit4]    > Throwable #1: java.lang.Exception: Suite timeout exceeded (>= 7200000 msec).
   [junit4]    > 	at __randomizedtesting.SeedInfo.seed([545782413F1883B0]:0)
   [junit4] Completed [174/206 (1!)] on J3 in 62.59s, 9 tests, 2 errors <<< FAILURES!

[...truncated 51841 lines...]
BUILD FAILED
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:634: The following error occurred while executing this line:
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:507: The following error occurred while executing this line:
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:494: Source checkout is dirty (unversioned/missing files) after running tests!!! Offending files:
* solr/licenses/jetty-start-9.4.24.v20191120-shaded.jar.sha1

Total time: 78 minutes 34 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-master-MacOSX (64bit/jdk-13.0.1) - Build # 5537 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-MacOSX/5537/
Java: 64bit/jdk-13.0.1 -XX:+UseCompressedOops -XX:+UseParallelGC

2 tests failed.
FAILED:  junit.framework.TestSuite.org.apache.solr.cloud.TestStressInPlaceUpdates

Error Message:
1 thread leaked from SUITE scope at org.apache.solr.cloud.TestStressInPlaceUpdates:     1) Thread[id=15072, name=SessionTracker, state=TIMED_WAITING, group=TGRP-TestStressInPlaceUpdates]         at java.base@13.0.1/java.lang.Thread.sleep(Native Method)         at app//org.apache.zookeeper.server.SessionTrackerImpl.run(SessionTrackerImpl.java:151)

Stack Trace:
com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from SUITE scope at org.apache.solr.cloud.TestStressInPlaceUpdates: 
   1) Thread[id=15072, name=SessionTracker, state=TIMED_WAITING, group=TGRP-TestStressInPlaceUpdates]
        at java.base@13.0.1/java.lang.Thread.sleep(Native Method)
        at app//org.apache.zookeeper.server.SessionTrackerImpl.run(SessionTrackerImpl.java:151)
	at __randomizedtesting.SeedInfo.seed([A2482538A05583EB]:0)


FAILED:  junit.framework.TestSuite.org.apache.solr.handler.TestConfigReload

Error Message:
1 thread leaked from SUITE scope at org.apache.solr.handler.TestConfigReload:     1) Thread[id=26717, name=SessionTracker, state=TIMED_WAITING, group=TGRP-TestConfigReload]         at java.base@13.0.1/java.lang.Thread.sleep(Native Method)         at app//org.apache.zookeeper.server.SessionTrackerImpl.run(SessionTrackerImpl.java:151)

Stack Trace:
com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from SUITE scope at org.apache.solr.handler.TestConfigReload: 
   1) Thread[id=26717, name=SessionTracker, state=TIMED_WAITING, group=TGRP-TestConfigReload]
        at java.base@13.0.1/java.lang.Thread.sleep(Native Method)
        at app//org.apache.zookeeper.server.SessionTrackerImpl.run(SessionTrackerImpl.java:151)
	at __randomizedtesting.SeedInfo.seed([A2482538A05583EB]:0)




Build Log:
[...truncated 15022 lines...]
   [junit4] Suite: org.apache.solr.cloud.TestStressInPlaceUpdates
   [junit4]   2> 1522720 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/data-dir-100-001
   [junit4]   2> 1522720 WARN  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=1 numCloses=1
   [junit4]   2> 1522721 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=true
   [junit4]   2> 1522727 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason="", value=0.0/0.0, ssl=0.0/0.0, clientAuth=0.0/0.0) w/ MAC_OS_X supressed clientAuth
   [junit4]   2> 1522730 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 1522730 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /lft/
   [junit4]   2> 1522731 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.SolrTestCaseJ4 ####initCore
   [junit4]   2> 1522736 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.c.SolrResourceLoader [null] Added 2 libs to classloader, from paths: [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/lib, /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/lib/classes]
   [junit4]   2> 1522819 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 1522830 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.s.IndexSchema [null] Schema name=inplace-updates
   [junit4]   2> 1522832 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.s.IndexSchema Loaded schema inplace-updates/1.6 with uniqueid field id
   [junit4]   2> 1523060 WARN  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.f.DistribPackageStore Unable to create [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/filestore] directory in SOLR_HOME [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr].  Features requiring this directory may fail.
   [junit4]   2>           => java.security.AccessControlException: access denied ("java.io.FilePermission" "/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/filestore" "write")
   [junit4]   2> 	at java.base/java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
   [junit4]   2> java.security.AccessControlException: access denied ("java.io.FilePermission" "/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/filestore" "write")
   [junit4]   2> 	at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472) ~[?:?]
   [junit4]   2> 	at java.security.AccessController.checkPermission(AccessController.java:1036) ~[?:?]
   [junit4]   2> 	at java.lang.SecurityManager.checkPermission(SecurityManager.java:408) ~[?:?]
   [junit4]   2> 	at java.lang.SecurityManager.checkWrite(SecurityManager.java:838) ~[?:?]
   [junit4]   2> 	at java.io.File.mkdir(File.java:1323) ~[?:?]
   [junit4]   2> 	at java.io.File.mkdirs(File.java:1355) ~[?:?]
   [junit4]   2> 	at org.apache.solr.filestore.DistribPackageStore.ensurePackageStoreDir(DistribPackageStore.java:520) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.filestore.DistribPackageStore.<init>(DistribPackageStore.java:77) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.filestore.PackageStoreAPI.<init>(PackageStoreAPI.java:79) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.core.CoreContainer.load(CoreContainer.java:619) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.util.TestHarness.<init>(TestHarness.java:180) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.util.TestHarness.<init>(TestHarness.java:143) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.util.TestHarness.<init>(TestHarness.java:149) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.util.TestHarness.<init>(TestHarness.java:112) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.SolrTestCaseJ4.createCore(SolrTestCaseJ4.java:814) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.SolrTestCaseJ4.initCore(SolrTestCaseJ4.java:804) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.SolrTestCaseJ4.initCore(SolrTestCaseJ4.java:665) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.SolrTestCaseJ4.initCore(SolrTestCaseJ4.java:654) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.cloud.TestStressInPlaceUpdates.beforeSuperClass(TestStressInPlaceUpdates.java:62) ~[test/:?]
   [junit4]   2> 	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
   [junit4]   2> 	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
   [junit4]   2> 	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
   [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:567) ~[?:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1754) ~[randomizedtesting-runner-2.7.6.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:882) ~[randomizedtesting-runner-2.7.6.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:898) ~[randomizedtesting-runner-2.7.6.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.6.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) ~[randomizedtesting-runner-2.7.6.jar:?]
   [junit4]   2> 	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.6.jar:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) ~[randomizedtesting-runner-2.7.6.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) ~[randomizedtesting-runner-2.7.6.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.6.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.6.jar:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.6.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370) ~[randomizedtesting-runner-2.7.6.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:826) ~[randomizedtesting-runner-2.7.6.jar:?]
   [junit4]   2> 	at java.lang.Thread.run(Thread.java:830) [?:?]
   [junit4]   2> 1523060 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1523093 WARN  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@4603b5b1[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1523093 WARN  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@4603b5b1[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1523115 WARN  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@1c17bd7a[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1523116 WARN  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@1c17bd7a[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1523127 WARN  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 1523228 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 2147483647 transient cores
   [junit4]   2> 1523228 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1523265 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1523278 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1523279 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1523283 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.c.SolrResourceLoader [null] Added 2 libs to classloader, from paths: [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/lib, /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/lib/classes]
   [junit4]   2> 1523309 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 1523323 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.s.IndexSchema [collection1] Schema name=inplace-updates
   [junit4]   2> 1523335 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.s.IndexSchema Loaded schema inplace-updates/1.6 with uniqueid field id
   [junit4]   2> 1523335 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1' using configuration from instancedir /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1, trusted=true
   [junit4]   2> 1523336 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1' (registry 'solr.core.collection1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1523336 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.c.SolrCore [[collection1] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/data-dir-100-001/]
   [junit4]   2> 1523338 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=45, maxMergeAtOnceExplicit=19, maxMergedSegmentMB=12.7734375, floorSegmentMB=1.966796875, forceMergeDeletesPctAllowed=2.8096685791294, segmentsPerTier=44.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=22.563996868910873
   [junit4]   2> 1523342 WARN  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}}
   [junit4]   2> 1523422 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 1523422 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1523424 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1523424 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1523425 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=45, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0]
   [junit4]   2> 1523425 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@427671d[collection1] main]
   [junit4]   2> 1523427 WARN  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.r.ManagedResourceStorage Cannot write to config directory /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf; switching to use InMemory storage instead.
   [junit4]   2> 1523427 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 1523428 INFO  (coreLoadExecutor-4735-thread-1) [    x:collection1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1655887044436557824
   [junit4]   2> 1523438 INFO  (searcherExecutor-4736-thread-1-processing-x:collection1) [    x:collection1 ] o.a.s.c.SolrCore [collection1] Registered new searcher Searcher@427671d[collection1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1523440 INFO  (SUITE-TestStressInPlaceUpdates-seed#[A2482538A05583EB]-worker) [     ] o.a.s.SolrTestCaseJ4 ####initCore end
   [junit4]   2> 1523459 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 1523461 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 1523461 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 1523569 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer start zk server on port:63237
   [junit4]   2> 1523569 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:63237
   [junit4]   2> 1523569 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:63237
   [junit4]   2> 1523569 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 63237
   [junit4]   2> 1523573 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1523578 INFO  (zkConnectionManagerCallback-5398-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1523578 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1523582 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1523585 INFO  (zkConnectionManagerCallback-5400-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1523585 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1523588 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 1523591 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/schema-inplace-updates.xml to /configs/conf1/schema.xml
   [junit4]   2> 1523594 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 1523596 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 1523602 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 1523606 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 1523609 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 1523628 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 1523651 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 1523665 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 1523679 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 1523704 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 1524739 WARN  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1524740 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1524741 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 1524741 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 1524767 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1524768 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1524770 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 1524784 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@727c3bb3{/lft,null,AVAILABLE}
   [junit4]   2> 1524795 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@68fff600{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:63251}
   [junit4]   2> 1524795 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.e.j.s.Server Started @1524841ms
   [junit4]   2> 1524796 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/lft, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/tempDir-001/control/data, hostPort=63251, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/../../../../../../../../../Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/control-001/cores, replicaType=NRT}
   [junit4]   2> 1524802 ERROR (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1524802 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1524802 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 9.0.0
   [junit4]   2> 1524802 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1524803 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 1524803 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-16T12:15:46.524234Z
   [junit4]   2> 1524807 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1524813 INFO  (zkConnectionManagerCallback-5402-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1524813 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1524923 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1524923 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/control-001/solr.xml
   [junit4]   2> 1524939 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1524939 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1524940 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1525092 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1525093 WARN  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@460c54b5[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1525093 WARN  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@460c54b5[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1525096 WARN  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@6ed9dc8f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1525096 WARN  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@6ed9dc8f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1525098 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:63237/solr
   [junit4]   2> 1525100 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1525108 INFO  (zkConnectionManagerCallback-5409-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1525109 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1525225 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1525234 INFO  (zkConnectionManagerCallback-5411-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1525234 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1525687 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:63251_lft
   [junit4]   2> 1525691 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.Overseer Overseer (id=72058311538573316-127.0.0.1:63251_lft-n_0000000000) starting
   [junit4]   2> 1525717 INFO  (OverseerStateUpdate-72058311538573316-127.0.0.1:63251_lft-n_0000000000) [n:127.0.0.1:63251_lft     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:63251_lft
   [junit4]   2> 1525721 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:63251_lft
   [junit4]   2> 1525735 INFO  (OverseerStateUpdate-72058311538573316-127.0.0.1:63251_lft-n_0000000000) [n:127.0.0.1:63251_lft     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1525771 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1525773 WARN  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 1525846 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1525884 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1525899 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1525899 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1525900 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/../../../../../../../../../Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/control-001/cores
   [junit4]   2> 1525932 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1525936 INFO  (zkConnectionManagerCallback-5420-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1525937 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1525941 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1526001 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:63237/solr ready
   [junit4]   2> 1526006 INFO  (qtp1473878813-15098) [n:127.0.0.1:63251_lft     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:63251_lft&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1526031 INFO  (OverseerThreadFactory-4750-thread-1) [     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 1526160 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 1526160 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1527203 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 1527227 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema [control_collection_shard1_replica_n1] Schema name=inplace-updates
   [junit4]   2> 1527231 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema inplace-updates/1.6 with uniqueid field id
   [junit4]   2> 1527231 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from collection control_collection, trusted=true
   [junit4]   2> 1527231 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1527232 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/../../../../../../../../../Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/control-001/cores/control_collection_shard1_replica_n1/data/]
   [junit4]   2> 1527236 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=45, maxMergeAtOnceExplicit=19, maxMergedSegmentMB=12.7734375, floorSegmentMB=1.966796875, forceMergeDeletesPctAllowed=2.8096685791294, segmentsPerTier=44.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=22.563996868910873
   [junit4]   2> 1527241 WARN  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}}
   [junit4]   2> 1527334 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 1527334 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1527336 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1527336 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1527337 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=45, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0]
   [junit4]   2> 1527337 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@7e230a6f[control_collection_shard1_replica_n1] main]
   [junit4]   2> 1527342 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 1527344 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 1527344 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 1527344 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1655887048542781440
   [junit4]   2> 1527348 INFO  (searcherExecutor-4755-thread-1-processing-n:127.0.0.1:63251_lft x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@7e230a6f[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1527372 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 1527372 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 1527409 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 1527409 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 1527409 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:63251/lft/control_collection_shard1_replica_n1/
   [junit4]   2> 1527410 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 1527412 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:63251/lft/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 1527412 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72058311538573316-core_node2-n_0000000000
   [junit4]   2> 1527421 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:63251/lft/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 1527529 INFO  (zkCallback-5410-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1527530 INFO  (zkCallback-5410-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1527538 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 1527546 INFO  (qtp1473878813-15100) [n:127.0.0.1:63251_lft c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1386
   [junit4]   2> 1527553 INFO  (qtp1473878813-15098) [n:127.0.0.1:63251_lft     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 1527650 INFO  (zkCallback-5410-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1527650 INFO  (zkCallback-5410-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1527650 INFO  (zkCallback-5410-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1527652 INFO  (qtp1473878813-15098) [n:127.0.0.1:63251_lft     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:63251_lft&wt=javabin&version=2} status=0 QTime=1646
   [junit4]   2> 1527654 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 1527769 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1527773 INFO  (zkConnectionManagerCallback-5426-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1527773 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1527777 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1527783 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:63237/solr ready
   [junit4]   2> 1527783 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 1527785 INFO  (qtp1473878813-15098) [n:127.0.0.1:63251_lft     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1527797 INFO  (OverseerThreadFactory-4750-thread-2) [     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 1527799 INFO  (OverseerCollectionConfigSetProcessor-72058311538573316-127.0.0.1:63251_lft-n_0000000000) [     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 1528018 WARN  (OverseerThreadFactory-4750-thread-2) [     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 1528031 INFO  (qtp1473878813-15098) [n:127.0.0.1:63251_lft     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 1528034 INFO  (qtp1473878813-15098) [n:127.0.0.1:63251_lft     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2} status=0 QTime=249
   [junit4]   2> 1528041 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1528041 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1528041 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1528041 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1528041 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1528041 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1528041 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=3
   [junit4]   2> 1529333 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-1-001 of type NRT
   [junit4]   2> 1529338 WARN  (closeThreadPool-5427-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1529338 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1529338 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 1529338 INFO  (closeThreadPool-5427-thread-1) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 1529341 INFO  (closeThreadPool-5427-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1529341 INFO  (closeThreadPool-5427-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1529341 INFO  (closeThreadPool-5427-thread-1) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1529342 INFO  (closeThreadPool-5427-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@194d5c69{/lft,null,AVAILABLE}
   [junit4]   2> 1529343 INFO  (closeThreadPool-5427-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@6bddbb3e{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:63345}
   [junit4]   2> 1529344 INFO  (closeThreadPool-5427-thread-1) [     ] o.e.j.s.Server Started @1529390ms
   [junit4]   2> 1529344 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/lft, solrconfig=solrconfig.xml, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/tempDir-001/jetty1, hostPort=63345, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-1-001/cores}
   [junit4]   2> 1529344 ERROR (closeThreadPool-5427-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1529344 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1529344 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 9.0.0
   [junit4]   2> 1529344 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1529344 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 1529344 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-16T12:15:51.065622Z
   [junit4]   2> 1529346 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1529354 INFO  (zkConnectionManagerCallback-5429-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1529356 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1529461 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1529461 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-1-001/solr.xml
   [junit4]   2> 1529467 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1529467 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1529483 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1529707 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1529711 WARN  (closeThreadPool-5427-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@3e54dd6a[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1529711 WARN  (closeThreadPool-5427-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@3e54dd6a[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1529718 WARN  (closeThreadPool-5427-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@206bdc08[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1529718 WARN  (closeThreadPool-5427-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@206bdc08[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1529719 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:63237/solr
   [junit4]   2> 1529721 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1529727 INFO  (zkConnectionManagerCallback-5436-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1529728 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1529841 INFO  (OverseerCollectionConfigSetProcessor-72058311538573316-127.0.0.1:63251_lft-n_0000000000) [     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 1529846 INFO  (closeThreadPool-5427-thread-1) [n:127.0.0.1:63345_lft     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1529852 INFO  (zkConnectionManagerCallback-5438-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1529852 INFO  (closeThreadPool-5427-thread-1) [n:127.0.0.1:63345_lft     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1529894 INFO  (closeThreadPool-5427-thread-1) [n:127.0.0.1:63345_lft     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1529946 INFO  (closeThreadPool-5427-thread-1) [n:127.0.0.1:63345_lft     ] o.a.s.c.ZkController Publish node=127.0.0.1:63345_lft as DOWN
   [junit4]   2> 1529957 INFO  (closeThreadPool-5427-thread-1) [n:127.0.0.1:63345_lft     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1529957 INFO  (closeThreadPool-5427-thread-1) [n:127.0.0.1:63345_lft     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:63345_lft
   [junit4]   2> 1529964 INFO  (zkCallback-5425-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1529964 INFO  (zkCallback-5410-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1529967 INFO  (zkCallback-5437-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1529974 INFO  (closeThreadPool-5427-thread-1) [n:127.0.0.1:63345_lft     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1529975 WARN  (closeThreadPool-5427-thread-1) [n:127.0.0.1:63345_lft     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 1530002 INFO  (closeThreadPool-5427-thread-1) [n:127.0.0.1:63345_lft     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1530037 INFO  (closeThreadPool-5427-thread-1) [n:127.0.0.1:63345_lft     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1530059 INFO  (closeThreadPool-5427-thread-1) [n:127.0.0.1:63345_lft     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1530060 INFO  (closeThreadPool-5427-thread-1) [n:127.0.0.1:63345_lft     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1530062 INFO  (closeThreadPool-5427-thread-1) [n:127.0.0.1:63345_lft     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-1-001/cores
   [junit4]   2> 1530190 INFO  (closeThreadPool-5427-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:63345_lft
   [junit4]   2> 1530192 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 2 in directory /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-2-001 of type NRT
   [junit4]   2> 1530194 WARN  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1530194 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1530194 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 1530194 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 1530203 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1530203 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1530203 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1530203 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@7d24cd17{/lft,null,AVAILABLE}
   [junit4]   2> 1530205 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.AbstractConnector Started ServerConnector@19ea9e76{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:63374}
   [junit4]   2> 1530205 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.Server Started @1530251ms
   [junit4]   2> 1530205 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/lft, solrconfig=solrconfig.xml, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/tempDir-001/jetty2, hostPort=63374, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-2-001/cores}
   [junit4]   2> 1530205 ERROR (closeThreadPool-5427-thread-2) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1530205 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1530205 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 9.0.0
   [junit4]   2> 1530205 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1530205 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 1530205 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-16T12:15:51.926872Z
   [junit4]   2> 1530216 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1530220 INFO  (zkConnectionManagerCallback-5444-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1530220 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1530338 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1530339 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-2-001/solr.xml
   [junit4]   2> 1530342 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1530342 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1530344 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1530427 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1530430 WARN  (closeThreadPool-5427-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@10e7826f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1530430 WARN  (closeThreadPool-5427-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@10e7826f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1530437 WARN  (closeThreadPool-5427-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@5d8dd8d1[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1530437 WARN  (closeThreadPool-5427-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@5d8dd8d1[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1530440 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:63237/solr
   [junit4]   2> 1530447 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1530453 INFO  (zkConnectionManagerCallback-5451-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1530453 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1530566 INFO  (closeThreadPool-5427-thread-2) [n:127.0.0.1:63374_lft     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1530571 INFO  (zkConnectionManagerCallback-5453-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1530572 INFO  (closeThreadPool-5427-thread-2) [n:127.0.0.1:63374_lft     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1530621 INFO  (closeThreadPool-5427-thread-2) [n:127.0.0.1:63374_lft     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2)
   [junit4]   2> 1530676 INFO  (closeThreadPool-5427-thread-2) [n:127.0.0.1:63374_lft     ] o.a.s.c.ZkController Publish node=127.0.0.1:63374_lft as DOWN
   [junit4]   2> 1530687 INFO  (closeThreadPool-5427-thread-2) [n:127.0.0.1:63374_lft     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1530687 INFO  (closeThreadPool-5427-thread-2) [n:127.0.0.1:63374_lft     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:63374_lft
   [junit4]   2> 1530721 INFO  (zkCallback-5437-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 1530729 INFO  (zkCallback-5410-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 1530730 INFO  (zkCallback-5425-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 1530732 INFO  (zkCallback-5452-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 1530753 INFO  (closeThreadPool-5427-thread-2) [n:127.0.0.1:63374_lft     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1530754 WARN  (closeThreadPool-5427-thread-2) [n:127.0.0.1:63374_lft     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 1530820 INFO  (closeThreadPool-5427-thread-2) [n:127.0.0.1:63374_lft     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1530925 INFO  (closeThreadPool-5427-thread-2) [n:127.0.0.1:63374_lft     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1530939 INFO  (closeThreadPool-5427-thread-2) [n:127.0.0.1:63374_lft     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1530939 INFO  (closeThreadPool-5427-thread-2) [n:127.0.0.1:63374_lft     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1530941 INFO  (closeThreadPool-5427-thread-2) [n:127.0.0.1:63374_lft     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-2-001/cores
   [junit4]   2> 1531015 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:63374_lft
   [junit4]   2> 1531046 INFO  (TEST-TestStressInPlaceUpdates.stressTest-seed#[A2482538A05583EB]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 3 in directory /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-3-001 of type NRT
   [junit4]   2> 1531047 WARN  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1531047 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1531047 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 1531047 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 1531051 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1531051 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1531051 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1531052 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@48e6bdb0{/lft,null,AVAILABLE}
   [junit4]   2> 1531052 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.AbstractConnector Started ServerConnector@1bd66851{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:63388}
   [junit4]   2> 1531053 INFO  (closeThreadPool-5427-thread-2) [     ] o.e.j.s.Server Started @1531099ms
   [junit4]   2> 1531053 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/lft, solrconfig=solrconfig.xml, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/tempDir-001/jetty3, hostPort=63388, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-3-001/cores, replicaType=NRT}
   [junit4]   2> 1531053 ERROR (closeThreadPool-5427-thread-2) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1531053 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1531053 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 9.0.0
   [junit4]   2> 1531053 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1531053 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 1531053 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-16T12:15:52.774727Z
   [junit4]   2> 1531055 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1531059 INFO  (zkConnectionManagerCallback-5459-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1531059 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1531163 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1531163 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-3-001/solr.xml
   [junit4]   2> 1531168 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1531168 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1531169 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1531349 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1531353 WARN  (closeThreadPool-5427-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@1c3264ce[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1531353 WARN  (closeThreadPool-5427-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@1c3264ce[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1531360 WARN  (closeThreadPool-5427-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@3d075ef[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1531360 WARN  (closeThreadPool-5427-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@3d075ef[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1531362 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:63237/solr
   [junit4]   2> 1531368 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1531372 INFO  (zkConnectionManagerCallback-5466-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1531372 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1531496 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1531501 INFO  (zkConnectionManagerCallback-5468-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1531501 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1531544 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3)
   [junit4]   2> 1531594 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:63388_lft as DOWN
   [junit4]   2> 1531615 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1531615 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:63388_lft
   [junit4]   2> 1531646 INFO  (zkCallback-5410-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 1531646 INFO  (zkCallback-5452-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 1531646 INFO  (zkCallback-5437-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 1531646 INFO  (zkCallback-5425-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 1531647 INFO  (zkCallback-5467-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 1531682 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1531684 WARN  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 1531728 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1531822 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1531848 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1531848 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1531861 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-3-001/cores
   [junit4]   2> 1532020 INFO  (closeThreadPool-5427-thread-2) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:63388_lft
   [junit4]   2> 1532029 INFO  (qtp1473878813-15099) [n:127.0.0.1:63251_lft     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:63374_lft&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1532030 INFO  (qtp1473878813-15098) [n:127.0.0.1:63251_lft     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:63388_lft&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1532036 INFO  (qtp1473878813-15097) [n:127.0.0.1:63251_lft     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:63345_lft&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1532048 INFO  (OverseerThreadFactory-4750-thread-3) [ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:63374_lft for creating new replica of shard shard1 for collection collection1
   [junit4]   2> 1532063 INFO  (OverseerThreadFactory-4750-thread-3) [ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command.
   [junit4]   2> 1532076 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft    x:collection1_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 1533119 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 1533138 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema [collection1_shard1_replica_n1] Schema name=inplace-updates
   [junit4]   2> 1533141 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema inplace-updates/1.6 with uniqueid field id
   [junit4]   2> 1533141 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from collection collection1, trusted=true
   [junit4]   2> 1533142 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@74ac98b3
   [junit4]   2> 1533142 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-2-001/cores/collection1_shard1_replica_n1], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J1/temp/solr.cloud.TestStressInPlaceUpdates_A2482538A05583EB-001/shard-2-001/cores/collection1_shard1_replica_n1/data/]
   [junit4]   2> 1533153 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=45, maxMergeAtOnceExplicit=19, maxMergedSegmentMB=12.7734375, floorSegmentMB=1.966796875, forceMergeDeletesPctAllowed=2.8096685791294, segmentsPerTier=44.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=22.563996868910873
   [junit4]   2> 1533189 WARN  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}}
   [junit4]   2> 1533410 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 1533412 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1533415 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1533415 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1533417 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=45, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0]
   [junit4]   2> 1533417 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@3b3df4ad[collection1_shard1_replica_n1] main]
   [junit4]   2> 1533421 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 1533422 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 1533423 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 1533423 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1655887054917074944
   [junit4]   2> 1533426 INFO  (searcherExecutor-4787-thread-1-processing-n:127.0.0.1:63374_lft x:collection1_shard1_replica_n1 c:collection1 s:shard1) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [collection1_shard1_replica_n1] Registered new searcher Searcher@3b3df4ad[collection1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1533462 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 1533462 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1
   [junit4]   2> 1533472 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 1533472 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 1533472 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:63374/lft/collection1_shard1_replica_n1/
   [junit4]   2> 1533472 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 1533472 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:63374/lft/collection1_shard1_replica_n1/ has no replicas
   [junit4]   2> 1533472 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72058311538573324-core_node2-n_0000000000
   [junit4]   2> 1533478 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:63374/lft/collection1_shard1_replica_n1/ shard1
   [junit4]   2> 1533588 INFO  (qtp500504852-15187) [n:127.0.0.1:63374_lft c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] 

[...truncated too long message...]

  (closeThreadPool-10013-thread-7) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 1839304 INFO  (TEST-TestConfigReload.test-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer Shutting down ZkTestServer.
   [junit4]   2> 1839535 WARN  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Watch limit violations: 
   [junit4]   2> Maximum concurrent create/delete watches above limit:
   [junit4]   2> 
   [junit4]   2> 	10	/solr/configs/conf1
   [junit4]   2> 	7	/solr/aliases.json
   [junit4]   2> 	5	/solr/packages.json
   [junit4]   2> 	5	/solr/security.json
   [junit4]   2> 	5	/solr/collections/collection1/terms/shard1
   [junit4]   2> 	5	/solr/collections/collection1/terms/shard2
   [junit4]   2> 	4	/solr/collections/collection1/collectionprops.json
   [junit4]   2> 	2	/solr/collections/control_collection/terms/shard1
   [junit4]   2> 
   [junit4]   2> Maximum concurrent data watches above limit:
   [junit4]   2> 
   [junit4]   2> 	71	/solr/collections/collection1/state.json
   [junit4]   2> 	12	/solr/collections/control_collection/state.json
   [junit4]   2> 	7	/solr/clusterstate.json
   [junit4]   2> 	7	/solr/clusterprops.json
   [junit4]   2> 	3	/solr/overseer_elect/election/72058330640285700-127.0.0.1:50701_ulprv%2Fk-n_0000000000
   [junit4]   2> 	2	/solr/overseer_elect/election/72058330640285708-127.0.0.1:50731_ulprv%2Fk-n_0000000002
   [junit4]   2> 
   [junit4]   2> Maximum concurrent children watches above limit:
   [junit4]   2> 
   [junit4]   2> 	39	/solr/overseer/queue
   [junit4]   2> 	35	/solr/live_nodes
   [junit4]   2> 	15	/solr/overseer/collection-queue-work
   [junit4]   2> 	7	/solr/collections
   [junit4]   2> 
   [junit4]   2> 1839549 INFO  (TEST-TestConfigReload.test-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer waitForServerDown: 127.0.0.1:50695
   [junit4]   2> 1839549 INFO  (TEST-TestConfigReload.test-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:50695
   [junit4]   2> 1839549 INFO  (TEST-TestConfigReload.test-seed#[A2482538A05583EB]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 50695
   [junit4]   2> 1839551 INFO  (SUITE-TestConfigReload-seed#[A2482538A05583EB]-worker) [     ] o.a.s.SolrTestCaseJ4 ------------------------------------------------------- Done waiting for tracked resources to be released
   [junit4]   2> Jan 16, 2020 12:21:02 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 1 leaked thread(s).
   [junit4]   2> Jan 16, 2020 12:21:12 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> SEVERE: 1 thread leaked from SUITE scope at org.apache.solr.handler.TestConfigReload: 
   [junit4]   2>    1) Thread[id=26717, name=SessionTracker, state=TIMED_WAITING, group=TGRP-TestConfigReload]
   [junit4]   2>         at java.base@13.0.1/java.lang.Thread.sleep(Native Method)
   [junit4]   2>         at app//org.apache.zookeeper.server.SessionTrackerImpl.run(SessionTrackerImpl.java:151)
   [junit4]   2> Jan 16, 2020 12:21:12 PM com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll
   [junit4]   2> INFO: Starting to interrupt leaked threads:
   [junit4]   2>    1) Thread[id=26717, name=SessionTracker, state=TIMED_WAITING, group=TGRP-TestConfigReload]
   [junit4]   2> 1850803 ERROR (SessionTracker) [     ] o.a.z.s.ZooKeeperCriticalThread Severe unrecoverable error, from thread : SessionTracker
   [junit4]   2>           => java.lang.InterruptedException: sleep interrupted
   [junit4]   2> 	at java.base/java.lang.Thread.sleep(Native Method)
   [junit4]   2> java.lang.InterruptedException: sleep interrupted
   [junit4]   2> 	at java.lang.Thread.sleep(Native Method) [?:?]
   [junit4]   2> 	at org.apache.zookeeper.server.SessionTrackerImpl.run(SessionTrackerImpl.java:151) ~[zookeeper-3.5.5.jar:3.5.5]
   [junit4]   2> Jan 16, 2020 12:21:12 PM com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll
   [junit4]   2> INFO: All leaked threads terminated.
   [junit4]   2> NOTE: test params are: codec=Lucene84, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@6f609c31), locale=kw, timezone=Africa/Bamako
   [junit4]   2> NOTE: Mac OS X 10.14.6 x86_64/AdoptOpenJDK 13.0.1 (64-bit)/cpus=6,threads=1,free=200113072,total=443547648
   [junit4]   2> NOTE: All tests run in this JVM: [TestSolrCoreSnapshots, OverseerSolrResponseTest, TestBadConfig, TestComplexPhraseLeadingWildcard, TriggerSetPropertiesIntegrationTest, TestBinaryResponseWriter, CollectionStateFormat2Test, TestOnReconnectListenerSupport, TestManagedStopFilterFactory, TestReRankQParserPlugin, CollectionReloadTest, TestElisionMultitermQuery, ProtectedTermFilterFactoryTest, DistributedFacetExistsSmallTest, TestSubQueryTransformer, ConjunctionSolrSpellCheckerTest, LeaderFailureAfterFreshStartTest, SolrJmxReporterTest, TestHighlightDedupGrouping, ShardRoutingTest, TestPerFieldSimilarity, TestFieldCacheReopen, IndexSizeTriggerSizeEstimationTest, JavabinLoaderTest, ForceLeaderWithTlogReplicasTest, SplitByPrefixTest, TestSimExecutePlanAction, TestSchemaResource, ScheduledMaintenanceTriggerTest, CategoryRoutedAliasUpdateProcessorTest, TestNumericTokenStream, TestDistributedGrouping, TestImplicitCoreProperties, MoveReplicaTest, SearchHandlerTest, PrimitiveFieldTypeTest, TestSolrCoreProperties, BJQParserTest, TestSimNodeLostTrigger, TestCloudPivotFacet, TestFieldCollectionResource, TestStressRecovery, WrapperMergePolicyFactoryTest, ResourceLoaderTest, TestWordDelimiterFilterFactory, TestInPlaceUpdateWithRouteField, TestPartialUpdateDeduplication, TestTermsQParserPlugin, TestCollectionAPIs, TestExactSharedStatsCacheCloud, TestDeleteCollectionOnDownNodes, NumericFieldsTest, TestReplicaProperties, DeleteReplicaTest, UniqFieldsUpdateProcessorFactoryTest, TestWriterPerf, SolrIndexMetricsTest, TestCustomStream, SolrLogAuditLoggerPluginTest, TestRetrieveFieldsOptimizer, TestLegacyFieldCache, CoreAdminOperationTest, TestUseDocValuesAsStored, HdfsBasicDistributedZk2Test, BooleanFieldTest, HdfsRecoverLeaseTest, DistributedFacetPivotWhiteBoxTest, AnalysisErrorHandlingTest, TestAnalyzedSuggestions, TestHighFrequencyDictionaryFactory, TestReloadDeadlock, DistributedFacetSimpleRefinementLongTailTest, TestSafeXMLParsing, TestShardHandlerFactory, TestManagedSchema, HdfsDirectoryTest, HdfsDirectoryFactoryTest, VersionInfoTest, TestTlogReplayVsRecovery, TestIBSimilarityFactory, JWTAuthPluginTest, ChaosMonkeyNothingIsSafeTest, OrderedExecutorTest, TestSegmentSorting, PeerSyncWithLeaderTest, TestSolrCloudWithKerberosAlt, AnalysisAfterCoreReloadTest, ConvertedLegacyTest, DisMaxRequestHandlerTest, SampleTest, TestSolrTestCaseJ4, TestTolerantSearch, TestTrie, PathHierarchyTokenizerFactoryTest, TestLuceneIndexBackCompat, ActionThrottleTest, AliasIntegrationTest, CollectionsAPISolrJTest, RecoveryAfterSoftCommitTest, RemoteQueryErrorTest, ReplicationFactorTest, TestCloudInspectUtil, TestCloudPhrasesIdentificationComponent, TestCloudRecovery, TestClusterProperties, TestConfigSetsAPI, TestExclusionRuleCollectionAccess, TestLRUStatsCacheCloud, TestPullReplicaErrorHandling, TestSSLRandomization, TestStressLiveNodes, CollectionsAPIAsyncDistributedZkTest, AutoAddReplicasPlanActionTest, ComputePlanActionTest, TestInfoStreamLogging, TestInitParams, TestJmxIntegration, TestMergePolicyConfig, TestNRTOpen, TestQuerySenderListener, TestQuerySenderNoQuery, TestReloadAndDeleteDocs, TestSolrDeletionPolicy1, TestSolrDeletionPolicy2, TestSolrIndexConfig, FieldAnalysisRequestHandlerTest, TestConfigReload]
   [junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestConfigReload -Dtests.seed=A2482538A05583EB -Dtests.slow=true -Dtests.locale=kw -Dtests.timezone=Africa/Bamako -Dtests.asserts=true -Dtests.file.encoding=ISO-8859-1
   [junit4] ERROR   0.00s J4 | TestConfigReload (suite) <<<
   [junit4]    > Throwable #1: com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from SUITE scope at org.apache.solr.handler.TestConfigReload: 
   [junit4]    >    1) Thread[id=26717, name=SessionTracker, state=TIMED_WAITING, group=TGRP-TestConfigReload]
   [junit4]    >         at java.base@13.0.1/java.lang.Thread.sleep(Native Method)
   [junit4]    >         at app//org.apache.zookeeper.server.SessionTrackerImpl.run(SessionTrackerImpl.java:151)
   [junit4]    > 	at __randomizedtesting.SeedInfo.seed([A2482538A05583EB]:0)
   [junit4] Completed [654/899 (2!)] on J4 in 35.93s, 1 test, 1 error <<< FAILURES!

[...truncated 45844 lines...]
BUILD FAILED
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:634: The following error occurred while executing this line:
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:507: The following error occurred while executing this line:
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:494: Source checkout is dirty (unversioned/missing files) after running tests!!! Offending files:
* solr/licenses/jetty-start-9.4.24.v20191120-shaded.jar.sha1

Total time: 71 minutes 43 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-master-MacOSX (64bit/jdk-12.0.1) - Build # 5536 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-MacOSX/5536/
Java: 64bit/jdk-12.0.1 -XX:+UseCompressedOops -XX:+UseSerialGC

3 tests failed.
FAILED:  org.apache.solr.cloud.autoscaling.SearchRateTriggerTest.testWaitForElapsed

Error Message:
[]

Stack Trace:
java.lang.AssertionError: []
	at __randomizedtesting.SeedInfo.seed([9F32918D10A5218B:773EA39C81D91389]:0)
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.solr.cloud.autoscaling.SearchRateTriggerTest.testWaitForElapsed(SearchRateTriggerTest.java:263)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1754)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:942)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:978)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:992)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:819)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:470)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:951)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:887)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:898)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:826)
	at java.base/java.lang.Thread.run(Thread.java:835)


FAILED:  junit.framework.TestSuite.org.apache.solr.cloud.hdfs.HDFSCollectionsAPITest

Error Message:
20 threads leaked from SUITE scope at org.apache.solr.cloud.hdfs.HDFSCollectionsAPITest:     1) Thread[id=3338, name=nioEventLoopGroup-2-10, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)    2) Thread[id=3345, name=nioEventLoopGroup-3-5, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)    3) Thread[id=3350, name=nioEventLoopGroup-3-10, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)    4) Thread[id=3333, name=nioEventLoopGroup-2-5, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)    5) Thread[id=3348, name=nioEventLoopGroup-3-8, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)    6) Thread[id=3351, name=nioEventLoopGroup-3-11, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)    7) Thread[id=3337, name=nioEventLoopGroup-2-9, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)    8) Thread[id=3347, name=nioEventLoopGroup-3-7, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)    9) Thread[id=3334, name=nioEventLoopGroup-2-6, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)   10) Thread[id=3343, name=nioEventLoopGroup-3-3, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)   11) Thread[id=3344, name=nioEventLoopGroup-3-4, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)   12) Thread[id=3341, name=nioEventLoopGroup-3-1, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)   13) Thread[id=3336, name=nioEventLoopGroup-2-8, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)   14) Thread[id=3331, name=nioEventLoopGroup-2-3, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)   15) Thread[id=3036, name=nioEventLoopGroup-2-1, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)   16) Thread[id=3335, name=nioEventLoopGroup-2-7, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)   17) Thread[id=3342, name=nioEventLoopGroup-3-2, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)   18) Thread[id=3332, name=nioEventLoopGroup-2-4, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)   19) Thread[id=3346, name=nioEventLoopGroup-3-6, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)   20) Thread[id=3340, name=nioEventLoopGroup-2-12, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]         at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)

Stack Trace:
com.carrotsearch.randomizedtesting.ThreadLeakError: 20 threads leaked from SUITE scope at org.apache.solr.cloud.hdfs.HDFSCollectionsAPITest: 
   1) Thread[id=3338, name=nioEventLoopGroup-2-10, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
   2) Thread[id=3345, name=nioEventLoopGroup-3-5, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
   3) Thread[id=3350, name=nioEventLoopGroup-3-10, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
   4) Thread[id=3333, name=nioEventLoopGroup-2-5, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
   5) Thread[id=3348, name=nioEventLoopGroup-3-8, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
   6) Thread[id=3351, name=nioEventLoopGroup-3-11, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
   7) Thread[id=3337, name=nioEventLoopGroup-2-9, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
   8) Thread[id=3347, name=nioEventLoopGroup-3-7, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
   9) Thread[id=3334, name=nioEventLoopGroup-2-6, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
  10) Thread[id=3343, name=nioEventLoopGroup-3-3, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
  11) Thread[id=3344, name=nioEventLoopGroup-3-4, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
  12) Thread[id=3341, name=nioEventLoopGroup-3-1, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
  13) Thread[id=3336, name=nioEventLoopGroup-2-8, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
  14) Thread[id=3331, name=nioEventLoopGroup-2-3, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
  15) Thread[id=3036, name=nioEventLoopGroup-2-1, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
  16) Thread[id=3335, name=nioEventLoopGroup-2-7, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
  17) Thread[id=3342, name=nioEventLoopGroup-3-2, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
  18) Thread[id=3332, name=nioEventLoopGroup-2-4, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
  19) Thread[id=3346, name=nioEventLoopGroup-3-6, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
  20) Thread[id=3340, name=nioEventLoopGroup-2-12, state=RUNNABLE, group=TGRP-HDFSCollectionsAPITest]
        at java.base@12.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@12.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@12.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@12.0.1/java.lang.Thread.run(Thread.java:835)
	at __randomizedtesting.SeedInfo.seed([9F32918D10A5218B]:0)


FAILED:  org.apache.solr.search.FuzzySearchTest.testTooComplex

Error Message:
Unexpected exception type, expected RemoteSolrException but got org.apache.solr.client.solrj.SolrServerException: No live SolrServers available to handle this request:[http://127.0.0.1:63338/solr/c1]

Stack Trace:
junit.framework.AssertionFailedError: Unexpected exception type, expected RemoteSolrException but got org.apache.solr.client.solrj.SolrServerException: No live SolrServers available to handle this request:[http://127.0.0.1:63338/solr/c1]
	at __randomizedtesting.SeedInfo.seed([9F32918D10A5218B:4990DBA90F7612C1]:0)
	at org.apache.lucene.util.LuceneTestCase.expectThrows(LuceneTestCase.java:2752)
	at org.apache.lucene.util.LuceneTestCase.expectThrows(LuceneTestCase.java:2740)
	at org.apache.solr.search.FuzzySearchTest.testTooComplex(FuzzySearchTest.java:64)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1754)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:942)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:978)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:992)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:819)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:470)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:951)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:887)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:898)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:826)
	at java.base/java.lang.Thread.run(Thread.java:835)
Caused by: org.apache.solr.client.solrj.SolrServerException: No live SolrServers available to handle this request:[http://127.0.0.1:63338/solr/c1]
	at org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:345)
	at org.apache.solr.client.solrj.impl.BaseCloudSolrClient.sendRequest(BaseCloudSolrClient.java:1143)
	at org.apache.solr.client.solrj.impl.BaseCloudSolrClient.requestWithRetryOnStaleState(BaseCloudSolrClient.java:906)
	at org.apache.solr.client.solrj.impl.BaseCloudSolrClient.request(BaseCloudSolrClient.java:838)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:207)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.search.FuzzySearchTest.lambda$testTooComplex$0(FuzzySearchTest.java:64)
	at org.apache.lucene.util.LuceneTestCase._expectThrows(LuceneTestCase.java:2870)
	at org.apache.lucene.util.LuceneTestCase.expectThrows(LuceneTestCase.java:2745)
	... 41 more
Caused by: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:63338/solr/c1: Term too complex: headquarters(在日米海軍横須賀基地司令部庁舎/旧横須賀鎮守府会議所・横須賀海軍艦船部)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.impl.LBSolrClient.doRequest(LBSolrClient.java:368)
	at org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:296)
	... 50 more




Build Log:
[...truncated 13388 lines...]
   [junit4] Suite: org.apache.solr.cloud.hdfs.HDFSCollectionsAPITest
   [junit4]   2> 227421 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.hdfs.HDFSCollectionsAPITest_9F32918D10A5218B-001/data-dir-11-001
   [junit4]   2> 227422 WARN  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=1 numCloses=1
   [junit4]   2> 227422 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.SolrTestCaseJ4 Using TrieFields (NUMERIC_POINTS_SYSPROP=false) w/NUMERIC_DOCVALUES_SYSPROP=true
   [junit4]   2> 227423 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason="", value=0.0/0.0, ssl=0.0/0.0, clientAuth=0.0/0.0) w/ MAC_OS_X supressed clientAuth
   [junit4]   2> 227423 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 227423 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.MiniSolrCloudCluster Starting cluster of 2 servers in /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.hdfs.HDFSCollectionsAPITest_9F32918D10A5218B-001/tempDir-001
   [junit4]   2> 227424 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 227424 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 227425 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 227525 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.ZkTestServer start zk server on port:61225
   [junit4]   2> 227525 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:61225
   [junit4]   2> 227525 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:61225
   [junit4]   2> 227525 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 61225
   [junit4]   2> 227528 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 227538 INFO  (zkConnectionManagerCallback-991-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 227538 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 227547 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 227560 INFO  (zkConnectionManagerCallback-993-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 227560 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 227563 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 227578 INFO  (zkConnectionManagerCallback-995-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 227578 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 227701 WARN  (jetty-launcher-996-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 227701 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 227701 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 227701 INFO  (jetty-launcher-996-thread-1) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 12.0.1+12
   [junit4]   2> 227702 WARN  (jetty-launcher-996-thread-2) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 227702 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 227702 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 227702 INFO  (jetty-launcher-996-thread-2) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 12.0.1+12
   [junit4]   2> 227707 INFO  (jetty-launcher-996-thread-2) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 227707 INFO  (jetty-launcher-996-thread-2) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 227707 INFO  (jetty-launcher-996-thread-2) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 227707 INFO  (jetty-launcher-996-thread-2) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@43eebf8b{/solr,null,AVAILABLE}
   [junit4]   2> 227712 INFO  (jetty-launcher-996-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 227712 INFO  (jetty-launcher-996-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 227712 INFO  (jetty-launcher-996-thread-1) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 227714 INFO  (jetty-launcher-996-thread-2) [     ] o.e.j.s.AbstractConnector Started ServerConnector@7f63a639{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:61230}
   [junit4]   2> 227714 INFO  (jetty-launcher-996-thread-2) [     ] o.e.j.s.Server Started @227818ms
   [junit4]   2> 227714 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=61230}
   [junit4]   2> 227715 ERROR (jetty-launcher-996-thread-2) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 227715 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 227715 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 9.0.0
   [junit4]   2> 227715 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 227715 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 227716 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-16T06:58:30.727990Z
   [junit4]   2> 227722 INFO  (jetty-launcher-996-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@63e247bc{/solr,null,AVAILABLE}
   [junit4]   2> 227723 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 227728 INFO  (jetty-launcher-996-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@53839906{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:61231}
   [junit4]   2> 227728 INFO  (jetty-launcher-996-thread-1) [     ] o.e.j.s.Server Started @227832ms
   [junit4]   2> 227728 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/solr, hostPort=61231}
   [junit4]   2> 227730 ERROR (jetty-launcher-996-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 227730 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 227730 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 9.0.0
   [junit4]   2> 227731 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 227731 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 227731 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-16T06:58:30.743144Z
   [junit4]   2> 227734 INFO  (zkConnectionManagerCallback-998-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 227734 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 227736 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.s.SolrDispatchFilter solr.xml found in ZooKeeper. Loading...
   [junit4]   2> 227737 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 227759 INFO  (zkConnectionManagerCallback-1000-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 227759 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 227864 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.s.SolrDispatchFilter solr.xml found in ZooKeeper. Loading...
   [junit4]   2> 228008 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=true]
   [junit4]   2> 228014 WARN  (jetty-launcher-996-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@51a55594[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228014 WARN  (jetty-launcher-996-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@51a55594[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228025 WARN  (jetty-launcher-996-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@7fe24994[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228025 WARN  (jetty-launcher-996-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@7fe24994[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228029 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:61225/solr
   [junit4]   2> 228031 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 228035 INFO  (zkConnectionManagerCallback-1008-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 228035 INFO  (jetty-launcher-996-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 228143 INFO  (jetty-launcher-996-thread-2) [n:127.0.0.1:61230_solr     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 228148 INFO  (zkConnectionManagerCallback-1010-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 228148 INFO  (jetty-launcher-996-thread-2) [n:127.0.0.1:61230_solr     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 228316 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=true]
   [junit4]   2> 228327 WARN  (jetty-launcher-996-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@1414bb42[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228328 WARN  (jetty-launcher-996-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@1414bb42[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228347 WARN  (jetty-launcher-996-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@2a9e6665[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228348 WARN  (jetty-launcher-996-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@2a9e6665[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 228350 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:61225/solr
   [junit4]   2> 228352 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 228362 INFO  (zkConnectionManagerCallback-1018-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 228363 INFO  (jetty-launcher-996-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 228479 INFO  (jetty-launcher-996-thread-1) [n:127.0.0.1:61231_solr     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 228493 INFO  (zkConnectionManagerCallback-1020-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 228493 INFO  (jetty-launcher-996-thread-1) [n:127.0.0.1:61231_solr     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 228599 INFO  (jetty-launcher-996-thread-1) [n:127.0.0.1:61231_solr     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:61231_solr
   [junit4]   2> 228605 INFO  (jetty-launcher-996-thread-1) [n:127.0.0.1:61231_solr     ] o.a.s.c.Overseer Overseer (id=72063526270992392-127.0.0.1:61231_solr-n_0000000000) starting
   [junit4]   2> 228652 INFO  (OverseerStateUpdate-72063526270992392-127.0.0.1:61231_solr-n_0000000000) [n:127.0.0.1:61231_solr     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:61231_solr
   [junit4]   2> 228659 INFO  (jetty-launcher-996-thread-1) [n:127.0.0.1:61231_solr     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:61231_solr
   [junit4]   2> 228670 INFO  (zkCallback-1019-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 228686 INFO  (jetty-launcher-996-thread-1) [n:127.0.0.1:61231_solr     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 228687 WARN  (jetty-launcher-996-thread-1) [n:127.0.0.1:61231_solr     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 228738 INFO  (jetty-launcher-996-thread-1) [n:127.0.0.1:61231_solr     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 228821 INFO  (jetty-launcher-996-thread-1) [n:127.0.0.1:61231_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_61231.solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6e10eaa1
   [junit4]   2> 228887 INFO  (jetty-launcher-996-thread-1) [n:127.0.0.1:61231_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_61231.solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6e10eaa1
   [junit4]   2> 228887 INFO  (jetty-launcher-996-thread-1) [n:127.0.0.1:61231_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_61231.solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6e10eaa1
   [junit4]   2> 228893 INFO  (jetty-launcher-996-thread-1) [n:127.0.0.1:61231_solr     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.hdfs.HDFSCollectionsAPITest_9F32918D10A5218B-001/tempDir-001/node1/.
   [junit4]   2> 228993 INFO  (jetty-launcher-996-thread-2) [n:127.0.0.1:61230_solr     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 229042 INFO  (jetty-launcher-996-thread-2) [n:127.0.0.1:61230_solr     ] o.a.s.c.ZkController Publish node=127.0.0.1:61230_solr as DOWN
   [junit4]   2> 229048 INFO  (jetty-launcher-996-thread-2) [n:127.0.0.1:61230_solr     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 2147483647 transient cores
   [junit4]   2> 229048 INFO  (jetty-launcher-996-thread-2) [n:127.0.0.1:61230_solr     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:61230_solr
   [junit4]   2> 229058 INFO  (zkCallback-1019-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 229059 INFO  (zkCallback-1009-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 229070 INFO  (jetty-launcher-996-thread-2) [n:127.0.0.1:61230_solr     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 229071 WARN  (jetty-launcher-996-thread-2) [n:127.0.0.1:61230_solr     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 229104 INFO  (jetty-launcher-996-thread-2) [n:127.0.0.1:61230_solr     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 229169 INFO  (jetty-launcher-996-thread-2) [n:127.0.0.1:61230_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_61230.solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6e10eaa1
   [junit4]   2> 229210 INFO  (jetty-launcher-996-thread-2) [n:127.0.0.1:61230_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_61230.solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6e10eaa1
   [junit4]   2> 229211 INFO  (jetty-launcher-996-thread-2) [n:127.0.0.1:61230_solr     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_61230.solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6e10eaa1
   [junit4]   2> 229212 INFO  (jetty-launcher-996-thread-2) [n:127.0.0.1:61230_solr     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.hdfs.HDFSCollectionsAPITest_9F32918D10A5218B-001/tempDir-001/node2/.
   [junit4]   2> 229293 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.MiniSolrCloudCluster waitForAllNodes: numServers=2
   [junit4]   2> 229296 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 229304 INFO  (zkConnectionManagerCallback-1031-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 229304 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 229314 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2)
   [junit4]   2> 229325 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:61225/solr ready
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 232476 WARN  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 232546 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 12.0.1+12
   [junit4]   2> 232558 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 232558 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 232558 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 232580 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@6eafe9cd{static,/static,jar:file:/Users/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 233065 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@7b9acf91{hdfs,/,file:///Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/jetty-localhost-61252-hadoop-hdfs-3_2_0-tests_jar-_-any-1579350662329924737.dir/webapp/,AVAILABLE}{jar:file:/Users/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 233067 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@35e1e6ab{HTTP/1.1,[http/1.1]}{localhost:61252}
   [junit4]   2> 233067 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.Server Started @233171ms
   [junit4]   2> 235485 WARN  (StorageLocationChecker thread 0) [     ] o.a.h.u.NativeCodeLoader Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
   [junit4]   2> 235819 WARN  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 235845 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 12.0.1+12
   [junit4]   2> 235863 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 235865 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 235865 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 235867 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@244fcb38{static,/static,jar:file:/Users/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 236226 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@6e648666{datanode,/,file:///Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/jetty-localhost-61278-hadoop-hdfs-3_2_0-tests_jar-_-any-9963638945085216981.dir/webapp/,AVAILABLE}{jar:file:/Users/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 236231 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@191e3eb7{HTTP/1.1,[http/1.1]}{localhost:61278}
   [junit4]   2> 236231 INFO  (SUITE-HDFSCollectionsAPITest-seed#[9F32918D10A5218B]-worker) [     ] o.e.j.s.Server Started @236335ms
   [junit4]   2> 241282 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x5282bae8635312e2: Processing first storage report for DS-6bbc2b75-9741-4095-a9ed-6de737e81c8b from datanode a963ffdc-3a36-4e13-88e1-3e919a88fca1
   [junit4]   2> 241287 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x5282bae8635312e2: from storage DS-6bbc2b75-9741-4095-a9ed-6de737e81c8b node DatanodeRegistration(127.0.0.1:61276, datanodeUuid=a963ffdc-3a36-4e13-88e1-3e919a88fca1, infoPort=61291, infoSecurePort=0, ipcPort=61294, storageInfo=lv=-57;cid=testClusterID;nsid=36111219;c=1579157914702), blocks: 0, hasStaleStorage: true, processing time: 4 msecs, invalidatedBlocks: 0
   [junit4]   2> 241288 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x5282bae8635312e2: Processing first storage report for DS-d843d8ab-ea1c-442a-8d1c-7dff92b1dfad from datanode a963ffdc-3a36-4e13-88e1-3e919a88fca1
   [junit4]   2> 241288 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x5282bae8635312e2: from storage DS-d843d8ab-ea1c-442a-8d1c-7dff92b1dfad node DatanodeRegistration(127.0.0.1:61276, datanodeUuid=a963ffdc-3a36-4e13-88e1-3e919a88fca1, infoPort=61291, infoSecurePort=0, ipcPort=61294, storageInfo=lv=-57;cid=testClusterID;nsid=36111219;c=1579157914702), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 241696 INFO  (TEST-HDFSCollectionsAPITest.testDataDirIsNotReused-seed#[9F32918D10A5218B]) [     ] o.a.s.SolrTestCaseJ4 ###Starting testDataDirIsNotReused
   [junit4]   2> 241702 INFO  (qtp1564765406-2926) [n:127.0.0.1:61231_solr     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=test&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:61231_solr&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 241725 INFO  (OverseerThreadFactory-820-thread-1-processing-n:127.0.0.1:61231_solr) [n:127.0.0.1:61231_solr     ] o.a.s.c.a.c.CreateCollectionCmd Create collection test
   [junit4]   2> 241871 INFO  (OverseerStateUpdate-72063526270992392-127.0.0.1:61231_solr-n_0000000000) [n:127.0.0.1:61231_solr     ] o.a.s.c.o.SliceMutator createReplica() {
   [junit4]   2>   "operation":"ADDREPLICA",
   [junit4]   2>   "collection":"test",
   [junit4]   2>   "shard":"shard1",
   [junit4]   2>   "core":"test_shard1_replica_n1",
   [junit4]   2>   "state":"down",
   [junit4]   2>   "base_url":"http://127.0.0.1:61231/solr",
   [junit4]   2>   "type":"NRT",
   [junit4]   2>   "waitForFinalState":"false"} 
   [junit4]   2> 242089 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr    x:test_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&coreNodeName=core_node2&collection.configName=conf1&newCollection=true&name=test_shard1_replica_n1&action=CREATE&numShards=1&collection=test&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 242090 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr    x:test_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 2147483647 transient cores
   [junit4]   2> 243183 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 243200 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.s.IndexSchema [test_shard1_replica_n1] Schema name=minimal
   [junit4]   2> 243203 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema minimal/1.1 with uniqueid field id
   [junit4]   2> 243203 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'test_shard1_replica_n1' using configuration from collection test, trusted=true
   [junit4]   2> 243206 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr_61231.solr.core.test.shard1.replica_n1' (registry 'solr.core.test.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6e10eaa1
   [junit4]   2> 243234 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost:61271/solr_hdfs_home
   [junit4]   2> 243235 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 243235 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.SolrCore [[test_shard1_replica_n1] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.hdfs.HDFSCollectionsAPITest_9F32918D10A5218B-001/tempDir-001/node1/test_shard1_replica_n1], dataDir=[hdfs://localhost:61271/solr_hdfs_home/test/core_node2/data/]
   [junit4]   2> 243243 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost:61271/solr_hdfs_home/test/core_node2/data/snapshot_metadata
   [junit4]   2> 243273 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 243273 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 243273 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 243388 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 243401 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost:61271/solr_hdfs_home/test/core_node2/data
   [junit4]   2> 243467 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost:61271/solr_hdfs_home/test/core_node2/data/index
   [junit4]   2> 243482 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 243482 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 243482 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 243535 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 244546 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 244546 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir=null defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 244546 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=3
   [junit4]   2> 244577 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 244577 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 244739 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@18501067[test_shard1_replica_n1] main]
   [junit4]   2> 244752 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 244754 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 244767 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 244773 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1655867103285084160
   [junit4]   2> 244783 INFO  (searcherExecutor-829-thread-1-processing-n:127.0.0.1:61231_solr x:test_shard1_replica_n1 c:test s:shard1 r:core_node2) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.SolrCore [test_shard1_replica_n1] Registered new searcher Searcher@18501067[test_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 244813 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/test/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 244815 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/test/leaders/shard1
   [junit4]   2> 244844 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 244844 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 244844 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:61231/solr/test_shard1_replica_n1/
   [junit4]   2> 244847 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 244849 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:61231/solr/test_shard1_replica_n1/ has no replicas
   [junit4]   2> 244849 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/test/leaders/shard1/leader after winning as /collections/test/leader_elect/shard1/election/72063526270992392-core_node2-n_0000000000
   [junit4]   2> 244861 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:61231/solr/test_shard1_replica_n1/ shard1
   [junit4]   2> 244872 INFO  (zkCallback-1019-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/test/state.json] for collection [test] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 244872 INFO  (zkCallback-1019-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/test/state.json] for collection [test] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 244878 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 244900 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&coreNodeName=core_node2&collection.configName=conf1&newCollection=true&name=test_shard1_replica_n1&action=CREATE&numShards=1&collection=test&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2813
   [junit4]   2> 244930 INFO  (qtp1564765406-2926) [n:127.0.0.1:61231_solr     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 245000 INFO  (zkCallback-1019-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/test/state.json] for collection [test] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 245001 INFO  (zkCallback-1019-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/test/state.json] for collection [test] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 245002 INFO  (zkCallback-1019-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/test/state.json] for collection [test] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 245004 INFO  (qtp1564765406-2926) [n:127.0.0.1:61231_solr     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=test&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:61231_solr&wt=javabin&version=2} status=0 QTime=3303
   [junit4]   2> 245005 INFO  (TEST-HDFSCollectionsAPITest.testDataDirIsNotReused-seed#[9F32918D10A5218B]) [     ] o.a.s.c.SolrCloudTestCase waitForState (test): 
   [junit4]   2> 245008 INFO  (TEST-HDFSCollectionsAPITest.testDataDirIsNotReused-seed#[9F32918D10A5218B]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 1 expected replica count: 1
   [junit4]   2> 245010 INFO  (TEST-HDFSCollectionsAPITest.testDataDirIsNotReused-seed#[9F32918D10A5218B]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 1 expected replica count: 1
   [junit4]   2> 245014 INFO  (TEST-HDFSCollectionsAPITest.testDataDirIsNotReused-seed#[9F32918D10A5218B]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 1 expected replica count: 1
   [junit4]   2> 245075 INFO  (qtp1564765406-2925) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/test/terms/shard1 to Terms{values={core_node2=1}, version=1}
   [junit4]   2> 245108 INFO  (qtp1564765406-2925) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.u.p.LogUpdateProcessorFactory [test_shard1_replica_n1]  webapp=/solr path=/update params={wt=javabin&version=2}{add=[1 (1655867103559811072)]} 0 87
   [junit4]   2> 245118 INFO  (qtp1564765406-2927) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.u.p.LogUpdateProcessorFactory [test_shard1_replica_n1]  webapp=/solr path=/update params={wt=javabin&version=2}{add=[2 (1655867103641600000)]} 0 7
   [junit4]   2> 245131 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.u.DirectUpdateHandler2 start commit{_version_=1655867103658377216,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
   [junit4]   2> 245131 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.u.SolrIndexWriter Calling setCommitData with IW:org.apache.solr.update.SolrIndexWriter@455ad80e commitCommandVersion:1655867103658377216
   [junit4]   2> 245986 INFO  (OverseerCollectionConfigSetProcessor-72063526270992392-127.0.0.1:61231_solr-n_0000000000) [n:127.0.0.1:61231_solr     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 248276 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@6f53a13f[test_shard1_replica_n1] main]
   [junit4]   2> 248283 INFO  (searcherExecutor-829-thread-1-processing-n:127.0.0.1:61231_solr x:test_shard1_replica_n1 c:test s:shard1 r:core_node2) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.c.SolrCore [test_shard1_replica_n1] Registered new searcher Searcher@6f53a13f[test_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader(Uninverting(_0(9.0.0):C2:[diagnostics={timestamp=1579157928753, java.vendor=AdoptOpenJDK, os=Mac OS X, os.version=10.14.6, java.runtime.version=12.0.1+12, os.arch=x86_64, source=flush, lucene.version=9.0.0, java.vm.version=12.0.1+12, java.version=12.0.1}]:[attributes={Lucene50StoredFieldsFormat.mode=BEST_SPEED}])))}
   [junit4]   2> 248291 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.u.DirectUpdateHandler2 end_commit_flush
   [junit4]   2> 248291 INFO  (qtp1564765406-2928) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.u.p.LogUpdateProcessorFactory [test_shard1_replica_n1]  webapp=/solr path=/update params={_stateVer_=test:4&waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 3166
   [junit4]   2> 248332 INFO  (qtp1564765406-2926) [n:127.0.0.1:61231_solr c:test s:shard1 r:core_node2 x:test_shard1_replica_n1 ] o.a.s.u.p.LogUpdateProcessorFactory [test_shard1_replica_n1]  webapp=/solr path=/update params={wt=javabin&version=2}{add=[3 (1655867106977120256)]} 0 37
   [junit4]   2> 248333 INFO  (TEST-HDFSCollectionsAPITest.testDataDirIsNotReused-seed#[9F32918D10A5218B]) [     ] o.a.s.c.

[...truncated too long message...]

 (3)
   [junit4]   2> 561141 INFO  (jetty-closer-1969-thread-1) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty, tag=null
   [junit4]   2> 561142 INFO  (jetty-closer-1969-thread-1) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@7200a96b: rootName = solr_63404, domain = solr.jetty, service url = null, agent id = null] for registry solr.jetty / com.codahale.metrics.MetricRegistry@2a5ac250
   [junit4]   2> 561142 INFO  (jetty-closer-1969-thread-1) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster, tag=null
   [junit4]   2> 561147 INFO  (closeThreadPool-1983-thread-2) [     ] o.a.s.c.Overseer Overseer (id=72063547951087630-127.0.0.1:63404_solr-n_0000000000) closing
   [junit4]   2> 561147 INFO  (jetty-closer-1969-thread-2) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty, tag=null
   [junit4]   2> 561147 INFO  (jetty-closer-1969-thread-2) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@1e6e967: rootName = solr_63401, domain = solr.jetty, service url = null, agent id = null] for registry solr.jetty / com.codahale.metrics.MetricRegistry@2a5ac250
   [junit4]   2> 561147 INFO  (jetty-closer-1969-thread-2) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster, tag=null
   [junit4]   2> 561147 INFO  (OverseerStateUpdate-72063547951087630-127.0.0.1:63404_solr-n_0000000000) [n:127.0.0.1:63404_solr     ] o.a.s.c.Overseer Overseer Loop exiting : 127.0.0.1:63404_solr
   [junit4]   2> 561148 INFO  (OverseerAutoScalingTriggerThread-72063547951087630-127.0.0.1:63404_solr-n_0000000000) [     ] o.a.s.c.a.OverseerTriggerThread OverseerTriggerThread woken up but we are closed, exiting.
   [junit4]   2> 561148 INFO  (jetty-closer-1969-thread-4) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty, tag=null
   [junit4]   2> 561148 INFO  (jetty-closer-1969-thread-4) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@992fe64: rootName = solr_63402, domain = solr.jetty, service url = null, agent id = null] for registry solr.jetty / com.codahale.metrics.MetricRegistry@2a5ac250
   [junit4]   2> 561148 INFO  (jetty-closer-1969-thread-4) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster, tag=null
   [junit4]   2> 561179 INFO  (closeThreadPool-1983-thread-1) [     ] o.a.s.c.Overseer Overseer (id=72063547951087630-127.0.0.1:63404_solr-n_0000000000) closing
   [junit4]   2> 561207 INFO  (jetty-closer-1969-thread-3) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@269714ff{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:۰}
   [junit4]   2> 561208 INFO  (jetty-closer-1969-thread-3) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@6b2d328a{/solr,null,UNAVAILABLE}
   [junit4]   2> 561209 INFO  (jetty-closer-1969-thread-3) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 561274 INFO  (zkCallback-1940-thread-2) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (1)
   [junit4]   2> 561278 INFO  (jetty-closer-1969-thread-2) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@3524c126{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:۰}
   [junit4]   2> 561279 INFO  (jetty-closer-1969-thread-2) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@4bae8893{/solr,null,UNAVAILABLE}
   [junit4]   2> 561280 INFO  (jetty-closer-1969-thread-2) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 561303 INFO  (jetty-closer-1969-thread-1) [     ] o.a.s.c.Overseer Overseer (id=72063547951087630-127.0.0.1:63404_solr-n_0000000000) closing
   [junit4]   2> 561309 INFO  (jetty-closer-1969-thread-1) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@799da3ac{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:۰}
   [junit4]   2> 561310 INFO  (jetty-closer-1969-thread-1) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@12ff1ede{/solr,null,UNAVAILABLE}
   [junit4]   2> 561310 INFO  (jetty-closer-1969-thread-1) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 561383 INFO  (jetty-closer-1969-thread-4) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@6fe62a01{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:۰}
   [junit4]   2> 561384 INFO  (jetty-closer-1969-thread-4) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@22984c01{/solr,null,UNAVAILABLE}
   [junit4]   2> 561384 INFO  (jetty-closer-1969-thread-4) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 561390 INFO  (TEST-SearchRateTriggerTest.testDefaultsAndBackcompat-seed#[9F32918D10A5218B]) [     ] o.a.s.c.ZkTestServer Shutting down ZkTestServer.
   [junit4]   2> 561621 WARN  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Watch limit violations: 
   [junit4]   2> Maximum concurrent create/delete watches above limit:
   [junit4]   2> 
   [junit4]   2> 	5	/solr/aliases.json
   [junit4]   2> 	5	/solr/clusterprops.json
   [junit4]   2> 	4	/solr/packages.json
   [junit4]   2> 	4	/solr/security.json
   [junit4]   2> 
   [junit4]   2> Maximum concurrent data watches above limit:
   [junit4]   2> 
   [junit4]   2> 	5	/solr/clusterstate.json
   [junit4]   2> 	2	/solr/overseer_elect/election/72063547951087629-127.0.0.1:63401_solr-n_0000000001
   [junit4]   2> 
   [junit4]   2> Maximum concurrent children watches above limit:
   [junit4]   2> 
   [junit4]   2> 	20	/solr/live_nodes
   [junit4]   2> 	9	/solr/overseer/queue
   [junit4]   2> 	5	/solr/collections
   [junit4]   2> 
   [junit4]   2> 561630 INFO  (TEST-SearchRateTriggerTest.testDefaultsAndBackcompat-seed#[9F32918D10A5218B]) [     ] o.a.s.c.ZkTestServer waitForServerDown: 127.0.0.1:63395
   [junit4]   2> 561630 INFO  (TEST-SearchRateTriggerTest.testDefaultsAndBackcompat-seed#[9F32918D10A5218B]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:63395
   [junit4]   2> 561630 INFO  (TEST-SearchRateTriggerTest.testDefaultsAndBackcompat-seed#[9F32918D10A5218B]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 63395
   [junit4]   2> 561631 INFO  (TEST-SearchRateTriggerTest.testDefaultsAndBackcompat-seed#[9F32918D10A5218B]) [     ] o.a.s.SolrTestCaseJ4 ###Ending testDefaultsAndBackcompat
   [junit4] IGNOR/A 0.00s J3 | SearchRateTriggerTest.testTrigger
   [junit4]    > Assumption #1: 'awaitsfix' test group is disabled (@AwaitsFix(bugUrl="https://issues.apache.org/jira/browse/SOLR-12028"))
   [junit4]   2> NOTE: leaving temporary files on disk at: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.autoscaling.SearchRateTriggerTest_9F32918D10A5218B-001
   [junit4]   2> Jan 16, 2020 7:04:04 AM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 1 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene84): {}, docValues:{}, maxPointsInLeafNode=1604, maxMBSortInHeap=6.407041591261064, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@1c53ffb7), locale=lrc-IQ, timezone=NZ-CHAT
   [junit4]   2> NOTE: Mac OS X 10.14.6 x86_64/AdoptOpenJDK 12.0.1 (64-bit)/cpus=6,threads=1,free=108386048,total=280064000
   [junit4]   2> NOTE: All tests run in this JVM: [TestMiniSolrCloudClusterSSL, CacheHeaderTest, TestPseudoReturnFields, ShowFileRequestHandlerTest, NodeAddedTriggerIntegrationTest, TestDynamicFieldNamesIndexCorrectly, HighlighterMaxOffsetTest, TestCloudSearcherWarming, SolrJmxReporterCloudTest, TestAnalyzeInfixSuggestions, HDFSCollectionsAPITest, TestGroupingSearch, AddBlockUpdateTest, HLLUtilTest, CoreAdminHandlerTest, SortByFunctionTest, TestCorePropertiesReload, TaggerTest, TestCloudJSONFacetSKG, TestBulkSchemaConcurrent, EnumFieldTest, FieldMutatingUpdateProcessorTest, BadIndexSchemaTest, PeerSyncWithLeaderAndIndexFingerprintCachingTest, TestMultiWordSynonyms, BufferStoreTest, ParsingFieldUpdateProcessorsTest, MergeStrategyTest, AutoScalingHandlerTest, SimpleCollectionCreateDeleteTest, TestCursorMarkWithoutUniqueKey, TestUnInvertedFieldException, TestXmlQParserPlugin, CollectionPropsTest, UUIDUpdateProcessorFallbackTest, TestScoreJoinQPNoScore, TriLevelCompositeIdRoutingTest, MetricsHandlerTest, PathHierarchyTokenizerFactoryTest, SearchRateTriggerTest]
   [junit4] Completed [156/899 (3!)] on J3 in 23.37s, 3 tests, 1 failure, 1 skipped <<< FAILURES!

[...truncated 47534 lines...]
BUILD FAILED
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:634: The following error occurred while executing this line:
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:507: The following error occurred while executing this line:
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:494: Source checkout is dirty (unversioned/missing files) after running tests!!! Offending files:
* solr/licenses/jetty-start-9.4.24.v20191120-shaded.jar.sha1

Total time: 77 minutes 59 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-master-MacOSX (64bit/jdk-13.0.1) - Build # 5535 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-MacOSX/5535/
Java: 64bit/jdk-13.0.1 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC

3 tests failed.
FAILED:  junit.framework.TestSuite.org.apache.solr.cloud.TestSSLRandomization

Error Message:
1 thread leaked from SUITE scope at org.apache.solr.cloud.TestSSLRandomization:     1) Thread[id=19678, name=SessionTracker, state=TIMED_WAITING, group=TGRP-TestSSLRandomization]         at java.base@13.0.1/java.lang.Thread.sleep(Native Method)         at app//org.apache.zookeeper.server.SessionTrackerImpl.run(SessionTrackerImpl.java:151)

Stack Trace:
com.carrotsearch.randomizedtesting.ThreadLeakError: 1 thread leaked from SUITE scope at org.apache.solr.cloud.TestSSLRandomization: 
   1) Thread[id=19678, name=SessionTracker, state=TIMED_WAITING, group=TGRP-TestSSLRandomization]
        at java.base@13.0.1/java.lang.Thread.sleep(Native Method)
        at app//org.apache.zookeeper.server.SessionTrackerImpl.run(SessionTrackerImpl.java:151)
	at __randomizedtesting.SeedInfo.seed([720DA66893576D9E]:0)


FAILED:  org.apache.solr.search.FuzzySearchTest.testTooComplex

Error Message:
Unexpected exception type, expected RemoteSolrException but got org.apache.solr.client.solrj.SolrServerException: No live SolrServers available to handle this request:[http://127.0.0.1:54182/solr/c1]

Stack Trace:
junit.framework.AssertionFailedError: Unexpected exception type, expected RemoteSolrException but got org.apache.solr.client.solrj.SolrServerException: No live SolrServers available to handle this request:[http://127.0.0.1:54182/solr/c1]
	at __randomizedtesting.SeedInfo.seed([720DA66893576D9E:A4AFEC4C8C845ED4]:0)
	at org.apache.lucene.util.LuceneTestCase.expectThrows(LuceneTestCase.java:2752)
	at org.apache.lucene.util.LuceneTestCase.expectThrows(LuceneTestCase.java:2740)
	at org.apache.solr.search.FuzzySearchTest.testTooComplex(FuzzySearchTest.java:64)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1754)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:942)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:978)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:992)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:819)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:470)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:951)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:887)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:898)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:826)
	at java.base/java.lang.Thread.run(Thread.java:830)
Caused by: org.apache.solr.client.solrj.SolrServerException: No live SolrServers available to handle this request:[http://127.0.0.1:54182/solr/c1]
	at org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:345)
	at org.apache.solr.client.solrj.impl.BaseCloudSolrClient.sendRequest(BaseCloudSolrClient.java:1143)
	at org.apache.solr.client.solrj.impl.BaseCloudSolrClient.requestWithRetryOnStaleState(BaseCloudSolrClient.java:906)
	at org.apache.solr.client.solrj.impl.BaseCloudSolrClient.request(BaseCloudSolrClient.java:838)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:207)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.search.FuzzySearchTest.lambda$testTooComplex$0(FuzzySearchTest.java:64)
	at org.apache.lucene.util.LuceneTestCase._expectThrows(LuceneTestCase.java:2870)
	at org.apache.lucene.util.LuceneTestCase.expectThrows(LuceneTestCase.java:2745)
	... 41 more
Caused by: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:54182/solr/c1: Term too complex: headquarters(在日米海軍横須賀基地司令部庁舎/旧横須賀鎮守府会議所・横須賀海軍艦船部)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.impl.LBSolrClient.doRequest(LBSolrClient.java:368)
	at org.apache.solr.client.solrj.impl.LBSolrClient.request(LBSolrClient.java:296)
	... 50 more


FAILED:  junit.framework.TestSuite.org.apache.solr.store.hdfs.HdfsDirectoryTest

Error Message:
48 threads leaked from SUITE scope at org.apache.solr.store.hdfs.HdfsDirectoryTest:     1) Thread[id=3075, name=nioEventLoopGroup-3-10, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)    2) Thread[id=3043, name=nioEventLoopGroup-5-1, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)    3) Thread[id=3047, name=nioEventLoopGroup-5-5, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)    4) Thread[id=3056, name=nioEventLoopGroup-2-3, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)    5) Thread[id=3059, name=nioEventLoopGroup-2-6, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)    6) Thread[id=3035, name=nioEventLoopGroup-4-5, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)    7) Thread[id=3033, name=nioEventLoopGroup-4-3, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)    8) Thread[id=3073, name=nioEventLoopGroup-3-8, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)    9) Thread[id=3060, name=nioEventLoopGroup-2-7, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   10) Thread[id=3034, name=nioEventLoopGroup-4-4, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   11) Thread[id=3048, name=nioEventLoopGroup-5-6, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   12) Thread[id=3072, name=nioEventLoopGroup-3-7, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   13) Thread[id=3039, name=nioEventLoopGroup-4-9, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   14) Thread[id=3052, name=nioEventLoopGroup-5-10, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   15) Thread[id=3068, name=nioEventLoopGroup-3-3, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   16) Thread[id=3055, name=nioEventLoopGroup-2-2, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   17) Thread[id=3049, name=nioEventLoopGroup-5-7, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   18) Thread[id=3036, name=nioEventLoopGroup-4-6, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   19) Thread[id=3067, name=nioEventLoopGroup-3-2, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   20) Thread[id=3044, name=nioEventLoopGroup-5-2, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   21) Thread[id=3063, name=nioEventLoopGroup-2-10, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   22) Thread[id=3070, name=nioEventLoopGroup-3-5, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   23) Thread[id=3051, name=nioEventLoopGroup-5-9, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   24) Thread[id=3057, name=nioEventLoopGroup-2-4, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   25) Thread[id=3071, name=nioEventLoopGroup-3-6, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   26) Thread[id=3076, name=nioEventLoopGroup-3-11, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   27) Thread[id=3046, name=nioEventLoopGroup-5-4, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   28) Thread[id=3054, name=nioEventLoopGroup-5-12, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   29) Thread[id=3065, name=nioEventLoopGroup-2-12, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   30) Thread[id=3042, name=nioEventLoopGroup-4-12, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   31) Thread[id=3032, name=nioEventLoopGroup-4-2, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   32) Thread[id=3066, name=nioEventLoopGroup-3-1, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   33) Thread[id=3050, name=nioEventLoopGroup-5-8, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   34) Thread[id=3053, name=nioEventLoopGroup-5-11, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   35) Thread[id=3074, name=nioEventLoopGroup-3-9, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   36) Thread[id=3038, name=nioEventLoopGroup-4-8, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   37) Thread[id=3062, name=nioEventLoopGroup-2-9, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   38) Thread[id=3064, name=nioEventLoopGroup-2-11, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   39) Thread[id=3037, name=nioEventLoopGroup-4-7, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   40) Thread[id=3058, name=nioEventLoopGroup-2-5, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   41) Thread[id=3077, name=nioEventLoopGroup-3-12, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   42) Thread[id=2888, name=nioEventLoopGroup-4-1, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   43) Thread[id=2858, name=nioEventLoopGroup-2-1, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   44) Thread[id=3045, name=nioEventLoopGroup-5-3, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   45) Thread[id=3040, name=nioEventLoopGroup-4-10, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   46) Thread[id=3069, name=nioEventLoopGroup-3-4, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   47) Thread[id=3061, name=nioEventLoopGroup-2-8, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)   48) Thread[id=3041, name=nioEventLoopGroup-4-11, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]         at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)         at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)         at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)         at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)         at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)         at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)         at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)         at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)

Stack Trace:
com.carrotsearch.randomizedtesting.ThreadLeakError: 48 threads leaked from SUITE scope at org.apache.solr.store.hdfs.HdfsDirectoryTest: 
   1) Thread[id=3075, name=nioEventLoopGroup-3-10, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
   2) Thread[id=3043, name=nioEventLoopGroup-5-1, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
   3) Thread[id=3047, name=nioEventLoopGroup-5-5, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
   4) Thread[id=3056, name=nioEventLoopGroup-2-3, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
   5) Thread[id=3059, name=nioEventLoopGroup-2-6, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
   6) Thread[id=3035, name=nioEventLoopGroup-4-5, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
   7) Thread[id=3033, name=nioEventLoopGroup-4-3, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
   8) Thread[id=3073, name=nioEventLoopGroup-3-8, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
   9) Thread[id=3060, name=nioEventLoopGroup-2-7, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  10) Thread[id=3034, name=nioEventLoopGroup-4-4, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  11) Thread[id=3048, name=nioEventLoopGroup-5-6, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  12) Thread[id=3072, name=nioEventLoopGroup-3-7, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  13) Thread[id=3039, name=nioEventLoopGroup-4-9, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  14) Thread[id=3052, name=nioEventLoopGroup-5-10, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  15) Thread[id=3068, name=nioEventLoopGroup-3-3, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  16) Thread[id=3055, name=nioEventLoopGroup-2-2, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  17) Thread[id=3049, name=nioEventLoopGroup-5-7, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  18) Thread[id=3036, name=nioEventLoopGroup-4-6, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  19) Thread[id=3067, name=nioEventLoopGroup-3-2, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  20) Thread[id=3044, name=nioEventLoopGroup-5-2, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  21) Thread[id=3063, name=nioEventLoopGroup-2-10, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  22) Thread[id=3070, name=nioEventLoopGroup-3-5, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  23) Thread[id=3051, name=nioEventLoopGroup-5-9, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  24) Thread[id=3057, name=nioEventLoopGroup-2-4, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  25) Thread[id=3071, name=nioEventLoopGroup-3-6, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  26) Thread[id=3076, name=nioEventLoopGroup-3-11, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  27) Thread[id=3046, name=nioEventLoopGroup-5-4, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  28) Thread[id=3054, name=nioEventLoopGroup-5-12, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  29) Thread[id=3065, name=nioEventLoopGroup-2-12, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  30) Thread[id=3042, name=nioEventLoopGroup-4-12, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  31) Thread[id=3032, name=nioEventLoopGroup-4-2, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  32) Thread[id=3066, name=nioEventLoopGroup-3-1, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  33) Thread[id=3050, name=nioEventLoopGroup-5-8, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  34) Thread[id=3053, name=nioEventLoopGroup-5-11, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  35) Thread[id=3074, name=nioEventLoopGroup-3-9, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  36) Thread[id=3038, name=nioEventLoopGroup-4-8, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  37) Thread[id=3062, name=nioEventLoopGroup-2-9, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  38) Thread[id=3064, name=nioEventLoopGroup-2-11, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  39) Thread[id=3037, name=nioEventLoopGroup-4-7, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  40) Thread[id=3058, name=nioEventLoopGroup-2-5, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  41) Thread[id=3077, name=nioEventLoopGroup-3-12, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  42) Thread[id=2888, name=nioEventLoopGroup-4-1, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  43) Thread[id=2858, name=nioEventLoopGroup-2-1, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  44) Thread[id=3045, name=nioEventLoopGroup-5-3, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  45) Thread[id=3040, name=nioEventLoopGroup-4-10, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  46) Thread[id=3069, name=nioEventLoopGroup-3-4, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  47) Thread[id=3061, name=nioEventLoopGroup-2-8, state=RUNNABLE, group=TGRP-HdfsDirectoryTest]
        at java.base@13.0.1/sun.nio.ch.KQueue.poll(Native Method)
        at java.base@13.0.1/sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:122)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:124)
        at java.base@13.0.1/sun.nio.ch.SelectorImpl.select(SelectorImpl.java:136)
        at app//io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:737)
        at app//io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:392)
        at app//io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
        at app//io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
        at java.base@13.0.1/java.lang.Thread.run(Thread.java:830)
  48) Thread[id=

[...truncated too long message...]

.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-cell/test/temp/junit4-J0-20200116_015427_38510245654676020634944.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-cell/test/temp/junit4-J2-20200116_015427_38514651641036594561048.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 566 lines...]
   [junit4] JVM J0: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-jaegertracer-configurator/test/temp/junit4-J0-20200116_015452_57411612191689792095382.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 572 lines...]
   [junit4] JVM J2: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-langid/test/temp/junit4-J2-20200116_015503_8189506765271181801589.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 9 lines...]
   [junit4] JVM J1: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-langid/test/temp/junit4-J1-20200116_015503_81817462915640820174872.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J3: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-langid/test/temp/junit4-J3-20200116_015503_81917852401936199024043.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

   [junit4] JVM J0: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-langid/test/temp/junit4-J0-20200116_015503_8184421136606338012824.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 680 lines...]
   [junit4] JVM J2: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-ltr/test/temp/junit4-J2-20200116_015517_09517833053195933307232.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J4: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-ltr/test/temp/junit4-J4-20200116_015517_0965542588650647344768.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-ltr/test/temp/junit4-J0-20200116_015517_09410492332988076808181.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-ltr/test/temp/junit4-J1-20200116_015517_0947520605122693768035.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J3: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-ltr/test/temp/junit4-J3-20200116_015517_09512289981042044110885.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 582 lines...]
   [junit4] JVM J0: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-prometheus-exporter/test/temp/junit4-J0-20200116_015623_9182996607878627663237.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-prometheus-exporter/test/temp/junit4-J2-20200116_015623_9181607766582095614068.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 13 lines...]
   [junit4] JVM J1: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-prometheus-exporter/test/temp/junit4-J1-20200116_015623_9184607940980619758513.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J4: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-prometheus-exporter/test/temp/junit4-J4-20200116_015623_91915156505070799515623.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J3: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-prometheus-exporter/test/temp/junit4-J3-20200116_015623_9188549196561415240817.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 569 lines...]
   [junit4] JVM J0: stderr was not empty, see: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/contrib/solr-velocity/test/temp/junit4-J0-20200116_015647_81217948733903131427382.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 35407 lines...]
BUILD FAILED
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:634: The following error occurred while executing this line:
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:507: The following error occurred while executing this line:
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:494: Source checkout is dirty (unversioned/missing files) after running tests!!! Offending files:
* solr/licenses/jetty-start-9.4.24.v20191120-shaded.jar.sha1

Total time: 78 minutes 29 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-master-MacOSX (64bit/jdk-13.0.1) - Build # 5534 - Failure!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-master-MacOSX/5534/
Java: 64bit/jdk-13.0.1 -XX:+UseCompressedOops -XX:+UseParallelGC

1 tests failed.
FAILED:  org.apache.solr.cloud.TestCryptoKeys.test

Error Message:
{   "responseHeader":{     "status":500,     "QTime":30077},   "errorMessages":["3 out of 5 the property overlay to be of version 4 within 30 seconds! Failed cores: [http://127.0.0.1:54186/collection1_shard1_replica_n3/, http://127.0.0.1:54200/collection1_shard1_replica_n5/, http://127.0.0.1:54204/collection1_shard1_replica_n7/]\n"],   "WARNING":"This response format is experimental.  It is likely to change in the future.",   "error":{     "metadata":[       "error-class","org.apache.solr.common.SolrException",       "root-error-class","org.apache.solr.common.SolrException"],     "msg":"3 out of 5 the property overlay to be of version 4 within 30 seconds! Failed cores: [http://127.0.0.1:54186/collection1_shard1_replica_n3/, http://127.0.0.1:54200/collection1_shard1_replica_n5/, http://127.0.0.1:54204/collection1_shard1_replica_n7/]",     "trace":"org.apache.solr.common.SolrException: 3 out of 5 the property overlay to be of version 4 within 30 seconds! Failed cores: [http://127.0.0.1:54186/collection1_shard1_replica_n3/, http://127.0.0.1:54200/collection1_shard1_replica_n5/, http://127.0.0.1:54204/collection1_shard1_replica_n7/]\n\tat org.apache.solr.handler.SolrConfigHandler.waitForAllReplicasState(SolrConfigHandler.java:813)\n\tat org.apache.solr.handler.SolrConfigHandler$Command.handleCommands(SolrConfigHandler.java:522)\n\tat org.apache.solr.handler.SolrConfigHandler$Command.handlePOST(SolrConfigHandler.java:363)\n\tat org.apache.solr.handler.SolrConfigHandler.handleRequestBody(SolrConfigHandler.java:139)\n\tat org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:208)\n\tat org.apache.solr.core.SolrCore.execute(SolrCore.java:2582)\n\tat org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:799)\n\tat org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:578)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)\n\tat org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)\n\tat org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)\n\tat org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1607)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1297)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)\n\tat org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)\n\tat org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1577)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1212)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n\tat org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)\n\tat org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:717)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n\tat org.eclipse.jetty.server.Server.handle(Server.java:500)\n\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)\n\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)\n\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)\n\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:270)\n\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\n\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)\n\tat org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)\n\tat java.base/java.lang.Thread.run(Thread.java:830)\n",     "code":500}}  expected null, but was:<[3 out of 5 the property overlay to be of version 4 within 30 seconds! Failed cores: [http://127.0.0.1:54186/collection1_shard1_replica_n3/, http://127.0.0.1:54200/collection1_shard1_replica_n5/, http://127.0.0.1:54204/collection1_shard1_replica_n7/] ]>

Stack Trace:
java.lang.AssertionError: {
  "responseHeader":{
    "status":500,
    "QTime":30077},
  "errorMessages":["3 out of 5 the property overlay to be of version 4 within 30 seconds! Failed cores: [http://127.0.0.1:54186/collection1_shard1_replica_n3/, http://127.0.0.1:54200/collection1_shard1_replica_n5/, http://127.0.0.1:54204/collection1_shard1_replica_n7/]\n"],
  "WARNING":"This response format is experimental.  It is likely to change in the future.",
  "error":{
    "metadata":[
      "error-class","org.apache.solr.common.SolrException",
      "root-error-class","org.apache.solr.common.SolrException"],
    "msg":"3 out of 5 the property overlay to be of version 4 within 30 seconds! Failed cores: [http://127.0.0.1:54186/collection1_shard1_replica_n3/, http://127.0.0.1:54200/collection1_shard1_replica_n5/, http://127.0.0.1:54204/collection1_shard1_replica_n7/]",
    "trace":"org.apache.solr.common.SolrException: 3 out of 5 the property overlay to be of version 4 within 30 seconds! Failed cores: [http://127.0.0.1:54186/collection1_shard1_replica_n3/, http://127.0.0.1:54200/collection1_shard1_replica_n5/, http://127.0.0.1:54204/collection1_shard1_replica_n7/]\n\tat org.apache.solr.handler.SolrConfigHandler.waitForAllReplicasState(SolrConfigHandler.java:813)\n\tat org.apache.solr.handler.SolrConfigHandler$Command.handleCommands(SolrConfigHandler.java:522)\n\tat org.apache.solr.handler.SolrConfigHandler$Command.handlePOST(SolrConfigHandler.java:363)\n\tat org.apache.solr.handler.SolrConfigHandler.handleRequestBody(SolrConfigHandler.java:139)\n\tat org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:208)\n\tat org.apache.solr.core.SolrCore.execute(SolrCore.java:2582)\n\tat org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:799)\n\tat org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:578)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)\n\tat org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)\n\tat org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)\n\tat org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1607)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1297)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)\n\tat org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)\n\tat org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1577)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1212)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n\tat org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)\n\tat org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:717)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n\tat org.eclipse.jetty.server.Server.handle(Server.java:500)\n\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)\n\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)\n\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)\n\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:270)\n\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\n\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)\n\tat org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)\n\tat java.base/java.lang.Thread.run(Thread.java:830)\n",
    "code":500}}
 expected null, but was:<[3 out of 5 the property overlay to be of version 4 within 30 seconds! Failed cores: [http://127.0.0.1:54186/collection1_shard1_replica_n3/, http://127.0.0.1:54200/collection1_shard1_replica_n5/, http://127.0.0.1:54204/collection1_shard1_replica_n7/]
]>
	at __randomizedtesting.SeedInfo.seed([3414475A001D7ABB:BC407880AEE11743]:0)
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotNull(Assert.java:755)
	at org.junit.Assert.assertNull(Assert.java:737)
	at org.apache.solr.core.TestSolrConfigHandler.runConfigCommand(TestSolrConfigHandler.java:179)
	at org.apache.solr.cloud.TestCryptoKeys.test(TestCryptoKeys.java:180)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1754)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:942)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:978)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:992)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1082)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1054)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:819)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:470)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:951)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:836)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:887)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:898)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:370)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:826)
	at java.base/java.lang.Thread.run(Thread.java:830)




Build Log:
[...truncated 14841 lines...]
   [junit4] Suite: org.apache.solr.cloud.TestCryptoKeys
   [junit4]   2> 1239148 INFO  (SUITE-TestCryptoKeys-seed#[3414475A001D7ABB]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/data-dir-86-001
   [junit4]   2> 1239148 INFO  (SUITE-TestCryptoKeys-seed#[3414475A001D7ABB]-worker) [     ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=false
   [junit4]   2> 1239150 INFO  (SUITE-TestCryptoKeys-seed#[3414475A001D7ABB]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason="", value=0.0/0.0, ssl=0.0/0.0, clientAuth=0.0/0.0) w/ MAC_OS_X supressed clientAuth
   [junit4]   2> 1239151 INFO  (SUITE-TestCryptoKeys-seed#[3414475A001D7ABB]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 1239152 INFO  (SUITE-TestCryptoKeys-seed#[3414475A001D7ABB]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /
   [junit4]   2> 1239163 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 1239164 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 1239164 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 1239265 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer start zk server on port:54145
   [junit4]   2> 1239265 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:54145
   [junit4]   2> 1239265 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:54145
   [junit4]   2> 1239265 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 54145
   [junit4]   2> 1239269 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1239275 INFO  (zkConnectionManagerCallback-6485-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1239276 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1239282 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1239287 INFO  (zkConnectionManagerCallback-6487-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1239287 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1239292 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 1239296 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 1239302 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 1239305 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 1239310 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 1239318 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 1239323 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 1239328 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 1239332 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 1239336 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 1239344 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkTestServer put /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 1239348 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 1239884 WARN  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1239885 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1239885 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 1239885 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 1239888 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1239888 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1239888 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1239889 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@734f48ea{/,null,AVAILABLE}
   [junit4]   2> 1239890 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@53a22bd0{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:?????}
   [junit4]   2> 1239890 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.e.j.s.Server Started @???????ms
   [junit4]   2> 1239890 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/tempDir-001/control/data, hostPort=54150, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/control-001/cores, replicaType=NRT}
   [junit4]   2> 1239890 ERROR (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1239890 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1239890 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 9.0.0
   [junit4]   2> 1239890 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1239890 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 1239890 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-15T15:31:06.290855Z
   [junit4]   2> 1239892 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1239895 INFO  (zkConnectionManagerCallback-6489-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1239895 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1239999 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1239999 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/control-001/solr.xml
   [junit4]   2> 1240004 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1240004 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1240004 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1240078 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1240080 WARN  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@f92b06f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1240080 WARN  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@f92b06f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1240087 WARN  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@e0e97b2[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1240088 WARN  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@e0e97b2[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1240090 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:54145/solr
   [junit4]   2> 1240092 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1240100 INFO  (zkConnectionManagerCallback-6496-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1240100 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1240208 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [n:127.0.0.1:54150_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1240211 INFO  (zkConnectionManagerCallback-6498-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1240211 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [n:127.0.0.1:54150_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1240636 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [n:127.0.0.1:54150_     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:54150_
   [junit4]   2> 1240638 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [n:127.0.0.1:54150_     ] o.a.s.c.Overseer Overseer (id=72059879528792068-127.0.0.1:54150_-n_0000000000) starting
   [junit4]   2> 1240664 INFO  (OverseerStateUpdate-72059879528792068-127.0.0.1:54150_-n_0000000000) [n:127.0.0.1:54150_     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:54150_
   [junit4]   2> 1240666 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [n:127.0.0.1:54150_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:54150_
   [junit4]   2> 1240675 INFO  (OverseerStateUpdate-72059879528792068-127.0.0.1:54150_-n_0000000000) [n:127.0.0.1:54150_     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1240684 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [n:127.0.0.1:54150_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1240685 WARN  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [n:127.0.0.1:54150_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 1240720 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [n:127.0.0.1:54150_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1240747 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [n:127.0.0.1:54150_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1240758 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [n:127.0.0.1:54150_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1240758 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [n:127.0.0.1:54150_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1240760 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [n:127.0.0.1:54150_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/control-001/cores
   [junit4]   2> 1240784 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1240788 INFO  (zkConnectionManagerCallback-6507-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1240788 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1240791 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1240794 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:54145/solr ready
   [junit4]   2> 1240797 INFO  (qtp1390150781-16962) [n:127.0.0.1:54150_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:54150_&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1240808 INFO  (OverseerThreadFactory-3832-thread-1-processing-n:127.0.0.1:54150_) [n:127.0.0.1:54150_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 1240932 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 1240932 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1241979 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 1242004 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema [control_collection_shard1_replica_n1] Schema name=test
   [junit4]   2> 1242140 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 1242234 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from collection control_collection, trusted=true
   [junit4]   2> 1242236 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1242237 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/control-001/cores/control_collection_shard1_replica_n1/data/]
   [junit4]   2> 1242241 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=42, maxMergeAtOnceExplicit=12, maxMergedSegmentMB=76.0166015625, floorSegmentMB=1.125, forceMergeDeletesPctAllowed=16.358676175507277, segmentsPerTier=47.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=49.93528858612819
   [junit4]   2> 1242247 WARN  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A, b=B}}}
   [junit4]   2> 1242385 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 1242385 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1242389 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1242389 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1242390 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=37, maxMergeAtOnceExplicit=46, maxMergedSegmentMB=46.494140625, floorSegmentMB=1.25390625, forceMergeDeletesPctAllowed=23.97081426480858, segmentsPerTier=29.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=49.9804872956426
   [junit4]   2> 1242390 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@40ffe936[control_collection_shard1_replica_n1] main]
   [junit4]   2> 1242395 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 1242396 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 1242397 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 1242397 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1655808740606083072
   [junit4]   2> 1242403 INFO  (searcherExecutor-3837-thread-1-processing-n:127.0.0.1:54150_ x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@40ffe936[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1242417 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 1242418 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 1242433 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 1242433 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 1242433 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:54150/control_collection_shard1_replica_n1/
   [junit4]   2> 1242433 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 1242442 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:54150/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 1242442 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72059879528792068-core_node2-n_0000000000
   [junit4]   2> 1242450 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:54150/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 1242564 INFO  (zkCallback-6497-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1242565 INFO  (zkCallback-6497-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1242574 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 1242589 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1657
   [junit4]   2> 1242604 INFO  (qtp1390150781-16962) [n:127.0.0.1:54150_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 1242692 INFO  (zkCallback-6497-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1242692 INFO  (zkCallback-6497-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1242692 INFO  (zkCallback-6497-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1242696 INFO  (qtp1390150781-16962) [n:127.0.0.1:54150_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:54150_&wt=javabin&version=2} status=0 QTime=1898
   [junit4]   2> 1242697 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 1242813 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1242815 INFO  (OverseerCollectionConfigSetProcessor-72059879528792068-127.0.0.1:54150_-n_0000000000) [n:127.0.0.1:54150_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 1242817 INFO  (zkConnectionManagerCallback-6513-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1242817 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1242822 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1242827 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:54145/solr ready
   [junit4]   2> 1242827 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 1242829 INFO  (qtp1390150781-16962) [n:127.0.0.1:54150_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=2&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1242844 INFO  (OverseerThreadFactory-3832-thread-2-processing-n:127.0.0.1:54150_) [n:127.0.0.1:54150_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 1243067 WARN  (OverseerThreadFactory-3832-thread-2-processing-n:127.0.0.1:54150_) [n:127.0.0.1:54150_     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 1243073 INFO  (qtp1390150781-16962) [n:127.0.0.1:54150_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 1243079 INFO  (qtp1390150781-16962) [n:127.0.0.1:54150_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=2&wt=javabin&version=2} status=0 QTime=249
   [junit4]   2> 1243082 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1243082 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1243082 INFO  (watches-6510-thread-1) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1243082 INFO  (watches-6510-thread-1) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1243084 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1243084 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1243086 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1243086 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1243087 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=4
   [junit4]   2> 1243533 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-1-001 of type NRT
   [junit4]   2> 1243535 WARN  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1243535 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1243535 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 1243535 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 1243536 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1243536 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1243536 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1243536 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@12402edd{/,null,AVAILABLE}
   [junit4]   2> 1243542 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@6c5378ac{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:?????}
   [junit4]   2> 1243542 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.Server Started @???????ms
   [junit4]   2> 1243542 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/tempDir-001/jetty1, hostPort=54181, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-1-001/cores, replicaType=NRT}
   [junit4]   2> 1243542 ERROR (closeThreadPool-6514-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1243542 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1243542 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 9.0.0
   [junit4]   2> 1243542 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1243542 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 1243542 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-15T15:31:09.942927Z
   [junit4]   2> 1243544 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1243550 INFO  (zkConnectionManagerCallback-6516-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1243550 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1243657 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1243657 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-1-001/solr.xml
   [junit4]   2> 1243660 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1243660 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1243661 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1243929 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1243931 WARN  (closeThreadPool-6514-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@63e84a77[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1243931 WARN  (closeThreadPool-6514-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@63e84a77[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1243935 WARN  (closeThreadPool-6514-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@545410c0[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1243935 WARN  (closeThreadPool-6514-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@545410c0[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1243936 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:54145/solr
   [junit4]   2> 1243938 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1243942 INFO  (zkConnectionManagerCallback-6523-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1243942 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1244051 INFO  (closeThreadPool-6514-thread-1) [n:127.0.0.1:54181_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1244056 INFO  (zkConnectionManagerCallback-6525-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1244056 INFO  (closeThreadPool-6514-thread-1) [n:127.0.0.1:54181_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1244091 INFO  (closeThreadPool-6514-thread-1) [n:127.0.0.1:54181_     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1244112 INFO  (closeThreadPool-6514-thread-1) [n:127.0.0.1:54181_     ] o.a.s.c.ZkController Publish node=127.0.0.1:54181_ as DOWN
   [junit4]   2> 1244119 INFO  (closeThreadPool-6514-thread-1) [n:127.0.0.1:54181_     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1244119 INFO  (closeThreadPool-6514-thread-1) [n:127.0.0.1:54181_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:54181_
   [junit4]   2> 1244126 INFO  (zkCallback-6497-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1244127 INFO  (zkCallback-6512-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1244127 INFO  (zkCallback-6524-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1244135 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 2 in directory /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-2-001 of type NRT
   [junit4]   2> 1244136 WARN  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1244136 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1244136 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 1244137 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 1244137 INFO  (closeThreadPool-6514-thread-1) [n:127.0.0.1:54181_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1244139 WARN  (closeThreadPool-6514-thread-1) [n:127.0.0.1:54181_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 1244140 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1244140 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1244140 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1244140 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@40600970{/,null,AVAILABLE}
   [junit4]   2> 1244142 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.AbstractConnector Started ServerConnector@3e36169e{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:?????}
   [junit4]   2> 1244142 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.Server Started @???????ms
   [junit4]   2> 1244142 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/tempDir-001/jetty2, hostPort=54186, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-2-001/cores, replicaType=NRT}
   [junit4]   2> 1244142 ERROR (closeThreadPool-6514-thread-2) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1244142 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1244143 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 9.0.0
   [junit4]   2> 1244143 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1244143 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 1244143 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-15T15:31:10.543101Z
   [junit4]   2> 1244145 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1244148 INFO  (zkConnectionManagerCallback-6530-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1244148 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1244192 INFO  (closeThreadPool-6514-thread-1) [n:127.0.0.1:54181_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1244257 INFO  (closeThreadPool-6514-thread-1) [n:127.0.0.1:54181_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1244265 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1244265 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-2-001/solr.xml
   [junit4]   2> 1244268 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1244268 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1244269 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1244274 INFO  (closeThreadPool-6514-thread-1) [n:127.0.0.1:54181_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1244274 INFO  (closeThreadPool-6514-thread-1) [n:127.0.0.1:54181_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1244276 INFO  (closeThreadPool-6514-thread-1) [n:127.0.0.1:54181_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-1-001/cores
   [junit4]   2> 1244305 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:54181_
   [junit4]   2> 1244340 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1244341 WARN  (closeThreadPool-6514-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@610a144f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1244342 WARN  (closeThreadPool-6514-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@610a144f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1244348 WARN  (closeThreadPool-6514-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@36624286[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1244348 WARN  (closeThreadPool-6514-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@36624286[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1244349 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:54145/solr
   [junit4]   2> 1244351 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1244357 INFO  (zkConnectionManagerCallback-6538-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1244357 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1244469 INFO  (closeThreadPool-6514-thread-2) [n:127.0.0.1:54186_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1244474 INFO  (zkConnectionManagerCallback-6540-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1244474 INFO  (closeThreadPool-6514-thread-2) [n:127.0.0.1:54186_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1244505 INFO  (closeThreadPool-6514-thread-2) [n:127.0.0.1:54186_     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2)
   [junit4]   2> 1244539 INFO  (closeThreadPool-6514-thread-2) [n:127.0.0.1:54186_     ] o.a.s.c.ZkController Publish node=127.0.0.1:54186_ as DOWN
   [junit4]   2> 1244544 INFO  (closeThreadPool-6514-thread-2) [n:127.0.0.1:54186_     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1244544 INFO  (closeThreadPool-6514-thread-2) [n:127.0.0.1:54186_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:54186_
   [junit4]   2> 1244552 INFO  (zkCallback-6497-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 1244553 INFO  (zkCallback-6512-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 1244553 INFO  (zkCallback-6524-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 1244554 INFO  (zkCallback-6539-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 1244562 INFO  (closeThreadPool-6514-thread-2) [n:127.0.0.1:54186_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1244564 WARN  (closeThreadPool-6514-thread-2) [n:127.0.0.1:54186_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 1244595 INFO  (closeThreadPool-6514-thread-2) [n:127.0.0.1:54186_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1244664 INFO  (closeThreadPool-6514-thread-2) [n:127.0.0.1:54186_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1244684 INFO  (closeThreadPool-6514-thread-2) [n:127.0.0.1:54186_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1244685 INFO  (closeThreadPool-6514-thread-2) [n:127.0.0.1:54186_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1244689 INFO  (closeThreadPool-6514-thread-2) [n:127.0.0.1:54186_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-2-001/cores
   [junit4]   2> 1244791 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:54186_
   [junit4]   2> 1244849 INFO  (OverseerCollectionConfigSetProcessor-72059879528792068-127.0.0.1:54150_-n_0000000000) [n:127.0.0.1:54150_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 1244887 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 3 in directory /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-3-001 of type NRT
   [junit4]   2> 1244888 WARN  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1244889 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1244889 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 1244889 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 1244897 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1244897 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1244897 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 1244898 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@36d72bfe{/,null,AVAILABLE}
   [junit4]   2> 1244900 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.AbstractConnector Started ServerConnector@1bdfae4a{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:?????}
   [junit4]   2> 1244900 INFO  (closeThreadPool-6514-thread-2) [     ] o.e.j.s.Server Started @???????ms
   [junit4]   2> 1244900 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/tempDir-001/jetty3, hostPort=54200, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-3-001/cores, replicaType=NRT}
   [junit4]   2> 1244902 ERROR (closeThreadPool-6514-thread-2) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1244902 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1244903 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 9.0.0
   [junit4]   2> 1244903 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1244903 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 1244903 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-15T15:31:11.303143Z
   [junit4]   2> 1244909 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1244913 INFO  (zkConnectionManagerCallback-6546-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1244913 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1245023 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1245023 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-3-001/solr.xml
   [junit4]   2> 1245028 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1245028 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1245031 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1245267 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1245272 WARN  (closeThreadPool-6514-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@1560fab9[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1245272 WARN  (closeThreadPool-6514-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@1560fab9[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1245280 WARN  (closeThreadPool-6514-thread-2) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@5c819c53[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1245280 WARN  (closeThreadPool-6514-thread-2) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@5c819c53[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1245282 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:54145/solr
   [junit4]   2> 1245285 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1245289 INFO  (zkConnectionManagerCallback-6553-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1245289 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1245422 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1245426 INFO  (zkConnectionManagerCallback-6555-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1245426 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1245477 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3)
   [junit4]   2> 1245510 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:54200_ as DOWN
   [junit4]   2> 1245516 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1245517 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:54200_
   [junit4]   2> 1245527 INFO  (zkCallback-6497-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 1245527 INFO  (zkCallback-6524-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 1245527 INFO  (zkCallback-6512-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 1245527 INFO  (zkCallback-6539-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 1245528 INFO  (zkCallback-6554-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 1245547 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1245549 WARN  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 1245594 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1245639 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1245665 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1245665 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1245671 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-3-001/cores
   [junit4]   2> 1245688 INFO  (TEST-TestCryptoKeys.test-seed#[3414475A001D7ABB]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 4 in directory /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-4-001 of type NRT
   [junit4]   2> 1245689 WARN  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1245689 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1245689 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 1 ...
   [junit4]   2> 1245689 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.Server jetty-9.4.24.v20191120; built: 2019-11-20T21:37:49.771Z; git: 363d5f2df3a8a28de40604320230664b9c793c16; jvm 13.0.1+9
   [junit4]   2> 1245691 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1245691 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1245691 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 1245692 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@44095c2a{/,null,AVAILABLE}
   [junit4]   2> 1245693 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@ec9c1ad{HTTP/1.1,[http/1.1, h2c]}{127.0.0.1:?????}
   [junit4]   2> 1245693 INFO  (closeThreadPool-6514-thread-1) [     ] o.e.j.s.Server Started @???????ms
   [junit4]   2> 1245693 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/tempDir-001/jetty4, hostPort=54204, coreRootDirectory=/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-4-001/cores, replicaType=NRT}
   [junit4]   2> 1245694 ERROR (closeThreadPool-6514-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1245694 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1245694 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 9.0.0
   [junit4]   2> 1245694 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1245694 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 1245694 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-01-15T15:31:12.094176Z
   [junit4]   2> 1245695 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1245707 INFO  (zkConnectionManagerCallback-6561-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1245707 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1245745 INFO  (closeThreadPool-6514-thread-2) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:54200_
   [junit4]   2> 1245811 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1245811 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-4-001/solr.xml
   [junit4]   2> 1245814 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1245814 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1245815 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1246019 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1246020 WARN  (closeThreadPool-6514-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@2c43afd7[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1246020 WARN  (closeThreadPool-6514-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@2c43afd7[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1246024 WARN  (closeThreadPool-6514-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@3eac8af9[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1246024 WARN  (closeThreadPool-6514-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@3eac8af9[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1246026 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:54145/solr
   [junit4]   2> 1246028 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1246032 INFO  (zkConnectionManagerCallback-6568-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1246032 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1246143 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1246147 INFO  (zkConnectionManagerCallback-6570-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1246147 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1246177 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (4)
   [junit4]   2> 1246211 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:54204_ as DOWN
   [junit4]   2> 1246218 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1246219 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:54204_
   [junit4]   2> 1246226 INFO  (zkCallback-6497-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 1246227 INFO  (zkCallback-6512-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 1246227 INFO  (zkCallback-6539-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 1246227 INFO  (zkCallback-6554-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 1246227 INFO  (zkCallback-6524-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 1246228 INFO  (zkCallback-6569-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (4) -> (5)
   [junit4]   2> 1246241 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1246242 WARN  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=disabled authorization=disabled.  Solr is only as secure as you make it. Consider configuring authentication/authorization before exposing Solr to users internal or external.  See https://s.apache.org/solrsecurity for more info
   [junit4]   2> 1246306 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1246352 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1246362 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1246362 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1246364 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-4-001/cores
   [junit4]   2> 1246406 INFO  (closeThreadPool-6514-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:54204_
   [junit4]   2> 1246411 INFO  (qtp1390150781-16961) [n:127.0.0.1:54150_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:54181_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1246411 INFO  (qtp1390150781-16964) [n:127.0.0.1:54150_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:54186_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1246413 INFO  (qtp1390150781-16962) [n:127.0.0.1:54150_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:54200_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1246413 INFO  (qtp1390150781-16963) [n:127.0.0.1:54150_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:54204_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1246424 INFO  (OverseerThreadFactory-3832-thread-3-processing-n:127.0.0.1:54150_) [n:127.0.0.1:54150_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:54181_ for creating new replica of shard shard1 for collection collection1
   [junit4]   2> 1246427 INFO  (OverseerThreadFactory-3832-thread-3-processing-n:127.0.0.1:54150_) [n:127.0.0.1:54150_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command.
   [junit4]   2> 1246440 INFO  (qtp1751966073-17020) [n:127.0.0.1:54181_    x:collection1_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 1247563 INFO  (qtp1751966073-17020) [n:127.0.0.1:54181_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 9.0.0
   [junit4]   2> 1247576 INFO  (qtp1751966073-17020) [n:127.0.0.1:54181_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema [collection1_shard1_replica_n1] Schema name=test
   [junit4]   2> 1247700 INFO  (qtp1751966073-17020) [n:127.0.0.1:54181_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 1247782 INFO  (qtp1751966073-17020) [n:127.0.0.1:54181_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from collection collection1, trusted=true
   [junit4]   2> 1247782 INFO  (qtp1751966073-17020) [n:127.0.0.1:54181_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@5096c33a
   [junit4]   2> 1247782 INFO  (qtp1751966073-17020) [n:127.0.0.1:54181_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-1-001/cores/collection1_shard1_replica_n1], dataDir=[/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001/shard-1-001/cores/collection1_shard1_replica_n1/data/]
   [junit4]   2> 1247785 INFO  (qtp1751966073-17020) [n:127.0.0.1:54181_ c:collection1 s:shard1  x:collection1_shard1_r

[...truncated too long message...]

86/collection1_shard1_replica_n3/, http://127.0.0.1:54200/collection1_shard1_replica_n5/, http://127.0.0.1:54204/collection1_shard1_replica_n7/]",
   [junit4]    >     "trace":"org.apache.solr.common.SolrException: 3 out of 5 the property overlay to be of version 4 within 30 seconds! Failed cores: [http://127.0.0.1:54186/collection1_shard1_replica_n3/, http://127.0.0.1:54200/collection1_shard1_replica_n5/, http://127.0.0.1:54204/collection1_shard1_replica_n7/]\n\tat org.apache.solr.handler.SolrConfigHandler.waitForAllReplicasState(SolrConfigHandler.java:813)\n\tat org.apache.solr.handler.SolrConfigHandler$Command.handleCommands(SolrConfigHandler.java:522)\n\tat org.apache.solr.handler.SolrConfigHandler$Command.handlePOST(SolrConfigHandler.java:363)\n\tat org.apache.solr.handler.SolrConfigHandler.handleRequestBody(SolrConfigHandler.java:139)\n\tat org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:208)\n\tat org.apache.solr.core.SolrCore.execute(SolrCore.java:2582)\n\tat org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:799)\n\tat org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:578)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:419)\n\tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:351)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)\n\tat org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)\n\tat org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)\n\tat org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)\n\tat org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1607)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1297)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)\n\tat org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)\n\tat org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1577)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)\n\tat org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1212)\n\tat org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n\tat org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)\n\tat org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:717)\n\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n\tat org.eclipse.jetty.server.Server.handle(Server.java:500)\n\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)\n\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)\n\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)\n\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:270)\n\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\n\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)\n\tat org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)\n\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)\n\tat java.base/java.lang.Thread.run(Thread.java:830)\n",
   [junit4]    >     "code":500}}
   [junit4]    >  expected null, but was:<[3 out of 5 the property overlay to be of version 4 within 30 seconds! Failed cores: [http://127.0.0.1:54186/collection1_shard1_replica_n3/, http://127.0.0.1:54200/collection1_shard1_replica_n5/, http://127.0.0.1:54204/collection1_shard1_replica_n7/]
   [junit4]    > ]>
   [junit4]    > 	at __randomizedtesting.SeedInfo.seed([3414475A001D7ABB:BC407880AEE11743]:0)
   [junit4]    > 	at org.apache.solr.core.TestSolrConfigHandler.runConfigCommand(TestSolrConfigHandler.java:179)
   [junit4]    > 	at org.apache.solr.cloud.TestCryptoKeys.test(TestCryptoKeys.java:180)
   [junit4]    > 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   [junit4]    > 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   [junit4]    > 	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   [junit4]    > 	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
   [junit4]    > 	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1082)
   [junit4]    > 	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1054)
   [junit4]    > 	at java.base/java.lang.Thread.run(Thread.java:830)
   [junit4]   2> NOTE: leaving temporary files on disk at: /Users/jenkins/workspace/Lucene-Solr-master-MacOSX/solr/build/solr-core/test/J3/temp/solr.cloud.TestCryptoKeys_3414475A001D7ABB-001
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene84): {blobName=BlockTreeOrds(blocksize=128), id=PostingsFormat(name=LuceneVarGapDocFreqInterval), type=BlockTreeOrds(blocksize=128), md5=BlockTreeOrds(blocksize=128)}, docValues:{timestamp_l=DocValuesFormat(name=Lucene80), size=DocValuesFormat(name=Lucene80), _version_=DocValuesFormat(name=Asserting), version=DocValuesFormat(name=Asserting), timestamp=DocValuesFormat(name=Asserting)}, maxPointsInLeafNode=1689, maxMBSortInHeap=7.6867638805874305, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@34832e6), locale=ar, timezone=Europe/San_Marino
   [junit4]   2> NOTE: Mac OS X 10.14.6 x86_64/AdoptOpenJDK 13.0.1 (64-bit)/cpus=6,threads=1,free=165320704,total=400556032
   [junit4]   2> NOTE: All tests run in this JVM: [SolrPluginUtilsTest, TestSolrConfigHandlerConcurrent, SpellCheckCollatorTest, CdcrOpsAndBoundariesTest, OverseerTest, TestFiltering, TestDefaultStatsCache, HttpPartitionTest, DocumentAnalysisRequestHandlerTest, TestPseudoReturnFields, DistributedFacetPivotSmallTest, TestSmileRequest, TestRangeQuery, SignatureUpdateProcessorFactoryTest, ResponseLogComponentTest, DistribDocExpirationUpdateProcessorTest, TestNonDefinedSimilarityFactory, TestUnInvertedFieldException, TestClusterStateMutator, TestDistributedStatsComponentCardinality, MetricsHistoryIntegrationTest, UninvertDocValuesMergePolicyTest, TestSimNodeAddedTrigger, TestSolrCloudWithDelegationTokens, TestSurroundQueryParser, UUIDUpdateProcessorFallbackTest, ParsingFieldUpdateProcessorsTest, OutputWriterTest, TestEmbeddedSolrServerConstructors, TestLRUStatsCacheCloud, TestCloudSearcherWarming, HighlighterMaxOffsetTest, SolrIndexConfigTest, BufferStoreTest, TestLocalFSCloudBackupRestore, TestReloadAndDeleteDocs, EnumFieldTest, CloudMLTQParserTest, TriLevelCompositeIdRoutingTest, TestBulkSchemaConcurrent, BasicZkTest, TestBackupRepositoryFactory, TestInitQParser, NodeMutatorTest, TestCursorMarkWithoutUniqueKey, TestQueryingOnDownCollection, SortByFunctionTest, PeerSyncWithLeaderAndIndexFingerprintCachingTest, ShowFileRequestHandlerTest, BigEndianAscendingWordSerializerTest, TestFieldCache, MoveReplicaHDFSTest, DeleteNodeTest, SearchRateTriggerTest, TestSchemaNameResource, TestSimDistributedQueue, CircularListTest, TriggerIntegrationTest, AsyncCallRequestStatusResponseTest, CollectionPropsTest, SimpleCollectionCreateDeleteTest, SolrCloudExampleTest, TestPointFields, TestApiFramework, HdfsThreadLeakTest, FieldMutatingUpdateProcessorTest, BadComponentTest, TestUniqueKeyFieldResource, TestHdfsUpdateLog, TestNamedUpdateProcessors, InfixSuggestersTest, AtomicUpdatesTest, TestIndexingPerformance, HdfsNNFailoverTest, SolrCoreMetricManagerTest, TestSimpleTrackingShardHandler, SuggestComponentContextFilterQueryTest, IndexSizeTriggerMixedBoundsTest, TestCloudJSONFacetSKG, LeaderFailoverAfterPartitionTest, HLLUtilTest, RequestHandlersTest, TestStressLiveNodes, TestGraphMLResponseWriter, CoreAdminHandlerTest, TestSha256AuthenticationProvider, TestCryptoKeys]
   [junit4] Completed [482/899 (1!)] on J3 in 161.47s, 1 test, 1 failure <<< FAILURES!

[...truncated 46391 lines...]
BUILD FAILED
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:634: The following error occurred while executing this line:
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:507: The following error occurred while executing this line:
/Users/jenkins/workspace/Lucene-Solr-master-MacOSX/build.xml:494: Source checkout is dirty (unversioned/missing files) after running tests!!! Offending files:
* solr/licenses/jetty-start-9.4.24.v20191120-shaded.jar.sha1

Total time: 75 minutes 15 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/Users/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2