You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@lucene.apache.org by Policeman Jenkins Server <je...@thetaphi.de> on 2016/10/15 15:58:07 UTC
[JENKINS] Lucene-Solr-6.x-Linux (32bit/jdk1.8.0_102) - Build # 1955
- Unstable!
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-6.x-Linux/1955/
Java: 32bit/jdk1.8.0_102 -server -XX:+UseConcMarkSweepGC
1 tests failed.
FAILED: org.apache.solr.cloud.PeerSyncReplicationTest.test
Error Message:
expected:<204> but was:<187>
Stack Trace:
java.lang.AssertionError: expected:<204> but was:<187>
at __randomizedtesting.SeedInfo.seed([236A8C932ECFF850:AB3EB349803395A8]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.failNotEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:128)
at org.junit.Assert.assertEquals(Assert.java:472)
at org.junit.Assert.assertEquals(Assert.java:456)
at org.apache.solr.cloud.PeerSyncReplicationTest.bringUpDeadNodeAndEnsureNoReplication(PeerSyncReplicationTest.java:280)
at org.apache.solr.cloud.PeerSyncReplicationTest.test(PeerSyncReplicationTest.java:157)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1764)
at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:871)
at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:907)
at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:921)
at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:992)
at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:967)
at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:809)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:460)
at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:880)
at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:781)
at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:816)
at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:827)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
at java.lang.Thread.run(Thread.java:745)
Build Log:
[...truncated 12103 lines...]
[junit4] Suite: org.apache.solr.cloud.PeerSyncReplicationTest
[junit4] 2> Creating dataDir: /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/init-core-data-001
[junit4] 2> 1518859 INFO (SUITE-PeerSyncReplicationTest-seed#[236A8C932ECFF850]-worker) [ ] o.a.s.SolrTestCaseJ4 Randomized ssl (true) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason=, ssl=NaN, value=NaN, clientAuth=NaN)
[junit4] 2> 1518859 INFO (SUITE-PeerSyncReplicationTest-seed#[236A8C932ECFF850]-worker) [ ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /
[junit4] 2> 1518861 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
[junit4] 2> 1518861 INFO (Thread-2627) [ ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
[junit4] 2> 1518861 INFO (Thread-2627) [ ] o.a.s.c.ZkTestServer Starting server
[junit4] 2> 1518961 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.ZkTestServer start zk server on port:45627
[junit4] 2> 1518966 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
[junit4] 2> 1518967 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
[junit4] 2> 1518968 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
[junit4] 2> 1518968 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
[junit4] 2> 1518969 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
[junit4] 2> 1518969 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
[junit4] 2> 1518970 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
[junit4] 2> 1518970 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
[junit4] 2> 1518971 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
[junit4] 2> 1518971 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
[junit4] 2> 1518972 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
[junit4] 2> 1519033 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.SolrTestCaseJ4 Writing core.properties file to /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/control-001/cores/collection1
[junit4] 2> 1519035 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.Server jetty-9.3.8.v20160314
[junit4] 2> 1519036 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@19afbfa{/,null,AVAILABLE}
[junit4] 2> 1519037 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.ServerConnector Started ServerConnector@1ff9044{SSL,[ssl, http/1.1]}{127.0.0.1:41344}
[junit4] 2> 1519038 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.Server Started @1521074ms
[junit4] 2> 1519038 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/tempDir-001/control/data, hostContext=/, hostPort=41344, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/control-001/cores}
[junit4] 2> 1519038 ERROR (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
[junit4] 2> 1519038 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter ___ _ Welcome to Apache Solr™ version 6.3.0
[junit4] 2> 1519038 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _ Starting in cloud mode on port null
[junit4] 2> 1519038 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_| Install dir: null
[junit4] 2> 1519038 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter |___/\___/_|_| Start time: 2016-10-15T15:41:23.562Z
[junit4] 2> 1519040 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
[junit4] 2> 1519040 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/control-001/solr.xml
[junit4] 2> 1519045 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.u.UpdateShardHandler Creating UpdateShardHandler HTTP client with params: socketTimeout=340000&connTimeout=45000&retry=true
[junit4] 2> 1519045 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:45627/solr
[junit4] 2> 1519053 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:41344_ ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:41344_
[junit4] 2> 1519053 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:41344_ ] o.a.s.c.Overseer Overseer (id=96766924120784901-127.0.0.1:41344_-n_0000000000) starting
[junit4] 2> 1519057 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:41344_ ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:41344_
[junit4] 2> 1519057 INFO (zkCallback-24679-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
[junit4] 2> 1519077 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:41344_ ] o.a.s.c.CorePropertiesLocator Found 1 core definitions underneath /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/control-001/cores
[junit4] 2> 1519077 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:41344_ ] o.a.s.c.CorePropertiesLocator Cores are: [collection1]
[junit4] 2> 1519078 INFO (OverseerStateUpdate-96766924120784901-127.0.0.1:41344_-n_0000000000) [n:127.0.0.1:41344_ ] o.a.s.c.o.ReplicaMutator Assigning new node to shard shard=shard1
[junit4] 2> 1520086 WARN (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection x:collection1] o.a.s.c.Config Beginning with Solr 5.5, <mergePolicy> is deprecated, use <mergePolicyFactory> instead.
[junit4] 2> 1520086 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection x:collection1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 6.3.0
[junit4] 2> 1520124 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection x:collection1] o.a.s.s.IndexSchema [collection1] Schema name=test
[junit4] 2> 1520201 WARN (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection x:collection1] o.a.s.s.IndexSchema [collection1] default search field in schema is text. WARNING: Deprecated, please use 'df' on request instead.
[junit4] 2> 1520203 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection x:collection1] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
[junit4] 2> 1520209 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection x:collection1] o.a.s.c.CoreContainer Creating SolrCore 'collection1' using configuration from collection control_collection
[junit4] 2> 1520209 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.SolrCore [[collection1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/control-001/cores/collection1], dataDir=[/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/control-001/cores/collection1/data/]
[junit4] 2> 1520209 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring is enabled. Adding Solr mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxMBeanServer@6578c9
[junit4] 2> 1520211 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=47, maxMergeAtOnceExplicit=35, maxMergedSegmentMB=6.7490234375, floorSegmentMB=1.4541015625, forceMergeDeletesPctAllowed=28.12303288361992, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
[junit4] 2> 1520291 WARN (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,args = {defaults={a=A,b=B}}}
[junit4] 2> 1520296 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
[junit4] 2> 1520296 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=1000 maxNumLogsToKeep=10 numVersionBuckets=65536
[junit4] 2> 1520297 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.CommitTracker Hard AutoCommit: disabled
[junit4] 2> 1520297 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.CommitTracker Soft AutoCommit: disabled
[junit4] 2> 1520297 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.AlcoholicMergePolicy: [AlcoholicMergePolicy: minMergeSize=0, mergeFactor=10, maxMergeSize=730345928, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.1]
[junit4] 2> 1520298 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.s.SolrIndexSearcher Opening [Searcher@2e5748[collection1] main]
[junit4] 2> 1520298 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
[junit4] 2> 1520299 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
[junit4] 2> 1520299 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.h.ReplicationHandler Commits will be reserved for 10000
[junit4] 2> 1520299 INFO (searcherExecutor-9377-thread-1-processing-n:127.0.0.1:41344_ x:collection1 s:shard1 c:control_collection r:core_node1) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.SolrCore [collection1] Registered new searcher Searcher@2e5748[collection1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
[junit4] 2> 1520299 INFO (coreLoadExecutor-9376-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1548270787439362048
[junit4] 2> 1520303 INFO (coreZkRegister-9369-thread-1-processing-n:127.0.0.1:41344_ x:collection1 s:shard1 c:control_collection r:core_node1) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
[junit4] 2> 1520303 INFO (coreZkRegister-9369-thread-1-processing-n:127.0.0.1:41344_ x:collection1 s:shard1 c:control_collection r:core_node1) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
[junit4] 2> 1520303 INFO (coreZkRegister-9369-thread-1-processing-n:127.0.0.1:41344_ x:collection1 s:shard1 c:control_collection r:core_node1) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:41344/collection1/
[junit4] 2> 1520303 INFO (coreZkRegister-9369-thread-1-processing-n:127.0.0.1:41344_ x:collection1 s:shard1 c:control_collection r:core_node1) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
[junit4] 2> 1520303 INFO (coreZkRegister-9369-thread-1-processing-n:127.0.0.1:41344_ x:collection1 s:shard1 c:control_collection r:core_node1) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.SyncStrategy https://127.0.0.1:41344/collection1/ has no replicas
[junit4] 2> 1520305 INFO (coreZkRegister-9369-thread-1-processing-n:127.0.0.1:41344_ x:collection1 s:shard1 c:control_collection r:core_node1) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:41344/collection1/ shard1
[junit4] 2> 1520456 INFO (coreZkRegister-9369-thread-1-processing-n:127.0.0.1:41344_ x:collection1 s:shard1 c:control_collection r:core_node1) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.c.ZkController I am the leader, no recovery necessary
[junit4] 2> 1520581 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
[junit4] 2> 1520582 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
[junit4] 2> 1520645 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.SolrTestCaseJ4 Writing core.properties file to /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-1-001/cores/collection1
[junit4] 2> 1520646 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-1-001
[junit4] 2> 1520647 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.Server jetty-9.3.8.v20160314
[junit4] 2> 1520648 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@cd21c7{/,null,AVAILABLE}
[junit4] 2> 1520650 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.ServerConnector Started ServerConnector@96121e{SSL,[ssl, http/1.1]}{127.0.0.1:39898}
[junit4] 2> 1520650 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.Server Started @1522686ms
[junit4] 2> 1520650 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/tempDir-001/jetty1, solrconfig=solrconfig.xml, hostContext=/, hostPort=39898, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-1-001/cores}
[junit4] 2> 1520650 ERROR (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
[junit4] 2> 1520651 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter ___ _ Welcome to Apache Solr™ version 6.3.0
[junit4] 2> 1520651 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _ Starting in cloud mode on port null
[junit4] 2> 1520651 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_| Install dir: null
[junit4] 2> 1520651 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter |___/\___/_|_| Start time: 2016-10-15T15:41:25.175Z
[junit4] 2> 1520654 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
[junit4] 2> 1520654 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-1-001/solr.xml
[junit4] 2> 1520658 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.u.UpdateShardHandler Creating UpdateShardHandler HTTP client with params: socketTimeout=340000&connTimeout=45000&retry=true
[junit4] 2> 1520658 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:45627/solr
[junit4] 2> 1520666 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:39898_ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
[junit4] 2> 1520668 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:39898_ ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:39898_
[junit4] 2> 1520668 INFO (zkCallback-24679-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
[junit4] 2> 1520668 INFO (zkCallback-24688-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
[junit4] 2> 1520668 INFO (zkCallback-24683-thread-1) [ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
[junit4] 2> 1520682 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:39898_ ] o.a.s.c.CorePropertiesLocator Found 1 core definitions underneath /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-1-001/cores
[junit4] 2> 1520682 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:39898_ ] o.a.s.c.CorePropertiesLocator Cores are: [collection1]
[junit4] 2> 1520683 INFO (OverseerStateUpdate-96766924120784901-127.0.0.1:41344_-n_0000000000) [n:127.0.0.1:41344_ ] o.a.s.c.o.ReplicaMutator Assigning new node to shard shard=shard1
[junit4] 2> 1521691 WARN (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 x:collection1] o.a.s.c.Config Beginning with Solr 5.5, <mergePolicy> is deprecated, use <mergePolicyFactory> instead.
[junit4] 2> 1521692 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 x:collection1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 6.3.0
[junit4] 2> 1521703 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 x:collection1] o.a.s.s.IndexSchema [collection1] Schema name=test
[junit4] 2> 1521786 WARN (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 x:collection1] o.a.s.s.IndexSchema [collection1] default search field in schema is text. WARNING: Deprecated, please use 'df' on request instead.
[junit4] 2> 1521788 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 x:collection1] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
[junit4] 2> 1521794 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 x:collection1] o.a.s.c.CoreContainer Creating SolrCore 'collection1' using configuration from collection collection1
[junit4] 2> 1521795 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.SolrCore [[collection1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-1-001/cores/collection1], dataDir=[/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-1-001/cores/collection1/data/]
[junit4] 2> 1521795 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring is enabled. Adding Solr mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxMBeanServer@6578c9
[junit4] 2> 1521796 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=47, maxMergeAtOnceExplicit=35, maxMergedSegmentMB=6.7490234375, floorSegmentMB=1.4541015625, forceMergeDeletesPctAllowed=28.12303288361992, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
[junit4] 2> 1521909 WARN (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,args = {defaults={a=A,b=B}}}
[junit4] 2> 1521915 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
[junit4] 2> 1521915 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=1000 maxNumLogsToKeep=10 numVersionBuckets=65536
[junit4] 2> 1521916 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.CommitTracker Hard AutoCommit: disabled
[junit4] 2> 1521916 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.CommitTracker Soft AutoCommit: disabled
[junit4] 2> 1521916 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.AlcoholicMergePolicy: [AlcoholicMergePolicy: minMergeSize=0, mergeFactor=10, maxMergeSize=730345928, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.1]
[junit4] 2> 1521917 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.s.SolrIndexSearcher Opening [Searcher@119ae3e[collection1] main]
[junit4] 2> 1521917 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
[junit4] 2> 1521917 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
[junit4] 2> 1521917 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.h.ReplicationHandler Commits will be reserved for 10000
[junit4] 2> 1521918 INFO (searcherExecutor-9388-thread-1-processing-n:127.0.0.1:39898_ x:collection1 s:shard1 c:collection1 r:core_node1) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.SolrCore [collection1] Registered new searcher Searcher@119ae3e[collection1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
[junit4] 2> 1521918 INFO (coreLoadExecutor-9387-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1548270789137006592
[junit4] 2> 1521921 INFO (coreZkRegister-9382-thread-1-processing-n:127.0.0.1:39898_ x:collection1 s:shard1 c:collection1 r:core_node1) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
[junit4] 2> 1521921 INFO (coreZkRegister-9382-thread-1-processing-n:127.0.0.1:39898_ x:collection1 s:shard1 c:collection1 r:core_node1) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
[junit4] 2> 1521921 INFO (coreZkRegister-9382-thread-1-processing-n:127.0.0.1:39898_ x:collection1 s:shard1 c:collection1 r:core_node1) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:39898/collection1/
[junit4] 2> 1521922 INFO (coreZkRegister-9382-thread-1-processing-n:127.0.0.1:39898_ x:collection1 s:shard1 c:collection1 r:core_node1) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
[junit4] 2> 1521922 INFO (coreZkRegister-9382-thread-1-processing-n:127.0.0.1:39898_ x:collection1 s:shard1 c:collection1 r:core_node1) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.SyncStrategy https://127.0.0.1:39898/collection1/ has no replicas
[junit4] 2> 1521923 INFO (coreZkRegister-9382-thread-1-processing-n:127.0.0.1:39898_ x:collection1 s:shard1 c:collection1 r:core_node1) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:39898/collection1/ shard1
[junit4] 2> 1522073 INFO (coreZkRegister-9382-thread-1-processing-n:127.0.0.1:39898_ x:collection1 s:shard1 c:collection1 r:core_node1) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.ZkController I am the leader, no recovery necessary
[junit4] 2> 1522242 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.SolrTestCaseJ4 Writing core.properties file to /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-2-001/cores/collection1
[junit4] 2> 1522242 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 2 in directory /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-2-001
[junit4] 2> 1522244 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.Server jetty-9.3.8.v20160314
[junit4] 2> 1522245 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@1978004{/,null,AVAILABLE}
[junit4] 2> 1522247 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.ServerConnector Started ServerConnector@1376811{SSL,[ssl, http/1.1]}{127.0.0.1:35205}
[junit4] 2> 1522247 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.Server Started @1524283ms
[junit4] 2> 1522247 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/tempDir-001/jetty2, solrconfig=solrconfig.xml, hostContext=/, hostPort=35205, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-2-001/cores}
[junit4] 2> 1522247 ERROR (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
[junit4] 2> 1522248 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter ___ _ Welcome to Apache Solr™ version 6.3.0
[junit4] 2> 1522248 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _ Starting in cloud mode on port null
[junit4] 2> 1522249 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_| Install dir: null
[junit4] 2> 1522249 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter |___/\___/_|_| Start time: 2016-10-15T15:41:26.773Z
[junit4] 2> 1522250 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
[junit4] 2> 1522250 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-2-001/solr.xml
[junit4] 2> 1522255 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.u.UpdateShardHandler Creating UpdateShardHandler HTTP client with params: socketTimeout=340000&connTimeout=45000&retry=true
[junit4] 2> 1522255 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:45627/solr
[junit4] 2> 1522259 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:35205_ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2)
[junit4] 2> 1522262 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:35205_ ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:35205_
[junit4] 2> 1522263 INFO (zkCallback-24683-thread-1) [ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
[junit4] 2> 1522263 INFO (zkCallback-24694-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
[junit4] 2> 1522263 INFO (zkCallback-24688-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
[junit4] 2> 1522263 INFO (zkCallback-24679-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
[junit4] 2> 1522275 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:35205_ ] o.a.s.c.CorePropertiesLocator Found 1 core definitions underneath /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-2-001/cores
[junit4] 2> 1522275 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:35205_ ] o.a.s.c.CorePropertiesLocator Cores are: [collection1]
[junit4] 2> 1522277 INFO (OverseerStateUpdate-96766924120784901-127.0.0.1:41344_-n_0000000000) [n:127.0.0.1:41344_ ] o.a.s.c.o.ReplicaMutator Assigning new node to shard shard=shard1
[junit4] 2> 1523283 WARN (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 x:collection1] o.a.s.c.Config Beginning with Solr 5.5, <mergePolicy> is deprecated, use <mergePolicyFactory> instead.
[junit4] 2> 1523285 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 x:collection1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 6.3.0
[junit4] 2> 1523294 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 x:collection1] o.a.s.s.IndexSchema [collection1] Schema name=test
[junit4] 2> 1523369 WARN (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 x:collection1] o.a.s.s.IndexSchema [collection1] default search field in schema is text. WARNING: Deprecated, please use 'df' on request instead.
[junit4] 2> 1523371 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 x:collection1] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
[junit4] 2> 1523377 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 x:collection1] o.a.s.c.CoreContainer Creating SolrCore 'collection1' using configuration from collection collection1
[junit4] 2> 1523377 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.SolrCore [[collection1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-2-001/cores/collection1], dataDir=[/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-2-001/cores/collection1/data/]
[junit4] 2> 1523377 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring is enabled. Adding Solr mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxMBeanServer@6578c9
[junit4] 2> 1523379 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=47, maxMergeAtOnceExplicit=35, maxMergedSegmentMB=6.7490234375, floorSegmentMB=1.4541015625, forceMergeDeletesPctAllowed=28.12303288361992, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
[junit4] 2> 1523421 WARN (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,args = {defaults={a=A,b=B}}}
[junit4] 2> 1523427 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
[junit4] 2> 1523427 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=1000 maxNumLogsToKeep=10 numVersionBuckets=65536
[junit4] 2> 1523428 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.CommitTracker Hard AutoCommit: disabled
[junit4] 2> 1523428 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.CommitTracker Soft AutoCommit: disabled
[junit4] 2> 1523428 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.AlcoholicMergePolicy: [AlcoholicMergePolicy: minMergeSize=0, mergeFactor=10, maxMergeSize=730345928, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.1]
[junit4] 2> 1523428 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.s.SolrIndexSearcher Opening [Searcher@336f61[collection1] main]
[junit4] 2> 1523429 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
[junit4] 2> 1523429 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
[junit4] 2> 1523429 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.h.ReplicationHandler Commits will be reserved for 10000
[junit4] 2> 1523430 INFO (searcherExecutor-9399-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.SolrCore [collection1] Registered new searcher Searcher@336f61[collection1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
[junit4] 2> 1523430 INFO (coreLoadExecutor-9398-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1548270790722453504
[junit4] 2> 1523433 INFO (coreZkRegister-9393-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.ZkController Core needs to recover:collection1
[junit4] 2> 1523433 INFO (updateExecutor-24691-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DefaultSolrCoreState Running recovery
[junit4] 2> 1523433 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true
[junit4] 2> 1523433 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy ###### startupVersions=[[]]
[junit4] 2> 1523433 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection1]
[junit4] 2> 1523434 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null}
[junit4] 2> 1523434 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Publishing state of core [collection1] as recovering, leader is [https://127.0.0.1:39898/collection1/] and I am [https://127.0.0.1:35205/collection1/]
[junit4] 2> 1523435 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Sending prep recovery command to [https://127.0.0.1:39898]; [WaitForState: action=PREPRECOVERY&core=collection1&nodeName=127.0.0.1:35205_&coreNodeName=core_node2&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true]
[junit4] 2> 1523501 INFO (qtp13018214-93853) [n:127.0.0.1:39898_ ] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node2, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true
[junit4] 2> 1523502 INFO (qtp13018214-93853) [n:127.0.0.1:39898_ ] o.a.s.h.a.PrepRecoveryOp Will wait a max of 183 seconds to see collection1 (shard1 of collection1) have state: recovering
[junit4] 2> 1523502 INFO (qtp13018214-93853) [n:127.0.0.1:39898_ ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:35205_, coreNodeName=core_node2, onlyIfActiveCheckResult=false, nodeProps: core_node2:{"core":"collection1","base_url":"https://127.0.0.1:35205","node_name":"127.0.0.1:35205_","state":"down"}
[junit4] 2> 1523833 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.SolrTestCaseJ4 Writing core.properties file to /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-3-001/cores/collection1
[junit4] 2> 1523833 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 3 in directory /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-3-001
[junit4] 2> 1523834 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.Server jetty-9.3.8.v20160314
[junit4] 2> 1523835 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@de092f{/,null,AVAILABLE}
[junit4] 2> 1523837 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.ServerConnector Started ServerConnector@160df77{SSL,[ssl, http/1.1]}{127.0.0.1:44182}
[junit4] 2> 1523837 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.Server Started @1525873ms
[junit4] 2> 1523837 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/tempDir-001/jetty3, solrconfig=solrconfig.xml, hostContext=/, hostPort=44182, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-3-001/cores}
[junit4] 2> 1523837 ERROR (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
[junit4] 2> 1523838 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter ___ _ Welcome to Apache Solr™ version 6.3.0
[junit4] 2> 1523838 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _ Starting in cloud mode on port null
[junit4] 2> 1523838 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_| Install dir: null
[junit4] 2> 1523838 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter |___/\___/_|_| Start time: 2016-10-15T15:41:28.362Z
[junit4] 2> 1523840 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
[junit4] 2> 1523840 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-3-001/solr.xml
[junit4] 2> 1523844 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.u.UpdateShardHandler Creating UpdateShardHandler HTTP client with params: socketTimeout=340000&connTimeout=45000&retry=true
[junit4] 2> 1523845 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:45627/solr
[junit4] 2> 1523856 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:44182_ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3)
[junit4] 2> 1523858 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:44182_ ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:44182_
[junit4] 2> 1523858 INFO (zkCallback-24701-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
[junit4] 2> 1523859 INFO (zkCallback-24694-thread-1-processing-n:127.0.0.1:35205_) [n:127.0.0.1:35205_ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
[junit4] 2> 1523859 INFO (zkCallback-24679-thread-1-processing-n:127.0.0.1:41344_) [n:127.0.0.1:41344_ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
[junit4] 2> 1523859 INFO (zkCallback-24683-thread-1) [ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
[junit4] 2> 1523859 INFO (zkCallback-24688-thread-1-processing-n:127.0.0.1:39898_) [n:127.0.0.1:39898_ ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
[junit4] 2> 1523871 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:44182_ ] o.a.s.c.CorePropertiesLocator Found 1 core definitions underneath /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-3-001/cores
[junit4] 2> 1523871 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [n:127.0.0.1:44182_ ] o.a.s.c.CorePropertiesLocator Cores are: [collection1]
[junit4] 2> 1523871 INFO (OverseerStateUpdate-96766924120784901-127.0.0.1:41344_-n_0000000000) [n:127.0.0.1:41344_ ] o.a.s.c.o.ReplicaMutator Assigning new node to shard shard=shard1
[junit4] 2> 1524502 INFO (qtp13018214-93853) [n:127.0.0.1:39898_ ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:35205_, coreNodeName=core_node2, onlyIfActiveCheckResult=false, nodeProps: core_node2:{"core":"collection1","base_url":"https://127.0.0.1:35205","node_name":"127.0.0.1:35205_","state":"recovering"}
[junit4] 2> 1524502 INFO (qtp13018214-93853) [n:127.0.0.1:39898_ ] o.a.s.h.a.PrepRecoveryOp Waited coreNodeName: core_node2, state: recovering, checkLive: true, onlyIfLeader: true for: 1 seconds.
[junit4] 2> 1524502 INFO (qtp13018214-93853) [n:127.0.0.1:39898_ ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:35205_&onlyIfLeaderActive=true&core=collection1&coreNodeName=core_node2&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=1000
[junit4] 2> 1524881 WARN (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 x:collection1] o.a.s.c.Config Beginning with Solr 5.5, <mergePolicy> is deprecated, use <mergePolicyFactory> instead.
[junit4] 2> 1524882 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 x:collection1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 6.3.0
[junit4] 2> 1524896 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 x:collection1] o.a.s.s.IndexSchema [collection1] Schema name=test
[junit4] 2> 1524996 WARN (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 x:collection1] o.a.s.s.IndexSchema [collection1] default search field in schema is text. WARNING: Deprecated, please use 'df' on request instead.
[junit4] 2> 1524998 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 x:collection1] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
[junit4] 2> 1525003 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [https://127.0.0.1:39898/collection1/] - recoveringAfterStartup=[true]
[junit4] 2> 1525003 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.PeerSync PeerSync: core=collection1 url=https://127.0.0.1:35205 START replicas=[https://127.0.0.1:39898/collection1/] nUpdates=1000
[junit4] 2> 1525004 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 x:collection1] o.a.s.c.CoreContainer Creating SolrCore 'collection1' using configuration from collection collection1
[junit4] 2> 1525005 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.SolrCore [[collection1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-3-001/cores/collection1], dataDir=[/home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001/shard-3-001/cores/collection1/data/]
[junit4] 2> 1525005 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.JmxMonitoredMap JMX monitoring is enabled. Adding Solr mbeans to JMX Server: com.sun.jmx.mbeanserver.JmxMBeanServer@6578c9
[junit4] 2> 1525007 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=47, maxMergeAtOnceExplicit=35, maxMergedSegmentMB=6.7490234375, floorSegmentMB=1.4541015625, forceMergeDeletesPctAllowed=28.12303288361992, segmentsPerTier=45.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0
[junit4] 2> 1525008 INFO (qtp13018214-93854) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.IndexFingerprint IndexFingerprint millis:0.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=0, maxInHash=0, versionsHash=0, numVersions=0, numDocs=0, maxDoc=0}
[junit4] 2> 1525009 INFO (qtp13018214-93854) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.S.Request [collection1] webapp= path=/get params={distrib=false&qt=/get&getFingerprint=9223372036854775807&wt=javabin&version=2} status=0 QTime=1
[junit4] 2> 1525009 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.IndexFingerprint IndexFingerprint millis:0.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=0, maxInHash=0, versionsHash=0, numVersions=0, numDocs=0, maxDoc=0}
[junit4] 2> 1525009 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.PeerSync We are already in sync. No need to do a PeerSync
[junit4] 2> 1525009 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
[junit4] 2> 1525009 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
[junit4] 2> 1525010 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 end_commit_flush
[junit4] 2> 1525010 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy PeerSync stage of recovery was successful.
[junit4] 2> 1525010 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Replaying updates buffered during PeerSync.
[junit4] 2> 1525010 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy No replay needed.
[junit4] 2> 1525010 INFO (recoveryExecutor-24692-thread-1-processing-n:127.0.0.1:35205_ x:collection1 s:shard1 c:collection1 r:core_node2) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.RecoveryStrategy Registering as Active after recovery.
[junit4] 2> 1525036 WARN (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,args = {defaults={a=A,b=B}}}
[junit4] 2> 1525042 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
[junit4] 2> 1525042 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=1000 maxNumLogsToKeep=10 numVersionBuckets=65536
[junit4] 2> 1525042 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.CommitTracker Hard AutoCommit: disabled
[junit4] 2> 1525043 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.CommitTracker Soft AutoCommit: disabled
[junit4] 2> 1525043 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.AlcoholicMergePolicy: [AlcoholicMergePolicy: minMergeSize=0, mergeFactor=10, maxMergeSize=730345928, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.1]
[junit4] 2> 1525043 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.s.SolrIndexSearcher Opening [Searcher@c42b4a[collection1] main]
[junit4] 2> 1525044 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
[junit4] 2> 1525044 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
[junit4] 2> 1525044 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.h.ReplicationHandler Commits will be reserved for 10000
[junit4] 2> 1525045 INFO (searcherExecutor-9410-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.SolrCore [collection1] Registered new searcher Searcher@c42b4a[collection1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
[junit4] 2> 1525045 INFO (coreLoadExecutor-9409-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1548270792415903744
[junit4] 2> 1525047 INFO (coreZkRegister-9404-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.ZkController Core needs to recover:collection1
[junit4] 2> 1525048 INFO (updateExecutor-24698-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DefaultSolrCoreState Running recovery
[junit4] 2> 1525048 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true
[junit4] 2> 1525048 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy ###### startupVersions=[[]]
[junit4] 2> 1525048 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection1]
[junit4] 2> 1525048 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null}
[junit4] 2> 1525048 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Publishing state of core [collection1] as recovering, leader is [https://127.0.0.1:39898/collection1/] and I am [https://127.0.0.1:44182/collection1/]
[junit4] 2> 1525049 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Sending prep recovery command to [https://127.0.0.1:39898]; [WaitForState: action=PREPRECOVERY&core=collection1&nodeName=127.0.0.1:44182_&coreNodeName=core_node3&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true]
[junit4] 2> 1525052 INFO (qtp13018214-93855) [n:127.0.0.1:39898_ ] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node3, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true
[junit4] 2> 1525052 INFO (qtp13018214-93855) [n:127.0.0.1:39898_ ] o.a.s.h.a.PrepRecoveryOp Will wait a max of 183 seconds to see collection1 (shard1 of collection1) have state: recovering
[junit4] 2> 1525052 INFO (qtp13018214-93855) [n:127.0.0.1:39898_ ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:44182_, coreNodeName=core_node3, onlyIfActiveCheckResult=false, nodeProps: core_node3:{"core":"collection1","base_url":"https://127.0.0.1:44182","node_name":"127.0.0.1:44182_","state":"down"}
[junit4] 2> 1525372 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.SolrTestCaseJ4 ###Starting test
[junit4] 2> 1525373 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractFullDistribZkTestBase Wait for recoveries to finish - wait 30 for each attempt
[junit4] 2> 1525373 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractDistribZkTestBase Wait for recoveries to finish - collection: collection1 failOnTimeout:true timeout (sec):30
[junit4] 2> 1526052 INFO (qtp13018214-93855) [n:127.0.0.1:39898_ ] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard1, thisCore=collection1, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=recovering, localState=active, nodeName=127.0.0.1:44182_, coreNodeName=core_node3, onlyIfActiveCheckResult=false, nodeProps: core_node3:{"core":"collection1","base_url":"https://127.0.0.1:44182","node_name":"127.0.0.1:44182_","state":"recovering"}
[junit4] 2> 1526053 INFO (qtp13018214-93855) [n:127.0.0.1:39898_ ] o.a.s.h.a.PrepRecoveryOp Waited coreNodeName: core_node3, state: recovering, checkLive: true, onlyIfLeader: true for: 1 seconds.
[junit4] 2> 1526053 INFO (qtp13018214-93855) [n:127.0.0.1:39898_ ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={nodeName=127.0.0.1:44182_&onlyIfLeaderActive=true&core=collection1&coreNodeName=core_node3&action=PREPRECOVERY&checkLive=true&state=recovering&onlyIfLeader=true&wt=javabin&version=2} status=0 QTime=1000
[junit4] 2> 1526553 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Attempting to PeerSync from [https://127.0.0.1:39898/collection1/] - recoveringAfterStartup=[true]
[junit4] 2> 1526553 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.PeerSync PeerSync: core=collection1 url=https://127.0.0.1:44182 START replicas=[https://127.0.0.1:39898/collection1/] nUpdates=1000
[junit4] 2> 1526557 INFO (qtp13018214-93854) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.S.Request [collection1] webapp= path=/get params={distrib=false&qt=/get&getFingerprint=9223372036854775807&wt=javabin&version=2} status=0 QTime=0
[junit4] 2> 1526557 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.IndexFingerprint IndexFingerprint millis:0.0 result:{maxVersionSpecified=9223372036854775807, maxVersionEncountered=0, maxInHash=0, versionsHash=0, numVersions=0, numDocs=0, maxDoc=0}
[junit4] 2> 1526557 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.PeerSync We are already in sync. No need to do a PeerSync
[junit4] 2> 1526558 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
[junit4] 2> 1526558 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
[junit4] 2> 1526558 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 end_commit_flush
[junit4] 2> 1526558 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy PeerSync stage of recovery was successful.
[junit4] 2> 1526558 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Replaying updates buffered during PeerSync.
[junit4] 2> 1526558 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy No replay needed.
[junit4] 2> 1526558 INFO (recoveryExecutor-24699-thread-1-processing-n:127.0.0.1:44182_ x:collection1 s:shard1 c:collection1 r:core_node3) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.RecoveryStrategy Registering as Active after recovery.
[junit4] 2> 1527373 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.AbstractDistribZkTestBase Recoveries finished - collection: collection1
[junit4] 2> 1527442 INFO (qtp956471-93818) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
[junit4] 2> 1527442 INFO (qtp956471-93818) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
[junit4] 2> 1527442 INFO (qtp956471-93818) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 end_commit_flush
[junit4] 2> 1527442 INFO (qtp956471-93818) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 0
[junit4] 2> 1527449 INFO (qtp13018214-93855) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
[junit4] 2> 1527449 INFO (qtp13018214-93855) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
[junit4] 2> 1527450 INFO (qtp13018214-93855) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.DirectUpdateHandler2 end_commit_flush
[junit4] 2> 1527450 INFO (qtp13018214-93855) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=https://127.0.0.1:39898/collection1/&commit_end_point=true&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 0
[junit4] 2> 1527515 INFO (qtp31764975-93915) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
[junit4] 2> 1527515 INFO (qtp16758849-93881) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 start commit{,optimize=false,openSearcher=true,waitSearcher=true,expungeDeletes=false,softCommit=false,prepareCommit=false}
[junit4] 2> 1527515 INFO (qtp31764975-93915) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
[junit4] 2> 1527515 INFO (qtp16758849-93881) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 No uncommitted changes. Skipping IW.commit.
[junit4] 2> 1527515 INFO (qtp16758849-93881) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.DirectUpdateHandler2 end_commit_flush
[junit4] 2> 1527515 INFO (qtp31764975-93915) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 end_commit_flush
[junit4] 2> 1527516 INFO (qtp16758849-93881) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=https://127.0.0.1:39898/collection1/&commit_end_point=true&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 0
[junit4] 2> 1527516 INFO (qtp31764975-93915) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&waitSearcher=true&openSearcher=true&commit=true&softCommit=false&distrib.from=https://127.0.0.1:39898/collection1/&commit_end_point=true&wt=javabin&version=2&expungeDeletes=false}{commit=} 0 0
[junit4] 2> 1527516 INFO (qtp13018214-93854) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={waitSearcher=true&commit=true&softCommit=false&wt=javabin&version=2}{commit=} 0 70
[junit4] 2> 1527519 INFO (qtp13018214-93859) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.c.S.Request [collection1] webapp= path=/select params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2} hits=0 status=0 QTime=0
[junit4] 2> 1527522 INFO (qtp16758849-93881) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.c.S.Request [collection1] webapp= path=/select params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2} hits=0 status=0 QTime=0
[junit4] 2> 1527524 INFO (qtp31764975-93915) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.S.Request [collection1] webapp= path=/select params={q=*:*&distrib=false&tests=checkShardConsistency&rows=0&wt=javabin&version=2} hits=0 status=0 QTime=0
[junit4] 2> 1529528 INFO (qtp956471-93823) [n:127.0.0.1:41344_ c:control_collection s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={wt=javabin&version=2}{deleteByQuery=*:* (-1548270797115621376)} 0 1
[junit4] 2> 1529534 INFO (qtp16758849-93886) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&_version_=-1548270797118767104&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{deleteByQuery=*:* (-1548270797118767104)} 0 0
[junit4] 2> 1529534 INFO (qtp31764975-93920) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&_version_=-1548270797118767104&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{deleteByQuery=*:* (-1548270797118767104)} 0 0
[junit4] 2> 1529534 INFO (qtp13018214-93860) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={wt=javabin&version=2}{deleteByQuery=*:* (-1548270797118767104)} 0 4
[junit4] 2> 1529541 INFO (qtp16758849-93887) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{add=[0 (1548270797126107136)]} 0 1
[junit4] 2> 1529541 INFO (qtp31764975-93921) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{add=[0 (1548270797126107136)]} 0 1
[junit4] 2> 1529541 INFO (qtp13018214-93855) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={wt=javabin&version=2}{add=[0 (1548270797126107136)]} 0 3
[junit4] 2> 1529545 INFO (qtp31764975-93922) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{add=[1 (1548270797131350016)]} 0 1
[junit4] 2> 1529545 INFO (qtp16758849-93888) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{add=[1 (1548270797131350016)]} 0 1
[junit4] 2> 1529545 INFO (qtp13018214-93854) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={wt=javabin&version=2}{add=[1 (1548270797131350016)]} 0 3
[junit4] 2> 1529549 INFO (qtp31764975-93920) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{add=[2 (1548270797136592896)]} 0 0
[junit4] 2> 1529549 INFO (qtp16758849-93881) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{add=[2 (1548270797136592896)]} 0 0
[junit4] 2> 1529549 INFO (qtp13018214-93859) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={wt=javabin&version=2}{add=[2 (1548270797136592896)]} 0 2
[junit4] 2> 1529552 INFO (qtp16758849-93887) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{add=[3 (1548270797140787200)]} 0 0
[junit4] 2> 1529552 INFO (qtp31764975-93921) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{add=[3 (1548270797140787200)]} 0 0
[junit4] 2> 1529553 INFO (qtp13018214-93860) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={wt=javabin&version=2}{add=[3 (1548270797140787200)]} 0 2
[junit4] 2> 1529557 INFO (qtp16758849-93888) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{add=[4 (1548270797143932928)]} 0 1
[junit4] 2> 1529557 INFO (qtp31764975-93915) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{add=[4 (1548270797143932928)]} 0 1
[junit4] 2> 1529557 INFO (qtp13018214-93855) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={wt=javabin&version=2}{add=[4 (1548270797143932928)]} 0 3
[junit4] 2> 1529561 INFO (qtp16758849-93881) [n:127.0.0.1:35205_ c:collection1 s:shard1 r:core_node2 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{add=[5 (1548270797149175808)]} 0 0
[junit4] 2> 1529561 INFO (qtp31764975-93922) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.p.LogUpdateProcessorFactory [collection1] webapp= path=/update params={update.distrib=FROMLEADER&distrib.from=https://127.0.0.1:39898/collection1/&wt=javabin&version=2}{add=[5 (1548270797149175808)]} 0 0
[junit4] 2> 1529561 INFO (qtp13018214-93854) [n:127.0.0.1:39898_ c:collection1 s:shard1 r:core_node1 x:collection1] o.a.s.u.p.LogUpdatePr
[...truncated too long message...]
17-127.0.0.1:44182_-n_0000000005) am no longer a leader.
[junit4] 2> 1573501 WARN (zkCallback-24715-thread-1-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ ] o.a.s.c.c.ZkStateReader ZooKeeper watch triggered, but Solr cannot talk to ZK: [KeeperErrorCode = Session expired for /live_nodes]
[junit4] 2> 1574497 WARN (zkCallback-24715-thread-3-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.SyncStrategy Closed, skipping sync up.
[junit4] 2> 1574497 INFO (zkCallback-24715-thread-3-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.c.SolrCore [collection1] CLOSING SolrCore org.apache.solr.core.SolrCore@8763b7
[junit4] 2> 1574498 INFO (zkCallback-24715-thread-3-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter close.
[junit4] 2> 1574498 INFO (zkCallback-24715-thread-3-processing-n:127.0.0.1:44182_) [n:127.0.0.1:44182_ c:collection1 s:shard1 r:core_node3 x:collection1] o.a.s.u.SolrIndexWriter Calling setCommitData with IW:org.apache.solr.update.SolrIndexWriter@e8a123
[junit4] 2> 1574661 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.ServerConnector Stopped ServerConnector@29813a{SSL,[ssl, http/1.1]}{127.0.0.1:44182}
[junit4] 2> 1574661 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@792ed0{/,null,UNAVAILABLE}
[junit4] 2> 1574662 INFO (TEST-PeerSyncReplicationTest.test-seed#[236A8C932ECFF850]) [ ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:45627 45627
[junit4] 2> 1574756 INFO (Thread-2627) [ ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:45627 45627
[junit4] 2> 1574756 WARN (Thread-2627) [ ] o.a.s.c.ZkTestServer Watch limit violations:
[junit4] 2> Maximum concurrent create/delete watches above limit:
[junit4] 2>
[junit4] 2> 7 /solr/aliases.json
[junit4] 2> 6 /solr/security.json
[junit4] 2> 6 /solr/configs/conf1
[junit4] 2> 5 /solr/collections/collection1/state.json
[junit4] 2>
[junit4] 2> Maximum concurrent data watches above limit:
[junit4] 2>
[junit4] 2> 7 /solr/clusterstate.json
[junit4] 2> 7 /solr/clusterprops.json
[junit4] 2> 2 /solr/collections/collection1/leader_elect/shard1/election/96766924120784905-core_node1-n_0000000000
[junit4] 2> 2 /solr/overseer_elect/election/96766924120784901-127.0.0.1:41344_-n_0000000000
[junit4] 2> 2 /solr/overseer_elect/election/96766924120784905-127.0.0.1:39898_-n_0000000001
[junit4] 2>
[junit4] 2> Maximum concurrent children watches above limit:
[junit4] 2>
[junit4] 2> 52 /solr/overseer/collection-queue-work
[junit4] 2> 26 /solr/overseer/queue
[junit4] 2> 7 /solr/collections
[junit4] 2> 5 /solr/live_nodes
[junit4] 2> 5 /solr/overseer/queue-work
[junit4] 2>
[junit4] 2> NOTE: reproduce with: ant test -Dtestcase=PeerSyncReplicationTest -Dtests.method=test -Dtests.seed=236A8C932ECFF850 -Dtests.multiplier=3 -Dtests.slow=true -Dtests.locale=mt-MT -Dtests.timezone=America/Grenada -Dtests.asserts=true -Dtests.file.encoding=UTF-8
[junit4] FAILURE 55.9s J2 | PeerSyncReplicationTest.test <<<
[junit4] > Throwable #1: java.lang.AssertionError: expected:<204> but was:<187>
[junit4] > at __randomizedtesting.SeedInfo.seed([236A8C932ECFF850:AB3EB349803395A8]:0)
[junit4] > at org.apache.solr.cloud.PeerSyncReplicationTest.bringUpDeadNodeAndEnsureNoReplication(PeerSyncReplicationTest.java:280)
[junit4] > at org.apache.solr.cloud.PeerSyncReplicationTest.test(PeerSyncReplicationTest.java:157)
[junit4] > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:992)
[junit4] > at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:967)
[junit4] > at java.lang.Thread.run(Thread.java:745)
[junit4] 2> 1574761 INFO (SUITE-PeerSyncReplicationTest-seed#[236A8C932ECFF850]-worker) [ ] o.a.s.SolrTestCaseJ4 ###deleteCore
[junit4] 2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.PeerSyncReplicationTest_236A8C932ECFF850-001
[junit4] 2> Oct 15, 2016 3:42:19 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
[junit4] 2> WARNING: Will linger awaiting termination of 1 leaked thread(s).
[junit4] 2> NOTE: test params are: codec=Asserting(Lucene62): {other_tl1=PostingsFormat(name=MockRandom), range_facet_l_dv=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene50(blocksize=128))), rnd_s=PostingsFormat(name=MockRandom), multiDefault=PostingsFormat(name=LuceneVarGapDocFreqInterval), intDefault=PostingsFormat(name=MockRandom), a_i1=PostingsFormat(name=LuceneVarGapDocFreqInterval), range_facet_l=PostingsFormat(name=LuceneVarGapDocFreqInterval), _version_=PostingsFormat(name=MockRandom), a_t=PostingsFormat(name=LuceneVarGapDocFreqInterval), id=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene50(blocksize=128))), range_facet_i_dv=PostingsFormat(name=LuceneVarGapDocFreqInterval), text=Lucene50(blocksize=128), timestamp=PostingsFormat(name=LuceneVarGapDocFreqInterval)}, docValues:{range_facet_l_dv=DocValuesFormat(name=Lucene54), range_facet_i_dv=DocValuesFormat(name=Memory), timestamp=DocValuesFormat(name=Memory)}, maxPointsInLeafNode=806, maxMBSortInHeap=7.9213187940023175, sim=RandomSimilarity(queryNorm=false,coord=crazy): {}, locale=mt-MT, timezone=America/Grenada
[junit4] 2> NOTE: Linux 4.4.0-36-generic i386/Oracle Corporation 1.8.0_102 (32-bit)/cpus=12,threads=1,free=182503536,total=518979584
[junit4] 2> NOTE: All tests run in this JVM: [CoreAdminCreateDiscoverTest, DistributedQueryComponentCustomSortTest, QueryParsingTest, TlogReplayBufferedWhileIndexingTest, OverseerStatusTest, FieldMutatingUpdateProcessorTest, TestSmileRequest, FacetPivotSmallTest, TestUtils, RecoveryAfterSoftCommitTest, TestConfig, UnloadDistributedZkTest, TestWriterPerf, TestManagedResource, CollectionStateFormat2Test, TestScoreJoinQPNoScore, FieldAnalysisRequestHandlerTest, SuggesterTest, DistribCursorPagingTest, SuggesterTSTTest, CdcrRequestHandlerTest, TestClusterStateMutator, UpdateParamsTest, TestSha256AuthenticationProvider, StatsReloadRaceTest, TestConfigOverlay, TestStressLiveNodes, JSONWriterTest, TestMacros, LeaderInitiatedRecoveryOnCommitTest, TestLeaderElectionZkExpiry, OpenExchangeRatesOrgProviderTest, TestSort, OverseerCollectionConfigSetProcessorTest, HdfsUnloadDistributedZkTest, TestManagedStopFilterFactory, HdfsSyncSliceTest, DocValuesMissingTest, TestCryptoKeys, TestMaxScoreQueryParser, HdfsLockFactoryTest, TestStressReorder, TestSubQueryTransformer, TestFieldTypeResource, TestReload, TestSSLRandomization, TestAnalyzeInfixSuggestions, TestIndexingPerformance, TestZkChroot, TestOverriddenPrefixQueryForCustomFieldType, HighlighterMaxOffsetTest, AnalyticsQueryTest, TestReversedWildcardFilterFactory, TestRandomFlRTGCloud, TestSolrCloudWithDelegationTokens, TestMergePolicyConfig, TestLRUStatsCache, TestDFISimilarityFactory, RecoveryZkTest, ShowFileRequestHandlerTest, TestReqParamsAPI, BadComponentTest, TestGraphTermsQParserPlugin, TestPerFieldSimilarityWithDefaultOverride, HdfsBasicDistributedZk2Test, ResponseLogComponentTest, HLLSerializationTest, SSLMigrationTest, TestPostingsSolrHighlighter, BitVectorTest, TestImplicitCoreProperties, BasicDistributedZk2Test, TestSortByMinMaxFunction, TestObjectReleaseTracker, TestSolrQueryParserDefaultOperatorResource, TestManagedSchemaAPI, TestSolrQueryResponse, PrimitiveFieldTypeTest, ConnectionManagerTest, TestUseDocValuesAsStored, MinimalSchemaTest, HardAutoCommitTest, SharedFSAutoReplicaFailoverTest, DistributedFacetExistsSmallTest, TestFieldCollectionResource, SolrCLIZkUtilsTest, TermVectorComponentDistributedTest, TestCollationField, TestDFRSimilarityFactory, SuggestComponentTest, CleanupOldIndexTest, TestExtendedDismaxParser, TestStandardQParsers, LeaderElectionTest, TestConfigReload, SOLR749Test, TestExceedMaxTermLength, DeleteReplicaTest, TolerantUpdateProcessorTest, TestExclusionRuleCollectionAccess, TestReloadDeadlock, ChaosMonkeySafeLeaderTest, OverseerTest, ShardRoutingTest, FullSolrCloudDistribCmdsTest, TestRandomFaceting, ZkSolrClientTest, TestRandomDVFaceting, TestDistributedSearch, TestGroupingSearch, PeerSyncTest, BadIndexSchemaTest, DirectUpdateHandlerTest, SoftAutoCommitTest, SpellCheckCollatorTest, TestFoldingMultitermQuery, SuggesterWFSTTest, SolrCoreCheckLockOnStartupTest, DirectUpdateHandlerOptimizeTest, StatelessScriptUpdateProcessorFactoryTest, DebugComponentTest, IndexBasedSpellCheckerTest, StandardRequestHandlerTest, DocumentAnalysisRequestHandlerTest, TestOmitPositions, XmlUpdateRequestHandlerTest, DocumentBuilderTest, TestValueSourceCache, MoreLikeThisHandlerTest, TestSolrQueryParser, IndexSchemaRuntimeFieldTest, SolrPluginUtilsTest, TestCSVResponseWriter, TestAnalyzedSuggestions, TestPHPSerializedResponseWriter, TestComponentsName, SearchHandlerTest, TestLFUCache, ActionThrottleTest, AssignTest, AsyncCallRequestStatusResponseTest, CdcrReplicationDistributedZkTest, CollectionTooManyReplicasTest, CollectionsAPISolrJTest, ConfigSetsAPITest, CreateCollectionCleanupTest, DeleteInactiveReplicaTest, DeleteStatusTest, DistribDocExpirationUpdateProcessorTest, DistributedQueueTest, MigrateRouteKeyTest, MultiThreadedOCPTest, OutOfBoxZkACLAndCredentialsProvidersTest, OverseerModifyCollectionTest, OverseerRolesTest, PeerSyncReplicationTest]
[junit4] Completed [453/640 (1!)] on J2 in 56.67s, 1 test, 1 failure <<< FAILURES!
[...truncated 53296 lines...]
[JENKINS-EA] Lucene-Solr-6.x-Linux (32bit/jdk-9-ea+138) - Build #
1956 - Still Unstable!
Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-6.x-Linux/1956/
Java: 32bit/jdk-9-ea+138 -server -XX:+UseSerialGC
1 tests failed.
FAILED: org.apache.solr.cloud.OverseerTaskQueueTest.testPeekElements
Error Message:
expected:<1> but was:<0>
Stack Trace:
java.lang.AssertionError: expected:<1> but was:<0>
at __randomizedtesting.SeedInfo.seed([533C15679CC8CFC3:AE12AF464CF19BDE]:0)
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.failNotEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:128)
at org.junit.Assert.assertEquals(Assert.java:472)
at org.junit.Assert.assertEquals(Assert.java:456)
at org.apache.solr.cloud.DistributedQueueTest.testPeekElements(DistributedQueueTest.java:177)
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(java.base@9-ea/Native Method)
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(java.base@9-ea/NativeMethodAccessorImpl.java:62)
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(java.base@9-ea/DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(java.base@9-ea/Method.java:535)
at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1764)
at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:871)
at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:907)
at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:921)
at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:809)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:460)
at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:880)
at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:781)
at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:816)
at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:827)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:367)
at java.lang.Thread.run(java.base@9-ea/Thread.java:843)
Build Log:
[...truncated 11063 lines...]
[junit4] Suite: org.apache.solr.cloud.OverseerTaskQueueTest
[junit4] 2> Creating dataDir: /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.OverseerTaskQueueTest_533C15679CC8CFC3-001/init-core-data-001
[junit4] 2> 304996 INFO (SUITE-OverseerTaskQueueTest-seed#[533C15679CC8CFC3]-worker) [ ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (true) via: @org.apache.solr.util.RandomizeSSL(reason="", ssl=0.0/0.0, value=0.0/0.0, clientAuth=0.0/0.0)
[junit4] 2> 304997 INFO (TEST-OverseerTaskQueueTest.testContainsTaskWithRequestId-seed#[533C15679CC8CFC3]) [ ] o.a.s.SolrTestCaseJ4 ###Starting testContainsTaskWithRequestId
[junit4] 2> 304997 INFO (TEST-OverseerTaskQueueTest.testContainsTaskWithRequestId-seed#[533C15679CC8CFC3]) [ ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
[junit4] 2> 304997 INFO (Thread-630) [ ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
[junit4] 2> 304997 INFO (Thread-630) [ ] o.a.s.c.ZkTestServer Starting server
[junit4] 2> 305097 INFO (TEST-OverseerTaskQueueTest.testContainsTaskWithRequestId-seed#[533C15679CC8CFC3]) [ ] o.a.s.c.ZkTestServer start zk server on port:42184
[junit4] 2> 305104 INFO (TEST-OverseerTaskQueueTest.testContainsTaskWithRequestId-seed#[533C15679CC8CFC3]) [ ] o.a.s.SolrTestCaseJ4 ###Ending testContainsTaskWithRequestId
[junit4] 2> 305105 INFO (TEST-OverseerTaskQueueTest.testContainsTaskWithRequestId-seed#[533C15679CC8CFC3]) [ ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:42184 42184
[junit4] 2> 305151 INFO (Thread-630) [ ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:42184 42184
[junit4] 2> 305159 INFO (TEST-OverseerTaskQueueTest.testDistributedQueueBlocking-seed#[533C15679CC8CFC3]) [ ] o.a.s.SolrTestCaseJ4 ###Starting testDistributedQueueBlocking
[junit4] 2> 305159 INFO (TEST-OverseerTaskQueueTest.testDistributedQueueBlocking-seed#[533C15679CC8CFC3]) [ ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
[junit4] 2> 305159 INFO (Thread-631) [ ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
[junit4] 2> 305159 INFO (Thread-631) [ ] o.a.s.c.ZkTestServer Starting server
[junit4] 2> 305259 INFO (TEST-OverseerTaskQueueTest.testDistributedQueueBlocking-seed#[533C15679CC8CFC3]) [ ] o.a.s.c.ZkTestServer start zk server on port:46725
[junit4] 2> 306470 WARN (zkCallback-771-thread-1) [ ] o.a.s.c.c.ConnectionManager Watcher org.apache.solr.common.cloud.ConnectionManager@83f85 name: ZooKeeperConnection Watcher:127.0.0.1:46725/solr got event WatchedEvent state:Disconnected type:None path:null path: null type: None
[junit4] 2> 306470 WARN (zkCallback-771-thread-1) [ ] o.a.s.c.c.ConnectionManager zkClient has disconnected
[junit4] 2> 308110 WARN (zkCallback-771-thread-3) [ ] o.a.s.c.c.ConnectionManager Watcher org.apache.solr.common.cloud.ConnectionManager@83f85 name: ZooKeeperConnection Watcher:127.0.0.1:46725/solr got event WatchedEvent state:Expired type:None path:null path: null type: None
[junit4] 2> 308110 WARN (zkCallback-771-thread-3) [ ] o.a.s.c.c.ConnectionManager Our previous ZooKeeper session was expired. Attempting to reconnect to recover relationship with ZooKeeper...
[junit4] 2> 308110 WARN (zkCallback-771-thread-3) [ ] o.a.s.c.c.DefaultConnectionStrategy Connection expired - starting a new one...
[junit4] 2> 308112 INFO (zkCallback-771-thread-3) [ ] o.a.s.c.c.ConnectionManager Connection with ZooKeeper reestablished.
[junit4] 2> 308112 INFO (zkCallback-771-thread-3) [ ] o.a.s.c.c.DefaultConnectionStrategy Reconnected to ZooKeeper
[junit4] 2> 308112 INFO (zkCallback-771-thread-3) [ ] o.a.s.c.c.ConnectionManager Connected:true
[junit4] 2> 309264 INFO (TEST-OverseerTaskQueueTest.testDistributedQueueBlocking-seed#[533C15679CC8CFC3]) [ ] o.a.s.SolrTestCaseJ4 ###Ending testDistributedQueueBlocking
[junit4] 2> 309264 INFO (TEST-OverseerTaskQueueTest.testDistributedQueueBlocking-seed#[533C15679CC8CFC3]) [ ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:46725 46725
[junit4] 2> 309294 INFO (Thread-631) [ ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:46725 46725
[junit4] 2> 309296 WARN (Thread-631) [ ] o.a.s.c.ZkTestServer Watch limit violations:
[junit4] 2> Maximum concurrent children watches above limit:
[junit4] 2>
[junit4] 2> 4 /solr/distqueue/test
[junit4] 2>
[junit4] 2> 309298 INFO (TEST-OverseerTaskQueueTest.testDistributedQueue-seed#[533C15679CC8CFC3]) [ ] o.a.s.SolrTestCaseJ4 ###Starting testDistributedQueue
[junit4] 2> 309298 INFO (TEST-OverseerTaskQueueTest.testDistributedQueue-seed#[533C15679CC8CFC3]) [ ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
[junit4] 2> 309298 INFO (Thread-632) [ ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
[junit4] 2> 309298 INFO (Thread-632) [ ] o.a.s.c.ZkTestServer Starting server
[junit4] 2> 309398 INFO (TEST-OverseerTaskQueueTest.testDistributedQueue-seed#[533C15679CC8CFC3]) [ ] o.a.s.c.ZkTestServer start zk server on port:35806
[junit4] 2> 316157 INFO (TEST-OverseerTaskQueueTest.testDistributedQueue-seed#[533C15679CC8CFC3]) [ ] o.a.s.SolrTestCaseJ4 ###Ending testDistributedQueue
[junit4] 2> 316158 INFO (TEST-OverseerTaskQueueTest.testDistributedQueue-seed#[533C15679CC8CFC3]) [ ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:35806 35806
[junit4] 2> 316213 INFO (Thread-632) [ ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:35806 35806
[junit4] 2> 316214 WARN (Thread-632) [ ] o.a.s.c.ZkTestServer Watch limit violations:
[junit4] 2> Maximum concurrent children watches above limit:
[junit4] 2>
[junit4] 2> 4 /solr/distqueue/test
[junit4] 2>
[junit4] 2> 316216 INFO (TEST-OverseerTaskQueueTest.testPeekElements-seed#[533C15679CC8CFC3]) [ ] o.a.s.SolrTestCaseJ4 ###Starting testPeekElements
[junit4] 2> 316217 INFO (TEST-OverseerTaskQueueTest.testPeekElements-seed#[533C15679CC8CFC3]) [ ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
[junit4] 2> 316217 INFO (Thread-635) [ ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
[junit4] 2> 316217 INFO (Thread-635) [ ] o.a.s.c.ZkTestServer Starting server
[junit4] 2> 316317 INFO (TEST-OverseerTaskQueueTest.testPeekElements-seed#[533C15679CC8CFC3]) [ ] o.a.s.c.ZkTestServer start zk server on port:37871
[junit4] 2> 319330 INFO (TEST-OverseerTaskQueueTest.testPeekElements-seed#[533C15679CC8CFC3]) [ ] o.a.s.SolrTestCaseJ4 ###Ending testPeekElements
[junit4] 2> 320770 WARN (NIOServerCxn.Factory:0.0.0.0/0.0.0.0:0) [ ] o.a.z.s.NIOServerCnxn caught end of stream exception
[junit4] 2> EndOfStreamException: Unable to read additional data from client sessionid 0x157c98c27470000, likely client has closed socket
[junit4] 2> at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:228)
[junit4] 2> at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:208)
[junit4] 2> at java.lang.Thread.run(java.base@9-ea/Thread.java:843)
[junit4] 2> 320771 INFO (TEST-OverseerTaskQueueTest.testPeekElements-seed#[533C15679CC8CFC3]) [ ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:37871 37871
[junit4] 2> 320786 INFO (Thread-635) [ ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:37871 37871
[junit4] 2> NOTE: reproduce with: ant test -Dtestcase=OverseerTaskQueueTest -Dtests.method=testPeekElements -Dtests.seed=533C15679CC8CFC3 -Dtests.multiplier=3 -Dtests.slow=true -Dtests.locale=sr-Cyrl-RS -Dtests.timezone=Australia/Canberra -Dtests.asserts=true -Dtests.file.encoding=ISO-8859-1
[junit4] FAILURE 4.57s J0 | OverseerTaskQueueTest.testPeekElements <<<
[junit4] > Throwable #1: java.lang.AssertionError: expected:<1> but was:<0>
[junit4] > at __randomizedtesting.SeedInfo.seed([533C15679CC8CFC3:AE12AF464CF19BDE]:0)
[junit4] > at org.apache.solr.cloud.DistributedQueueTest.testPeekElements(DistributedQueueTest.java:177)
[junit4] > at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(java.base@9-ea/Native Method)
[junit4] > at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(java.base@9-ea/NativeMethodAccessorImpl.java:62)
[junit4] > at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(java.base@9-ea/DelegatingMethodAccessorImpl.java:43)
[junit4] > at java.lang.Thread.run(java.base@9-ea/Thread.java:843)
[junit4] 2> 320788 INFO (SUITE-OverseerTaskQueueTest-seed#[533C15679CC8CFC3]-worker) [ ] o.a.s.SolrTestCaseJ4 ###deleteCore
[junit4] 2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-6.x-Linux/solr/build/solr-core/test/J0/temp/solr.cloud.OverseerTaskQueueTest_533C15679CC8CFC3-001
[junit4] 2> Oct 15, 2016 6:13:12 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
[junit4] 2> WARNING: Will linger awaiting termination of 2 leaked thread(s).
[junit4] 2> NOTE: test params are: codec=Asserting(Lucene62): {}, docValues:{}, maxPointsInLeafNode=297, maxMBSortInHeap=5.0462112515754916, sim=RandomSimilarity(queryNorm=true,coord=crazy): {}, locale=sr-Cyrl-RS, timezone=Australia/Canberra
[junit4] 2> NOTE: Linux 4.4.0-36-generic i386/Oracle Corporation 9-ea (32-bit)/cpus=12,threads=1,free=51327800,total=285560832
[junit4] 2> NOTE: All tests run in this JVM: [TestEmbeddedSolrServerConstructors, TestBinaryResponseWriter, DistributedQueryElevationComponentTest, DefaultValueUpdateProcessorTest, CoreAdminRequestStatusTest, TestSearcherReuse, TestCollapseQParserPlugin, ExternalFileFieldSortTest, TestReloadAndDeleteDocs, AtomicUpdatesTest, TestDocTermOrds, SyncSliceTest, XsltUpdateRequestHandlerTest, TriLevelCompositeIdRoutingTest, PathHierarchyTokenizerFactoryTest, TestFieldCacheVsDocValues, AlternateDirectoryTest, CdcrVersionReplicationTest, TestSolr4Spatial, BigEndianAscendingWordDeserializerTest, DistributedDebugComponentTest, AddBlockUpdateTest, TestDocSet, TestOmitPositions, PreAnalyzedUpdateProcessorTest, TestMinMaxOnMultiValuedField, CreateCollectionCleanupTest, TestNumericTerms32, TestSolrConfigHandlerCloud, DistribDocExpirationUpdateProcessorTest, BasicAuthIntegrationTest, TestCodecSupport, PolyFieldTest, TestManagedResourceStorage, TestManagedSchemaThreadSafety, OutOfBoxZkACLAndCredentialsProvidersTest, TestScoreJoinQPScore, QueryResultKeyTest, DateFieldTest, CacheHeaderTest, HdfsCollectionsAPIDistributedZkTest, UUIDFieldTest, TestLockTree, ExplicitHLLTest, HdfsWriteToMultipleCollectionsTest, TestTrackingShardHandlerFactory, SolrRequestParserTest, TestCSVLoader, HighlighterConfigTest, TestSortingResponseWriter, TestJmxIntegration, ConcurrentDeleteAndCreateCollectionTest, TestRawResponseWriter, OverseerTaskQueueTest]
[junit4] Completed [120/640 (1!)] on J0 in 16.31s, 4 tests, 1 failure <<< FAILURES!
[...truncated 51581 lines...]