You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@lucene.apache.org by Policeman Jenkins Server <je...@thetaphi.de> on 2018/05/16 03:03:33 UTC

[JENKINS] Lucene-Solr-BadApples-7.x-Linux (64bit/jdk-9.0.4) - Build # 37 - Failure!

Build: https://jenkins.thetaphi.de/job/Lucene-Solr-BadApples-7.x-Linux/37/
Java: 64bit/jdk-9.0.4 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC

All tests passed

Build Log:
[...truncated 1809 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/core/test/temp/junit4-J2-20180516_025016_457709195438340551970.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/core/test/temp/junit4-J0-20180516_025016_45515472862549252371370.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/core/test/temp/junit4-J1-20180516_025016_45713014644300883597524.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 295 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/test-framework/test/temp/junit4-J0-20180516_025533_4246660678585464796955.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/test-framework/test/temp/junit4-J1-20180516_025533_4241363484868251727641.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/test-framework/test/temp/junit4-J2-20180516_025533_4242355200695445415624.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 1075 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/common/test/temp/junit4-J2-20180516_025642_83511669479631096808731.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/common/test/temp/junit4-J0-20180516_025642_83514473295086673092907.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/common/test/temp/junit4-J1-20180516_025642_83518297475497425541755.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 261 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/icu/test/temp/junit4-J2-20180516_025809_89715872878883994530572.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/icu/test/temp/junit4-J1-20180516_025809_8966057010044097455038.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/icu/test/temp/junit4-J0-20180516_025809_8969927938993738136981.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 250 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/kuromoji/test/temp/junit4-J0-20180516_025821_60411225029471790689363.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/kuromoji/test/temp/junit4-J2-20180516_025821_6047092970079643302687.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/kuromoji/test/temp/junit4-J1-20180516_025821_6047164271500479275491.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 162 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/morfologik/test/temp/junit4-J1-20180516_025845_2495471817170273303487.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/morfologik/test/temp/junit4-J0-20180516_025845_24915621158852619008096.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/morfologik/test/temp/junit4-J2-20180516_025845_25015239643798987324656.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 207 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/nori/test/temp/junit4-J2-20180516_025848_9597788216630754664368.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/nori/test/temp/junit4-J0-20180516_025848_95914088226523100589390.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/nori/test/temp/junit4-J1-20180516_025848_9593189183945030124970.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 168 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/opennlp/test/temp/junit4-J2-20180516_025856_32616252537816313124357.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/opennlp/test/temp/junit4-J0-20180516_025856_32616163104560792174701.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/opennlp/test/temp/junit4-J1-20180516_025856_3265758152397752554818.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 176 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/phonetic/test/temp/junit4-J2-20180516_025859_0243720708185807490868.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/phonetic/test/temp/junit4-J0-20180516_025859_02315366388131051314876.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/phonetic/test/temp/junit4-J1-20180516_025859_02415561237574402843802.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 161 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/smartcn/test/temp/junit4-J0-20180516_025908_17812320116673289592023.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/smartcn/test/temp/junit4-J1-20180516_025908_1785582305247668083563.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 162 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/stempel/test/temp/junit4-J1-20180516_025916_0736395930571124595518.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/stempel/test/temp/junit4-J2-20180516_025916_07318122900888187517522.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/stempel/test/temp/junit4-J0-20180516_025916_07318027331600618653645.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 177 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/analysis/uima/test/temp/junit4-J0-20180516_025918_7129011573089929313654.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 200 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/backward-codecs/test/temp/junit4-J1-20180516_025926_38712237205567720520228.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 22 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/backward-codecs/test/temp/junit4-J0-20180516_025926_3877864093716515401683.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 9 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/backward-codecs/test/temp/junit4-J2-20180516_025926_38710118339882026916639.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 1404 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/benchmark/test/temp/junit4-J2-20180516_030124_6086535291072341435619.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/benchmark/test/temp/junit4-J1-20180516_030124_60813669309484857003627.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/benchmark/test/temp/junit4-J0-20180516_030124_60816529263409184874487.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 256 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/classification/test/temp/junit4-J1-20180516_030132_02414297523684215256220.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/classification/test/temp/junit4-J2-20180516_030132_025215123410218429020.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/classification/test/temp/junit4-J0-20180516_030132_0244588165476808856153.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 267 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/codecs/test/temp/junit4-J1-20180516_030142_66114608044005213838864.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/codecs/test/temp/junit4-J2-20180516_030142_66116458048120691495543.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/codecs/test/temp/junit4-J0-20180516_030142_661923306597398373202.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 129 lines...]
    [javac] Compiling 13 source files to /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build/demo/classes/java
    [javac] /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/demo/src/java/org/apache/lucene/demo/xmlparser/FormBasedXmlQueryDemo.java:46: error: cannot find symbol
    [javac] import org.apache.lucene.queryparser.xml.QueryTemplateManager;
    [javac]                                         ^
    [javac]   symbol:   class QueryTemplateManager
    [javac]   location: package org.apache.lucene.queryparser.xml
    [javac] /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/demo/src/java/org/apache/lucene/demo/xmlparser/FormBasedXmlQueryDemo.java:61: error: cannot find symbol
    [javac]   private QueryTemplateManager queryTemplateManager;
    [javac]           ^
    [javac]   symbol:   class QueryTemplateManager
    [javac]   location: class FormBasedXmlQueryDemo
    [javac] /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/demo/src/java/org/apache/lucene/demo/xmlparser/FormBasedXmlQueryDemo.java:81: error: cannot find symbol
    [javac]       queryTemplateManager = new QueryTemplateManager(
    [javac]                                  ^
    [javac]   symbol:   class QueryTemplateManager
    [javac]   location: class FormBasedXmlQueryDemo
    [javac] 3 errors

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/build.xml:642: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/build.xml:577: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/build.xml:59: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/build.xml:493: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/common-build.xml:2264: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/module-build.xml:67: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/module-build.xml:64: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/common-build.xml:551: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/common-build.xml:2052: Compile failed; see the compiler error output for details.

Total time: 13 minutes 31 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-BadApples-7.x-Linux (64bit/jdk1.8.0_162) - Build # 38 - Still unstable!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-BadApples-7.x-Linux/38/
Java: 64bit/jdk1.8.0_162 -XX:-UseCompressedOops -XX:+UseParallelGC

1 tests failed.
FAILED:  org.apache.solr.cloud.SSLMigrationTest.test

Error Message:
Replica didn't have the proper urlScheme in the ClusterState

Stack Trace:
java.lang.AssertionError: Replica didn't have the proper urlScheme in the ClusterState
	at __randomizedtesting.SeedInfo.seed([D43070E746C4A5D:851738D4DA9027A5]:0)
	at org.junit.Assert.fail(Assert.java:93)
	at org.junit.Assert.assertTrue(Assert.java:43)
	at org.apache.solr.cloud.SSLMigrationTest.assertReplicaInformation(SSLMigrationTest.java:104)
	at org.apache.solr.cloud.SSLMigrationTest.testMigrateSSL(SSLMigrationTest.java:97)
	at org.apache.solr.cloud.SSLMigrationTest.test(SSLMigrationTest.java:61)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1737)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:934)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:970)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:984)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:993)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:968)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:943)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:829)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:879)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:890)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.lang.Thread.run(Thread.java:748)




Build Log:
[...truncated 13248 lines...]
   [junit4] Suite: org.apache.solr.cloud.SSLMigrationTest
   [junit4]   2> Creating dataDir: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/init-core-data-001
   [junit4]   2> 969886 WARN  (SUITE-SSLMigrationTest-seed#[D43070E746C4A5D]-worker) [    ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=23 numCloses=23
   [junit4]   2> 969886 INFO  (SUITE-SSLMigrationTest-seed#[D43070E746C4A5D]-worker) [    ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=true
   [junit4]   2> 969887 INFO  (SUITE-SSLMigrationTest-seed#[D43070E746C4A5D]-worker) [    ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.SolrTestCaseJ4$SuppressSSL(bugUrl=None)
   [junit4]   2> 969887 INFO  (SUITE-SSLMigrationTest-seed#[D43070E746C4A5D]-worker) [    ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 969887 INFO  (SUITE-SSLMigrationTest-seed#[D43070E746C4A5D]-worker) [    ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /
   [junit4]   2> 969888 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 969888 INFO  (Thread-2183) [    ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 969888 INFO  (Thread-2183) [    ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 969889 ERROR (Thread-2183) [    ] o.a.z.s.ZooKeeperServer ZKShutdownHandler is not registered, so ZooKeeper server won't take any action on ERROR or SHUTDOWN server state changes
   [junit4]   2> 969988 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.ZkTestServer start zk server on port:34819
   [junit4]   2> 969990 INFO  (zkConnectionManagerCallback-2262-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 969991 INFO  (zkConnectionManagerCallback-2264-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 969993 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 969993 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 969994 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 969994 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 969995 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 969995 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 969995 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 969996 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 969996 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 969996 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 969997 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractZkTestCase put /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 969997 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 970069 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.Server jetty-9.4.10.v20180503; built: 2018-05-03T15:56:21.710Z; git: daa59876e6f384329b122929e70a80934569428c; jvm 1.8.0_162-b12
   [junit4]   2> 970069 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 970069 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 970069 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 970069 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@7dde1350{/,null,AVAILABLE}
   [junit4]   2> 970070 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@172e12cc{HTTP/1.1,[http/1.1]}{127.0.0.1:34077}
   [junit4]   2> 970070 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.Server Started @970096ms
   [junit4]   2> 970070 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/tempDir-001/control/data, hostContext=/, hostPort=34077, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/control-001/cores}
   [junit4]   2> 970070 ERROR (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 970070 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 970070 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 7.4.0
   [junit4]   2> 970070 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 970070 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 970070 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2018-05-18T06:43:51.563Z
   [junit4]   2> 970071 INFO  (zkConnectionManagerCallback-2266-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 970072 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 970072 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/control-001/solr.xml
   [junit4]   2> 970074 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 970074 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 970075 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 970077 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:34819/solr
   [junit4]   2> 970078 INFO  (zkConnectionManagerCallback-2270-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 970083 INFO  (zkConnectionManagerCallback-2272-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 970124 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:34077_    ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 970125 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:34077_    ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:34077_
   [junit4]   2> 970125 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:34077_    ] o.a.s.c.Overseer Overseer (id=72181576636694532-127.0.0.1:34077_-n_0000000000) starting
   [junit4]   2> 970149 INFO  (zkConnectionManagerCallback-2279-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 970150 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:34077_    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:34819/solr ready
   [junit4]   2> 970153 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:34077_    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:34077_
   [junit4]   2> 970159 INFO  (zkCallback-2278-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 970171 INFO  (zkCallback-2271-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 970464 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:34077_    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 970470 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:34077_    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 970470 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:34077_    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 970471 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:34077_    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/control-001/cores
   [junit4]   2> 970487 INFO  (zkConnectionManagerCallback-2284-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 970488 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 970488 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:34819/solr ready
   [junit4]   2> 970489 INFO  (qtp1161571820-9610) [n:127.0.0.1:34077_    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params replicationFactor=1&collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:34077_&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 970490 INFO  (OverseerThreadFactory-3063-thread-1) [    ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 970600 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_    x:control_collection_shard1_replica_n1] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 970601 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_    x:control_collection_shard1_replica_n1] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 971862 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.4.0
   [junit4]   2> 971873 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.s.IndexSchema [control_collection_shard1_replica_n1] Schema name=test
   [junit4]   2> 971949 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 971965 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from collection control_collection, trusted=true
   [junit4]   2> 971966 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 971966 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 971966 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/control-001/cores/control_collection_shard1_replica_n1/data/]
   [junit4]   2> 971968 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=38, maxMergeAtOnceExplicit=25, maxMergedSegmentMB=36.111328125, floorSegmentMB=1.5859375, forceMergeDeletesPctAllowed=2.386041573014035, segmentsPerTier=22.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.1390375990920447
   [junit4]   2> 971970 WARN  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 972036 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 972036 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 972037 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 972037 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 972038 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=16, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.6546968515599456]
   [junit4]   2> 972038 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.s.SolrIndexSearcher Opening [Searcher@1605061a[control_collection_shard1_replica_n1] main]
   [junit4]   2> 972039 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 972039 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 972039 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 972040 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1600783210022699008
   [junit4]   2> 972040 INFO  (searcherExecutor-3066-thread-1-processing-n:127.0.0.1:34077_ x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@1605061a[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 972043 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 972044 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 972044 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 972044 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:34077/control_collection_shard1_replica_n1/
   [junit4]   2> 972045 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 972045 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.SyncStrategy http://127.0.0.1:34077/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 972045 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.ShardLeaderElectionContext Found all replicas participating in election, clear LIR
   [junit4]   2> 972046 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:34077/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 972148 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 972149 INFO  (qtp1161571820-9605) [n:127.0.0.1:34077_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1548
   [junit4]   2> 972150 INFO  (qtp1161571820-9610) [n:127.0.0.1:34077_    ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 30 seconds. Check all shard replicas
   [junit4]   2> 972249 INFO  (zkCallback-2271-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 972491 INFO  (OverseerCollectionConfigSetProcessor-72181576636694532-127.0.0.1:34077_-n_0000000000) [    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 973150 INFO  (qtp1161571820-9610) [n:127.0.0.1:34077_    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={replicationFactor=1&collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:34077_&wt=javabin&version=2} status=0 QTime=2661
   [junit4]   2> 973153 INFO  (zkConnectionManagerCallback-2289-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 973154 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 973155 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:34819/solr ready
   [junit4]   2> 973155 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 973155 INFO  (qtp1161571820-9606) [n:127.0.0.1:34077_    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params replicationFactor=1&collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=2&createNodeSet=&stateFormat=2&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 973156 INFO  (OverseerThreadFactory-3063-thread-2) [    ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 973157 WARN  (OverseerThreadFactory-3063-thread-2) [    ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 973360 INFO  (qtp1161571820-9606) [n:127.0.0.1:34077_    ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 30 seconds. Check all shard replicas
   [junit4]   2> 973361 INFO  (qtp1161571820-9606) [n:127.0.0.1:34077_    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={replicationFactor=1&collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=2&createNodeSet=&stateFormat=2&wt=javabin&version=2} status=0 QTime=205
   [junit4]   2> 973422 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-1-001 of type NRT
   [junit4]   2> 973422 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.Server jetty-9.4.10.v20180503; built: 2018-05-03T15:56:21.710Z; git: daa59876e6f384329b122929e70a80934569428c; jvm 1.8.0_162-b12
   [junit4]   2> 973423 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 973423 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 973423 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 973423 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@473434ee{/,null,AVAILABLE}
   [junit4]   2> 973424 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@53741d1a{HTTP/1.1,[http/1.1]}{127.0.0.1:41725}
   [junit4]   2> 973424 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.Server Started @973450ms
   [junit4]   2> 973424 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/tempDir-001/jetty1, solrconfig=solrconfig.xml, hostContext=/, hostPort=41725, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-1-001/cores}
   [junit4]   2> 973424 ERROR (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 973424 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 973424 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 7.4.0
   [junit4]   2> 973424 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 973424 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 973424 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2018-05-18T06:43:54.917Z
   [junit4]   2> 973425 INFO  (zkConnectionManagerCallback-2291-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 973426 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 973426 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-1-001/solr.xml
   [junit4]   2> 973428 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 973428 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 973429 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 973431 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:34819/solr
   [junit4]   2> 973432 INFO  (zkConnectionManagerCallback-2295-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 973433 INFO  (zkConnectionManagerCallback-2297-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 973435 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:41725_    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 973436 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:41725_    ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 973437 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:41725_    ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 973437 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:41725_    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:41725_
   [junit4]   2> 973438 INFO  (zkCallback-2271-thread-2) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 973438 INFO  (zkCallback-2278-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 973438 INFO  (zkCallback-2296-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 973438 INFO  (zkCallback-2288-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 973532 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:41725_    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 973537 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:41725_    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 973538 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:41725_    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 973539 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:41725_    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-1-001/cores
   [junit4]   2> 973541 INFO  (zkConnectionManagerCallback-2304-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 973542 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:41725_    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2)
   [junit4]   2> 973542 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:41725_    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:34819/solr ready
   [junit4]   2> 973556 INFO  (qtp1161571820-9648) [n:127.0.0.1:34077_    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:41725_&action=ADDREPLICA&collection=collection1&shard=shard2&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 973558 INFO  (OverseerCollectionConfigSetProcessor-72181576636694532-127.0.0.1:34077_-n_0000000000) [    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 973558 INFO  (OverseerThreadFactory-3063-thread-3) [ c:collection1 s:shard2  ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:41725_ for creating new replica
   [junit4]   2> 973560 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_    x:collection1_shard2_replica_n41] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard2_replica_n41&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 974573 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.4.0
   [junit4]   2> 974584 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.s.IndexSchema [collection1_shard2_replica_n41] Schema name=test
   [junit4]   2> 974717 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 974728 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard2_replica_n41' using configuration from collection collection1, trusted=true
   [junit4]   2> 974729 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard2.replica_n41' (registry 'solr.core.collection1.shard2.replica_n41') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 974729 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 974729 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.SolrCore [[collection1_shard2_replica_n41] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-1-001/cores/collection1_shard2_replica_n41], dataDir=[/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-1-001/cores/collection1_shard2_replica_n41/data/]
   [junit4]   2> 974731 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=38, maxMergeAtOnceExplicit=25, maxMergedSegmentMB=36.111328125, floorSegmentMB=1.5859375, forceMergeDeletesPctAllowed=2.386041573014035, segmentsPerTier=22.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.1390375990920447
   [junit4]   2> 974733 WARN  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 974769 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 974770 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 974771 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 974771 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 974772 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=16, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.6546968515599456]
   [junit4]   2> 974772 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.s.SolrIndexSearcher Opening [Searcher@66f35295[collection1_shard2_replica_n41] main]
   [junit4]   2> 974773 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 974773 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 974774 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 974774 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1600783212889505792
   [junit4]   2> 974775 INFO  (searcherExecutor-3077-thread-1-processing-n:127.0.0.1:41725_ x:collection1_shard2_replica_n41 c:collection1 s:shard2) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.SolrCore [collection1_shard2_replica_n41] Registered new searcher Searcher@66f35295[collection1_shard2_replica_n41] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 974777 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard2 to Terms{values={core_node42=0}, version=0}
   [junit4]   2> 974779 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 974779 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 974779 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:41725/collection1_shard2_replica_n41/
   [junit4]   2> 974779 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 974779 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.SyncStrategy http://127.0.0.1:41725/collection1_shard2_replica_n41/ has no replicas
   [junit4]   2> 974779 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.ShardLeaderElectionContext Found all replicas participating in election, clear LIR
   [junit4]   2> 974781 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:41725/collection1_shard2_replica_n41/ shard2
   [junit4]   2> 974882 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 974883 INFO  (qtp440281205-9666) [n:127.0.0.1:41725_ c:collection1 s:shard2  x:collection1_shard2_replica_n41] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard2_replica_n41&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1323
   [junit4]   2> 974885 INFO  (qtp1161571820-9648) [n:127.0.0.1:34077_ c:collection1   ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:41725_&action=ADDREPLICA&collection=collection1&shard=shard2&type=NRT&wt=javabin&version=2} status=0 QTime=1328
   [junit4]   2> 974948 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 2 in directory /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-2-001 of type NRT
   [junit4]   2> 974949 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.Server jetty-9.4.10.v20180503; built: 2018-05-03T15:56:21.710Z; git: daa59876e6f384329b122929e70a80934569428c; jvm 1.8.0_162-b12
   [junit4]   2> 974950 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 974950 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 974950 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 974950 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@68ee2d39{/,null,AVAILABLE}
   [junit4]   2> 974951 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@25f9ab9{HTTP/1.1,[http/1.1]}{127.0.0.1:44721}
   [junit4]   2> 974951 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.Server Started @974977ms
   [junit4]   2> 974951 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/tempDir-001/jetty2, replicaType=NRT, solrconfig=solrconfig.xml, hostContext=/, hostPort=44721, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-2-001/cores}
   [junit4]   2> 974951 ERROR (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 974951 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 974951 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 7.4.0
   [junit4]   2> 974951 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 974951 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 974951 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2018-05-18T06:43:56.444Z
   [junit4]   2> 974952 INFO  (zkConnectionManagerCallback-2306-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 974953 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 974953 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-2-001/solr.xml
   [junit4]   2> 974955 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 974955 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 974956 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 974958 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:34819/solr
   [junit4]   2> 974959 INFO  (zkConnectionManagerCallback-2310-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 974960 INFO  (zkConnectionManagerCallback-2312-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 974963 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:44721_    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (2)
   [junit4]   2> 974964 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:44721_    ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 974965 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:44721_    ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 974965 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:44721_    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:44721_
   [junit4]   2> 974965 INFO  (zkCallback-2271-thread-2) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 974965 INFO  (zkCallback-2288-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 974965 INFO  (zkCallback-2296-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 974966 INFO  (zkCallback-2311-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 974966 INFO  (zkCallback-2303-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 974966 INFO  (zkCallback-2278-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (2) -> (3)
   [junit4]   2> 975066 INFO  (zkCallback-2296-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [3])
   [junit4]   2> 975083 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:44721_    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 975088 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:44721_    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 975088 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:44721_    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 975089 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:44721_    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-2-001/cores
   [junit4]   2> 975091 INFO  (zkConnectionManagerCallback-2319-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 975092 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:44721_    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3)
   [junit4]   2> 975093 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:44721_    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:34819/solr ready
   [junit4]   2> 975118 INFO  (qtp1161571820-9610) [n:127.0.0.1:34077_    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:44721_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 975119 INFO  (OverseerCollectionConfigSetProcessor-72181576636694532-127.0.0.1:34077_-n_0000000000) [    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 975119 INFO  (OverseerThreadFactory-3063-thread-4) [ c:collection1 s:shard1  ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:44721_ for creating new replica
   [junit4]   2> 975120 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_    x:collection1_shard1_replica_n43] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n43&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 975223 INFO  (zkCallback-2296-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [3])
   [junit4]   2> 976172 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.4.0
   [junit4]   2> 976180 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.s.IndexSchema [collection1_shard1_replica_n43] Schema name=test
   [junit4]   2> 976252 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 976259 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n43' using configuration from collection collection1, trusted=true
   [junit4]   2> 976260 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n43' (registry 'solr.core.collection1.shard1.replica_n43') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 976260 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 976260 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.SolrCore [[collection1_shard1_replica_n43] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-2-001/cores/collection1_shard1_replica_n43], dataDir=[/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-2-001/cores/collection1_shard1_replica_n43/data/]
   [junit4]   2> 976261 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=38, maxMergeAtOnceExplicit=25, maxMergedSegmentMB=36.111328125, floorSegmentMB=1.5859375, forceMergeDeletesPctAllowed=2.386041573014035, segmentsPerTier=22.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.1390375990920447
   [junit4]   2> 976263 WARN  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 976288 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 976288 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 976288 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 976288 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 976289 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=16, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.6546968515599456]
   [junit4]   2> 976289 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.s.SolrIndexSearcher Opening [Searcher@2480c638[collection1_shard1_replica_n43] main]
   [junit4]   2> 976290 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 976290 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 976291 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 976291 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1600783214480195584
   [junit4]   2> 976291 INFO  (searcherExecutor-3088-thread-1-processing-n:127.0.0.1:44721_ x:collection1_shard1_replica_n43 c:collection1 s:shard1) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.SolrCore [collection1_shard1_replica_n43] Registered new searcher Searcher@2480c638[collection1_shard1_replica_n43] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 976293 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node44=0}, version=0}
   [junit4]   2> 976299 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 976299 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 976299 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:44721/collection1_shard1_replica_n43/
   [junit4]   2> 976299 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 976299 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.SyncStrategy http://127.0.0.1:44721/collection1_shard1_replica_n43/ has no replicas
   [junit4]   2> 976299 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.ShardLeaderElectionContext Found all replicas participating in election, clear LIR
   [junit4]   2> 976301 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:44721/collection1_shard1_replica_n43/ shard1
   [junit4]   2> 976402 INFO  (zkCallback-2296-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [3])
   [junit4]   2> 976453 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 976485 INFO  (qtp438135863-9702) [n:127.0.0.1:44721_ c:collection1 s:shard1  x:collection1_shard1_replica_n43] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n43&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1364
   [junit4]   2> 976486 INFO  (qtp1161571820-9610) [n:127.0.0.1:34077_ c:collection1   ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:44721_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1368
   [junit4]   2> 976548 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 3 in directory /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-3-001 of type NRT
   [junit4]   2> 976548 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.Server jetty-9.4.10.v20180503; built: 2018-05-03T15:56:21.710Z; git: daa59876e6f384329b122929e70a80934569428c; jvm 1.8.0_162-b12
   [junit4]   2> 976549 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 976549 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 976549 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 976549 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@1966c91c{/,null,AVAILABLE}
   [junit4]   2> 976550 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@5d2331{HTTP/1.1,[http/1.1]}{127.0.0.1:35003}
   [junit4]   2> 976550 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.Server Started @976576ms
   [junit4]   2> 976550 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/tempDir-001/jetty3, replicaType=NRT, solrconfig=solrconfig.xml, hostContext=/, hostPort=35003, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-3-001/cores}
   [junit4]   2> 976550 ERROR (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 976550 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 976550 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 7.4.0
   [junit4]   2> 976550 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 976550 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: null
   [junit4]   2> 976550 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2018-05-18T06:43:58.043Z
   [junit4]   2> 976551 INFO  (zkConnectionManagerCallback-2321-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 976552 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 976552 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-3-001/solr.xml
   [junit4]   2> 976554 INFO  (zkCallback-2311-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [3])
   [junit4]   2> 976554 INFO  (zkCallback-2296-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [3])
   [junit4]   2> 976554 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 976554 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 976555 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 976557 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:34819/solr
   [junit4]   2> 976558 INFO  (zkConnectionManagerCallback-2325-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 976559 INFO  (zkConnectionManagerCallback-2327-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 976562 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:35003_    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (3)
   [junit4]   2> 976562 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:35003_    ] o.a.s.c.Overseer Overseer (id=null) closing
   [junit4]   2> 976563 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:35003_    ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 976563 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:35003_    ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:35003_
   [junit4]   2> 976564 INFO  (zkCallback-2271-thread-2) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 976564 INFO  (zkCallback-2311-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 976564 INFO  (zkCallback-2278-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 976564 INFO  (zkCallback-2303-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 976564 INFO  (zkCallback-2296-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 976564 INFO  (zkCallback-2288-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 976564 INFO  (zkCallback-2326-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 976564 INFO  (zkCallback-2318-thread-1) [    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (3) -> (4)
   [junit4]   2> 976713 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:35003_    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 976718 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:35003_    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 976718 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:35003_    ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 976719 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:35003_    ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-3-001/cores
   [junit4]   2> 976721 INFO  (zkConnectionManagerCallback-2334-thread-1) [    ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 976722 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:35003_    ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (4)
   [junit4]   2> 976723 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [n:127.0.0.1:35003_    ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:34819/solr ready
   [junit4]   2> 976745 INFO  (qtp1161571820-9648) [n:127.0.0.1:34077_    ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:35003_&action=ADDREPLICA&collection=collection1&shard=shard2&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 976747 INFO  (OverseerCollectionConfigSetProcessor-72181576636694532-127.0.0.1:34077_-n_0000000000) [    ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000006 doesn't exist.  Requestor may have disconnected from ZooKeeper
   [junit4]   2> 976747 INFO  (OverseerThreadFactory-3063-thread-5) [ c:collection1 s:shard2  ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:35003_ for creating new replica
   [junit4]   2> 976748 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_    x:collection1_shard2_replica_n45] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard2_replica_n45&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 976851 INFO  (zkCallback-2296-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [4])
   [junit4]   2> 976851 INFO  (zkCallback-2311-thread-1) [    ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [4])
   [junit4]   2> 977841 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.c.SolrConfig Using Lucene MatchVersion: 7.4.0
   [junit4]   2> 977853 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.s.IndexSchema [collection1_shard2_replica_n45] Schema name=test
   [junit4]   2> 977938 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 977946 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard2_replica_n45' using configuration from collection collection1, trusted=true
   [junit4]   2> 977947 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard2.replica_n45' (registry 'solr.core.collection1.shard2.replica_n45') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@6a58c411
   [junit4]   2> 977947 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.c.SolrCore solr.RecoveryStrategy.Builder
   [junit4]   2> 977947 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.c.SolrCore [[collection1_shard2_replica_n45] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-3-001/cores/collection1_shard2_replica_n45], dataDir=[/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-3-001/cores/collection1_shard2_replica_n45/data/]
   [junit4]   2> 977949 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=38, maxMergeAtOnceExplicit=25, maxMergedSegmentMB=36.111328125, floorSegmentMB=1.5859375, forceMergeDeletesPctAllowed=2.386041573014035, segmentsPerTier=22.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.1390375990920447
   [junit4]   2> 977951 WARN  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 977984 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.UpdateLog
   [junit4]   2> 977984 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 977985 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 977985 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 977986 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=16, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.6546968515599456]
   [junit4]   2> 977986 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.s.SolrIndexSearcher Opening [Searcher@4741a76[collection1_shard2_replica_n45] main]
   [junit4]   2> 977987 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 977987 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 977988 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.h.ReplicationHandler Commits will be reserved for 10000ms.
   [junit4]   2> 977988 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1600783216259629056
   [junit4]   2> 977988 INFO  (searcherExecutor-3099-thread-1-processing-n:127.0.0.1:35003_ x:collection1_shard2_replica_n45 c:collection1 s:shard2) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.c.SolrCore [collection1_shard2_replica_n45] Registered new searcher Searcher@4741a76[collection1_shard2_replica_n45] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 977991 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard2 to Terms{values={core_node42=0, core_node46=0}, version=1}
   [junit4]   2> 977991 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.c.ZkController Core needs to recover:collection1_shard2_replica_n45
   [junit4]   2> 977992 INFO  (updateExecutor-2322-thread-1-processing-n:127.0.0.1:35003_ x:collection1_shard2_replica_n45 c:collection1 s:shard2) [n:127.0.0.1:35003_ c:collection1 s:shard2 r:core_node46 x:collection1_shard2_replica_n45] o.a.s.u.DefaultSolrCoreState Running recovery
   [junit4]   2> 977992 INFO  (recoveryExecutor-2323-thread-1-processing-n:127.0.0.1:35003_ x:collection1_shard2_replica_n45 c:collection1 s:shard2 r:core_node46) [n:127.0.0.1:35003_ c:collection1 s:shard2 r:core_node46 x:collection1_shard2_replica_n45] o.a.s.c.RecoveryStrategy Starting recovery process. recoveringAfterStartup=true
   [junit4]   2> 977992 INFO  (recoveryExecutor-2323-thread-1-processing-n:127.0.0.1:35003_ x:collection1_shard2_replica_n45 c:collection1 s:shard2 r:core_node46) [n:127.0.0.1:35003_ c:collection1 s:shard2 r:core_node46 x:collection1_shard2_replica_n45] o.a.s.c.RecoveryStrategy ###### startupVersions=[[]]
   [junit4]   2> 977992 INFO  (qtp579733820-9738) [n:127.0.0.1:35003_ c:collection1 s:shard2  x:collection1_shard2_replica_n45] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard2_replica_n45&action=CREATE&collection=collection1&shard=shard2&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1243
   [junit4]   2> 977993 INFO  (qtp440281205-9661) [n:127.0.0.1:41725_ c:collection1 s:shard2 r:core_node42 x:collection1_shard2_replica_n41] o.a.s.c.S.Request [collection1_shard2_replica_n41]  webapp= path=/admin/ping params={wt=javabin&version=2} hits=0 status=0 QTime=0
   [junit4]   2> 977993 INFO  (qtp440281205-9661) [n:127.0.0.1:41725_ c:collection1 s:shard2 r:core_node42 x:collection1_shard2_replica_n41] o.a.s.c.S.Request [collection1_shard2_replica_n41]  webapp= path=/admin/ping params={wt=javabin&version=2} status=0 QTime=0
   [junit4]   2> 977994 INFO  (qtp1161571820-9648) [n:127.0.0.1:34077_ c:collection1   ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:35003_&action=ADDREPLICA&collection=collection1&shard=shard2&type=NRT&wt=javabin&version=2} status=0 QTime=1248
   [junit4]   2> 977994 INFO  (recoveryExecutor-2323-thread-1-processing-n:127.0.0.1:35003_ x:collection1_shard2_replica_n45 c:collection1 s:shard2 r:core_node46) [n:127.0.0.1:35003_ c:collection1 s:shard2 r:core_node46 x:collection1_shard2_replica_n45] o.a.s.c.RecoveryStrategy Begin buffering updates. core=[collection1_shard2_replica_n45]
   [junit4]   2> 977994 INFO  (recoveryExecutor-2323-thread-1-processing-n:127.0.0.1:35003_ x:collection1_shard2_replica_n45 c:collection1 s:shard2 r:core_node46) [n:127.0.0.1:35003_ c:collection1 s:shard2 r:core_node46 x:collection1_shard2_replica_n45] o.a.s.u.UpdateLog Starting to buffer updates. FSUpdateLog{state=ACTIVE, tlog=null}
   [junit4]   2> 977994 INFO  (recoveryExecutor-2323-thread-1-processing-n:127.0.0.1:35003_ x:collection1_shard2_replica_n45 c:collection1 s:shard2 r:core_node46) [n:127.0.0.1:35003_ c:collection1 s:shard2 r:core_node46 x:collection1_shard2_replica_n45] o.a.s.c.RecoveryStrategy Publishing state of core [collection1_shard2_replica_n45] as recovering, leader is [http://127.0.0.1:41725/collection1_shard2_replica_n41/] and I am [http://127.0.0.1:35003/collection1_shard2_replica_n45/]
   [junit4]   2> 977994 INFO  (recoveryExecutor-2323-thread-1-processing-n:127.0.0.1:35003_ x:collection1_shard2_replica_n45 c:collection1 s:shard2 r:core_node46) [n:127.0.0.1:35003_ c:collection1 s:shard2 r:core_node46 x:collection1_shard2_replica_n45] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard2 to Terms{values={core_node42=0, core_node46=0, core_node46_recovering=0}, version=2}
   [junit4]   2> 977996 INFO  (recoveryExecutor-2323-thread-1-processing-n:127.0.0.1:35003_ x:collection1_shard2_replica_n45 c:collection1 s:shard2 r:core_node46) [n:127.0.0.1:35003_ c:collection1 s:shard2 r:core_node46 x:collection1_shard2_replica_n45] o.a.s.c.RecoveryStrategy Sending prep recovery command to [http://127.0.0.1:41725]; [WaitForState: action=PREPRECOVERY&core=collection1_shard2_replica_n41&nodeName=127.0.0.1:35003_&coreNodeName=core_node46&state=recovering&checkLive=true&onlyIfLeader=true&onlyIfLeaderActive=true]
   [junit4]   2> 977997 INFO  (qtp440281205-9662) [n:127.0.0.1:41725_    x:collection1_shard2_replica_n41] o.a.s.h.a.PrepRecoveryOp Going to wait for coreNodeName: core_node46, state: recovering, checkLive: true, onlyIfLeader: true, onlyIfLeaderActive: true, maxTime: 183 s
   [junit4]   2> 977997 INFO  (qtp440281205-9662) [n:127.0.0.1:41725_    x:collection1_shard2_replica_n41] o.a.s.h.a.PrepRecoveryOp In WaitForState(recovering): collection=collection1, shard=shard2, thisCore=collection1_shard2_replica_n41, leaderDoesNotNeedRecovery=false, isLeader? true, live=true, checkLive=true, currentState=down, localState=active, nodeName=127.0.0.1:35003_, coreNodeName=core_node46, onlyIfActiveCheckResult=false, nodeProps: core_node46:{"core":"collection1_shard2_replica_n45","base_url":"http://127.0.0.1:35003","node_name":"127.0.0.1:35003_","state":"down","type":"NRT"}
   [junit4]   2> 978056 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 4 in directory /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-4-001 of type NRT
   [junit4]   2> 978057 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.Server jetty-9.4.10.v20180503; built: 2018-05-03T15:56:21.710Z; git: daa59876e6f384329b122929e70a80934569428c; jvm 1.8.0_162-b12
   [junit4]   2> 978057 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 978057 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 978057 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 978058 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@5037a793{/,null,AVAILABLE}
   [junit4]   2> 978058 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.AbstractConnector Started ServerConnector@46a7249{HTTP/1.1,[http/1.1]}{127.0.0.1:34155}
   [junit4]   2> 978058 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.Server Started @978085ms
   [junit4]   2> 978058 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/tempDir-001/jetty4, replicaType=NRT, solrconfig=solrconfig.xml, hostContext=/, hostPort=34155, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J1/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001/shard-4-001/cores}
   [junit4]   2> 978058 ERROR (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 978059 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 978059 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr™ version 7.4.0
   [junit4]   2> 978059 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.s.SolrDis

[...truncated too long message...]

oller.java:1254) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.cloud.ZkController.getLeader(ZkController.java:1210) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.cloud.ZkController.register(ZkController.java:1094) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.cloud.ZkController.register(ZkController.java:1025) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.core.ZkContainer.lambda$registerInZk$0(ZkContainer.java:187) ~[java/:?]
   [junit4]   2> 	at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:192) ~[java/:?]
   [junit4]   2> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_162]
   [junit4]   2> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_162]
   [junit4]   2> 	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_162]
   [junit4]   2> 35215 ERROR (coreZkRegister-101-thread-1-processing-n:127.0.0.1:33215_ x:collection1_shard1_replica_n47 c:collection1 s:shard1 r:core_node48) [n:127.0.0.1:33215_ c:collection1 s:shard1 r:core_node48 x:collection1_shard1_replica_n47] o.a.s.c.ZkContainer :org.apache.solr.common.SolrException: Error getting leader from zk for shard shard1
   [junit4]   2> 	at org.apache.solr.cloud.ZkController.getLeader(ZkController.java:1243)
   [junit4]   2> 	at org.apache.solr.cloud.ZkController.register(ZkController.java:1094)
   [junit4]   2> 	at org.apache.solr.cloud.ZkController.register(ZkController.java:1025)
   [junit4]   2> 	at org.apache.solr.core.ZkContainer.lambda$registerInZk$0(ZkContainer.java:187)
   [junit4]   2> 	at org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:192)
   [junit4]   2> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   [junit4]   2> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   [junit4]   2> 	at java.lang.Thread.run(Thread.java:748)
   [junit4]   2> Caused by: org.apache.solr.common.SolrException: CoreContainer is closed
   [junit4]   2> 	at org.apache.solr.cloud.ZkController.getLeaderProps(ZkController.java:1287)
   [junit4]   2> 	at org.apache.solr.cloud.ZkController.getLeaderProps(ZkController.java:1254)
   [junit4]   2> 	at org.apache.solr.cloud.ZkController.getLeader(ZkController.java:1210)
   [junit4]   2> 	... 7 more
   [junit4]   2> 
   [junit4]   2> 35217 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.AbstractConnector Stopped ServerConnector@52efbd41{SSL,[ssl, http/1.1]}{127.0.0.1:33215}
   [junit4]   2> 35217 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@22698877{/,null,UNAVAILABLE}
   [junit4]   2> 35217 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 35217 ERROR (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.z.s.ZooKeeperServer ZKShutdownHandler is not registered, so ZooKeeper server won't take any action on ERROR or SHUTDOWN server state changes
   [junit4]   2> 35218 INFO  (TEST-SSLMigrationTest.test-seed#[D43070E746C4A5D]) [    ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:43367 43367
   [junit4]   2> 36768 INFO  (Thread-1) [    ] o.a.s.c.ZkTestServer connecting to 127.0.0.1:43367 43367
   [junit4]   2> 36769 WARN  (Thread-1) [    ] o.a.s.c.ZkTestServer Watch limit violations: 
   [junit4]   2> Maximum concurrent create/delete watches above limit:
   [junit4]   2> 
   [junit4]   2> 	20	/solr/aliases.json
   [junit4]   2> 	9	/solr/security.json
   [junit4]   2> 	9	/solr/configs/conf1
   [junit4]   2> 	9	/solr/collections/collection1/terms/shard1
   [junit4]   2> 	9	/solr/collections/collection1/terms/shard2
   [junit4]   2> 	2	/solr/collections/control_collection/terms/shard1
   [junit4]   2> 
   [junit4]   2> Maximum concurrent data watches above limit:
   [junit4]   2> 
   [junit4]   2> 	20	/solr/clusterprops.json
   [junit4]   2> 	20	/solr/clusterstate.json
   [junit4]   2> 	14	/solr/collections/collection1/state.json
   [junit4]   2> 	5	/solr/overseer_elect/election/72181778595250180-127.0.0.1:39687_-n_0000000000
   [junit4]   2> 	4	/solr/autoscaling.json
   [junit4]   2> 
   [junit4]   2> Maximum concurrent children watches above limit:
   [junit4]   2> 
   [junit4]   2> 	20	/solr/live_nodes
   [junit4]   2> 	20	/solr/collections
   [junit4]   2> 	4	/solr/autoscaling/events/.scheduled_maintenance
   [junit4]   2> 	4	/solr/overseer/collection-queue-work
   [junit4]   2> 	4	/solr/overseer/queue-work
   [junit4]   2> 	3	/solr/overseer/queue
   [junit4]   2> 	3	/solr/autoscaling/events/.auto_add_replicas
   [junit4]   2> 
   [junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=SSLMigrationTest -Dtests.method=test -Dtests.seed=D43070E746C4A5D -Dtests.multiplier=3 -Dtests.slow=true -Dtests.badapples=true -Dtests.locale=th -Dtests.timezone=America/Tegucigalpa -Dtests.asserts=true -Dtests.file.encoding=UTF-8
   [junit4] FAILURE 34.2s J2 | SSLMigrationTest.test <<<
   [junit4]    > Throwable #1: java.lang.AssertionError: Replica didn't have the proper urlScheme in the ClusterState
   [junit4]    > 	at __randomizedtesting.SeedInfo.seed([D43070E746C4A5D:851738D4DA9027A5]:0)
   [junit4]    > 	at org.apache.solr.cloud.SSLMigrationTest.assertReplicaInformation(SSLMigrationTest.java:104)
   [junit4]    > 	at org.apache.solr.cloud.SSLMigrationTest.testMigrateSSL(SSLMigrationTest.java:97)
   [junit4]    > 	at org.apache.solr.cloud.SSLMigrationTest.test(SSLMigrationTest.java:61)
   [junit4]    > 	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:993)
   [junit4]    > 	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:968)
   [junit4]    > 	at java.lang.Thread.run(Thread.java:748)
   [junit4]   2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/solr/build/solr-core/test/J2/temp/solr.cloud.SSLMigrationTest_D43070E746C4A5D-001
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene70): {}, docValues:{}, maxPointsInLeafNode=27, maxMBSortInHeap=6.3160092135698145, sim=RandomSimilarity(queryNorm=false): {}, locale=th, timezone=America/Tegucigalpa
   [junit4]   2> NOTE: Linux 4.13.0-39-generic amd64/Oracle Corporation 1.8.0_162 (64-bit)/cpus=8,threads=1,free=336497472,total=436207616
   [junit4]   2> NOTE: All tests run in this JVM: [SSLMigrationTest]
   [junit4] Completed [1/5 (1!)] on J2 in 35.89s, 1 test, 1 failure <<< FAILURES!
   [junit4] 
   [junit4] Suite: org.apache.solr.cloud.SSLMigrationTest
   [junit4] OK      36.5s J1 | SSLMigrationTest.test
   [junit4] Completed [2/5 (1!)] on J1 in 38.99s, 1 test
   [junit4] 
   [junit4] Duplicate suite name used with XML reports: org.apache.solr.cloud.SSLMigrationTest. This may confuse tools that process XML reports. Set 'ignoreDuplicateSuites' to true to skip this message.
   [junit4] Suite: org.apache.solr.cloud.SSLMigrationTest
   [junit4] OK      40.9s J0 | SSLMigrationTest.test
   [junit4] Completed [3/5 (1!)] on J0 in 42.80s, 1 test
   [junit4] 
   [junit4] Suite: org.apache.solr.cloud.SSLMigrationTest
   [junit4] OK      30.5s J2 | SSLMigrationTest.test
   [junit4] Completed [4/5 (1!)] on J2 in 30.90s, 1 test
   [junit4] 
   [junit4] Suite: org.apache.solr.cloud.SSLMigrationTest
   [junit4] OK      27.2s J1 | SSLMigrationTest.test
   [junit4] Completed [5/5 (1!)] on J1 in 28.04s, 1 test
   [junit4] 
   [junit4] 
   [junit4] Tests with failures [seed: D43070E746C4A5D]:
   [junit4]   - org.apache.solr.cloud.SSLMigrationTest.test
   [junit4] 
   [junit4] 
   [junit4] JVM J0:     0.46 ..    44.08 =    43.61s
   [junit4] JVM J1:     0.67 ..    68.60 =    67.93s
   [junit4] JVM J2:     0.67 ..    68.55 =    67.88s
   [junit4] Execution time total: 1 minute 8 seconds
   [junit4] Tests summary: 5 suites, 5 tests, 1 failure

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/common-build.xml:1568: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux/lucene/common-build.xml:1092: There were test failures: 5 suites, 5 tests, 1 failure [seed: D43070E746C4A5D]

Total time: 1 minute 10 seconds

[repro] Setting last failure code to 256

[repro] Failures:
[repro]   1/5 failed: org.apache.solr.cloud.SSLMigrationTest
[repro] Exiting with code 256
+ mv lucene/build lucene/build.repro
+ mv solr/build solr/build.repro
+ mv lucene/build.orig lucene/build
+ mv solr/build.orig solr/build
Archiving artifacts
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Parsing warnings in console log with parser Java Compiler (javac)
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
<Git Blamer> Using GitBlamer to create author and commit information for all warnings.
<Git Blamer> GIT_COMMIT=96f6c65e43445bbb0b77604c8c6550ca654b5de8, workspace=/var/lib/jenkins/workspace/Lucene-Solr-BadApples-7.x-Linux
[WARNINGS] Computing warning deltas based on reference build #36
Recording test results
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Email was triggered for: Unstable (Test Failures)
Sending email for trigger: Unstable (Test Failures)
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/var/lib/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2