You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@lucene.apache.org by GitBox <gi...@apache.org> on 2021/06/01 00:55:19 UTC

[GitHub] [lucene] zacharymorn commented on a change in pull request #128: LUCENE-9662: CheckIndex should be concurrent - parallelizing index check across segments

zacharymorn commented on a change in pull request #128:
URL: https://github.com/apache/lucene/pull/128#discussion_r642719647



##########
File path: lucene/core/src/java/org/apache/lucene/index/CheckIndex.java
##########
@@ -605,209 +680,103 @@ public Status checkIndex(List<String> onlySegments) throws IOException {
     result.newSegments.clear();
     result.maxSegmentName = -1;
 
-    for (int i = 0; i < numSegments; i++) {
-      final SegmentCommitInfo info = sis.info(i);
-      long segmentName = Long.parseLong(info.info.name.substring(1), Character.MAX_RADIX);
-      if (segmentName > result.maxSegmentName) {
-        result.maxSegmentName = segmentName;
-      }
-      if (onlySegments != null && !onlySegments.contains(info.info.name)) {
-        continue;
-      }
-      Status.SegmentInfoStatus segInfoStat = new Status.SegmentInfoStatus();
-      result.segmentInfos.add(segInfoStat);
-      msg(
-          infoStream,
-          "  "
-              + (1 + i)
-              + " of "
-              + numSegments
-              + ": name="
-              + info.info.name
-              + " maxDoc="
-              + info.info.maxDoc());
-      segInfoStat.name = info.info.name;
-      segInfoStat.maxDoc = info.info.maxDoc();
-
-      final Version version = info.info.getVersion();
-      if (info.info.maxDoc() <= 0) {
-        throw new RuntimeException("illegal number of documents: maxDoc=" + info.info.maxDoc());
-      }
-
-      int toLoseDocCount = info.info.maxDoc();
-
-      SegmentReader reader = null;
-
-      try {
-        msg(infoStream, "    version=" + (version == null ? "3.0" : version));
-        msg(infoStream, "    id=" + StringHelper.idToString(info.info.getId()));
-        final Codec codec = info.info.getCodec();
-        msg(infoStream, "    codec=" + codec);
-        segInfoStat.codec = codec;
-        msg(infoStream, "    compound=" + info.info.getUseCompoundFile());
-        segInfoStat.compound = info.info.getUseCompoundFile();
-        msg(infoStream, "    numFiles=" + info.files().size());
-        Sort indexSort = info.info.getIndexSort();
-        if (indexSort != null) {
-          msg(infoStream, "    sort=" + indexSort);
-        }
-        segInfoStat.numFiles = info.files().size();
-        segInfoStat.sizeMB = info.sizeInBytes() / (1024. * 1024.);
-        msg(infoStream, "    size (MB)=" + nf.format(segInfoStat.sizeMB));
-        Map<String, String> diagnostics = info.info.getDiagnostics();
-        segInfoStat.diagnostics = diagnostics;
-        if (diagnostics.size() > 0) {
-          msg(infoStream, "    diagnostics = " + diagnostics);
+    // checks segments sequentially
+    if (executorService == null) {
+      for (int i = 0; i < numSegments; i++) {
+        final SegmentCommitInfo info = sis.info(i);
+        updateMaxSegmentName(result, info);
+        if (onlySegments != null && !onlySegments.contains(info.info.name)) {
+          continue;
         }
 
-        if (!info.hasDeletions()) {
-          msg(infoStream, "    no deletions");
-          segInfoStat.hasDeletions = false;
-        } else {
-          msg(infoStream, "    has deletions [delGen=" + info.getDelGen() + "]");
-          segInfoStat.hasDeletions = true;
-          segInfoStat.deletionsGen = info.getDelGen();
-        }
-
-        long startOpenReaderNS = System.nanoTime();
-        if (infoStream != null) infoStream.print("    test: open reader.........");
-        reader = new SegmentReader(info, sis.getIndexCreatedVersionMajor(), IOContext.DEFAULT);
         msg(
             infoStream,
-            String.format(
-                Locale.ROOT, "OK [took %.3f sec]", nsToSec(System.nanoTime() - startOpenReaderNS)));
+            (1 + i)
+                + " of "
+                + numSegments
+                + ": name="
+                + info.info.name
+                + " maxDoc="
+                + info.info.maxDoc());
+        Status.SegmentInfoStatus segmentInfoStatus = testSegment(sis, info, infoStream);
+
+        processSegmentInfoStatusResult(result, info, segmentInfoStatus);
+      }
+    } else {
+      ByteArrayOutputStream[] outputs = new ByteArrayOutputStream[numSegments];
+      @SuppressWarnings({"unchecked", "rawtypes"})
+      CompletableFuture<Status.SegmentInfoStatus>[] futures = new CompletableFuture[numSegments];
+
+      // checks segments concurrently
+      for (int i = 0; i < numSegments; i++) {
+        final SegmentCommitInfo info = sis.info(i);
+        updateMaxSegmentName(result, info);
+        if (onlySegments != null && !onlySegments.contains(info.info.name)) {
+          continue;
+        }
 
-        segInfoStat.openReaderPassed = true;
+        SegmentInfos finalSis = sis;
 
-        long startIntegrityNS = System.nanoTime();
-        if (infoStream != null) infoStream.print("    test: check integrity.....");
-        reader.checkIntegrity();
+        ByteArrayOutputStream output = new ByteArrayOutputStream();
+        PrintStream stream;
+        if (i > 0) {
+          // buffer the messages for segment starting from the 2nd one so that they can later be
+          // printed in order
+          stream = new PrintStream(output, true, IOUtils.UTF_8);
+        } else {
+          // optimize for first segment to print real-time

Review comment:
       I've implemented the above by sorting segments by the file size they contain in increasing order. Here are the test results:
   
   ### Full check on good index
   ```
   5:12:03 PM: Executing task 'CheckIndex.main()'...
   
   > Task :buildSrc:compileJava UP-TO-DATE
   > Task :buildSrc:compileGroovy NO-SOURCE
   > Task :buildSrc:processResources NO-SOURCE
   > Task :buildSrc:classes UP-TO-DATE
   > Task :buildSrc:jar UP-TO-DATE
   > Task :buildSrc:assemble UP-TO-DATE
   > Task :buildSrc:compileTestJava NO-SOURCE
   > Task :buildSrc:compileTestGroovy NO-SOURCE
   > Task :buildSrc:processTestResources NO-SOURCE
   > Task :buildSrc:testClasses UP-TO-DATE
   > Task :buildSrc:test NO-SOURCE
   > Task :buildSrc:check UP-TO-DATE
   > Task :buildSrc:build UP-TO-DATE
   
   > Configure project :
   IntelliJ Idea IDE detected.
   
   > Task :errorProneSkipped
   WARNING: errorprone disabled (skipped on non-nightly runs)
   
   > Task :lucene:core:processResources UP-TO-DATE
   > Task :lucene:core:compileJava
   > Task :lucene:core:classes
   
   > Task :lucene:core:CheckIndex.main()
   
   NOTE: testing will be more thorough if you run java with '-ea:org.apache.lucene...', so assertions are enabled
   
   Opening index @ /Users/xichen/IdeaProjects/benchmarks/indices/wikibigall.lucene_baseline.facets.taxonomy:Date.taxonomy:Month.taxonomy:DayOfYear.sortedset:Month.sortedset:DayOfYear.Lucene90.Lucene90.nd6.64758M/index
   
   Checking index with async threadCount: 12
   0.00% total deletions; 6647577 documents; 0 deletions
   Segments file=segments_2 numSegments=15 version=9.0.0 id=59c6he3dhebad46x7proh30nq userData={userData=multi}
   1 of 15: name=_h2 maxDoc=11248
       version=9.0.0
       id=59c6he3dhebad46x7proh30nm
       codec=Lucene90
       compound=false
       numFiles=17
       size (MB)=10.617
       diagnostics = {os.version=10.15.5, java.runtime.version=11.0.9+11, os.arch=x86_64, source=flush, lucene.version=9.0.0, java.vm.version=11.0.9+11, java.version=11.0.9, timestamp=1622102791291, os=Mac OS X, java.vendor=AdoptOpenJDK}
       no deletions
       test: open reader.........OK [took 0.167 sec]
       test: check integrity.....OK [took 0.097 sec]
       test: check live docs.....OK [took 0.000 sec]
       test: field infos.........OK [17 fields] [took 0.000 sec]
       test: field norms.........OK [2 fields] [took 0.007 sec]
       test: terms, freq, prox...OK [253387 terms; 1570705 terms/docs pairs; 3390075 tokens] [took 0.771 sec]
       test: stored fields.......OK [33744 total field count; avg 3.0 fields per doc] [took 0.057 sec]
       test: term vectors........OK [0 total term vector count; avg 0.0 term/freq vector fields per doc] [took 0.000 sec]
       test: docvalues...........OK [10 docvalues fields; 3 BINARY; 0 NUMERIC; 5 SORTED; 0 SORTED_NUMERIC; 2 SORTED_SET] [took 0.111 sec]
       test: points..............OK [2 fields, 22496 points] [took 0.016 sec]
       test: vectors.............OK [0 fields, 0 vectors] [took 0.000 sec]
   
   2 of 15: name=_h1 maxDoc=11979
       version=9.0.0
       id=59c6he3dhebad46x7proh30nj
       codec=Lucene90
       compound=false
       numFiles=17
       size (MB)=12.824
       diagnostics = {os.version=10.15.5, java.runtime.version=11.0.9+11, os.arch=x86_64, source=flush, lucene.version=9.0.0, java.vm.version=11.0.9+11, java.version=11.0.9, timestamp=1622102788648, os=Mac OS X, java.vendor=AdoptOpenJDK}
       no deletions
       test: open reader.........OK [took 0.166 sec]
       test: check integrity.....OK [took 0.108 sec]
       test: check live docs.....OK [took 0.000 sec]
       test: field infos.........OK [17 fields] [took 0.000 sec]
       test: field norms.........OK [2 fields] [took 0.004 sec]
       test: terms, freq, prox...OK [290488 terms; 1843478 terms/docs pairs; 4383419 tokens] [took 0.880 sec]
       test: stored fields.......OK [35937 total field count; avg 3.0 fields per doc] [took 0.021 sec]
       test: term vectors........OK [0 total term vector count; avg 0.0 term/freq vector fields per doc] [took 0.000 sec]
       test: docvalues...........OK [10 docvalues fields; 3 BINARY; 0 NUMERIC; 5 SORTED; 0 SORTED_NUMERIC; 2 SORTED_SET] [took 0.083 sec]
       test: points..............OK [2 fields, 23958 points] [took 0.005 sec]
       test: vectors.............OK [0 fields, 0 vectors] [took 0.000 sec]
   
   ...
   ...
   
   14 of 15: name=_65 maxDoc=1197893
       version=9.0.0
       id=59c6he3dhebad46x7proh2zqv
       codec=Lucene90
       compound=false
       numFiles=17
       size (MB)=1,539.981
       diagnostics = {os.version=10.15.5, java.vendor=AdoptOpenJDK, source=merge, os.arch=x86_64, mergeFactor=10, java.runtime.version=11.0.9+11, os=Mac OS X, timestamp=1622100810971, lucene.version=9.0.0, java.vm.version=11.0.9+11, java.version=11.0.9, mergeMaxNumSegments=-1}
       no deletions
       test: open reader.........OK [took 0.018 sec]
       test: check integrity.....OK [took 14.172 sec]
       test: check live docs.....OK [took 0.000 sec]
       test: field infos.........OK [17 fields] [took 0.000 sec]
       test: field norms.........OK [2 fields] [took 0.045 sec]
       test: terms, freq, prox...OK [15042354 terms; 274837439 terms/docs pairs; 686566591 tokens] [took 74.763 sec]
       test: stored fields.......OK [3593679 total field count; avg 3.0 fields per doc] [took 0.910 sec]
       test: term vectors........OK [0 total term vector count; avg 0.0 term/freq vector fields per doc] [took 0.000 sec]
       test: docvalues...........OK [10 docvalues fields; 3 BINARY; 0 NUMERIC; 5 SORTED; 0 SORTED_NUMERIC; 2 SORTED_SET] [took 2.224 sec]
       test: points..............OK [2 fields, 2395786 points] [took 0.183 sec]
       test: vectors.............OK [0 fields, 0 vectors] [took 0.000 sec]
   
   15 of 15: name=_32 maxDoc=1197893
       version=9.0.0
       id=59c6he3dhebad46x7proh2zhm
       codec=Lucene90
       compound=false
       numFiles=17
       size (MB)=2,531.843
       diagnostics = {os.version=10.15.5, java.vendor=AdoptOpenJDK, source=merge, os.arch=x86_64, mergeFactor=10, java.runtime.version=11.0.9+11, os=Mac OS X, timestamp=1622100146526, lucene.version=9.0.0, java.vm.version=11.0.9+11, java.version=11.0.9, mergeMaxNumSegments=-1}
       no deletions
       test: open reader.........OK [took 0.020 sec]
       test: check integrity.....OK [took 21.075 sec]
       test: check live docs.....OK [took 0.014 sec]
       test: field infos.........OK [17 fields] [took 0.000 sec]
       test: field norms.........OK [2 fields] [took 0.041 sec]
       test: terms, freq, prox...OK [20065511 terms; 450728331 terms/docs pairs; 1175837878 tokens] [took 111.604 sec]
       test: stored fields.......OK [3593679 total field count; avg 3.0 fields per doc] [took 1.112 sec]
       test: term vectors........OK [0 total term vector count; avg 0.0 term/freq vector fields per doc] [took 0.000 sec]
       test: docvalues...........OK [10 docvalues fields; 3 BINARY; 0 NUMERIC; 5 SORTED; 0 SORTED_NUMERIC; 2 SORTED_SET] [took 2.648 sec]
       test: points..............OK [2 fields, 2395786 points] [took 0.207 sec]
       test: vectors.............OK [0 fields, 0 vectors] [took 0.000 sec]
   
   No problems were detected with this index.
   
   Took 138.332 sec total.
   
   
   BUILD SUCCESSFUL in 2m 22s
   4 actionable tasks: 3 executed, 1 up-to-date
   5:14:26 PM: Task execution finished 'CheckIndex.main()'.
   
   ```
   
   ### Full check on bad index
   ```
   > Task :lucene:core:CheckIndex.main()
   
   NOTE: testing will be more thorough if you run java with '-ea:org.apache.lucene...', so assertions are enabled
   
   Opening index @ /Users/xichen/IdeaProjects/benchmarks/indices/corrupted/index/
   
   Checking index with async threadCount: 12
   0.00% total deletions; 6647577 documents; 0 deletions
   Segments file=segments_2 numSegments=15 version=9.0.0 id=59c6he3dhebad46x7proh30nq userData={userData=multi}
   1 of 15: name=_h2 maxDoc=11248
       version=9.0.0
       id=59c6he3dhebad46x7proh30nm
       codec=Lucene90
       compound=false
       numFiles=17
       size (MB)=10.617
       diagnostics = {os.arch=x86_64, source=flush, lucene.version=9.0.0, java.vm.version=11.0.9+11, java.version=11.0.9, timestamp=1622102791291, os=Mac OS X, java.vendor=AdoptOpenJDK, os.version=10.15.5, java.runtime.version=11.0.9+11}
       no deletions
       test: open reader.........OK [took 0.101 sec]
       test: check integrity.....OK [took 0.047 sec]
       test: check live docs.....OK [took 0.000 sec]
       test: field infos.........OK [17 fields] [took 0.000 sec]
       test: field norms.........OK [2 fields] [took 0.005 sec]
       test: terms, freq, prox...OK [253387 terms; 1570705 terms/docs pairs; 3390075 tokens] [took 0.863 sec]
       test: stored fields.......OK [33744 total field count; avg 3.0 fields per doc] [took 0.059 sec]
       test: term vectors........OK [0 total term vector count; avg 0.0 term/freq vector fields per doc] [took 0.000 sec]
       test: docvalues...........OK [10 docvalues fields; 3 BINARY; 0 NUMERIC; 5 SORTED; 0 SORTED_NUMERIC; 2 SORTED_SET] [took 0.133 sec]
       test: points..............OK [2 fields, 22496 points] [took 0.015 sec]
       test: vectors.............OK [0 fields, 0 vectors] [took 0.000 sec]
   
   ...
   ...
   
   6 of 15: name=_gb maxDoc=119789
       version=9.0.0
       id=59c6he3dhebad46x7proh30ld
       codec=Lucene90
       compound=false
       numFiles=17
       size (MB)=125.605
       diagnostics = {os=Mac OS X, timestamp=1622102690942, lucene.version=9.0.0, java.vm.version=11.0.9+11, java.version=11.0.9, mergeMaxNumSegments=-1, os.version=10.15.5, java.vendor=AdoptOpenJDK, source=merge, os.arch=x86_64, mergeFactor=10, java.runtime.version=11.0.9+11}
       no deletions
       test: open reader.........OK [took 0.101 sec]
       test: check integrity.....OK [took 0.678 sec]
       test: check live docs.....OK [took 0.000 sec]
       test: field infos.........OK [17 fields] [took 0.000 sec]
       test: field norms.........OK [2 fields] [took 0.007 sec]
       test: terms, freq, prox...OK [1773712 terms; 20129621 terms/docs pairs; 51648295 tokens] [took 5.681 sec]
       test: stored fields.......OK [359367 total field count; avg 3.0 fields per doc] [took 0.168 sec]
       test: term vectors........OK [0 total term vector count; avg 0.0 term/freq vector fields per doc] [took 0.000 sec]
       test: docvalues...........OK [10 docvalues fields; 3 BINARY; 0 NUMERIC; 5 SORTED; 0 SORTED_NUMERIC; 2 SORTED_SET] [took 0.856 sec]
       test: points..............OK [2 fields, 239578 points] [took 0.037 sec]
       test: vectors.............OK [0 fields, 0 vectors] [took 0.000 sec]
   
   7 of 15: name=_gx maxDoc=119789
       version=9.0.0
       id=59c6he3dhebad46x7proh30n7
       codec=Lucene90
       compound=false
       numFiles=17
       size (MB)=129.046
       diagnostics = {os=Mac OS X, timestamp=1622102767300, lucene.version=9.0.0, java.vm.version=11.0.9+11, java.version=11.0.9, mergeMaxNumSegments=-1, os.version=10.15.5, java.vendor=AdoptOpenJDK, source=merge, os.arch=x86_64, mergeFactor=10, java.runtime.version=11.0.9+11}
       no deletions
       test: open reader.........OK [took 0.101 sec]
       test: check integrity.....FAILED
       WARNING: exorciseIndex() would remove reference to this segment; full exception:
   org.apache.lucene.index.CorruptIndexException: checksum failed (hardware problem?) : expected=87e2aa4 actual=7b3afcbd (resource=BufferedChecksumIndexInput(MMapIndexInput(path="/Users/xichen/IdeaProjects/benchmarks/indices/corrupted/index/_gx_Lucene90_0.dvd")))
   	at org.apache.lucene.codecs.CodecUtil.checkFooter(CodecUtil.java:440)
   	at org.apache.lucene.codecs.CodecUtil.checksumEntireFile(CodecUtil.java:614)
   	at org.apache.lucene.codecs.lucene90.Lucene90DocValuesProducer.checkIntegrity(Lucene90DocValuesProducer.java:1656)
   	at org.apache.lucene.codecs.perfield.PerFieldDocValuesFormat$FieldsReader.checkIntegrity(PerFieldDocValuesFormat.java:364)
   	at org.apache.lucene.index.CodecReader.checkIntegrity(CodecReader.java:252)
   	at org.apache.lucene.index.SegmentReader.checkIntegrity(SegmentReader.java:391)
   	at org.apache.lucene.index.CheckIndex.testSegment(CheckIndex.java:925)
   	at org.apache.lucene.index.CheckIndex.lambda$checkIndex$1(CheckIndex.java:756)
   	at org.apache.lucene.index.CheckIndex.lambda$callableToSupplier$2(CheckIndex.java:854)
   	at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
   	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
   	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
   	at java.base/java.lang.Thread.run(Thread.java:834)
   
   14 of 15: name=_65 maxDoc=1197893
       version=9.0.0
       id=59c6he3dhebad46x7proh2zqv
       codec=Lucene90
       compound=false
       numFiles=17
       size (MB)=1,539.981
       diagnostics = {os=Mac OS X, timestamp=1622100810971, lucene.version=9.0.0, java.vm.version=11.0.9+11, java.version=11.0.9, mergeMaxNumSegments=-1, os.version=10.15.5, java.vendor=AdoptOpenJDK, source=merge, os.arch=x86_64, mergeFactor=10, java.runtime.version=11.0.9+11}
       no deletions
       test: open reader.........OK [took 0.011 sec]
       test: check integrity.....OK [took 11.858 sec]
       test: check live docs.....OK [took 0.000 sec]
       test: field infos.........OK [17 fields] [took 0.000 sec]
       test: field norms.........OK [2 fields] [took 0.044 sec]
       test: terms, freq, prox...OK [15042354 terms; 274837439 terms/docs pairs; 686566591 tokens] [took 83.292 sec]
       test: stored fields.......OK [3593679 total field count; avg 3.0 fields per doc] [took 0.903 sec]
       test: term vectors........OK [0 total term vector count; avg 0.0 term/freq vector fields per doc] [took 0.000 sec]
       test: docvalues...........OK [10 docvalues fields; 3 BINARY; 0 NUMERIC; 5 SORTED; 0 SORTED_NUMERIC; 2 SORTED_SET] [took 2.599 sec]
       test: points..............OK [2 fields, 2395786 points] [took 0.210 sec]
       test: vectors.............OK [0 fields, 0 vectors] [took 0.000 sec]
   
   15 of 15: name=_32 maxDoc=1197893
       version=9.0.0
       id=59c6he3dhebad46x7proh2zhm
       codec=Lucene90
       compound=false
       numFiles=17
       size (MB)=2,531.843
       diagnostics = {os=Mac OS X, timestamp=1622100146526, lucene.version=9.0.0, java.vm.version=11.0.9+11, java.version=11.0.9, mergeMaxNumSegments=-1, os.version=10.15.5, java.vendor=AdoptOpenJDK, source=merge, os.arch=x86_64, mergeFactor=10, java.runtime.version=11.0.9+11}
       no deletions
       test: open reader.........OK [took 0.016 sec]
       test: check integrity.....OK [took 19.048 sec]
       test: check live docs.....OK [took 0.000 sec]
       test: field infos.........OK [17 fields] [took 0.000 sec]
       test: field norms.........OK [2 fields] [took 0.047 sec]
       test: terms, freq, prox...OK [20065511 terms; 450728331 terms/docs pairs; 1175837878 tokens] [took 118.554 sec]
       test: stored fields.......OK [3593679 total field count; avg 3.0 fields per doc] [took 1.099 sec]
       test: term vectors........OK [0 total term vector count; avg 0.0 term/freq vector fields per doc] [took 0.000 sec]
       test: docvalues...........OK [10 docvalues fields; 3 BINARY; 0 NUMERIC; 5 SORTED; 0 SORTED_NUMERIC; 2 SORTED_SET] [took 2.914 sec]
       test: points..............OK [2 fields, 2395786 points] [took 0.214 sec]
       test: vectors.............OK [0 fields, 0 vectors] [took 0.000 sec]
   
   WARNING: 1 broken segments (containing 119789 documents) detected
   Took 143.432 sec total.
   WARNING: would write new segments file, and 119789 documents would be lost, if -exorcise were specified
   
   
   
   > Task :lucene:core:CheckIndex.main() FAILED
   
   Execution failed for task ':lucene:core:CheckIndex.main()'.
   > Process 'command '/Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home/bin/java'' finished with non-zero exit value 1
   
   
   
   ```
   
   ### Check with `-segment _gx` flag
   ```
   > Task :lucene:core:CheckIndex.main() FAILED
   
   NOTE: testing will be more thorough if you run java with '-ea:org.apache.lucene...', so assertions are enabled
   
   Opening index @ /Users/xichen/IdeaProjects/benchmarks/indices/corrupted/index/
   
   Checking index with async threadCount: 12
   0.00% total deletions; 6647577 documents; 0 deletions
   Segments file=segments_2 numSegments=15 version=9.0.0 id=59c6he3dhebad46x7proh30nq userData={userData=multi}
   
   Checking only these segments: _gx:
   7 of 15: name=_gx maxDoc=119789
       version=9.0.0
       id=59c6he3dhebad46x7proh30n7
       codec=Lucene90
       compound=false
       numFiles=17
       size (MB)=129.046
       diagnostics = {os.arch=x86_64, mergeFactor=10, java.runtime.version=11.0.9+11, os=Mac OS X, timestamp=1622102767300, lucene.version=9.0.0, java.vm.version=11.0.9+11, java.version=11.0.9, mergeMaxNumSegments=-1, os.version=10.15.5, java.vendor=AdoptOpenJDK, source=merge}
       no deletions
       test: open reader.........OK [took 0.058 sec]
       test: check integrity.....FAILED
       WARNING: exorciseIndex() would remove reference to this segment; full exception:
   org.apache.lucene.index.CorruptIndexException: checksum failed (hardware problem?) : expected=87e2aa4 actual=7b3afcbd (resource=BufferedChecksumIndexInput(MMapIndexInput(path="/Users/xichen/IdeaProjects/benchmarks/indices/corrupted/index/_gx_Lucene90_0.dvd")))
   	at org.apache.lucene.codecs.CodecUtil.checkFooter(CodecUtil.java:440)
   	at org.apache.lucene.codecs.CodecUtil.checksumEntireFile(CodecUtil.java:614)
   	at org.apache.lucene.codecs.lucene90.Lucene90DocValuesProducer.checkIntegrity(Lucene90DocValuesProducer.java:1656)
   	at org.apache.lucene.codecs.perfield.PerFieldDocValuesFormat$FieldsReader.checkIntegrity(PerFieldDocValuesFormat.java:364)
   	at org.apache.lucene.index.CodecReader.checkIntegrity(CodecReader.java:252)
   	at org.apache.lucene.index.SegmentReader.checkIntegrity(SegmentReader.java:391)
   	at org.apache.lucene.index.CheckIndex.testSegment(CheckIndex.java:925)
   	at org.apache.lucene.index.CheckIndex.lambda$checkIndex$1(CheckIndex.java:756)
   	at org.apache.lucene.index.CheckIndex.lambda$callableToSupplier$2(CheckIndex.java:854)
   	at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
   	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
   	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
   	at java.base/java.lang.Thread.run(Thread.java:834)
   
   WARNING: 1 broken segments (containing 119789 documents) detected
   Took 0.347 sec total.
   WARNING: would write new segments file, and 119789 documents would be lost, if -exorcise were specified
   
   
   
   Execution failed for task ':lucene:core:CheckIndex.main()'.
   > Process 'command '/Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home/bin/java'' finished with non-zero exit value 1
   
   
   
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@lucene.apache.org
For additional commands, e-mail: issues-help@lucene.apache.org