You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@lucene.apache.org by ab...@apache.org on 2017/03/01 09:27:09 UTC

[01/50] [abbrv] lucene-solr:jira/solr-9858: solr/CHANGES.txt: Re-order a couple entries in the 6.5.0 section to make it like the branch_6x version

Repository: lucene-solr
Updated Branches:
  refs/heads/jira/solr-9858 69187b7a4 -> d5bf3506d


solr/CHANGES.txt: Re-order a couple entries in the 6.5.0 section to make it like the branch_6x version


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/6ddf3693
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/6ddf3693
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/6ddf3693

Branch: refs/heads/jira/solr-9858
Commit: 6ddf3693bb196ab761edbc6fb0e9a4d20ab1633f
Parents: 9bc3fa3
Author: Steve Rowe <sa...@apache.org>
Authored: Wed Feb 22 13:23:34 2017 -0500
Committer: Steve Rowe <sa...@apache.org>
Committed: Wed Feb 22 13:23:34 2017 -0500

----------------------------------------------------------------------
 solr/CHANGES.txt | 10 +++++-----
 1 file changed, 5 insertions(+), 5 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/6ddf3693/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index 3924053..44e4fa9 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -162,21 +162,21 @@ Bug Fixes
 * SOLR-10141: Upgrade to Caffeine 2.4.0 since v1.0.1 contributed to BlockCache corruption because the
   removal listener was called more than once for some items and not at all for other items. (Ben Manes, yonik)
 
-* SOLR-9846: Overseer is not always closed after being started. (Mark Miller)
-
 * SOLR-10114: Reordered delete-by-query causes inconsistenties between shards that have
   child documents (Mano Kovacs, Mihaly Toth, yonik)
 
 * SOLR-10159: When DBQ is reordered with an in-place update, upon whose updated value the DBQ is based
   on, the DBQ fails due to excessive caching in DeleteByQueryWrapper (Ishan Chattopadhyaya)
 
-* SOLR-10168: ShardSplit can fail with NPE in OverseerCollectionMessageHandler#waitForCoreAdminAsyncCallToComplete. (Mark Miller)
+* SOLR-10020: CoreAdminHandler silently swallows some errors. (Mike Drob via Erick Erickson)
+
+* SOLR-10063: CoreContainer shutdown has race condition that can cause a hang on shutdown. (Mark Miller)
 
 * SOLR-10170: ClassCastException in RecoveryStrategy. (Mark Miller)
 
-* SOLR-10020: CoreAdminHandler silently swallows some errors. (Mike Drob via Erick Erickson)
+* SOLR-9846: Overseer is not always closed after being started. (Mark Miller)
 
-* SOLR-10063: CoreContainer shutdown has race condition that can cause a hang on shutdown. (Mark Miller)
+* SOLR-10168: ShardSplit can fail with NPE in OverseerCollectionMessageHandler#waitForCoreAdminAsyncCallToComplete. (Mark Miller)
 
 Optimizations
 ----------------------


[20/50] [abbrv] lucene-solr:jira/solr-9858: Remove outdated comment.

Posted by ab...@apache.org.
Remove outdated comment.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/471d8427
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/471d8427
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/471d8427

Branch: refs/heads/jira/solr-9858
Commit: 471d84274225c542e310f6cd8509702db4666ae0
Parents: 95d6fc2
Author: Christine Poerschke <cp...@apache.org>
Authored: Fri Feb 24 18:38:11 2017 +0000
Committer: Christine Poerschke <cp...@apache.org>
Committed: Fri Feb 24 18:38:11 2017 +0000

----------------------------------------------------------------------
 .../src/java/org/apache/solr/handler/component/QueryComponent.java  | 1 -
 1 file changed, 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/471d8427/solr/core/src/java/org/apache/solr/handler/component/QueryComponent.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/handler/component/QueryComponent.java b/solr/core/src/java/org/apache/solr/handler/component/QueryComponent.java
index c357202..deff25b 100644
--- a/solr/core/src/java/org/apache/solr/handler/component/QueryComponent.java
+++ b/solr/core/src/java/org/apache/solr/handler/component/QueryComponent.java
@@ -259,7 +259,6 @@ public class QueryComponent extends SearchComponent
     //TODO: move weighting of sort
     final SortSpec groupSortSpec = searcher.weightSortSpec(sortSpec, Sort.RELEVANCE);
 
-    // groupSort defaults to sort
     String withinGroupSortStr = params.get(GroupParams.GROUP_SORT);
     //TODO: move weighting of sort
     final SortSpec withinGroupSortSpec;


[33/50] [abbrv] lucene-solr:jira/solr-9858: tests: raise timeout

Posted by ab...@apache.org.
tests: raise timeout


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/86b5b633
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/86b5b633
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/86b5b633

Branch: refs/heads/jira/solr-9858
Commit: 86b5b6330fda49f7dc6114dac03fef9fd0caea96
Parents: 0f5875b
Author: markrmiller <ma...@apache.org>
Authored: Mon Feb 27 22:51:02 2017 -0500
Committer: markrmiller <ma...@apache.org>
Committed: Mon Feb 27 22:55:29 2017 -0500

----------------------------------------------------------------------
 .../src/test/org/apache/solr/update/TestInPlaceUpdatesDistrib.java | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/86b5b633/solr/core/src/test/org/apache/solr/update/TestInPlaceUpdatesDistrib.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/update/TestInPlaceUpdatesDistrib.java b/solr/core/src/test/org/apache/solr/update/TestInPlaceUpdatesDistrib.java
index 9136d73..b107cbd 100644
--- a/solr/core/src/test/org/apache/solr/update/TestInPlaceUpdatesDistrib.java
+++ b/solr/core/src/test/org/apache/solr/update/TestInPlaceUpdatesDistrib.java
@@ -812,7 +812,7 @@ public class TestInPlaceUpdatesDistrib extends AbstractFullDistribZkTestBase {
     }
 
     threadpool.shutdown();
-    assertTrue("Thread pool didn't terminate within 10 secs", threadpool.awaitTermination(10, TimeUnit.SECONDS));
+    assertTrue("Thread pool didn't terminate within 10 secs", threadpool.awaitTermination(15, TimeUnit.SECONDS));
 
     commit();
 


[15/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10156: Add significantTerms Streaming Expression

Posted by ab...@apache.org.
SOLR-10156: Add significantTerms Streaming Expression


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/dba733e7
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/dba733e7
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/dba733e7

Branch: refs/heads/jira/solr-9858
Commit: dba733e7aa90bd607fdda0342b94bc17bb717c31
Parents: 894a43b
Author: Joel Bernstein <jb...@apache.org>
Authored: Thu Feb 23 14:12:00 2017 -0500
Committer: Joel Bernstein <jb...@apache.org>
Committed: Thu Feb 23 14:18:03 2017 -0500

----------------------------------------------------------------------
 .../org/apache/solr/handler/StreamHandler.java  |  38 +-
 .../org/apache/solr/search/QParserPlugin.java   |   2 +
 .../search/SignificantTermsQParserPlugin.java   | 260 +++++++++++
 .../apache/solr/search/QueryEqualityTest.java   |   9 +
 .../solrj/io/stream/SignificantTermsStream.java | 444 +++++++++++++++++++
 .../solrj/io/stream/StreamExpressionTest.java   | 135 ++++++
 6 files changed, 852 insertions(+), 36 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/dba733e7/solr/core/src/java/org/apache/solr/handler/StreamHandler.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/handler/StreamHandler.java b/solr/core/src/java/org/apache/solr/handler/StreamHandler.java
index bcb2faa..31b37e7 100644
--- a/solr/core/src/java/org/apache/solr/handler/StreamHandler.java
+++ b/solr/core/src/java/org/apache/solr/handler/StreamHandler.java
@@ -51,41 +51,7 @@ import org.apache.solr.client.solrj.io.ops.ConcatOperation;
 import org.apache.solr.client.solrj.io.ops.DistinctOperation;
 import org.apache.solr.client.solrj.io.ops.GroupOperation;
 import org.apache.solr.client.solrj.io.ops.ReplaceOperation;
-import org.apache.solr.client.solrj.io.stream.CloudSolrStream;
-import org.apache.solr.client.solrj.io.stream.CommitStream;
-import org.apache.solr.client.solrj.io.stream.ComplementStream;
-import org.apache.solr.client.solrj.io.stream.DaemonStream;
-import org.apache.solr.client.solrj.io.stream.ExceptionStream;
-import org.apache.solr.client.solrj.io.stream.ExecutorStream;
-import org.apache.solr.client.solrj.io.stream.FacetStream;
-import org.apache.solr.client.solrj.io.stream.FeaturesSelectionStream;
-import org.apache.solr.client.solrj.io.stream.FetchStream;
-import org.apache.solr.client.solrj.io.stream.HashJoinStream;
-import org.apache.solr.client.solrj.io.stream.HavingStream;
-import org.apache.solr.client.solrj.io.stream.InnerJoinStream;
-import org.apache.solr.client.solrj.io.stream.IntersectStream;
-import org.apache.solr.client.solrj.io.stream.JDBCStream;
-import org.apache.solr.client.solrj.io.stream.LeftOuterJoinStream;
-import org.apache.solr.client.solrj.io.stream.MergeStream;
-import org.apache.solr.client.solrj.io.stream.ModelStream;
-import org.apache.solr.client.solrj.io.stream.NullStream;
-import org.apache.solr.client.solrj.io.stream.OuterHashJoinStream;
-import org.apache.solr.client.solrj.io.stream.ParallelStream;
-import org.apache.solr.client.solrj.io.stream.PriorityStream;
-import org.apache.solr.client.solrj.io.stream.RandomStream;
-import org.apache.solr.client.solrj.io.stream.RankStream;
-import org.apache.solr.client.solrj.io.stream.ReducerStream;
-import org.apache.solr.client.solrj.io.stream.RollupStream;
-import org.apache.solr.client.solrj.io.stream.ScoreNodesStream;
-import org.apache.solr.client.solrj.io.stream.SelectStream;
-import org.apache.solr.client.solrj.io.stream.SortStream;
-import org.apache.solr.client.solrj.io.stream.StatsStream;
-import org.apache.solr.client.solrj.io.stream.StreamContext;
-import org.apache.solr.client.solrj.io.stream.TextLogitStream;
-import org.apache.solr.client.solrj.io.stream.TopicStream;
-import org.apache.solr.client.solrj.io.stream.TupleStream;
-import org.apache.solr.client.solrj.io.stream.UniqueStream;
-import org.apache.solr.client.solrj.io.stream.UpdateStream;
+import org.apache.solr.client.solrj.io.stream.*;
 import org.apache.solr.client.solrj.io.stream.expr.Explanation;
 import org.apache.solr.client.solrj.io.stream.expr.Explanation.ExpressionType;
 import org.apache.solr.client.solrj.io.stream.expr.Expressible;
@@ -193,7 +159,7 @@ public class StreamHandler extends RequestHandlerBase implements SolrCoreAware,
       .withFunctionName("executor", ExecutorStream.class)
       .withFunctionName("null", NullStream.class)
       .withFunctionName("priority", PriorityStream.class)
-      
+      .withFunctionName("significantTerms", SignificantTermsStream.class)
       // metrics
       .withFunctionName("min", MinMetric.class)
       .withFunctionName("max", MaxMetric.class)

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/dba733e7/solr/core/src/java/org/apache/solr/search/QParserPlugin.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/search/QParserPlugin.java b/solr/core/src/java/org/apache/solr/search/QParserPlugin.java
index 573286b..34089d2 100644
--- a/solr/core/src/java/org/apache/solr/search/QParserPlugin.java
+++ b/solr/core/src/java/org/apache/solr/search/QParserPlugin.java
@@ -79,6 +79,8 @@ public abstract class QParserPlugin implements NamedListInitializedPlugin, SolrI
     map.put(GraphTermsQParserPlugin.NAME, GraphTermsQParserPlugin.class);
     map.put(IGainTermsQParserPlugin.NAME, IGainTermsQParserPlugin.class);
     map.put(TextLogisticRegressionQParserPlugin.NAME, TextLogisticRegressionQParserPlugin.class);
+    map.put(SignificantTermsQParserPlugin.NAME, SignificantTermsQParserPlugin.class);
+
     standardPlugins = Collections.unmodifiableMap(map);
   }
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/dba733e7/solr/core/src/java/org/apache/solr/search/SignificantTermsQParserPlugin.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/search/SignificantTermsQParserPlugin.java b/solr/core/src/java/org/apache/solr/search/SignificantTermsQParserPlugin.java
new file mode 100644
index 0000000..9b9a46b
--- /dev/null
+++ b/solr/core/src/java/org/apache/solr/search/SignificantTermsQParserPlugin.java
@@ -0,0 +1,260 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.solr.search;
+
+
+import java.io.IOException;
+import java.util.TreeSet;
+import java.util.List;
+import java.util.ArrayList;
+
+import org.apache.lucene.index.LeafReaderContext;
+import org.apache.lucene.index.MultiFields;
+import org.apache.lucene.index.PostingsEnum;
+import org.apache.lucene.index.Terms;
+import org.apache.lucene.index.TermsEnum;
+import org.apache.lucene.search.DocIdSetIterator;
+import org.apache.lucene.search.IndexSearcher;
+import org.apache.lucene.search.Query;
+import org.apache.lucene.util.BytesRef;
+import org.apache.lucene.util.SparseFixedBitSet;
+import org.apache.solr.common.params.SolrParams;
+import org.apache.solr.common.util.NamedList;
+import org.apache.solr.handler.component.ResponseBuilder;
+import org.apache.solr.request.SolrQueryRequest;
+
+public class SignificantTermsQParserPlugin extends QParserPlugin {
+
+  public static final String NAME = "sigificantTerms";
+
+  @Override
+  public QParser createParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) {
+    return new SignifcantTermsQParser(qstr, localParams, params, req);
+  }
+
+  private static class SignifcantTermsQParser extends QParser {
+
+    public SignifcantTermsQParser(String qstr, SolrParams localParams, SolrParams params, SolrQueryRequest req) {
+      super(qstr, localParams, params, req);
+    }
+
+    @Override
+    public Query parse() throws SyntaxError {
+      String field = getParam("field");
+      int numTerms = Integer.parseInt(params.get("numTerms", "20"));
+      float minDocs = Float.parseFloat(params.get("minDocFreq", "5"));
+      float maxDocs = Float.parseFloat(params.get("maxDocFreq", ".3"));
+      int minTermLength = Integer.parseInt(params.get("minTermLength", "4"));
+      return new SignificantTermsQuery(field, numTerms, minDocs, maxDocs, minTermLength);
+    }
+  }
+
+  private static class SignificantTermsQuery extends AnalyticsQuery {
+
+    private String field;
+    private int numTerms;
+    private float maxDocs;
+    private float minDocs;
+    private int minTermLength;
+
+    public SignificantTermsQuery(String field, int numTerms, float minDocs, float maxDocs, int minTermLength) {
+      this.field = field;
+      this.numTerms = numTerms;
+      this.minDocs = minDocs;
+      this.maxDocs = maxDocs;
+      this.minTermLength = minTermLength;
+
+    }
+
+    @Override
+    public DelegatingCollector getAnalyticsCollector(ResponseBuilder rb, IndexSearcher searcher) {
+      return new SignifcantTermsCollector(rb, searcher, field, numTerms, minDocs, maxDocs, minTermLength);
+    }
+  }
+
+  private static class SignifcantTermsCollector extends DelegatingCollector {
+
+    private String field;
+    private IndexSearcher searcher;
+    private ResponseBuilder rb;
+    private int numTerms;
+    private SparseFixedBitSet docs;
+    private int numDocs;
+    private float minDocs;
+    private float maxDocs;
+    private int count;
+    private int minTermLength;
+    private int highestCollected;
+
+    public SignifcantTermsCollector(ResponseBuilder rb, IndexSearcher searcher, String field, int numTerms, float minDocs, float maxDocs, int minTermLength) {
+      this.rb = rb;
+      this.searcher = searcher;
+      this.field = field;
+      this.numTerms = numTerms;
+      this.docs = new SparseFixedBitSet(searcher.getIndexReader().maxDoc());
+      this.numDocs = searcher.getIndexReader().numDocs();
+      this.minDocs = minDocs;
+      this.maxDocs = maxDocs;
+      this.minTermLength = minTermLength;
+    }
+
+    @Override
+    protected void doSetNextReader(LeafReaderContext context) throws IOException {
+      super.doSetNextReader(context);
+    }
+
+    @Override
+    public void collect(int doc) throws IOException {
+      super.collect(doc);
+      highestCollected = context.docBase + doc;
+      docs.set(highestCollected);
+      ++count;
+    }
+
+    @Override
+    public void finish() throws IOException {
+      List<String> outTerms = new ArrayList();
+      List<Integer> outFreq = new ArrayList();
+      List<Integer> outQueryFreq = new ArrayList();
+      List<Double> scores = new ArrayList();
+
+      NamedList<Integer> allFreq = new NamedList();
+      NamedList<Integer> allQueryFreq = new NamedList();
+
+      rb.rsp.add("numDocs", numDocs);
+      rb.rsp.add("resultCount", count);
+      rb.rsp.add("sterms", outTerms);
+      rb.rsp.add("scores", scores);
+      rb.rsp.add("docFreq", outFreq);
+      rb.rsp.add("queryDocFreq", outQueryFreq);
+
+      //TODO: Use a priority queue
+      TreeSet<TermWithScore> topTerms = new TreeSet<>();
+
+      Terms terms = MultiFields.getFields(searcher.getIndexReader()).terms(field);
+      TermsEnum termsEnum = terms.iterator();
+      BytesRef term;
+      PostingsEnum postingsEnum = null;
+
+      while ((term = termsEnum.next()) != null) {
+        int docFreq = termsEnum.docFreq();
+        
+        if(minDocs < 1.0) {
+          if((float)docFreq/numDocs < minDocs) {
+            continue;
+          }
+        } else if(docFreq < minDocs) {
+          continue;
+        }
+
+        if(maxDocs < 1.0) {
+          if((float)docFreq/numDocs > maxDocs) {
+            continue;
+          }
+        } else if(docFreq > maxDocs) {
+          continue;
+        }
+
+        if(term.length < minTermLength) {
+          continue;
+        }
+
+        int tf = 0;
+        postingsEnum = termsEnum.postings(postingsEnum);
+
+        POSTINGS:
+        while (postingsEnum.nextDoc() != DocIdSetIterator.NO_MORE_DOCS) {
+          int docId = postingsEnum.docID();
+
+          if(docId > highestCollected) {
+            break POSTINGS;
+          }
+
+          if (docs.get(docId)) {
+            ++tf;
+          }
+        }
+
+        if(tf == 0) {
+          continue;
+        }
+
+        float score = (float)Math.log(tf) * (float) (Math.log(((float)(numDocs + 1)) / (docFreq + 1)) + 1.0);
+
+        String t = term.utf8ToString();
+        allFreq.add(t, docFreq);
+        allQueryFreq.add(t, tf);
+
+        if (topTerms.size() < numTerms) {
+          topTerms.add(new TermWithScore(term.utf8ToString(), score));
+        } else  {
+          if (topTerms.first().score < score) {
+            topTerms.pollFirst();
+            topTerms.add(new TermWithScore(term.utf8ToString(), score));
+          }
+        }
+      }
+
+      for (TermWithScore topTerm : topTerms) {
+        outTerms.add(topTerm.term);
+        scores.add(topTerm.score);
+        outFreq.add(allFreq.get(topTerm.term));
+        outQueryFreq.add(allQueryFreq.get(topTerm.term));
+      }
+
+      if (this.delegate instanceof DelegatingCollector) {
+        ((DelegatingCollector) this.delegate).finish();
+      }
+    }
+  }
+
+  private static class TermWithScore implements Comparable<TermWithScore>{
+    public final String term;
+    public final double score;
+
+    public TermWithScore(String term, double score) {
+      this.term = term;
+      this.score = score;
+    }
+
+    @Override
+    public int hashCode() {
+      return term.hashCode();
+    }
+
+    @Override
+    public boolean equals(Object obj) {
+      if (obj == null) return false;
+      if (obj.getClass() != getClass()) return false;
+      TermWithScore other = (TermWithScore) obj;
+      return other.term.equals(this.term);
+    }
+
+    @Override
+    public int compareTo(TermWithScore o) {
+      int cmp = Double.compare(this.score, o.score);
+      if (cmp == 0) {
+        return this.term.compareTo(o.term);
+      } else {
+        return cmp;
+      }
+    }
+  }
+}
+
+

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/dba733e7/solr/core/src/test/org/apache/solr/search/QueryEqualityTest.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/search/QueryEqualityTest.java b/solr/core/src/test/org/apache/solr/search/QueryEqualityTest.java
index 86c7ee8..a9b5c6e 100644
--- a/solr/core/src/test/org/apache/solr/search/QueryEqualityTest.java
+++ b/solr/core/src/test/org/apache/solr/search/QueryEqualityTest.java
@@ -193,6 +193,15 @@ public class QueryEqualityTest extends SolrTestCaseJ4 {
     }
   }
 
+  public void testSignificantTermsQuery() throws Exception {
+    SolrQueryRequest req = req("q", "*:*");
+    try {
+      assertQueryEquals("sigificantTerms", req, "{!sigificantTerms}");
+    } finally {
+      req.close();
+    }
+  }
+
   public void testQuerySwitch() throws Exception {
     SolrQueryRequest req = req("myXXX", "XXX", 
                                "myField", "foo_s",

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/dba733e7/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java
----------------------------------------------------------------------
diff --git a/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java b/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java
new file mode 100644
index 0000000..f077421
--- /dev/null
+++ b/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java
@@ -0,0 +1,444 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.solr.client.solrj.io.stream;
+
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.Comparator;
+import java.util.HashMap;
+import java.util.Iterator;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Random;
+import java.util.Set;
+import java.util.concurrent.Callable;
+import java.util.concurrent.ExecutorService;
+import java.util.concurrent.Future;
+
+import org.apache.solr.client.solrj.impl.CloudSolrClient;
+import org.apache.solr.client.solrj.impl.HttpSolrClient;
+import org.apache.solr.client.solrj.io.SolrClientCache;
+import org.apache.solr.client.solrj.io.Tuple;
+import org.apache.solr.client.solrj.io.comp.StreamComparator;
+import org.apache.solr.client.solrj.io.stream.expr.Explanation;
+import org.apache.solr.client.solrj.io.stream.expr.Expressible;
+import org.apache.solr.client.solrj.io.stream.expr.StreamExplanation;
+import org.apache.solr.client.solrj.io.stream.expr.StreamExpression;
+import org.apache.solr.client.solrj.io.stream.expr.StreamExpressionNamedParameter;
+import org.apache.solr.client.solrj.io.stream.expr.StreamExpressionParameter;
+import org.apache.solr.client.solrj.io.stream.expr.StreamExpressionValue;
+import org.apache.solr.client.solrj.io.stream.expr.StreamFactory;
+import org.apache.solr.client.solrj.request.QueryRequest;
+import org.apache.solr.client.solrj.response.QueryResponse;
+import org.apache.solr.common.cloud.ClusterState;
+import org.apache.solr.common.cloud.Replica;
+import org.apache.solr.common.cloud.Slice;
+import org.apache.solr.common.cloud.ZkCoreNodeProps;
+import org.apache.solr.common.cloud.ZkStateReader;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.util.ExecutorUtil;
+import org.apache.solr.common.util.NamedList;
+import org.apache.solr.common.util.SolrjNamedThreadFactory;
+
+public class SignificantTermsStream extends TupleStream implements Expressible{
+
+  private static final long serialVersionUID = 1;
+
+  protected String zkHost;
+  protected String collection;
+  protected Map<String,String> params;
+  protected Iterator<Tuple> tupleIterator;
+  protected String field;
+  protected int numTerms;
+  protected float minDocFreq;
+  protected float maxDocFreq;
+  protected int minTermLength;
+
+  protected transient SolrClientCache cache;
+  protected transient boolean isCloseCache;
+  protected transient CloudSolrClient cloudSolrClient;
+
+  protected transient StreamContext streamContext;
+  protected ExecutorService executorService;
+
+
+  public SignificantTermsStream(String zkHost,
+                                 String collectionName,
+                                 Map params,
+                                 String field,
+                                 float minDocFreq,
+                                 float maxDocFreq,
+                                 int minTermLength,
+                                 int numTerms) throws IOException {
+
+    init(collectionName,
+         zkHost,
+         params,
+         field,
+         minDocFreq,
+         maxDocFreq,
+         minTermLength,
+         numTerms);
+  }
+
+  public SignificantTermsStream(StreamExpression expression, StreamFactory factory) throws IOException{
+    // grab all parameters out
+    String collectionName = factory.getValueOperand(expression, 0);
+    List<StreamExpressionNamedParameter> namedParams = factory.getNamedOperands(expression);
+    StreamExpressionNamedParameter zkHostExpression = factory.getNamedOperand(expression, "zkHost");
+
+    // Validate there are no unknown parameters - zkHost and alias are namedParameter so we don't need to count it twice
+    if(expression.getParameters().size() != 1 + namedParams.size()){
+      throw new IOException(String.format(Locale.ROOT,"invalid expression %s - unknown operands found",expression));
+    }
+
+    // Collection Name
+    if(null == collectionName){
+      throw new IOException(String.format(Locale.ROOT,"invalid expression %s - collectionName expected as first operand",expression));
+    }
+
+    // Named parameters - passed directly to solr as solrparams
+    if(0 == namedParams.size()){
+      throw new IOException(String.format(Locale.ROOT,"invalid expression %s - at least one named parameter expected. eg. 'q=*:*'",expression));
+    }
+
+    Map<String,String> params = new HashMap<String,String>();
+    for(StreamExpressionNamedParameter namedParam : namedParams){
+      if(!namedParam.getName().equals("zkHost")) {
+        params.put(namedParam.getName(), namedParam.getParameter().toString().trim());
+      }
+    }
+
+    String fieldParam = params.get("field");
+    if(fieldParam != null) {
+      params.remove("field");
+    } else {
+      throw new IOException("field param cannot be null for SignificantTermsStream");
+    }
+
+    String numTermsParam = params.get("limit");
+    int numTerms = 20;
+    if(numTermsParam != null) {
+      numTerms = Integer.parseInt(numTermsParam);
+      params.remove("limit");
+    }
+
+    String minTermLengthParam = params.get("minTermLength");
+    int minTermLength = 4;
+    if(minTermLengthParam != null) {
+      minTermLength = Integer.parseInt(minTermLengthParam);
+      params.remove("minTermLength");
+    }
+
+
+    String minDocFreqParam = params.get("minDocFreq");
+    float minDocFreq = 5.0F;
+    if(minDocFreqParam != null) {
+      minDocFreq = Float.parseFloat(minDocFreqParam);
+      params.remove("minDocFreq");
+    }
+
+    String maxDocFreqParam = params.get("maxDocFreq");
+    float maxDocFreq = .3F;
+    if(maxDocFreqParam != null) {
+      maxDocFreq = Float.parseFloat(maxDocFreqParam);
+      params.remove("maxDocFreq");
+    }
+
+
+    // zkHost, optional - if not provided then will look into factory list to get
+    String zkHost = null;
+    if(null == zkHostExpression){
+      zkHost = factory.getCollectionZkHost(collectionName);
+    }
+    else if(zkHostExpression.getParameter() instanceof StreamExpressionValue){
+      zkHost = ((StreamExpressionValue)zkHostExpression.getParameter()).getValue();
+    }
+    if(null == zkHost){
+      throw new IOException(String.format(Locale.ROOT,"invalid expression %s - zkHost not found for collection '%s'",expression,collectionName));
+    }
+
+    // We've got all the required items
+    init(collectionName, zkHost, params, fieldParam, minDocFreq, maxDocFreq, minTermLength, numTerms);
+  }
+
+  @Override
+  public StreamExpressionParameter toExpression(StreamFactory factory) throws IOException {
+    // functionName(collectionName, param1, param2, ..., paramN, sort="comp", [aliases="field=alias,..."])
+
+    // function name
+    StreamExpression expression = new StreamExpression(factory.getFunctionName(this.getClass()));
+
+    // collection
+    expression.addParameter(collection);
+
+    // parameters
+    for(Map.Entry<String,String> param : params.entrySet()){
+      expression.addParameter(new StreamExpressionNamedParameter(param.getKey(), param.getValue()));
+    }
+
+    expression.addParameter(new StreamExpressionNamedParameter("field", field));
+    expression.addParameter(new StreamExpressionNamedParameter("minDocFreq", Float.toString(minDocFreq)));
+    expression.addParameter(new StreamExpressionNamedParameter("maxDocFreq", Float.toString(maxDocFreq)));
+    expression.addParameter(new StreamExpressionNamedParameter("numTerms", String.valueOf(numTerms)));
+    expression.addParameter(new StreamExpressionNamedParameter("minTermLength", String.valueOf(minTermLength)));
+
+    // zkHost
+    expression.addParameter(new StreamExpressionNamedParameter("zkHost", zkHost));
+
+    return expression;
+  }
+
+  private void init(String collectionName,
+                    String zkHost,
+                    Map params,
+                    String field,
+                    float minDocFreq,
+                    float maxDocFreq,
+                    int minTermLength,
+                    int numTerms) throws IOException {
+    this.zkHost = zkHost;
+    this.collection = collectionName;
+    this.params = params;
+    this.field = field;
+    this.minDocFreq = minDocFreq;
+    this.maxDocFreq = maxDocFreq;
+    this.numTerms = numTerms;
+    this.minTermLength = minTermLength;
+  }
+
+  public void setStreamContext(StreamContext context) {
+    this.cache = context.getSolrClientCache();
+    this.streamContext = context;
+  }
+
+  public void open() throws IOException {
+    if (cache == null) {
+      isCloseCache = true;
+      cache = new SolrClientCache();
+    } else {
+      isCloseCache = false;
+    }
+
+    this.cloudSolrClient = this.cache.getCloudSolrClient(zkHost);
+    this.executorService = ExecutorUtil.newMDCAwareCachedThreadPool(new SolrjNamedThreadFactory("FeaturesSelectionStream"));
+  }
+
+  public List<TupleStream> children() {
+    return null;
+  }
+
+  private List<String> getShardUrls() throws IOException {
+    try {
+      ZkStateReader zkStateReader = cloudSolrClient.getZkStateReader();
+
+      Collection<Slice> slices = CloudSolrStream.getSlices(this.collection, zkStateReader, false);
+
+      ClusterState clusterState = zkStateReader.getClusterState();
+      Set<String> liveNodes = clusterState.getLiveNodes();
+
+      List<String> baseUrls = new ArrayList<>();
+      for(Slice slice : slices) {
+        Collection<Replica> replicas = slice.getReplicas();
+        List<Replica> shuffler = new ArrayList<>();
+        for(Replica replica : replicas) {
+          if(replica.getState() == Replica.State.ACTIVE && liveNodes.contains(replica.getNodeName())) {
+            shuffler.add(replica);
+          }
+        }
+
+        Collections.shuffle(shuffler, new Random());
+        Replica rep = shuffler.get(0);
+        ZkCoreNodeProps zkProps = new ZkCoreNodeProps(rep);
+        String url = zkProps.getCoreUrl();
+        baseUrls.add(url);
+      }
+
+      return baseUrls;
+
+    } catch (Exception e) {
+      throw new IOException(e);
+    }
+  }
+
+  private List<Future<NamedList>> callShards(List<String> baseUrls) throws IOException {
+
+    List<Future<NamedList>> futures = new ArrayList<>();
+    for (String baseUrl : baseUrls) {
+      SignificantTermsCall lc = new SignificantTermsCall(baseUrl,
+          this.params,
+          this.field,
+          this.minDocFreq,
+          this.maxDocFreq,
+          this.minTermLength,
+          this.numTerms);
+
+      Future<NamedList> future = executorService.submit(lc);
+      futures.add(future);
+    }
+
+    return futures;
+  }
+
+  public void close() throws IOException {
+    if (isCloseCache) {
+      cache.close();
+    }
+
+    executorService.shutdown();
+  }
+
+  /** Return the stream sort - ie, the order in which records are returned */
+  public StreamComparator getStreamSort(){
+    return null;
+  }
+
+  @Override
+  public Explanation toExplanation(StreamFactory factory) throws IOException {
+    return new StreamExplanation(getStreamNodeId().toString())
+        .withFunctionName(factory.getFunctionName(this.getClass()))
+        .withImplementingClass(this.getClass().getName())
+        .withExpressionType(Explanation.ExpressionType.STREAM_DECORATOR)
+        .withExpression(toExpression(factory).toString());
+  }
+
+  public Tuple read() throws IOException {
+    try {
+      if (tupleIterator == null) {
+        Map<String, int[]> mergeFreqs = new HashMap<>();
+        long numDocs = 0;
+        long resultCount = 0;
+        for (Future<NamedList> getTopTermsCall : callShards(getShardUrls())) {
+          NamedList resp = getTopTermsCall.get();
+
+          List<String> terms = (List<String>)resp.get("sterms");
+          List<Integer> docFreqs = (List<Integer>)resp.get("docFreq");
+          List<Integer> queryDocFreqs = (List<Integer>)resp.get("queryDocFreq");
+          numDocs += (Integer)resp.get("numDocs");
+          resultCount += (Integer)resp.get("resultCount");
+
+          for (int i = 0; i < terms.size(); i++) {
+            String term = terms.get(i);
+            int docFreq = docFreqs.get(i);
+            int queryDocFreq = queryDocFreqs.get(i);
+            if(!mergeFreqs.containsKey(term)) {
+              mergeFreqs.put(term, new int[2]);
+            }
+
+            int[] freqs = mergeFreqs.get(term);
+            freqs[0] += docFreq;
+            freqs[1] += queryDocFreq;
+          }
+        }
+
+        List<Map> maps = new ArrayList();
+
+        for(String term : mergeFreqs.keySet() ) {
+          int[] freqs = mergeFreqs.get(term);
+          Map map = new HashMap();
+          map.put("term", term);
+          map.put("background", freqs[0]);
+          map.put("foreground", freqs[1]);
+
+          float score = (float)Math.log(freqs[1]) * (float) (Math.log(((float)(numDocs + 1)) / (freqs[0] + 1)) + 1.0);
+
+          map.put("score", score);
+          maps.add(map);
+        }
+
+        Collections.sort(maps, new ScoreComp());
+        List<Tuple> tuples = new ArrayList();
+        for (Map map : maps) {
+          if (tuples.size() == numTerms) break;
+          tuples.add(new Tuple(map));
+        }
+
+        Map map = new HashMap();
+        map.put("EOF", true);
+        tuples.add(new Tuple(map));
+        tupleIterator = tuples.iterator();
+      }
+
+      return tupleIterator.next();
+    } catch(Exception e) {
+      throw new IOException(e);
+    }
+  }
+
+  private class ScoreComp implements Comparator<Map> {
+    public int compare(Map a, Map b) {
+      Float scorea = (Float)a.get("score");
+      Float scoreb = (Float)b.get("score");
+      return scoreb.compareTo(scorea);
+    }
+  }
+
+  protected class SignificantTermsCall implements Callable<NamedList> {
+
+    private String baseUrl;
+    private String field;
+    private float minDocFreq;
+    private float maxDocFreq;
+    private int numTerms;
+    private int minTermLength;
+    private Map<String, String> paramsMap;
+
+    public SignificantTermsCall(String baseUrl,
+                                 Map<String, String> paramsMap,
+                                 String field,
+                                 float minDocFreq,
+                                 float maxDocFreq,
+                                 int minTermLength,
+                                 int numTerms) {
+
+      this.baseUrl = baseUrl;
+      this.field = field;
+      this.minDocFreq = minDocFreq;
+      this.maxDocFreq = maxDocFreq;
+      this.paramsMap = paramsMap;
+      this.numTerms = numTerms;
+      this.minTermLength = minTermLength;
+    }
+
+    public NamedList<Double> call() throws Exception {
+      ModifiableSolrParams params = new ModifiableSolrParams();
+      HttpSolrClient solrClient = cache.getHttpSolrClient(baseUrl);
+
+      params.add("distrib", "false");
+      params.add("fq","{!sigificantTerms}");
+
+      for(String key : paramsMap.keySet()) {
+        params.add(key, paramsMap.get(key));
+      }
+
+      params.add("minDocFreq", Float.toString(minDocFreq));
+      params.add("maxDocFreq", Float.toString(maxDocFreq));
+      params.add("minTermLength", Integer.toString(minTermLength));
+      params.add("field", field);
+      params.add("numTerms", String.valueOf(numTerms*3));
+
+      QueryRequest request= new QueryRequest(params);
+      QueryResponse response = request.process(solrClient);
+      NamedList res = response.getResponse();
+      return res;
+    }
+  }
+}

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/dba733e7/solr/solrj/src/test/org/apache/solr/client/solrj/io/stream/StreamExpressionTest.java
----------------------------------------------------------------------
diff --git a/solr/solrj/src/test/org/apache/solr/client/solrj/io/stream/StreamExpressionTest.java b/solr/solrj/src/test/org/apache/solr/client/solrj/io/stream/StreamExpressionTest.java
index 46446d7..30b7056 100644
--- a/solr/solrj/src/test/org/apache/solr/client/solrj/io/stream/StreamExpressionTest.java
+++ b/solr/solrj/src/test/org/apache/solr/client/solrj/io/stream/StreamExpressionTest.java
@@ -96,6 +96,7 @@ public class StreamExpressionTest extends SolrCloudTestCase {
     } else {
       collection = COLLECTIONORALIAS;
     }
+
     CollectionAdminRequest.createCollection(collection, "conf", 2, 1).process(cluster.getSolrClient());
     AbstractDistribZkTestBase.waitForRecoveriesToFinish(collection, cluster.getSolrClient().getZkStateReader(),
         false, true, TIMEOUT);
@@ -4707,6 +4708,140 @@ public class StreamExpressionTest extends SolrCloudTestCase {
     CollectionAdminRequest.deleteCollection("destinationCollection").process(cluster.getSolrClient());
   }
 
+
+  @Test
+  public void testSignificantTermsStream() throws Exception {
+
+    Assume.assumeTrue(!useAlias);
+
+    UpdateRequest updateRequest = new UpdateRequest();
+    for (int i = 0; i < 5000; i++) {
+      updateRequest.add(id, "a"+i, "test_t", "a b c d m l");
+    }
+
+    for (int i = 0; i < 5000; i++) {
+      updateRequest.add(id, "b"+i, "test_t", "a b e f");
+    }
+
+    for (int i = 0; i < 900; i++) {
+      updateRequest.add(id, "c"+i, "test_t", "c");
+    }
+
+    for (int i = 0; i < 600; i++) {
+      updateRequest.add(id, "d"+i, "test_t", "d");
+    }
+
+    for (int i = 0; i < 500; i++) {
+      updateRequest.add(id, "e"+i, "test_t", "m");
+    }
+
+    updateRequest.commit(cluster.getSolrClient(), COLLECTIONORALIAS);
+
+    TupleStream stream;
+    List<Tuple> tuples;
+
+    StreamFactory factory = new StreamFactory()
+        .withCollectionZkHost("collection1", cluster.getZkServer().getZkAddress())
+        .withFunctionName("significantTerms", SignificantTermsStream.class);
+
+    String significantTerms = "significantTerms(collection1, q=\"id:a*\",  field=\"test_t\", limit=3, minTermLength=1, maxDocFreq=\".5\")";
+    stream = factory.constructStream(significantTerms);
+    tuples = getTuples(stream);
+
+    assert(tuples.size() == 3);
+    assertTrue(tuples.get(0).get("term").equals("l"));
+    assertTrue(tuples.get(0).getLong("background") == 5000);
+    assertTrue(tuples.get(0).getLong("foreground") == 5000);
+
+
+    assertTrue(tuples.get(1).get("term").equals("m"));
+    assertTrue(tuples.get(1).getLong("background") == 5500);
+    assertTrue(tuples.get(1).getLong("foreground") == 5000);
+
+    assertTrue(tuples.get(2).get("term").equals("d"));
+    assertTrue(tuples.get(2).getLong("background") == 5600);
+    assertTrue(tuples.get(2).getLong("foreground") == 5000);
+
+    //Test maxDocFreq
+    significantTerms = "significantTerms(collection1, q=\"id:a*\",  field=\"test_t\", limit=3, maxDocFreq=2650, minTermLength=1)";
+    stream = factory.constructStream(significantTerms);
+    tuples = getTuples(stream);
+
+    assert(tuples.size() == 1);
+    assertTrue(tuples.get(0).get("term").equals("l"));
+    assertTrue(tuples.get(0).getLong("background") == 5000);
+    assertTrue(tuples.get(0).getLong("foreground") == 5000);
+
+    //Test maxDocFreq percentage
+
+    significantTerms = "significantTerms(collection1, q=\"id:a*\",  field=\"test_t\", limit=3, maxDocFreq=\".45\", minTermLength=1)";
+    stream = factory.constructStream(significantTerms);
+    tuples = getTuples(stream);
+    assert(tuples.size() == 1);
+    assertTrue(tuples.get(0).get("term").equals("l"));
+    assertTrue(tuples.get(0).getLong("background") == 5000);
+    assertTrue(tuples.get(0).getLong("foreground") == 5000);
+
+
+    //Test min doc freq
+    significantTerms = "significantTerms(collection1, q=\"id:a*\",  field=\"test_t\", limit=3, minDocFreq=\"2700\", minTermLength=1, maxDocFreq=\".5\")";
+    stream = factory.constructStream(significantTerms);
+    tuples = getTuples(stream);
+
+    assert(tuples.size() == 3);
+
+    assertTrue(tuples.get(0).get("term").equals("m"));
+    assertTrue(tuples.get(0).getLong("background") == 5500);
+    assertTrue(tuples.get(0).getLong("foreground") == 5000);
+
+    assertTrue(tuples.get(1).get("term").equals("d"));
+    assertTrue(tuples.get(1).getLong("background") == 5600);
+    assertTrue(tuples.get(1).getLong("foreground") == 5000);
+
+    assertTrue(tuples.get(2).get("term").equals("c"));
+    assertTrue(tuples.get(2).getLong("background") == 5900);
+    assertTrue(tuples.get(2).getLong("foreground") == 5000);
+
+
+    //Test min doc freq percent
+    significantTerms = "significantTerms(collection1, q=\"id:a*\",  field=\"test_t\", limit=3, minDocFreq=\".478\", minTermLength=1, maxDocFreq=\".5\")";
+    stream = factory.constructStream(significantTerms);
+    tuples = getTuples(stream);
+
+    assert(tuples.size() == 1);
+
+    assertTrue(tuples.get(0).get("term").equals("c"));
+    assertTrue(tuples.get(0).getLong("background") == 5900);
+    assertTrue(tuples.get(0).getLong("foreground") == 5000);
+
+
+    //Test limit
+
+    significantTerms = "significantTerms(collection1, q=\"id:a*\",  field=\"test_t\", limit=2, minDocFreq=\"2700\", minTermLength=1, maxDocFreq=\".5\")";
+    stream = factory.constructStream(significantTerms);
+    tuples = getTuples(stream);
+
+    assert(tuples.size() == 2);
+
+    assertTrue(tuples.get(0).get("term").equals("m"));
+    assertTrue(tuples.get(0).getLong("background") == 5500);
+    assertTrue(tuples.get(0).getLong("foreground") == 5000);
+
+    assertTrue(tuples.get(1).get("term").equals("d"));
+    assertTrue(tuples.get(1).getLong("background") == 5600);
+    assertTrue(tuples.get(1).getLong("foreground") == 5000);
+
+    //Test term length
+
+    significantTerms = "significantTerms(collection1, q=\"id:a*\",  field=\"test_t\", limit=2, minDocFreq=\"2700\", minTermLength=2)";
+    stream = factory.constructStream(significantTerms);
+    tuples = getTuples(stream);
+    assert(tuples.size() == 0);
+
+  }
+
+
+
   @Test
   public void testComplementStream() throws Exception {
 


[29/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10192: Fix copy/paste in solr-ltr pom.xml template.

Posted by ab...@apache.org.
SOLR-10192: Fix copy/paste in solr-ltr pom.xml template.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/048b24c6
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/048b24c6
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/048b24c6

Branch: refs/heads/jira/solr-9858
Commit: 048b24c64a9b1a41e2f7c2bd3ca1c818ddd916df
Parents: ea37b9a
Author: Christine Poerschke <cp...@apache.org>
Authored: Mon Feb 27 12:31:25 2017 +0000
Committer: Christine Poerschke <cp...@apache.org>
Committed: Mon Feb 27 12:31:25 2017 +0000

----------------------------------------------------------------------
 dev-tools/maven/solr/contrib/ltr/pom.xml.template | 10 +++++-----
 solr/CHANGES.txt                                  |  2 ++
 2 files changed, 7 insertions(+), 5 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/048b24c6/dev-tools/maven/solr/contrib/ltr/pom.xml.template
----------------------------------------------------------------------
diff --git a/dev-tools/maven/solr/contrib/ltr/pom.xml.template b/dev-tools/maven/solr/contrib/ltr/pom.xml.template
index 4de59a2..68b5eeb 100644
--- a/dev-tools/maven/solr/contrib/ltr/pom.xml.template
+++ b/dev-tools/maven/solr/contrib/ltr/pom.xml.template
@@ -29,7 +29,7 @@
   <groupId>org.apache.solr</groupId>
   <artifactId>solr-ltr</artifactId>
   <packaging>jar</packaging>
-  <name>Apache Solr Analytics Package</name>
+  <name>Apache Solr Learning to Rank Package</name>
   <description>
     Apache Solr Learning to Rank Package
   </description>
@@ -57,10 +57,10 @@
       <artifactId>solr-test-framework</artifactId>
       <scope>test</scope>
     </dependency>
-    @solr-analytics.internal.dependencies@
-    @solr-analytics.external.dependencies@
-    @solr-analytics.internal.test.dependencies@
-    @solr-analytics.external.test.dependencies@
+    @solr-ltr.internal.dependencies@
+    @solr-ltr.external.dependencies@
+    @solr-ltr.internal.test.dependencies@
+    @solr-ltr.external.test.dependencies@
   </dependencies>
   <build>
     <sourceDirectory>${module-path}/src/java</sourceDirectory>

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/048b24c6/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index e06603c..546b484 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -280,6 +280,8 @@ Bug Fixes
 
 * SOLR-10190: Fix NPE in CloudSolrClient when reading stale alias (Janosch Woschitz via Tom�s Fern�ndez L�bbe)
 
+* SOLR-10192: Fix copy/paste in solr-ltr pom.xml template. (Christine Poerschke)
+
 ==================  6.4.1 ==================
 
 Consult the LUCENE_CHANGES.txt file for additional, low level, changes in this release.


[50/50] [abbrv] lucene-solr:jira/solr-9858: Merge branch 'master' into jira/solr-9858

Posted by ab...@apache.org.
Merge branch 'master' into jira/solr-9858


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/d5bf3506
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/d5bf3506
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/d5bf3506

Branch: refs/heads/jira/solr-9858
Commit: d5bf3506d03db763f84ab3aa48c43e5baac84e5f
Parents: 69187b7 ec13032
Author: Andrzej Bialecki <ab...@apache.org>
Authored: Wed Mar 1 10:26:42 2017 +0100
Committer: Andrzej Bialecki <ab...@apache.org>
Committed: Wed Mar 1 10:26:42 2017 +0100

----------------------------------------------------------------------
 .../maven/solr/contrib/ltr/pom.xml.template     |  10 +-
 lucene/CHANGES.txt                              |  78 ++-
 lucene/MIGRATE.txt                              |   8 +
 .../commongrams/CommonGramsQueryFilter.java     |   6 +
 .../PatternReplaceCharFilterFactory.java        |   9 +-
 .../pattern/SimplePatternSplitTokenizer.java    |   9 +-
 .../pattern/SimplePatternTokenizer.java         |   2 +-
 .../lucene/analysis/shingle/ShingleFilter.java  |   7 +-
 .../analysis/synonym/SynonymGraphFilter.java    |   8 +
 .../TestCommonGramsQueryFilterFactory.java      |  10 +
 .../TestSimplePatternSplitTokenizer.java        |  10 +
 .../pattern/TestSimplePatternTokenizer.java     |  10 +
 .../analysis/shingle/ShingleFilterTest.java     |  94 +++-
 .../synonym/TestSynonymGraphFilter.java         |  34 ++
 .../apache/lucene/index/FixBrokenOffsets.java   |  10 +
 .../index/TestBackwardsCompatibility.java       |  49 +-
 .../lucene/index/TestFixBrokenOffsets.java      |   7 +-
 .../lucene/index/TestIndexWriterOnOldIndex.java |  55 ++
 .../lucene/index/index.single-empty-doc.630.zip | Bin 0 -> 1363 bytes
 .../apache/lucene/codecs/DocValuesConsumer.java |   4 +-
 .../codecs/blocktree/BlockTreeTermsReader.java  |  30 +-
 .../blocktree/IntersectTermsEnumFrame.java      |  70 +--
 .../codecs/blocktree/SegmentTermsEnumFrame.java | 154 ++----
 .../CompressingStoredFieldsReader.java          |  19 +-
 .../CompressingStoredFieldsWriter.java          |   5 +-
 .../CompressingTermVectorsReader.java           |  19 +-
 .../CompressingTermVectorsWriter.java           |   5 +-
 .../lucene/index/ExitableDirectoryReader.java   |  21 +-
 .../apache/lucene/index/FilterCodecReader.java  |  13 +-
 .../apache/lucene/index/FilterLeafReader.java   |  73 +--
 .../org/apache/lucene/index/IndexReader.java    | 115 ++--
 .../org/apache/lucene/index/IndexWriter.java    |  20 +-
 .../org/apache/lucene/index/LeafReader.java     |  84 +--
 .../apache/lucene/index/MergeReaderWrapper.java |  20 +-
 .../org/apache/lucene/index/MergeScheduler.java |   3 -
 .../org/apache/lucene/index/MultiDocValues.java |  22 +-
 .../org/apache/lucene/index/MultiReader.java    |  11 +
 .../index/OneMergeWrappingMergePolicy.java      |  72 +++
 .../lucene/index/ParallelCompositeReader.java   |  16 +-
 .../apache/lucene/index/ParallelLeafReader.java |  36 +-
 .../apache/lucene/index/SegmentCoreReaders.java |  39 +-
 .../org/apache/lucene/index/SegmentInfos.java   |  77 ++-
 .../org/apache/lucene/index/SegmentReader.java  |  57 +-
 .../lucene/index/SlowCodecReaderWrapper.java    |   8 +-
 .../apache/lucene/index/SortingLeafReader.java  |  12 +
 .../lucene/index/StandardDirectoryReader.java   |  42 ++
 .../apache/lucene/search/DisjunctionScorer.java |   4 +-
 .../java/org/apache/lucene/search/FieldDoc.java |   6 +-
 .../org/apache/lucene/search/GraphQuery.java    | 136 -----
 .../org/apache/lucene/search/IndexSearcher.java |   4 +-
 .../org/apache/lucene/search/LRUQueryCache.java |  39 +-
 .../lucene/search/MinShouldMatchSumScorer.java  |  16 +-
 .../lucene/search/MultiLeafFieldComparator.java |  92 ++++
 .../java/org/apache/lucene/search/ScoreDoc.java |   2 +-
 .../java/org/apache/lucene/search/Scorer.java   |  12 +-
 .../java/org/apache/lucene/search/TopDocs.java  | 124 +++--
 .../apache/lucene/search/TopFieldCollector.java | 212 ++------
 .../search/UsageTrackingQueryCachingPolicy.java |  90 +++-
 .../org/apache/lucene/util/QueryBuilder.java    | 126 ++++-
 .../java/org/apache/lucene/util/Version.java    |   7 +
 .../java/org/apache/lucene/util/fst/Util.java   |  80 +--
 .../lucene/util/packed/BlockPackedReader.java   |   3 +-
 .../index/TestDemoParallelLeafReader.java       |  11 +-
 .../lucene/index/TestDirectoryReader.java       |   8 +-
 .../lucene/index/TestDirectoryReaderReopen.java |  12 +-
 .../index/TestExitableDirectoryReader.java      |  10 +
 .../lucene/index/TestFilterDirectoryReader.java |   5 +
 .../lucene/index/TestFilterLeafReader.java      |  21 +-
 .../lucene/index/TestIndexReaderClose.java      |  62 ++-
 .../apache/lucene/index/TestIndexWriter.java    |  11 +-
 .../apache/lucene/index/TestMultiTermsEnum.java |  10 +
 .../index/TestOneMergeWrappingMergePolicy.java  | 146 +++++
 .../index/TestParallelCompositeReader.java      |  33 +-
 .../apache/lucene/index/TestReadOnlyIndex.java  |   2 +-
 .../apache/lucene/index/TestSegmentInfos.java   |  11 +-
 .../lucene/search/TermInSetQueryTest.java       |  17 +-
 .../search/TestBooleanQueryVisitSubscorers.java |  45 +-
 .../apache/lucene/search/TestFuzzyQuery.java    |   5 +-
 .../apache/lucene/search/TestGraphQuery.java    |  79 ---
 .../apache/lucene/search/TestLRUQueryCache.java |  61 ++-
 .../lucene/search/TestSearcherManager.java      |  15 +
 .../lucene/search/TestSubScorerFreqs.java       |   2 +-
 .../org/apache/lucene/search/TestTermQuery.java |  15 +
 .../apache/lucene/search/TestTermScorer.java    |  10 +
 .../apache/lucene/search/TestTopDocsMerge.java  |  81 ++-
 .../TestUsageTrackingFilterCachingPolicy.java   |  92 ++++
 .../apache/lucene/util/TestQueryBuilder.java    |  29 +-
 .../lucene/demo/facet/RangeFacetsExample.java   |  42 +-
 .../lucene/facet/range/RangeFacetCounts.java    |  16 +
 .../DefaultSortedSetDocValuesReaderState.java   |   3 +-
 .../facet/taxonomy/CachedOrdinalsReader.java    |   7 +-
 .../taxonomy/OrdinalMappingLeafReader.java      |  10 +
 .../facet/AssertingSubDocsAtOnceCollector.java  |   3 +-
 .../search/highlight/TermVectorLeafReader.java  |  20 +-
 .../highlight/WeightedSpanTermExtractor.java    |  10 +
 .../MultiTermHighlighting.java                  |  20 +-
 .../uhighlight/MultiTermHighlighting.java       |  20 +-
 .../lucene/search/uhighlight/PhraseHelper.java  |  10 +
 .../TermVectorFilteredLeafReader.java           |  10 +
 .../search/uhighlight/UnifiedHighlighter.java   |  15 +
 .../uhighlight/TestUnifiedHighlighterMTQ.java   |  30 +-
 .../TestUnifiedHighlighterTermVec.java          |  15 +
 lucene/ivy-versions.properties                  |   4 +-
 .../lucene/search/join/QueryBitSetProducer.java |  14 +-
 .../search/join/ToChildBlockJoinQuery.java      |  23 +-
 .../search/join/ToParentBlockJoinQuery.java     |  23 +-
 .../apache/lucene/search/join/TestJoinUtil.java |  10 +-
 .../search/join/TestQueryBitSetProducer.java    | 110 ++++
 .../apache/lucene/index/memory/MemoryIndex.java | 373 +++++++------
 .../org/apache/lucene/index/IndexSplitter.java  |   4 +-
 .../lucene/index/MultiPassIndexSplitter.java    |  15 +
 .../apache/lucene/index/PKIndexSplitter.java    |  10 +
 .../queryparser/classic/QueryParserBase.java    |  15 -
 .../classic/TestMultiFieldQueryParser.java      |   3 +-
 .../queryparser/classic/TestQueryParser.java    | 124 +++--
 .../lucene/replicator/nrt/ReplicaNode.java      |   3 +-
 .../nrt/SegmentInfosSearcherManager.java        |   8 +-
 .../lucene/document/DoubleRangeField.java       |  20 +
 .../apache/lucene/document/FloatRangeField.java |  20 +
 .../apache/lucene/document/IntRangeField.java   |  20 +
 .../apache/lucene/document/LongRangeField.java  |  20 +
 .../apache/lucene/document/RangeFieldQuery.java |  90 ++--
 .../search/BaseRangeFieldQueryTestCase.java     |  18 +-
 .../search/TestDoubleRangeFieldQueries.java     |   9 +-
 .../search/TestFloatRangeFieldQueries.java      |   9 +-
 .../lucene/search/TestIntRangeFieldQueries.java |   9 +-
 .../search/TestLongRangeFieldQueries.java       |   9 +-
 .../suggest/document/CompletionAnalyzer.java    |   2 +-
 .../suggest/document/CompletionQuery.java       |   2 +-
 .../search/suggest/document/NRTSuggester.java   |  89 ++-
 .../search/suggest/document/SuggestField.java   |   2 +-
 .../suggest/document/SuggestIndexSearcher.java  |   7 +-
 .../search/suggest/document/TopSuggestDocs.java |  19 +
 .../document/TopSuggestDocsCollector.java       |  83 ++-
 .../suggest/document/TestContextQuery.java      |  26 +-
 .../document/TestContextSuggestField.java       |   8 +-
 .../document/TestFuzzyCompletionQuery.java      |   6 +-
 .../document/TestPrefixCompletionQuery.java     |  28 +-
 .../document/TestRegexCompletionQuery.java      |   6 +-
 .../suggest/document/TestSuggestField.java      | 278 +++++++++-
 .../lucene/index/AllDeletedFilterReader.java    |  10 +
 .../lucene/index/AssertingDirectoryReader.java  |   9 +-
 .../lucene/index/AssertingLeafReader.java       |  30 +-
 .../index/BaseStoredFieldsFormatTestCase.java   |  15 +
 .../lucene/index/FieldFilterLeafReader.java     |  12 +-
 .../lucene/index/MismatchedDirectoryReader.java |   5 +
 .../lucene/index/MismatchedLeafReader.java      |  10 +
 .../lucene/index/MockRandomMergePolicy.java     |  13 +-
 .../org/apache/lucene/search/QueryUtils.java    |  43 +-
 .../org/apache/lucene/util/LuceneTestCase.java  |  30 +-
 solr/CHANGES.txt                                |  97 +++-
 solr/NOTICE.txt                                 |  27 -
 solr/README.txt                                 |   2 +-
 solr/bin/install_solr_service.sh                |   2 +
 solr/build.xml                                  |  21 +-
 solr/common-build.xml                           |  18 +-
 solr/contrib/dataimporthandler/ivy.xml          |   4 +-
 .../solr/handler/dataimport/SolrWriter.java     |   6 +
 .../dataimport/MockInitialContextFactory.java   |  18 +-
 .../handler/dataimport/TestJdbcDataSource.java  | 475 ++++++++--------
 .../org/apache/solr/ltr/LTRScoringQuery.java    |   2 +-
 .../solr/AbstractSolrMorphlineZkTestBase.java   |  12 +
 solr/core/ivy.xml                               |   4 -
 .../solrj/embedded/EmbeddedSolrServer.java      |  55 +-
 .../org/apache/solr/cloud/ElectionContext.java  |  19 +-
 .../org/apache/solr/cloud/LeaderElector.java    |   1 +
 .../cloud/OverseerCollectionMessageHandler.java |   6 +
 .../org/apache/solr/cloud/RecoveryStrategy.java |  12 +-
 .../org/apache/solr/cloud/ZkController.java     |   9 +-
 .../org/apache/solr/core/CoreContainer.java     |  44 +-
 .../org/apache/solr/core/DirectoryFactory.java  |   9 +-
 .../apache/solr/core/MMapDirectoryFactory.java  |   4 +
 .../solr/core/MetricsDirectoryFactory.java      | 537 -------------------
 .../src/java/org/apache/solr/core/SolrCore.java |  26 +-
 .../apache/solr/core/SolrDeletionPolicy.java    |   6 -
 .../org/apache/solr/handler/BlobHandler.java    |  17 +-
 .../solr/handler/ContentStreamHandlerBase.java  |   6 +-
 .../org/apache/solr/handler/StreamHandler.java  |  38 +-
 .../solr/handler/admin/CoreAdminOperation.java  |  30 +-
 .../solr/handler/component/ExpandComponent.java |  21 +-
 .../component/HttpShardHandlerFactory.java      |   4 +-
 .../solr/handler/component/QueryComponent.java  |   1 -
 .../component/ReplicaListTransformer.java       |   2 +-
 .../handler/loader/ContentStreamLoader.java     |   2 -
 .../solr/handler/loader/JavabinLoader.java      |   3 -
 .../solr/highlight/DefaultSolrHighlighter.java  |  10 +
 .../solr/index/SlowCompositeReaderWrapper.java  |  35 +-
 .../solr/request/SolrQueryRequestBase.java      |  17 +-
 .../java/org/apache/solr/schema/PointField.java |   9 +
 .../schema/RptWithGeometrySpatialField.java     |   7 +-
 .../solr/search/CollapsingQParserPlugin.java    |  19 +-
 .../java/org/apache/solr/search/Insanity.java   |   9 +-
 .../org/apache/solr/search/QParserPlugin.java   |   2 +
 .../search/SignificantTermsQParserPlugin.java   | 260 +++++++++
 .../solr/search/function/FileFloatSource.java   |  14 +-
 .../apache/solr/servlet/SolrRequestParsers.java |  11 +-
 .../solr/store/blockcache/BlockCache.java       |  10 +-
 .../store/blockcache/BlockDirectoryCache.java   |   5 -
 .../apache/solr/store/blockcache/Metrics.java   | 121 ++---
 .../org/apache/solr/uninverting/FieldCache.java |  18 +-
 .../apache/solr/uninverting/FieldCacheImpl.java |  70 +--
 .../uninverting/FieldCacheSanityChecker.java    | 426 ---------------
 .../solr/uninverting/UninvertingReader.java     |  21 +-
 .../apache/solr/update/AddUpdateCommand.java    |   2 -
 .../solr/update/DeleteByQueryWrapper.java       |   1 +
 .../solr/update/DirectUpdateHandler2.java       |   4 +-
 .../java/org/apache/solr/update/PeerSync.java   |   3 +
 .../apache/solr/update/SolrCmdDistributor.java  |  15 +-
 .../apache/solr/update/SolrIndexSplitter.java   |  10 +
 .../solr/update/StreamingSolrClients.java       |   2 +-
 .../java/org/apache/solr/update/UpdateLog.java  |   3 +
 .../processor/AtomicUpdateDocumentMerger.java   |   2 +-
 .../processor/DistributedUpdateProcessor.java   |  30 +-
 .../DocExpirationUpdateProcessorFactory.java    |   6 +-
 .../processor/UpdateRequestProcessor.java       |  28 +-
 .../org/apache/solr/util/TestInjection.java     |   8 +-
 .../test-files/solr/collection1/conf/schema.xml |   3 +-
 .../conf/solrconfig-analytics-query.xml         |   2 +-
 .../conf/solrconfig-indexmetrics.xml            |   2 -
 .../TestEmbeddedSolrServerSchemaAPI.java        | 111 ++++
 .../cloud/ChaosMonkeyNothingIsSafeTest.java     |   6 +-
 .../apache/solr/cloud/CleanupOldIndexTest.java  |  18 +-
 .../org/apache/solr/cloud/ClusterStateTest.java |  21 +-
 .../cloud/CollectionsAPIDistributedZkTest.java  |   2 +-
 .../solr/cloud/DistributedVersionInfoTest.java  |  11 +-
 .../apache/solr/cloud/HttpPartitionTest.java    |  12 +-
 ...aderInitiatedRecoveryOnShardRestartTest.java |   4 +-
 ...verseerCollectionConfigSetProcessorTest.java | 213 +++-----
 .../solr/cloud/PeerSyncReplicationTest.java     |  28 +-
 .../solr/cloud/RecoveryAfterSoftCommitTest.java |   3 -
 .../cloud/SharedFSAutoReplicaFailoverTest.java  |   1 +
 .../TlogReplayBufferedWhileIndexingTest.java    |   2 +-
 .../hdfs/HdfsChaosMonkeyNothingIsSafeTest.java  |   3 +
 .../hdfs/HdfsChaosMonkeySafeLeaderTest.java     |   1 +
 .../HdfsCollectionsAPIDistributedZkTest.java    |   3 +
 .../HdfsWriteToMultipleCollectionsTest.java     |   7 +-
 .../solr/core/BlobRepositoryMockingTest.java    |  69 +--
 .../org/apache/solr/core/CoreSorterTest.java    |  28 +-
 .../test/org/apache/solr/core/TestNRTOpen.java  |   2 +-
 .../solr/handler/TestReplicationHandler.java    |   9 +-
 .../apache/solr/handler/TestReqParamsAPI.java   |   2 +
 .../handler/admin/CoreAdminHandlerTest.java     |   5 +-
 .../admin/CoreMergeIndexesAdminHandlerTest.java |   8 +-
 .../solr/handler/admin/TestCoreAdminApis.java   |  58 +-
 .../component/ReplicaListTransformerTest.java   |   2 +-
 .../index/TestSlowCompositeReaderWrapper.java   |  53 +-
 .../rest/schema/TestClassNameShortening.java    |   6 +-
 .../org/apache/solr/schema/PolyFieldTest.java   |  14 +-
 .../org/apache/solr/schema/TestPointFields.java |  63 ++-
 .../solr/search/AnalyticsMergeStrategyTest.java |   2 +-
 .../solr/search/AnalyticsTestQParserPlugin.java | 171 ++++++
 .../apache/solr/search/QueryEqualityTest.java   |   9 +
 .../solr/search/TestAnalyticsQParserPlugin.java | 173 ------
 .../test/org/apache/solr/search/TestDocSet.java |  20 +-
 .../solr/search/TestMaxScoreQueryParser.java    |   3 +-
 .../org/apache/solr/search/TestRecovery.java    |  55 +-
 .../apache/solr/search/TestReloadDeadlock.java  |   2 +-
 .../org/apache/solr/search/TestSearchPerf.java  |   1 +
 .../apache/solr/search/TestSolr4Spatial2.java   |   2 +-
 .../security/TestPKIAuthenticationPlugin.java   |  29 +-
 .../solr/servlet/SolrRequestParserTest.java     | 104 ++--
 .../solr/store/blockcache/BlockCacheTest.java   | 128 ++++-
 .../solr/uninverting/TestDocTermOrds.java       |   4 +-
 .../apache/solr/uninverting/TestFieldCache.java |   4 +-
 .../TestFieldCacheSanityChecker.java            | 164 ------
 .../solr/uninverting/TestLegacyFieldCache.java  |  35 +-
 .../org/apache/solr/update/PeerSyncTest.java    | 119 ++--
 .../apache/solr/update/SoftAutoCommitTest.java  |   8 +-
 .../solr/update/SolrIndexMetricsTest.java       |  44 --
 .../solr/update/TestInPlaceUpdatesDistrib.java  |  81 ++-
 .../org/apache/solr/update/UpdateLogTest.java   |   7 +-
 .../processor/TolerantUpdateProcessorTest.java  |   7 +-
 .../processor/UpdateProcessorTestBase.java      |   2 +
 solr/licenses/caffeine-1.0.1.jar.sha1           |   1 -
 solr/licenses/caffeine-2.4.0.jar.sha1           |   1 +
 solr/licenses/cglib-nodep-2.2.jar.sha1          |   1 -
 solr/licenses/cglib-nodep-LICENSE-ASL.txt       | 201 -------
 solr/licenses/cglib-nodep-NOTICE.txt            |   2 -
 solr/licenses/easymock-3.0.jar.sha1             |   1 -
 solr/licenses/easymock-LICENSE-MIT.txt          |  17 -
 solr/scripts/README.txt                         |  13 -
 solr/scripts/abc                                | 159 ------
 solr/scripts/abo                                | 158 ------
 solr/scripts/backup                             | 109 ----
 solr/scripts/backupcleaner                      | 134 -----
 solr/scripts/commit                             | 109 ----
 solr/scripts/optimize                           | 109 ----
 solr/scripts/rsyncd-disable                     |  77 ---
 solr/scripts/rsyncd-enable                      |  76 ---
 solr/scripts/rsyncd-start                       | 147 -----
 solr/scripts/rsyncd-stop                        | 105 ----
 solr/scripts/scripts-util                       | 141 -----
 solr/scripts/snapcleaner                        | 146 -----
 solr/scripts/snapinstaller                      | 190 -------
 solr/scripts/snappuller                         | 261 ---------
 solr/scripts/snappuller-disable                 |  77 ---
 solr/scripts/snappuller-enable                  |  77 ---
 solr/scripts/snapshooter                        | 128 -----
 solr/site/online-link.xsl                       |  69 +++
 solr/solrj/ivy.xml                              |   5 +-
 .../solr/client/solrj/impl/CloudSolrClient.java |   3 +
 .../solrj/impl/ConcurrentUpdateSolrClient.java  | 317 ++++++++---
 .../solr/client/solrj/impl/HttpClientUtil.java  |   8 +-
 .../solrj/io/stream/ScoreNodesStream.java       |   2 +-
 .../solrj/io/stream/SignificantTermsStream.java | 444 +++++++++++++++
 .../org/apache/solr/common/util/NamedList.java  |   4 +-
 .../solr/common/util/ObjectReleaseTracker.java  |   6 +-
 .../solr/client/solrj/TestLBHttpSolrClient.java |   6 +-
 .../solrj/impl/CloudSolrClientCacheTest.java    |  24 +-
 .../client/solrj/impl/CloudSolrClientTest.java  |  25 +
 .../solrj/io/graph/GraphExpressionTest.java     |  11 +-
 .../solrj/io/stream/StreamExpressionTest.java   | 135 +++++
 .../client/solrj/request/TestCoreAdmin.java     |  49 ++
 .../java/org/apache/solr/SolrTestCaseJ4.java    |  18 +-
 .../cloud/AbstractFullDistribZkTestBase.java    |   9 +
 .../apache/solr/cloud/MiniSolrCloudCluster.java |  21 +-
 .../java/org/apache/solr/cloud/SocketProxy.java |   4 +
 .../apache/solr/cloud/SolrCloudTestCase.java    |   2 +-
 solr/webapp/web/js/angular/controllers/files.js |   4 +-
 solr/webapp/web/js/angular/controllers/query.js |   6 +-
 solr/webapp/web/partials/files.html             |   2 +-
 solr/webapp/web/partials/query.html             |   2 +-
 322 files changed, 6463 insertions(+), 6953 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d5bf3506/solr/core/src/java/org/apache/solr/cloud/ElectionContext.java
----------------------------------------------------------------------

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d5bf3506/solr/core/src/java/org/apache/solr/cloud/ZkController.java
----------------------------------------------------------------------

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d5bf3506/solr/core/src/java/org/apache/solr/core/CoreContainer.java
----------------------------------------------------------------------

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d5bf3506/solr/core/src/java/org/apache/solr/core/SolrCore.java
----------------------------------------------------------------------

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d5bf3506/solr/core/src/java/org/apache/solr/update/PeerSync.java
----------------------------------------------------------------------

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d5bf3506/solr/solrj/src/test/org/apache/solr/client/solrj/request/TestCoreAdmin.java
----------------------------------------------------------------------


[16/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10156: Increase the overfetch

Posted by ab...@apache.org.
SOLR-10156: Increase the overfetch


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/a0aef2fa
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/a0aef2fa
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/a0aef2fa

Branch: refs/heads/jira/solr-9858
Commit: a0aef2faaf7da56efc8ac4b004e9d3b8dc401e81
Parents: dba733e
Author: Joel Bernstein <jb...@apache.org>
Authored: Thu Feb 23 15:12:08 2017 -0500
Committer: Joel Bernstein <jb...@apache.org>
Committed: Thu Feb 23 15:12:08 2017 -0500

----------------------------------------------------------------------
 .../apache/solr/client/solrj/io/stream/SignificantTermsStream.java | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/a0aef2fa/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java
----------------------------------------------------------------------
diff --git a/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java b/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java
index f077421..f5f8a06 100644
--- a/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java
+++ b/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java
@@ -433,7 +433,7 @@ public class SignificantTermsStream extends TupleStream implements Expressible{
       params.add("maxDocFreq", Float.toString(maxDocFreq));
       params.add("minTermLength", Integer.toString(minTermLength));
       params.add("field", field);
-      params.add("numTerms", String.valueOf(numTerms*3));
+      params.add("numTerms", String.valueOf(numTerms*5));
 
       QueryRequest request= new QueryRequest(params);
       QueryResponse response = request.process(solrClient);


[44/50] [abbrv] lucene-solr:jira/solr-9858: tests: null out static

Posted by ab...@apache.org.
tests: null out static


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/2adc11c7
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/2adc11c7
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/2adc11c7

Branch: refs/heads/jira/solr-9858
Commit: 2adc11c70af98feb8842f7349001374fb4785194
Parents: 0010867
Author: markrmiller <ma...@apache.org>
Authored: Tue Feb 28 12:12:34 2017 -0500
Committer: markrmiller <ma...@apache.org>
Committed: Tue Feb 28 12:12:34 2017 -0500

----------------------------------------------------------------------
 solr/core/src/test/org/apache/solr/update/UpdateLogTest.java | 1 +
 1 file changed, 1 insertion(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/2adc11c7/solr/core/src/test/org/apache/solr/update/UpdateLogTest.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/update/UpdateLogTest.java b/solr/core/src/test/org/apache/solr/update/UpdateLogTest.java
index 9b1d611..e9269b0 100644
--- a/solr/core/src/test/org/apache/solr/update/UpdateLogTest.java
+++ b/solr/core/src/test/org/apache/solr/update/UpdateLogTest.java
@@ -64,6 +64,7 @@ public class UpdateLogTest extends SolrTestCaseJ4 {
     System.clearProperty("solr.tests.longClassName");
     System.clearProperty("solr.tests.floatClassName");
     System.clearProperty("solr.tests.doubleClassName");
+    ulog = null;
   }
 
   @Test


[38/50] [abbrv] lucene-solr:jira/solr-9858: LUCENE-7709: Remove unused backward compatibility logic.

Posted by ab...@apache.org.
LUCENE-7709: Remove unused backward compatibility logic.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/c7fd1437
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/c7fd1437
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/c7fd1437

Branch: refs/heads/jira/solr-9858
Commit: c7fd1437706a21d0571c5fced2e2e734563fa895
Parents: d9c0f25
Author: Adrien Grand <jp...@gmail.com>
Authored: Tue Feb 28 13:38:04 2017 +0100
Committer: Adrien Grand <jp...@gmail.com>
Committed: Tue Feb 28 13:38:04 2017 +0100

----------------------------------------------------------------------
 .../codecs/blocktree/BlockTreeTermsReader.java  |  30 +---
 .../blocktree/IntersectTermsEnumFrame.java      |  70 ++-------
 .../codecs/blocktree/SegmentTermsEnumFrame.java | 154 ++++---------------
 .../CompressingStoredFieldsReader.java          |  19 +--
 .../CompressingStoredFieldsWriter.java          |   5 +-
 .../CompressingTermVectorsReader.java           |  19 +--
 .../CompressingTermVectorsWriter.java           |   5 +-
 7 files changed, 67 insertions(+), 235 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/c7fd1437/lucene/core/src/java/org/apache/lucene/codecs/blocktree/BlockTreeTermsReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/codecs/blocktree/BlockTreeTermsReader.java b/lucene/core/src/java/org/apache/lucene/codecs/blocktree/BlockTreeTermsReader.java
index 6fc9a24..8d31f18 100644
--- a/lucene/core/src/java/org/apache/lucene/codecs/blocktree/BlockTreeTermsReader.java
+++ b/lucene/core/src/java/org/apache/lucene/codecs/blocktree/BlockTreeTermsReader.java
@@ -98,14 +98,7 @@ public final class BlockTreeTermsReader extends FieldsProducer {
   final static String TERMS_CODEC_NAME = "BlockTreeTermsDict";
 
   /** Initial terms format. */
-  public static final int VERSION_START = 0;
-
-  /** Auto-prefix terms. */
-  public static final int VERSION_AUTO_PREFIX_TERMS = 1;
-
-  /** Conditional auto-prefix terms: we record at write time whether
-   *  this field did write any auto-prefix terms. */
-  public static final int VERSION_AUTO_PREFIX_TERMS_COND = 2;
+  public static final int VERSION_START = 2;
 
   /** Auto-prefix terms have been superseded by points. */
   public static final int VERSION_AUTO_PREFIX_TERMS_REMOVED = 3;
@@ -138,8 +131,6 @@ public final class BlockTreeTermsReader extends FieldsProducer {
   
   final int version;
 
-  final boolean anyAutoPrefixTerms;
-
   /** Sole constructor. */
   public BlockTreeTermsReader(PostingsReaderBase postingsReader, SegmentReadState state) throws IOException {
     boolean success = false;
@@ -153,22 +144,11 @@ public final class BlockTreeTermsReader extends FieldsProducer {
       termsIn = state.directory.openInput(termsName, state.context);
       version = CodecUtil.checkIndexHeader(termsIn, TERMS_CODEC_NAME, VERSION_START, VERSION_CURRENT, state.segmentInfo.getId(), state.segmentSuffix);
 
-      if (version < VERSION_AUTO_PREFIX_TERMS || version >= VERSION_AUTO_PREFIX_TERMS_REMOVED) {
-        // Old (pre-5.2.0) or recent (6.2.0+) index, no auto-prefix terms:
-        this.anyAutoPrefixTerms = false;
-      } else if (version == VERSION_AUTO_PREFIX_TERMS) {
-        // 5.2.x index, might have auto-prefix terms:
-        this.anyAutoPrefixTerms = true;
-      } else {
-        // 5.3.x index, we record up front if we may have written any auto-prefix terms:
-        assert version == VERSION_AUTO_PREFIX_TERMS_COND;
+      if (version < VERSION_AUTO_PREFIX_TERMS_REMOVED) {
+        // pre-6.2 index, records whether auto-prefix terms are enabled in the header
         byte b = termsIn.readByte();
-        if (b == 0) {
-          this.anyAutoPrefixTerms = false;
-        } else if (b == 1) {
-          this.anyAutoPrefixTerms = true;
-        } else {
-          throw new CorruptIndexException("invalid anyAutoPrefixTerms: expected 0 or 1 but got " + b, termsIn);
+        if (b != 0) {
+          throw new CorruptIndexException("Index header pretends the index has auto-prefix terms: " + b, termsIn);
         }
       }
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/c7fd1437/lucene/core/src/java/org/apache/lucene/codecs/blocktree/IntersectTermsEnumFrame.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/codecs/blocktree/IntersectTermsEnumFrame.java b/lucene/core/src/java/org/apache/lucene/codecs/blocktree/IntersectTermsEnumFrame.java
index 3241075..578e145 100644
--- a/lucene/core/src/java/org/apache/lucene/codecs/blocktree/IntersectTermsEnumFrame.java
+++ b/lucene/core/src/java/org/apache/lucene/codecs/blocktree/IntersectTermsEnumFrame.java
@@ -77,8 +77,6 @@ final class IntersectTermsEnumFrame {
   int transitionIndex;
   int transitionCount;
 
-  final boolean versionAutoPrefix;
-
   FST.Arc<BytesRef> arc;
 
   final BlockTermState termState;
@@ -116,7 +114,6 @@ final class IntersectTermsEnumFrame {
     this.termState = ite.fr.parent.postingsReader.newTermState();
     this.termState.totalTermFreq = -1;
     this.longs = new long[ite.fr.longsSize];
-    this.versionAutoPrefix = ite.fr.parent.anyAutoPrefixTerms;
   }
 
   void loadNextFloorBlock() throws IOException {
@@ -252,64 +249,17 @@ final class IntersectTermsEnumFrame {
     assert nextEnt != -1 && nextEnt < entCount: "nextEnt=" + nextEnt + " entCount=" + entCount + " fp=" + fp;
     nextEnt++;
     final int code = suffixesReader.readVInt();
-    if (versionAutoPrefix == false) {
-      suffix = code >>> 1;
-      startBytePos = suffixesReader.getPosition();
-      suffixesReader.skipBytes(suffix);
-      if ((code & 1) == 0) {
-        // A normal term
-        termState.termBlockOrd++;
-        return false;
-      } else {
-        // A sub-block; make sub-FP absolute:
-        lastSubFP = fp - suffixesReader.readVLong();
-        return true;
-      }
+    suffix = code >>> 1;
+    startBytePos = suffixesReader.getPosition();
+    suffixesReader.skipBytes(suffix);
+    if ((code & 1) == 0) {
+      // A normal term
+      termState.termBlockOrd++;
+      return false;
     } else {
-      suffix = code >>> 2;
-      startBytePos = suffixesReader.getPosition();
-      suffixesReader.skipBytes(suffix);
-      switch (code & 3) {
-      case 0:
-        // A normal term
-        isAutoPrefixTerm = false;
-        termState.termBlockOrd++;
-        return false;
-      case 1:
-        // A sub-block; make sub-FP absolute:
-        isAutoPrefixTerm = false;
-        lastSubFP = fp - suffixesReader.readVLong();
-        return true;
-      case 2:
-        // A normal prefix term, suffix leads with empty string
-        floorSuffixLeadStart = -1;
-        termState.termBlockOrd++;
-        floorSuffixLeadEnd = suffixesReader.readByte() & 0xff;
-        if (floorSuffixLeadEnd == 0xff) {
-          floorSuffixLeadEnd = -1;
-        }
-        isAutoPrefixTerm = true;
-        return false;
-      case 3:
-        // A floor'd prefix term, suffix leads with real byte
-        if (suffix == 0) {
-          // TODO: this is messy, but necessary because we are an auto-prefix term, but our suffix is the empty string here, so we have to
-          // look at the parent block to get the lead suffix byte:
-          assert ord > 0;
-          IntersectTermsEnumFrame parent = ite.stack[ord-1];
-          floorSuffixLeadStart = parent.suffixBytes[parent.startBytePos+parent.suffix-1] & 0xff;
-        } else {
-          floorSuffixLeadStart = suffixBytes[startBytePos+suffix-1] & 0xff;
-        }
-        termState.termBlockOrd++;
-        isAutoPrefixTerm = true;
-        floorSuffixLeadEnd = suffixesReader.readByte() & 0xff;
-        return false;
-      default:
-        // Silly javac:
-        assert false;
-        return false;
-      }
+      // A sub-block; make sub-FP absolute:
+      lastSubFP = fp - suffixesReader.readVLong();
+      return true;
     }
   }
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/c7fd1437/lucene/core/src/java/org/apache/lucene/codecs/blocktree/SegmentTermsEnumFrame.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/codecs/blocktree/SegmentTermsEnumFrame.java b/lucene/core/src/java/org/apache/lucene/codecs/blocktree/SegmentTermsEnumFrame.java
index a2abbaf..0860b30 100644
--- a/lucene/core/src/java/org/apache/lucene/codecs/blocktree/SegmentTermsEnumFrame.java
+++ b/lucene/core/src/java/org/apache/lucene/codecs/blocktree/SegmentTermsEnumFrame.java
@@ -37,8 +37,6 @@ final class SegmentTermsEnumFrame {
 
   FST.Arc<BytesRef> arc;
 
-  final boolean versionAutoPrefix;
-
   //static boolean DEBUG = BlockTreeTermsWriter.DEBUG;
 
   // File pointer where this block was loaded from
@@ -100,7 +98,6 @@ final class SegmentTermsEnumFrame {
     this.state = ste.fr.parent.postingsReader.newTermState();
     this.state.totalTermFreq = -1;
     this.longs = new long[ste.fr.longsSize];
-    this.versionAutoPrefix = ste.fr.parent.anyAutoPrefixTerms;
   }
 
   public void setFloorData(ByteArrayDataInput in, BytesRef source) {
@@ -302,58 +299,26 @@ final class SegmentTermsEnumFrame {
       assert nextEnt != -1 && nextEnt < entCount: "nextEnt=" + nextEnt + " entCount=" + entCount + " fp=" + fp;
       nextEnt++;
       final int code = suffixesReader.readVInt();
-      if (versionAutoPrefix == false) {
-        suffix = code >>> 1;
-        startBytePos = suffixesReader.getPosition();
-        ste.term.setLength(prefix + suffix);
-        ste.term.grow(ste.term.length());
-        suffixesReader.readBytes(ste.term.bytes(), prefix, suffix);
-        if ((code & 1) == 0) {
-          // A normal term
-          ste.termExists = true;
-          subCode = 0;
-          state.termBlockOrd++;
-          return false;
-        } else {
-          // A sub-block; make sub-FP absolute:
-          ste.termExists = false;
-          subCode = suffixesReader.readVLong();
-          lastSubFP = fp - subCode;
-          //if (DEBUG) {
-          //System.out.println("    lastSubFP=" + lastSubFP);
-          //}
-          return true;
-        }
+      suffix = code >>> 1;
+      startBytePos = suffixesReader.getPosition();
+      ste.term.setLength(prefix + suffix);
+      ste.term.grow(ste.term.length());
+      suffixesReader.readBytes(ste.term.bytes(), prefix, suffix);
+      if ((code & 1) == 0) {
+        // A normal term
+        ste.termExists = true;
+        subCode = 0;
+        state.termBlockOrd++;
+        return false;
       } else {
-        suffix = code >>> 2;
-        startBytePos = suffixesReader.getPosition();
-        ste.term.setLength(prefix + suffix);
-        ste.term.grow(ste.term.length());
-        suffixesReader.readBytes(ste.term.bytes(), prefix, suffix);
-
-        switch(code & 3) {
-        case 0:
-          // A normal term
-          ste.termExists = true;
-          subCode = 0;
-          state.termBlockOrd++;
-          return false;
-        case 1:
-          // A sub-block; make sub-FP absolute:
-          ste.termExists = false;
-          subCode = suffixesReader.readVLong();
-          lastSubFP = fp - subCode;
-          //if (DEBUG) {
-          //System.out.println("    lastSubFP=" + lastSubFP);
-          //}
-          return true;
-        case 2:
-        case 3:
-          // A prefix term: skip it
-          state.termBlockOrd++;
-          suffixesReader.readByte();
-          continue;
-        }
+        // A sub-block; make sub-FP absolute:
+        ste.termExists = false;
+        subCode = suffixesReader.readVLong();
+        lastSubFP = fp - subCode;
+        //if (DEBUG) {
+        //System.out.println("    lastSubFP=" + lastSubFP);
+        //}
+        return true;
       }
     }
   }
@@ -497,38 +462,16 @@ final class SegmentTermsEnumFrame {
       assert nextEnt < entCount;
       nextEnt++;
       final int code = suffixesReader.readVInt();
-      if (versionAutoPrefix == false) {
-        suffixesReader.skipBytes(code >>> 1);
-        if ((code & 1) != 0) {
-          final long subCode = suffixesReader.readVLong();
-          if (targetSubCode == subCode) {
-            //if (DEBUG) System.out.println("        match!");
-            lastSubFP = subFP;
-            return;
-          }
-        } else {
-          state.termBlockOrd++;
+      suffixesReader.skipBytes(code >>> 1);
+      if ((code & 1) != 0) {
+        final long subCode = suffixesReader.readVLong();
+        if (targetSubCode == subCode) {
+          //if (DEBUG) System.out.println("        match!");
+          lastSubFP = subFP;
+          return;
         }
       } else {
-        int flag = code & 3;
-        suffixesReader.skipBytes(code >>> 2);
-        //if (DEBUG) System.out.println("    " + nextEnt + " (of " + entCount + ") ent isSubBlock=" + ((code&1)==1));
-        if (flag == 1) {
-          // Sub-block
-          final long subCode = suffixesReader.readVLong();
-          //if (DEBUG) System.out.println("      subCode=" + subCode);
-          if (targetSubCode == subCode) {
-            //if (DEBUG) System.out.println("        match!");
-            lastSubFP = subFP;
-            return;
-          }
-        } else {
-          state.termBlockOrd++;
-          if (flag == 2 || flag == 3) {
-            // Floor'd prefix term
-            suffixesReader.readByte();
-          }
-        }
+        state.termBlockOrd++;
       }
     }
   }
@@ -691,11 +634,7 @@ final class SegmentTermsEnumFrame {
       nextEnt++;
 
       final int code = suffixesReader.readVInt();
-      if (versionAutoPrefix == false) {
-        suffix = code >>> 1;
-      } else {
-        suffix = code >>> 2;
-      }
+      suffix = code >>> 1;
 
       //if (DEBUG) {
       //  BytesRef suffixBytesRef = new BytesRef();
@@ -708,38 +647,13 @@ final class SegmentTermsEnumFrame {
       final int termLen = prefix + suffix;
       startBytePos = suffixesReader.getPosition();
       suffixesReader.skipBytes(suffix);
-      if (versionAutoPrefix == false) {
-        ste.termExists = (code & 1) == 0;
-        if (ste.termExists) {
-          state.termBlockOrd++;
-          subCode = 0;
-        } else {
-          subCode = suffixesReader.readVLong();
-          lastSubFP = fp - subCode;
-        }
+      ste.termExists = (code & 1) == 0;
+      if (ste.termExists) {
+        state.termBlockOrd++;
+        subCode = 0;
       } else {
-        switch (code & 3) {
-        case 0:
-          // Normal term
-          ste.termExists = true;
-          state.termBlockOrd++;
-          subCode = 0;
-          break;
-        case 1:
-          // Sub-block
-          ste.termExists = false;
-          subCode = suffixesReader.readVLong();
-          lastSubFP = fp - subCode;
-          break;
-        case 2:
-        case 3:
-          // Floor prefix term: skip it
-          //if (DEBUG) System.out.println("        skip floor prefix term");
-          suffixesReader.readByte();
-          ste.termExists = false;
-          state.termBlockOrd++;
-          continue;
-        }
+        subCode = suffixesReader.readVLong();
+        lastSubFP = fp - subCode;
       }
 
       final int targetLimit = target.offset + (target.length < termLen ? target.length : termLen);

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/c7fd1437/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java b/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java
index f496928..62508f8 100644
--- a/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java
+++ b/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java
@@ -36,7 +36,6 @@ import static org.apache.lucene.codecs.compressing.CompressingStoredFieldsWriter
 import static org.apache.lucene.codecs.compressing.CompressingStoredFieldsWriter.TYPE_BITS;
 import static org.apache.lucene.codecs.compressing.CompressingStoredFieldsWriter.TYPE_MASK;
 import static org.apache.lucene.codecs.compressing.CompressingStoredFieldsWriter.VERSION_CURRENT;
-import static org.apache.lucene.codecs.compressing.CompressingStoredFieldsWriter.VERSION_CHUNK_STATS;
 import static org.apache.lucene.codecs.compressing.CompressingStoredFieldsWriter.VERSION_START;
 
 import java.io.EOFException;
@@ -161,18 +160,14 @@ public final class CompressingStoredFieldsReader extends StoredFieldsReader {
       decompressor = compressionMode.newDecompressor();
       this.merging = false;
       this.state = new BlockState();
-      
-      if (version >= VERSION_CHUNK_STATS) {
-        fieldsStream.seek(maxPointer);
-        numChunks = fieldsStream.readVLong();
-        numDirtyChunks = fieldsStream.readVLong();
-        if (numDirtyChunks > numChunks) {
-          throw new CorruptIndexException("invalid chunk counts: dirty=" + numDirtyChunks + ", total=" + numChunks, fieldsStream);
-        }
-      } else {
-        numChunks = numDirtyChunks = -1;
+
+      fieldsStream.seek(maxPointer);
+      numChunks = fieldsStream.readVLong();
+      numDirtyChunks = fieldsStream.readVLong();
+      if (numDirtyChunks > numChunks) {
+        throw new CorruptIndexException("invalid chunk counts: dirty=" + numDirtyChunks + ", total=" + numChunks, fieldsStream);
       }
-      
+
       // NOTE: data file is too costly to verify checksum against all the bytes on open,
       // but for now we at least verify proper structure of the checksum footer: which looks
       // for FOOTER_MAGIC + algorithmID. This is cheap and can detect some forms of corruption

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/c7fd1437/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsWriter.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsWriter.java b/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsWriter.java
index 5b42870..8cd8ccb 100644
--- a/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsWriter.java
+++ b/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsWriter.java
@@ -73,9 +73,8 @@ public final class CompressingStoredFieldsWriter extends StoredFieldsWriter {
 
   static final String CODEC_SFX_IDX = "Index";
   static final String CODEC_SFX_DAT = "Data";
-  static final int VERSION_START = 0;
-  static final int VERSION_CHUNK_STATS = 1;
-  static final int VERSION_CURRENT = VERSION_CHUNK_STATS;
+  static final int VERSION_START = 1;
+  static final int VERSION_CURRENT = VERSION_START;
 
   private final String segment;
   private CompressingStoredFieldsIndexWriter indexWriter;

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/c7fd1437/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java b/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java
index f0d1640..aa19f20 100644
--- a/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java
+++ b/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java
@@ -59,7 +59,6 @@ import static org.apache.lucene.codecs.compressing.CompressingTermVectorsWriter.
 import static org.apache.lucene.codecs.compressing.CompressingTermVectorsWriter.POSITIONS;
 import static org.apache.lucene.codecs.compressing.CompressingTermVectorsWriter.VECTORS_EXTENSION;
 import static org.apache.lucene.codecs.compressing.CompressingTermVectorsWriter.VECTORS_INDEX_EXTENSION;
-import static org.apache.lucene.codecs.compressing.CompressingTermVectorsWriter.VERSION_CHUNK_STATS;
 import static org.apache.lucene.codecs.compressing.CompressingTermVectorsWriter.VERSION_CURRENT;
 import static org.apache.lucene.codecs.compressing.CompressingTermVectorsWriter.VERSION_START;
 
@@ -148,18 +147,14 @@ public final class CompressingTermVectorsReader extends TermVectorsReader implem
       assert CodecUtil.indexHeaderLength(codecNameDat, segmentSuffix) == vectorsStream.getFilePointer();
       
       long pos = vectorsStream.getFilePointer();
-      
-      if (version >= VERSION_CHUNK_STATS) {
-        vectorsStream.seek(maxPointer);
-        numChunks = vectorsStream.readVLong();
-        numDirtyChunks = vectorsStream.readVLong();
-        if (numDirtyChunks > numChunks) {
-          throw new CorruptIndexException("invalid chunk counts: dirty=" + numDirtyChunks + ", total=" + numChunks, vectorsStream);
-        }
-      } else {
-        numChunks = numDirtyChunks = -1;
+
+      vectorsStream.seek(maxPointer);
+      numChunks = vectorsStream.readVLong();
+      numDirtyChunks = vectorsStream.readVLong();
+      if (numDirtyChunks > numChunks) {
+        throw new CorruptIndexException("invalid chunk counts: dirty=" + numDirtyChunks + ", total=" + numChunks, vectorsStream);
       }
-      
+
       // NOTE: data file is too costly to verify checksum against all the bytes on open,
       // but for now we at least verify proper structure of the checksum footer: which looks
       // for FOOTER_MAGIC + algorithmID. This is cheap and can detect some forms of corruption

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/c7fd1437/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsWriter.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsWriter.java b/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsWriter.java
index 9bd2483..26fe890 100644
--- a/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsWriter.java
+++ b/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsWriter.java
@@ -64,9 +64,8 @@ public final class CompressingTermVectorsWriter extends TermVectorsWriter {
   static final String CODEC_SFX_IDX = "Index";
   static final String CODEC_SFX_DAT = "Data";
 
-  static final int VERSION_START = 0;
-  static final int VERSION_CHUNK_STATS = 1;
-  static final int VERSION_CURRENT = VERSION_CHUNK_STATS;
+  static final int VERSION_START = 1;
+  static final int VERSION_CURRENT = VERSION_START;
 
   static final int PACKED_BLOCK_SIZE = 64;
 


[10/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10195: Harden AbstractSolrMorphlineZkTestBase based tests.

Posted by ab...@apache.org.
SOLR-10195: Harden AbstractSolrMorphlineZkTestBase based tests.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/c53b7c33
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/c53b7c33
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/c53b7c33

Branch: refs/heads/jira/solr-9858
Commit: c53b7c33b03aad3880b57a85d4402a31f3e0ea36
Parents: 1e206d8
Author: markrmiller <ma...@apache.org>
Authored: Wed Feb 22 19:50:19 2017 -0500
Committer: markrmiller <ma...@apache.org>
Committed: Wed Feb 22 19:50:19 2017 -0500

----------------------------------------------------------------------
 .../solr/AbstractSolrMorphlineZkTestBase.java           | 12 ++++++++++++
 1 file changed, 12 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/c53b7c33/solr/contrib/morphlines-core/src/test/org/apache/solr/morphlines/solr/AbstractSolrMorphlineZkTestBase.java
----------------------------------------------------------------------
diff --git a/solr/contrib/morphlines-core/src/test/org/apache/solr/morphlines/solr/AbstractSolrMorphlineZkTestBase.java b/solr/contrib/morphlines-core/src/test/org/apache/solr/morphlines/solr/AbstractSolrMorphlineZkTestBase.java
index 535fe9d..9aa27c4 100644
--- a/solr/contrib/morphlines-core/src/test/org/apache/solr/morphlines/solr/AbstractSolrMorphlineZkTestBase.java
+++ b/solr/contrib/morphlines-core/src/test/org/apache/solr/morphlines/solr/AbstractSolrMorphlineZkTestBase.java
@@ -30,6 +30,7 @@ import org.apache.solr.client.solrj.request.CollectionAdminRequest;
 import org.apache.solr.cloud.AbstractDistribZkTestBase;
 import org.apache.solr.cloud.SolrCloudTestCase;
 import org.apache.solr.common.SolrDocument;
+import org.junit.AfterClass;
 import org.junit.Before;
 import org.junit.BeforeClass;
 import org.kitesdk.morphline.api.Collector;
@@ -49,6 +50,10 @@ public abstract class AbstractSolrMorphlineZkTestBase extends SolrCloudTestCase
 
   @BeforeClass
   public static void setupCluster() throws Exception {
+    // set some system properties for use by tests
+    System.setProperty("solr.test.sys.prop1", "propone");
+    System.setProperty("solr.test.sys.prop2", "proptwo");
+    
     configureCluster(2)
         .addConfig("conf", SOLR_CONF_DIR.toPath())
         .configure();
@@ -58,6 +63,12 @@ public abstract class AbstractSolrMorphlineZkTestBase extends SolrCloudTestCase
     AbstractDistribZkTestBase.waitForRecoveriesToFinish(COLLECTION, cluster.getSolrClient().getZkStateReader(),
         false, true, TIMEOUT);
   }
+  
+  @AfterClass
+  public static void afterClass() {
+    System.clearProperty("solr.test.sys.prop1");
+    System.clearProperty("solr.test.sys.prop2");
+  }
 
   protected static final String RESOURCES_DIR = getFile("morphlines-core.marker").getParent();
   private static final File SOLR_CONF_DIR = new File(RESOURCES_DIR + "/solr/collection1/conf");
@@ -79,6 +90,7 @@ public abstract class AbstractSolrMorphlineZkTestBase extends SolrCloudTestCase
   @Before
   public void setup() throws Exception {
     collector = new Collector();
+    cluster.waitForAllNodes(DEFAULT_TIMEOUT);
   }
 
   protected void commit() throws Exception {


[24/50] [abbrv] lucene-solr:jira/solr-9858: Revert "SOLR-9640: Support PKI authentication and SSL in standalone-mode master/slave auth with local security.json"

Posted by ab...@apache.org.
Revert "SOLR-9640: Support PKI authentication and SSL in standalone-mode master/slave auth with local security.json"

This reverts commit 95d6fc2512d6525b2354165553f0d6cc4d0d6310.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/30125f99
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/30125f99
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/30125f99

Branch: refs/heads/jira/solr-9858
Commit: 30125f99daf38c4788a9763a89fddb3730c709af
Parents: 57a42e4
Author: Jan H�ydahl <ja...@apache.org>
Authored: Sat Feb 25 00:43:42 2017 +0100
Committer: Jan H�ydahl <ja...@apache.org>
Committed: Sat Feb 25 00:43:42 2017 +0100

----------------------------------------------------------------------
 solr/CHANGES.txt                                |   2 -
 .../org/apache/solr/core/CoreContainer.java     |   9 +-
 .../solr/security/PKIAuthenticationPlugin.java  |  42 +-----
 .../org/apache/solr/servlet/HttpSolrCall.java   |   4 +-
 .../apache/solr/servlet/SolrDispatchFilter.java |  11 +-
 .../solr/security/BasicAuthDistributedTest.java | 136 -------------------
 .../security/TestPKIAuthenticationPlugin.java   |  38 +-----
 .../solr/BaseDistributedSearchTestCase.java     |  37 +----
 8 files changed, 19 insertions(+), 260 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/30125f99/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index 2c5f0db..0302615 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -134,8 +134,6 @@ New Features
   field must both be stored=false, indexed=false, docValues=true. (Ishan Chattopadhyaya, hossman, noble,
   shalin, yonik)
 
-* SOLR-9640: Support PKI authentication and SSL in standalone-mode master/slave auth with local security.json (janhoy)
-
 Bug Fixes
 ----------------------
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/30125f99/solr/core/src/java/org/apache/solr/core/CoreContainer.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/core/CoreContainer.java b/solr/core/src/java/org/apache/solr/core/CoreContainer.java
index 6115562..e3977d7 100644
--- a/solr/core/src/java/org/apache/solr/core/CoreContainer.java
+++ b/solr/core/src/java/org/apache/solr/core/CoreContainer.java
@@ -497,9 +497,7 @@ public class CoreContainer {
     hostName = cfg.getNodeName();
 
     zkSys.initZooKeeper(this, solrHome, cfg.getCloudConfig());
-    pkiAuthenticationPlugin = isZooKeeperAware() ?
-        new PKIAuthenticationPlugin(this, zkSys.getZkController().getNodeName()) :
-        new PKIAuthenticationPlugin(this, getNodeNameLocal());
+    if(isZooKeeperAware())  pkiAuthenticationPlugin = new PKIAuthenticationPlugin(this, zkSys.getZkController().getNodeName());
 
     MDCLoggingContext.setNode(this);
 
@@ -620,11 +618,6 @@ public class CoreContainer {
     }
   }
 
-  // Builds a node name to be used with PKIAuth.
-  private String getNodeNameLocal() {
-    return getConfig().getCloudConfig().getHost()+":"+getConfig().getCloudConfig().getSolrHostPort()+"_solr";
-  }
-
   public void securityNodeChanged() {
     log.info("Security node changed, reloading security.json");
     reloadSecurityProperties();

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/30125f99/solr/core/src/java/org/apache/solr/security/PKIAuthenticationPlugin.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/security/PKIAuthenticationPlugin.java b/solr/core/src/java/org/apache/solr/security/PKIAuthenticationPlugin.java
index d185bc9..fdd4408 100644
--- a/solr/core/src/java/org/apache/solr/security/PKIAuthenticationPlugin.java
+++ b/solr/core/src/java/org/apache/solr/security/PKIAuthenticationPlugin.java
@@ -22,9 +22,7 @@ import javax.servlet.ServletResponse;
 import javax.servlet.http.HttpServletRequest;
 import javax.servlet.http.HttpServletRequestWrapper;
 import java.io.IOException;
-import java.io.UnsupportedEncodingException;
 import java.lang.invoke.MethodHandles;
-import java.net.URLDecoder;
 import java.nio.ByteBuffer;
 import java.security.Principal;
 import java.security.PublicKey;
@@ -195,14 +193,9 @@ public class PKIAuthenticationPlugin extends AuthenticationPlugin implements Htt
   }
 
   PublicKey getRemotePublicKey(String nodename) {
-    String url, uri = null;
-    if (cores.isZooKeeperAware()) {
-      url = cores.getZkController().getZkStateReader().getBaseUrlForNodeName(nodename);
-    } else {
-      url = getBaseUrlForNodeNameLocal(nodename);
-    }
+    String url = cores.getZkController().getZkStateReader().getBaseUrlForNodeName(nodename);
     try {
-      uri += PATH + "?wt=json&omitHeader=true";
+      String uri = url + PATH + "?wt=json&omitHeader=true";
       log.debug("Fetching fresh public key from : {}",uri);
       HttpResponse rsp = cores.getUpdateShardHandler().getHttpClient()
           .execute(new HttpGet(uri), HttpClientUtil.createNewHttpClientRequestContext());
@@ -219,41 +212,12 @@ public class PKIAuthenticationPlugin extends AuthenticationPlugin implements Htt
       keyCache.put(nodename, pubKey);
       return pubKey;
     } catch (Exception e) {
-      log.error("Exception trying to get public key from : " + uri, e);
+      log.error("Exception trying to get public key from : " + url, e);
       return null;
     }
 
   }
 
-  protected String getBaseUrlForNodeNameLocal(String nodeName) {
-    final int _offset = nodeName.indexOf("_");
-    if (_offset < 0) {
-      throw new IllegalArgumentException("nodeName does not contain expected '_' seperator: " + nodeName);
-    }
-    final String hostAndPort = nodeName.substring(0,_offset);
-    try {
-      final String path = URLDecoder.decode(nodeName.substring(1+_offset), "UTF-8");
-      // TODO: Find a better way of resolving urlScheme when not using ZK?
-      String urlScheme = resolveUrlScheme();
-      return urlScheme + "://" + hostAndPort + (path.isEmpty() ? "" : ("/" + path));
-    } catch (UnsupportedEncodingException e) {
-      throw new IllegalStateException("JVM Does not seem to support UTF-8", e);
-    }
-  }
-
-  /**
-   * Resolve urlScheme first from sysProp "urlScheme", if not set or invalid value, peek at ssl sysProps
-   * @return "https" if SSL is enabled, else "http"
-   */
-  protected static String resolveUrlScheme() {
-    String urlScheme = System.getProperty("urlScheme");
-    if (urlScheme != null && urlScheme.matches("https?")) {
-      return urlScheme;
-    } else {
-      return System.getProperty("solr.jetty.keystore") == null ? "http" : "https";
-    }
-  }
-
   @Override
   public SolrHttpClientBuilder getHttpClientBuilder(SolrHttpClientBuilder builder) {
     HttpClientUtil.addRequestInterceptor(interceptor);

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/30125f99/solr/core/src/java/org/apache/solr/servlet/HttpSolrCall.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/servlet/HttpSolrCall.java b/solr/core/src/java/org/apache/solr/servlet/HttpSolrCall.java
index 0dfb0ea..4f6bae0 100644
--- a/solr/core/src/java/org/apache/solr/servlet/HttpSolrCall.java
+++ b/solr/core/src/java/org/apache/solr/servlet/HttpSolrCall.java
@@ -560,7 +560,7 @@ public class HttpSolrCall {
   }
 
   private boolean shouldAuthorize() {
-    if(path != null && path.endsWith(PKIAuthenticationPlugin.PATH)) return false;
+    if(PKIAuthenticationPlugin.PATH.equals(path)) return false;
     //admin/info/key is the path where public key is exposed . it is always unsecured
     if (cores.getPkiAuthenticationPlugin() != null && req.getUserPrincipal() != null) {
       boolean b = cores.getPkiAuthenticationPlugin().needsAuthorization(req);
@@ -1081,7 +1081,7 @@ public class HttpSolrCall {
           response.delete(response.length() - 1, response.length());
         
         response.append("], Path: [").append(resource).append("]");
-        response.append(" path : ").append(path).append(" params :").append(solrReq == null ? null : solrReq.getParams());
+        response.append(" path : ").append(path).append(" params :").append(solrReq.getParams());
         return response.toString();
       }
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/30125f99/solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java b/solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
index 4ce57b0..ce65069 100644
--- a/solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
+++ b/solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
@@ -402,11 +402,11 @@ public class SolrDispatchFilter extends BaseSolrFilter {
     if (authenticationPlugin == null) {
       return true;
     } else {
-      String requestUri = ((HttpServletRequest) request).getRequestURI();
-      if (requestUri != null && requestUri.endsWith(PKIAuthenticationPlugin.PATH)) {
-        log.debug("Passthrough of pki URL " + requestUri);
-        return true;
-      }
+      // /admin/info/key must be always open. see SOLR-9188
+      // tests work only w/ getPathInfo
+      //otherwise it's just enough to have getServletPath()
+      if (PKIAuthenticationPlugin.PATH.equals(((HttpServletRequest) request).getServletPath()) ||
+          PKIAuthenticationPlugin.PATH.equals(((HttpServletRequest) request).getPathInfo())) return true;
       String header = ((HttpServletRequest) request).getHeader(PKIAuthenticationPlugin.HEADER);
       if (header != null && cores.getPkiAuthenticationPlugin() != null)
         authenticationPlugin = cores.getPkiAuthenticationPlugin();
@@ -418,6 +418,7 @@ public class SolrDispatchFilter extends BaseSolrFilter {
           wrappedRequest.set(req);
         });
       } catch (Exception e) {
+        log.info("Error authenticating", e);
         throw new SolrException(ErrorCode.SERVER_ERROR, "Error during request authentication, ", e);
       }
     }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/30125f99/solr/core/src/test/org/apache/solr/security/BasicAuthDistributedTest.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/security/BasicAuthDistributedTest.java b/solr/core/src/test/org/apache/solr/security/BasicAuthDistributedTest.java
deleted file mode 100644
index e35e369..0000000
--- a/solr/core/src/test/org/apache/solr/security/BasicAuthDistributedTest.java
+++ /dev/null
@@ -1,136 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements.  See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.solr.security;
-
-import java.io.IOException;
-import java.nio.file.Files;
-import java.nio.file.Paths;
-
-import org.apache.lucene.util.LuceneTestCase.Slow;
-import org.apache.solr.BaseDistributedSearchTestCase;
-import org.apache.solr.client.solrj.embedded.JettySolrRunner;
-import org.apache.solr.client.solrj.impl.HttpSolrClient;
-import org.apache.solr.client.solrj.request.QueryRequest;
-import org.apache.solr.client.solrj.response.QueryResponse;
-import org.apache.solr.common.params.ModifiableSolrParams;
-import org.apache.solr.common.util.Utils;
-import org.apache.solr.core.CoreContainer;
-import org.apache.solr.handler.admin.SecurityConfHandler;
-import org.apache.solr.handler.admin.SecurityConfHandlerLocalForTesting;
-import org.apache.solr.util.LogLevel;
-import org.junit.Test;
-
-/**
- * Tests basicAuth in a multi shard env
- */
-@Slow
-public class BasicAuthDistributedTest extends BaseDistributedSearchTestCase {
-  public BasicAuthDistributedTest() {
-    super();
-    schemaString = "schema.xml";
-  }
-
-  private SecurityConfHandlerLocalForTesting securityConfHandler;
-
-  @Test
-  @LogLevel("org.apache.solr=DEBUG")
-  public void test() throws Exception {
-    index();
-    testAuth();
-  }
-
-  private void index() throws Exception {
-    del("*:*");
-    indexr(id, "1", "text", "doc one");
-    indexr(id, "2", "text", "doc two");
-    indexr(id, "3", "text", "doc three");
-    indexr(id, "4", "text", "doc four");
-    indexr(id, "5", "text", "doc five");
-
-    commit();  // try to ensure there's more than one segment
-
-    indexr(id, "6", "text", "doc six");
-    indexr(id, "7", "text", "doc seven");
-    indexr(id, "8", "text", "doc eight");
-    indexr(id, "9", "text", "doc nine");
-    indexr(id, "10", "text", "doc ten");
-
-    commit();
-
-    handle.clear();
-    handle.put("QTime", SKIPVAL);
-    handle.put("timestamp", SKIPVAL);
-    handle.put("maxScore", SKIPVAL);
-    handle.put("_version_", SKIPVAL);
-  }
-
-  private void testAuth() throws Exception {
-    QueryResponse rsp = query("q","text:doc", "fl", "id,text", "sort", "id asc");
-    assertEquals(10, rsp.getResults().getNumFound());
-
-    // Enable authentication
-    for (JettySolrRunner j : jettys) {
-      writeSecurityJson(j.getCoreContainer());
-    }
-
-    HttpSolrClient.RemoteSolrException expected = expectThrows(HttpSolrClient.RemoteSolrException.class, () -> {
-      query("q","text:doc-fail", "fl", "id,text", "sort", "id asc");
-    });
-    assertEquals(401, expected.code());
-
-    // Add auth
-    ModifiableSolrParams params = new ModifiableSolrParams();
-    params.add("q", "text:doc").add("fl", "id,text").add("sort", "id asc");
-    QueryRequest req = new QueryRequest(params);
-    req.setBasicAuthCredentials("solr", "SolrRocks");
-    rsp = req.process(clients.get(0), null);
-    if (jettys.size() > 1) {
-      assertTrue(rsp.getResults().getNumFound() < 10);
-      rsp = query(true, params, "solr", "SolrRocks");
-    }
-    assertEquals(10, rsp.getResults().getNumFound());
-
-    // Disable auth
-    for (JettySolrRunner j : jettys) {
-      deleteSecurityJson(j.getCoreContainer());
-    }
-
-  }
-
-  private void deleteSecurityJson(CoreContainer coreContainer) throws IOException {
-    securityConfHandler = new SecurityConfHandlerLocalForTesting(coreContainer);
-    Files.delete(Paths.get(coreContainer.getSolrHome()).resolve("security.json"));
-    coreContainer.securityNodeChanged();
-  }
-
-  private void writeSecurityJson(CoreContainer coreContainer) throws IOException {
-    securityConfHandler = new SecurityConfHandlerLocalForTesting(coreContainer);
-    securityConfHandler.persistConf(new SecurityConfHandler.SecurityConfig()
-        .setData(Utils.fromJSONString(ALL_CONF.replaceAll("'", "\""))));
-    coreContainer.securityNodeChanged();
-  }
-
-  protected static final String ALL_CONF = "{\n" +
-      "  'authentication':{\n" +
-      "    'blockUnknown':true,\n" +
-      "    'class':'solr.BasicAuthPlugin',\n" +
-      "    'credentials':{'solr':'orwp2Ghgj39lmnrZOTm7Qtre1VqHFDfwAEzr0ApbN3Y= Ju5osoAqOX8iafhWpPP01E5P+sg8tK8tHON7rCYZRRw='}},\n" +
-      "  'authorization':{\n" +
-      "    'class':'solr.RuleBasedAuthorizationPlugin',\n" +
-      "    'user-role':{'solr':'admin'},\n" +
-      "    'permissions':[{'name':'all','role':'admin'}]}}";
-}

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/30125f99/solr/core/src/test/org/apache/solr/security/TestPKIAuthenticationPlugin.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/security/TestPKIAuthenticationPlugin.java b/solr/core/src/test/org/apache/solr/security/TestPKIAuthenticationPlugin.java
index 90c5bd2..a5a279f 100644
--- a/solr/core/src/test/org/apache/solr/security/TestPKIAuthenticationPlugin.java
+++ b/solr/core/src/test/org/apache/solr/security/TestPKIAuthenticationPlugin.java
@@ -35,11 +35,7 @@ import org.apache.solr.request.LocalSolrQueryRequest;
 import org.apache.solr.request.SolrRequestInfo;
 import org.apache.solr.response.SolrQueryResponse;
 import org.apache.solr.util.CryptoKeys;
-import org.junit.Test;
-
-import static org.mockito.Mockito.any;
-import static org.mockito.Mockito.mock;
-import static org.mockito.Mockito.when;
+import static org.mockito.Mockito.*;
 
 public class TestPKIAuthenticationPlugin extends SolrTestCaseJ4 {
 
@@ -145,38 +141,10 @@ public class TestPKIAuthenticationPlugin extends SolrTestCaseJ4 {
     mock1.doAuthenticate(mockReq, null,filterChain );
     assertNotNull(wrappedRequestByFilter.get());
     assertEquals("$", ((HttpServletRequest) wrappedRequestByFilter.get()).getUserPrincipal().getName());
-  }
 
-  @Test
-  public void testGetBaseUrlForNodeNameLocal() {
-    synchronized (this) {
-      final MockPKIAuthenticationPlugin mock = new MockPKIAuthenticationPlugin(null, "myName");
-      System.clearProperty("solr.jetty.keystore");
-      assertEquals("http://my.host:9876/solr2", mock.getBaseUrlForNodeNameLocal("my.host:9876_solr2"));
-      System.setProperty("solr.jetty.keystore", "foo");
-      assertEquals("https://my.host:9876/solr2", mock.getBaseUrlForNodeNameLocal("my.host:9876_solr2"));
-      System.clearProperty("solr.jetty.keystore");
-    }
-  }
 
-  @Test
-  public void testResolveUrlScheme() {
-    synchronized (this) {
-      System.clearProperty("urlScheme");
-      System.clearProperty("solr.jetty.keystore");
-      assertEquals("http", MockPKIAuthenticationPlugin.resolveUrlScheme());
-      System.setProperty("urlScheme", "http");
-      assertEquals("http", MockPKIAuthenticationPlugin.resolveUrlScheme());
-      System.setProperty("urlScheme", "https");
-      assertEquals("https", MockPKIAuthenticationPlugin.resolveUrlScheme());
-      System.setProperty("urlScheme", "ftp");
-      System.clearProperty("solr.jetty.keystore");
-      assertEquals("http", MockPKIAuthenticationPlugin.resolveUrlScheme());
-      System.setProperty("solr.jetty.keystore", "foo");
-      assertEquals("https", MockPKIAuthenticationPlugin.resolveUrlScheme());
-      System.clearProperty("urlScheme");
-      System.clearProperty("solr.jetty.keystore");
-    }
+
+
   }
 
   private HttpServletRequest createMockRequest(final AtomicReference<Header> header) {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/30125f99/solr/test-framework/src/java/org/apache/solr/BaseDistributedSearchTestCase.java
----------------------------------------------------------------------
diff --git a/solr/test-framework/src/java/org/apache/solr/BaseDistributedSearchTestCase.java b/solr/test-framework/src/java/org/apache/solr/BaseDistributedSearchTestCase.java
index bbfc048..8c6eb60 100644
--- a/solr/test-framework/src/java/org/apache/solr/BaseDistributedSearchTestCase.java
+++ b/solr/test-framework/src/java/org/apache/solr/BaseDistributedSearchTestCase.java
@@ -50,7 +50,6 @@ import org.apache.solr.client.solrj.SolrServerException;
 import org.apache.solr.client.solrj.embedded.JettyConfig;
 import org.apache.solr.client.solrj.embedded.JettySolrRunner;
 import org.apache.solr.client.solrj.impl.HttpSolrClient;
-import org.apache.solr.client.solrj.request.QueryRequest;
 import org.apache.solr.client.solrj.request.UpdateRequest;
 import org.apache.solr.client.solrj.response.QueryResponse;
 import org.apache.solr.client.solrj.response.UpdateResponse;
@@ -559,12 +558,6 @@ public abstract class BaseDistributedSearchTestCase extends SolrTestCaseJ4 {
     return rsp;
   }
 
-  protected QueryResponse queryServer(QueryRequest req) throws IOException, SolrServerException {
-    int which = r.nextInt(clients.size());
-    SolrClient client = clients.get(which);
-    return req.process(client, null);
-  }
-
   /**
    * Sets distributed params.
    * Returns the QueryResponse from {@link #queryServer},
@@ -598,31 +591,18 @@ public abstract class BaseDistributedSearchTestCase extends SolrTestCaseJ4 {
    * Returns the QueryResponse from {@link #queryServer}  
    */
   protected QueryResponse query(boolean setDistribParams, SolrParams p) throws Exception {
-    return query(setDistribParams, p, null, null);
-  }
-
-  /**
-   * Returns the QueryResponse from {@link #queryServer}
-   * @param setDistribParams whether to do a distributed request
-   * @param user basic auth username (set to null if not in use)
-   * @param pass basic auth password (set to null if not in use)
-   * @return the query response
-   */
-  protected QueryResponse query(boolean setDistribParams, SolrParams p, String user, String pass) throws Exception {
     
     final ModifiableSolrParams params = new ModifiableSolrParams(p);
 
     // TODO: look into why passing true causes fails
     params.set("distrib", "false");
-    QueryRequest req = generateQueryRequest(params, user, pass);
-    final QueryResponse controlRsp = req.process(controlClient, null);
+    final QueryResponse controlRsp = controlClient.query(params);
     validateControlData(controlRsp);
 
     params.remove("distrib");
     if (setDistribParams) setDistributedParams(params);
-    req = generateQueryRequest(params, user, pass);
 
-    QueryResponse rsp = queryServer(req);
+    QueryResponse rsp = queryServer(params);
 
     compareResponses(rsp, controlRsp);
 
@@ -637,8 +617,7 @@ public abstract class BaseDistributedSearchTestCase extends SolrTestCaseJ4 {
               int which = r.nextInt(clients.size());
               SolrClient client = clients.get(which);
               try {
-                QueryRequest qreq = generateQueryRequest(new ModifiableSolrParams(params), user, pass);
-                QueryResponse rsp = qreq.process(client, null);
+                QueryResponse rsp = client.query(new ModifiableSolrParams(params));
                 if (verifyStress) {
                   compareResponses(rsp, controlRsp);
                 }
@@ -657,15 +636,7 @@ public abstract class BaseDistributedSearchTestCase extends SolrTestCaseJ4 {
     }
     return rsp;
   }
-
-  private QueryRequest generateQueryRequest(ModifiableSolrParams params, String user, String pass) {
-    QueryRequest req = new QueryRequest(params);
-    if (user != null && pass != null) {
-      req.setBasicAuthCredentials(user, pass);
-    }
-    return req;
-  }
-
+  
   public QueryResponse queryAndCompare(SolrParams params, SolrClient... clients) throws SolrServerException, IOException {
     return queryAndCompare(params, Arrays.<SolrClient>asList(clients));
   }


[27/50] [abbrv] lucene-solr:jira/solr-9858: Fix Java 9 b158+ problem (no compatibility layer for non expanded paths anymore)

Posted by ab...@apache.org.
Fix Java 9 b158+ problem (no compatibility layer for non expanded paths anymore)


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/6f3f6a2d
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/6f3f6a2d
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/6f3f6a2d

Branch: refs/heads/jira/solr-9858
Commit: 6f3f6a2d66d107e94d723a1f931da0b7bdb06928
Parents: 99e8ef2
Author: Uwe Schindler <us...@apache.org>
Authored: Sat Feb 25 20:59:26 2017 +0100
Committer: Uwe Schindler <us...@apache.org>
Committed: Sat Feb 25 20:59:26 2017 +0100

----------------------------------------------------------------------
 .../core/src/test/org/apache/lucene/index/TestReadOnlyIndex.java   | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/6f3f6a2d/lucene/core/src/test/org/apache/lucene/index/TestReadOnlyIndex.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/index/TestReadOnlyIndex.java b/lucene/core/src/test/org/apache/lucene/index/TestReadOnlyIndex.java
index 11e583e..9198002 100644
--- a/lucene/core/src/test/org/apache/lucene/index/TestReadOnlyIndex.java
+++ b/lucene/core/src/test/org/apache/lucene/index/TestReadOnlyIndex.java
@@ -46,7 +46,7 @@ public class TestReadOnlyIndex extends LuceneTestCase {
 
   @BeforeClass
   public static void buildIndex() throws Exception {
-    indexPath = Files.createTempDirectory("readonlyindex");
+    indexPath = Files.createTempDirectory("readonlyindex").toAbsolutePath();
     
     // borrows from TestDemo, but not important to keep in sync with demo
     Analyzer analyzer = new MockAnalyzer(random());


[21/50] [abbrv] lucene-solr:jira/solr-9858: LUCENE-7707: add explicit boolean to TopDocs.merge to govern whether incoming or implicit shard index should be used

Posted by ab...@apache.org.
LUCENE-7707: add explicit boolean to TopDocs.merge to govern whether incoming or implicit shard index should be used


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/2e56c0e5
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/2e56c0e5
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/2e56c0e5

Branch: refs/heads/jira/solr-9858
Commit: 2e56c0e50564c8feeeb2831dd36cff1e9b23a00f
Parents: 471d842
Author: Mike McCandless <mi...@apache.org>
Authored: Fri Feb 24 17:00:45 2017 -0500
Committer: Mike McCandless <mi...@apache.org>
Committed: Fri Feb 24 17:01:19 2017 -0500

----------------------------------------------------------------------
 lucene/CHANGES.txt                              | 10 +--
 .../java/org/apache/lucene/search/FieldDoc.java |  6 +-
 .../org/apache/lucene/search/IndexSearcher.java |  4 +-
 .../java/org/apache/lucene/search/TopDocs.java  | 68 +++++++++-----------
 .../apache/lucene/search/TestTopDocsMerge.java  | 38 ++++++-----
 5 files changed, 62 insertions(+), 64 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/2e56c0e5/lucene/CHANGES.txt
----------------------------------------------------------------------
diff --git a/lucene/CHANGES.txt b/lucene/CHANGES.txt
index 741418a..5d3a077 100644
--- a/lucene/CHANGES.txt
+++ b/lucene/CHANGES.txt
@@ -97,6 +97,12 @@ API Changes
 
 * LUCENE-7702: Removed GraphQuery in favour of simple boolean query. (Matt Webber via Jim Ferenczi)
 
+* LUCENE-7707: TopDocs.merge now takes a boolean option telling it
+  when to use the incoming shard index versus when to assign the shard
+  index itself, allowing users to merge shard responses incrementally
+  instead of once all shard responses are present. (Simon Willnauer,
+  Mike McCandless)
+
 New Features
 
 * LUCENE-7449: Add CROSSES relation support to RangeFieldQuery. (Nick Knize)
@@ -172,10 +178,6 @@ Improvements
   earlier than regular queries in order to improve cache efficiency.
   (Adrien Grand)
 
-* LUCENE-7707: Use predefined shard index when mergeing top docs if present. This 
-  allows to use TopDoc#merge to merge shard responses incrementally instead of
-  once all shard responses are present. (Simon Willnauer)
-
 Optimizations
 
 * LUCENE-7641: Optimized point range queries to compute documents that do not

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/2e56c0e5/lucene/core/src/java/org/apache/lucene/search/FieldDoc.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/search/FieldDoc.java b/lucene/core/src/java/org/apache/lucene/search/FieldDoc.java
index 31f7175..6125404 100644
--- a/lucene/core/src/java/org/apache/lucene/search/FieldDoc.java
+++ b/lucene/core/src/java/org/apache/lucene/search/FieldDoc.java
@@ -52,18 +52,18 @@ public class FieldDoc extends ScoreDoc {
 
   /** Expert: Creates one of these objects with empty sort information. */
   public FieldDoc(int doc, float score) {
-    super (doc, score);
+    super(doc, score);
   }
 
   /** Expert: Creates one of these objects with the given sort information. */
   public FieldDoc(int doc, float score, Object[] fields) {
-    super (doc, score);
+    super(doc, score);
     this.fields = fields;
   }
   
   /** Expert: Creates one of these objects with the given sort information. */
   public FieldDoc(int doc, float score, Object[] fields, int shardIndex) {
-    super (doc, score, shardIndex);
+    super(doc, score, shardIndex);
     this.fields = fields;
   }
   

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/2e56c0e5/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java b/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java
index 5cae122..5ab16c2 100644
--- a/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java
+++ b/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java
@@ -432,7 +432,7 @@ public class IndexSearcher {
         for (TopScoreDocCollector collector : collectors) {
           topDocs[i++] = collector.topDocs();
         }
-        return TopDocs.merge(cappedNumHits, topDocs);
+        return TopDocs.merge(0, cappedNumHits, topDocs, true);
       }
 
     };
@@ -559,7 +559,7 @@ public class IndexSearcher {
         for (TopFieldCollector collector : collectors) {
           topDocs[i++] = collector.topDocs();
         }
-        return TopDocs.merge(sort, cappedNumHits, topDocs);
+        return TopDocs.merge(sort, 0, cappedNumHits, topDocs, true);
       }
 
     };

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/2e56c0e5/lucene/core/src/java/org/apache/lucene/search/TopDocs.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/search/TopDocs.java b/lucene/core/src/java/org/apache/lucene/search/TopDocs.java
index 2913cb2..3b66fca 100644
--- a/lucene/core/src/java/org/apache/lucene/search/TopDocs.java
+++ b/lucene/core/src/java/org/apache/lucene/search/TopDocs.java
@@ -16,7 +16,6 @@
  */
 package org.apache.lucene.search;
 
-
 import org.apache.lucene.util.PriorityQueue;
 
 /** Represents hits returned by {@link
@@ -60,6 +59,8 @@ public class TopDocs {
   private final static class ShardRef {
     // Which shard (index into shardHits[]):
     final int shardIndex;
+
+    // True if we should use the incoming ScoreDoc.shardIndex for sort order
     final boolean useScoreDocIndex;
 
     // Which hit within the shard:
@@ -77,10 +78,12 @@ public class TopDocs {
 
     int getShardIndex(ScoreDoc scoreDoc) {
       if (useScoreDocIndex) {
-        assert scoreDoc.shardIndex != -1 : "scoreDoc shardIndex must be predefined set but wasn't";
+        if (scoreDoc.shardIndex == -1) {
+          throw new IllegalArgumentException("setShardIndex is false but TopDocs[" + shardIndex + "].scoreDocs[" + hitIndex + "] is not set");
+        }
         return scoreDoc.shardIndex;
       } else {
-        assert scoreDoc.shardIndex == -1 : "scoreDoc shardIndex must be undefined but wasn't";
+        // NOTE: we don't assert that shardIndex is -1 here, because caller could in fact have set it but asked us to ignore it now
         return shardIndex;
       }
     }
@@ -201,23 +204,25 @@ public class TopDocs {
    *  the provided TopDocs, sorting by score. Each {@link TopDocs}
    *  instance must be sorted.
    *
-   *  @see #merge(int, int, TopDocs[])
+   *  @see #merge(int, int, TopDocs[], boolean)
    *  @lucene.experimental */
   public static TopDocs merge(int topN, TopDocs[] shardHits) {
-    return merge(0, topN, shardHits);
+    return merge(0, topN, shardHits, true);
   }
 
   /**
    * Same as {@link #merge(int, TopDocs[])} but also ignores the top
    * {@code start} top docs. This is typically useful for pagination.
    *
-   * Note: This method will fill the {@link ScoreDoc#shardIndex} on all score docs returned iff all ScoreDocs passed
-   * to this have it's shard index set to <tt>-1</tt>. Otherwise the shard index is not set. This allows to predefine
-   * the shard index in order to incrementally merge shard responses without losing the original shard index.
+   * Note: If {@code setShardIndex} is true, this method will assume the incoming order of {@code shardHits} reflects
+   * each shard's index and will fill the {@link ScoreDoc#shardIndex}, otherwise
+   * it must already be set for all incoming {@code ScoreDoc}s, which can be useful when doing multiple reductions
+   * (merges) of TopDocs.
+   *
    * @lucene.experimental
    */
-  public static TopDocs merge(int start, int topN, TopDocs[] shardHits) {
-    return mergeAux(null, start, topN, shardHits);
+  public static TopDocs merge(int start, int topN, TopDocs[] shardHits, boolean setShardIndex) {
+    return mergeAux(null, start, topN, shardHits, setShardIndex);
   }
 
   /** Returns a new TopFieldDocs, containing topN results across
@@ -226,31 +231,34 @@ public class TopDocs {
    *  the same Sort, and sort field values must have been
    *  filled (ie, <code>fillFields=true</code> must be
    *  passed to {@link TopFieldCollector#create}).
-   *  @see #merge(Sort, int, int, TopFieldDocs[])
+   *  @see #merge(Sort, int, int, TopFieldDocs[], boolean)
    * @lucene.experimental */
   public static TopFieldDocs merge(Sort sort, int topN, TopFieldDocs[] shardHits) {
-    return merge(sort, 0, topN, shardHits);
+    return merge(sort, 0, topN, shardHits, true);
   }
 
   /**
    * Same as {@link #merge(Sort, int, TopFieldDocs[])} but also ignores the top
    * {@code start} top docs. This is typically useful for pagination.
    *
-   * Note: This method will fill the {@link ScoreDoc#shardIndex} on all score docs returned iff all ScoreDocs passed
-   * to this have it's shard index set to <tt>-1</tt>. Otherwise the shard index is not set. This allows to predefine
-   * the shard index in order to incrementally merge shard responses without losing the original shard index.
+   * Note: If {@code setShardIndex} is true, this method will assume the incoming order of {@code shardHits} reflects
+   * each shard's index and will fill the {@link ScoreDoc#shardIndex}, otherwise
+   * it must already be set for all incoming {@code ScoreDoc}s, which can be useful when doing multiple reductions
+   * (merges) of TopDocs.
+   *
    * @lucene.experimental
    */
-  public static TopFieldDocs merge(Sort sort, int start, int topN, TopFieldDocs[] shardHits) {
+  public static TopFieldDocs merge(Sort sort, int start, int topN, TopFieldDocs[] shardHits, boolean setShardIndex) {
     if (sort == null) {
       throw new IllegalArgumentException("sort must be non-null when merging field-docs");
     }
-    return (TopFieldDocs) mergeAux(sort, start, topN, shardHits);
+    return (TopFieldDocs) mergeAux(sort, start, topN, shardHits, setShardIndex);
   }
 
   /** Auxiliary method used by the {@link #merge} impls. A sort value of null
    *  is used to indicate that docs should be sorted by score. */
-  private static TopDocs mergeAux(Sort sort, int start, int size, TopDocs[] shardHits) {
+  private static TopDocs mergeAux(Sort sort, int start, int size, TopDocs[] shardHits, boolean setShardIndex) {
+
     final PriorityQueue<ShardRef> queue;
     if (sort == null) {
       queue = new ScoreMergeSortQueue(shardHits);
@@ -261,28 +269,15 @@ public class TopDocs {
     int totalHitCount = 0;
     int availHitCount = 0;
     float maxScore = Float.MIN_VALUE;
-    Boolean setShardIndex = null;
     for(int shardIDX=0;shardIDX<shardHits.length;shardIDX++) {
       final TopDocs shard = shardHits[shardIDX];
       // totalHits can be non-zero even if no hits were
       // collected, when searchAfter was used:
       totalHitCount += shard.totalHits;
       if (shard.scoreDocs != null && shard.scoreDocs.length > 0) {
-        if (shard.scoreDocs[0].shardIndex == -1) {
-          if (setShardIndex != null && setShardIndex == false) {
-            throw new IllegalStateException("scoreDocs at index " + shardIDX + " has undefined shard indices but previous scoreDocs were predefined");
-          }
-          setShardIndex = true;
-        } else {
-          if (setShardIndex != null && setShardIndex) {
-            throw new IllegalStateException("scoreDocs at index " + shardIDX + " has predefined shard indices but previous scoreDocs were undefined");
-          }
-          setShardIndex = false;
-        }
         availHitCount += shard.scoreDocs.length;
         queue.add(new ShardRef(shardIDX, setShardIndex == false));
         maxScore = Math.max(maxScore, shard.getMaxScore());
-        //System.out.println("  maxScore now " + maxScore + " vs " + shard.getMaxScore());
       }
     }
 
@@ -303,19 +298,16 @@ public class TopDocs {
         ShardRef ref = queue.top();
         final ScoreDoc hit = shardHits[ref.shardIndex].scoreDocs[ref.hitIndex++];
         if (setShardIndex) {
-          // unless this index is already initialized potentially due to multiple merge phases, or explicitly by the user
-          // we set the shard index to the index of the TopDocs array this hit is coming from.
-          // this allows multiple merge phases if needed but requires extra accounting on the users end.
-          // at the same time this is fully backwards compatible since the value was initialize to -1 from the beginning
+          // caller asked us to record shardIndex (index of the TopDocs array) this hit is coming from:
           hit.shardIndex = ref.shardIndex;
+        } else if (hit.shardIndex == -1) {
+          throw new IllegalArgumentException("setShardIndex is false but TopDocs[" + ref.shardIndex + "].scoreDocs[" + (ref.hitIndex-1) + "] is not set");
         }
+          
         if (hitUpto >= start) {
           hits[hitUpto - start] = hit;
         }
 
-        //System.out.println("  hitUpto=" + hitUpto);
-        //System.out.println("    doc=" + hits[hitUpto].doc + " score=" + hits[hitUpto].score);
-
         hitUpto++;
 
         if (ref.hitIndex < shardHits[ref.shardIndex].scoreDocs.length) {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/2e56c0e5/lucene/core/src/test/org/apache/lucene/search/TestTopDocsMerge.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/search/TestTopDocsMerge.java b/lucene/core/src/test/org/apache/lucene/search/TestTopDocsMerge.java
index 37c61a4..0372c2a 100644
--- a/lucene/core/src/test/org/apache/lucene/search/TestTopDocsMerge.java
+++ b/lucene/core/src/test/org/apache/lucene/search/TestTopDocsMerge.java
@@ -17,15 +17,22 @@
 package org.apache.lucene.search;
 
 
+import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
 import org.apache.lucene.document.Document;
 import org.apache.lucene.document.Field;
 import org.apache.lucene.document.FloatDocValuesField;
 import org.apache.lucene.document.NumericDocValuesField;
 import org.apache.lucene.document.SortedDocValuesField;
-import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.index.CompositeReaderContext;
 import org.apache.lucene.index.IndexReader;
 import org.apache.lucene.index.IndexReaderContext;
+import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.index.RandomIndexWriter;
 import org.apache.lucene.index.ReaderUtil;
 import org.apache.lucene.index.Term;
@@ -35,13 +42,6 @@ import org.apache.lucene.util.BytesRef;
 import org.apache.lucene.util.LuceneTestCase;
 import org.apache.lucene.util.TestUtil;
 
-import java.io.IOException;
-import java.util.ArrayList;
-import java.util.Collections;
-import java.util.HashMap;
-import java.util.List;
-import java.util.Map;
-
 public class TestTopDocsMerge extends LuceneTestCase {
 
   private static class ShardSearcher extends IndexSearcher {
@@ -77,18 +77,18 @@ public class TestTopDocsMerge extends LuceneTestCase {
 
   public void testInconsistentTopDocsFail() {
     TopDocs[] topDocs = new TopDocs[] {
-        new TopDocs(1, new ScoreDoc[] { new ScoreDoc(1, 1.0f, 1) }),
+        new TopDocs(1, new ScoreDoc[] { new ScoreDoc(1, 1.0f) }),
         new TopDocs(1, new ScoreDoc[] { new ScoreDoc(1, 1.0f, -1) })
     };
     if (random().nextBoolean()) {
       ArrayUtil.swap(topDocs, 0, 1);
     }
-    expectThrows(IllegalStateException.class, () -> {
-      TopDocs.merge(0, 1, topDocs);
+    expectThrows(IllegalArgumentException.class, () -> {
+        TopDocs.merge(0, 1, topDocs, false);
     });
   }
 
-  public void testAssignShardIndex() {
+  public void testPreAssignedShardIndex() {
     boolean useConstantScore = random().nextBoolean();
     int numTopDocs = 2 + random().nextInt(10);
     ArrayList<TopDocs> topDocs = new ArrayList<>(numTopDocs);
@@ -100,8 +100,8 @@ public class TestTopDocsMerge extends LuceneTestCase {
       ScoreDoc[] scoreDocs = new ScoreDoc[numHits];
       for (int j = 0; j < scoreDocs.length; j++) {
         float score = useConstantScore ? 1.0f : random().nextFloat();
-        scoreDocs[j] = new ScoreDoc((100 * i) + j, score , i);
         // we set the shard index to index in the list here but shuffle the entire list below
+        scoreDocs[j] = new ScoreDoc((100 * i) + j, score , i);
       }
       topDocs.add(new TopDocs(numHits, scoreDocs));
       shardResultMapping.put(i, topDocs.get(i));
@@ -111,7 +111,11 @@ public class TestTopDocsMerge extends LuceneTestCase {
     Collections.shuffle(topDocs, random());
     final int from = random().nextInt(numHitsTotal-1);
     final int size = 1 + random().nextInt(numHitsTotal - from);
-    TopDocs merge = TopDocs.merge(from, size, topDocs.toArray(new TopDocs[0]));
+
+    // passing false here means TopDocs.merge uses the incoming ScoreDoc.shardIndex
+    // that we already set, instead of the position of that TopDocs in the array:
+    TopDocs merge = TopDocs.merge(from, size, topDocs.toArray(new TopDocs[0]), false);
+    
     assertTrue(merge.scoreDocs.length > 0);
     for (ScoreDoc scoreDoc : merge.scoreDocs) {
       assertTrue(scoreDoc.shardIndex != -1);
@@ -129,7 +133,7 @@ public class TestTopDocsMerge extends LuceneTestCase {
 
     // now ensure merge is stable even if we use our own shard IDs
     Collections.shuffle(topDocs, random());
-    TopDocs merge2 = TopDocs.merge(from, size, topDocs.toArray(new TopDocs[0]));
+    TopDocs merge2 = TopDocs.merge(from, size, topDocs.toArray(new TopDocs[0]), false);
     assertArrayEquals(merge.scoreDocs, merge2.scoreDocs);
   }
 
@@ -346,9 +350,9 @@ public class TestTopDocsMerge extends LuceneTestCase {
       final TopDocs mergedHits;
       if (useFrom) {
         if (sort == null) {
-          mergedHits = TopDocs.merge(from, size, shardHits);
+          mergedHits = TopDocs.merge(from, size, shardHits, true);
         } else {
-          mergedHits = TopDocs.merge(sort, from, size, (TopFieldDocs[]) shardHits);
+          mergedHits = TopDocs.merge(sort, from, size, (TopFieldDocs[]) shardHits, true);
         }
       } else {
         if (sort == null) {


[07/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10193: Improve MiniSolrCloudCluster#shutdown.

Posted by ab...@apache.org.
SOLR-10193: Improve MiniSolrCloudCluster#shutdown.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/29a5ea44
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/29a5ea44
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/29a5ea44

Branch: refs/heads/jira/solr-9858
Commit: 29a5ea44a7f010e27a8c8951d697fc0fbb8d5403
Parents: d6337ac
Author: markrmiller <ma...@apache.org>
Authored: Wed Feb 22 15:58:12 2017 -0500
Committer: markrmiller <ma...@apache.org>
Committed: Wed Feb 22 15:58:12 2017 -0500

----------------------------------------------------------------------
 .../apache/solr/cloud/MiniSolrCloudCluster.java | 21 +++++++++++---------
 1 file changed, 12 insertions(+), 9 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/29a5ea44/solr/test-framework/src/java/org/apache/solr/cloud/MiniSolrCloudCluster.java
----------------------------------------------------------------------
diff --git a/solr/test-framework/src/java/org/apache/solr/cloud/MiniSolrCloudCluster.java b/solr/test-framework/src/java/org/apache/solr/cloud/MiniSolrCloudCluster.java
index 7eb9b0d..e8a0c08 100644
--- a/solr/test-framework/src/java/org/apache/solr/cloud/MiniSolrCloudCluster.java
+++ b/solr/test-framework/src/java/org/apache/solr/cloud/MiniSolrCloudCluster.java
@@ -36,7 +36,6 @@ import java.util.concurrent.CopyOnWriteArrayList;
 import java.util.concurrent.ExecutionException;
 import java.util.concurrent.ExecutorService;
 import java.util.concurrent.Future;
-import java.util.concurrent.TimeUnit;
 import java.util.concurrent.atomic.AtomicInteger;
 
 import org.apache.solr.client.solrj.embedded.JettyConfig;
@@ -50,6 +49,7 @@ import org.apache.solr.common.cloud.SolrZkClient;
 import org.apache.solr.common.cloud.ZkConfigManager;
 import org.apache.solr.common.cloud.ZkStateReader;
 import org.apache.solr.common.util.ExecutorUtil;
+import org.apache.solr.common.util.IOUtils;
 import org.apache.solr.common.util.SolrjNamedThreadFactory;
 import org.apache.solr.core.CoreContainer;
 import org.apache.zookeeper.KeeperException;
@@ -97,8 +97,9 @@ public class MiniSolrCloudCluster {
   private final CloudSolrClient solrClient;
   private final JettyConfig jettyConfig;
 
-  private final ExecutorService executor = ExecutorUtil.newMDCAwareCachedThreadPool(new SolrjNamedThreadFactory("jetty-launcher"));
-
+  private final ExecutorService executorLauncher = ExecutorUtil.newMDCAwareCachedThreadPool(new SolrjNamedThreadFactory("jetty-launcher"));
+  private final ExecutorService executorCloser = ExecutorUtil.newMDCAwareCachedThreadPool(new SolrjNamedThreadFactory("jetty-closer"));
+  
   private final AtomicInteger nodeIds = new AtomicInteger();
 
   /**
@@ -239,7 +240,7 @@ public class MiniSolrCloudCluster {
       startups.add(() -> startJettySolrRunner(newNodeName(), jettyConfig.context, jettyConfig));
     }
 
-    Collection<Future<JettySolrRunner>> futures = executor.invokeAll(startups);
+    Collection<Future<JettySolrRunner>> futures = executorLauncher.invokeAll(startups);
     Exception startupError = checkForExceptions("Error starting up MiniSolrCloudCluster", futures);
     if (startupError != null) {
       try {
@@ -443,21 +444,23 @@ public class MiniSolrCloudCluster {
    */
   public void shutdown() throws Exception {
     try {
-      if (solrClient != null)
-        solrClient.close();
+    
+      IOUtils.closeQuietly(solrClient);
+      // accept no new tasks
+      executorLauncher.shutdown();
       List<Callable<JettySolrRunner>> shutdowns = new ArrayList<>(jettys.size());
       for (final JettySolrRunner jetty : jettys) {
         shutdowns.add(() -> stopJettySolrRunner(jetty));
       }
       jettys.clear();
-      Collection<Future<JettySolrRunner>> futures = executor.invokeAll(shutdowns);
+      Collection<Future<JettySolrRunner>> futures = executorCloser.invokeAll(shutdowns);
       Exception shutdownError = checkForExceptions("Error shutting down MiniSolrCloudCluster", futures);
       if (shutdownError != null) {
         throw shutdownError;
       }
     } finally {
-      executor.shutdown();
-      executor.awaitTermination(15, TimeUnit.SECONDS);
+      ExecutorUtil.shutdownAndAwaitTermination(executorLauncher);
+      ExecutorUtil.shutdownAndAwaitTermination(executorCloser);
       try {
         if (!externalZkServer) {
           zkServer.shutdown();


[30/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10182 Remove metrics collection at Directory level.

Posted by ab...@apache.org.
SOLR-10182 Remove metrics collection at Directory level.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/a248e6e3
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/a248e6e3
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/a248e6e3

Branch: refs/heads/jira/solr-9858
Commit: a248e6e3c080cfe6deb873d1ef114e4b9c1c043d
Parents: 048b24c
Author: Andrzej Bialecki <ab...@apache.org>
Authored: Mon Feb 27 14:39:13 2017 +0100
Committer: Andrzej Bialecki <ab...@apache.org>
Committed: Mon Feb 27 16:32:27 2017 +0100

----------------------------------------------------------------------
 solr/CHANGES.txt                                |   5 +-
 .../org/apache/solr/core/DirectoryFactory.java  |   9 +-
 .../solr/core/MetricsDirectoryFactory.java      | 537 -------------------
 .../apache/solr/core/SolrDeletionPolicy.java    |   6 -
 .../conf/solrconfig-indexmetrics.xml            |   2 -
 .../HdfsWriteToMultipleCollectionsTest.java     |   7 +-
 .../solr/handler/TestReplicationHandler.java    |   9 +-
 .../admin/CoreMergeIndexesAdminHandlerTest.java |   8 +-
 .../solr/update/SolrIndexMetricsTest.java       |  44 --
 9 files changed, 7 insertions(+), 620 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/a248e6e3/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index 546b484..99a0a42 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -270,9 +270,10 @@ Detailed Change List
 
 Bug Fixes
 ----------------------
-* SOLR-10130: Serious performance degradation in Solr 6.4.1 due to the new metrics collection.
+* SOLR-10130, SOLR-10182: Serious performance degradation in Solr 6.4.1 due to the new metrics collection.
   Default settings in solrconfig.xml /config/indexConfig/metrics have been changed to turn off
-  IndexWriter and Directory level metrics collection. (ab, ishan)
+  IndexWriter metrics collection. Directory level metrics collection has been completely removed until
+  a better design is found. (ab, ishan)
 
 * SOLR-10138: Transaction log replay can hit an NPE due to new Metrics code. (ab)
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/a248e6e3/solr/core/src/java/org/apache/solr/core/DirectoryFactory.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/core/DirectoryFactory.java b/solr/core/src/java/org/apache/solr/core/DirectoryFactory.java
index e4f0c5e..cc24e6c 100644
--- a/solr/core/src/java/org/apache/solr/core/DirectoryFactory.java
+++ b/solr/core/src/java/org/apache/solr/core/DirectoryFactory.java
@@ -411,13 +411,6 @@ public abstract class DirectoryFactory implements NamedListInitializedPlugin,
       dirFactory = new NRTCachingDirectoryFactory();
       dirFactory.initCoreContainer(cc);
     }
-    if (config.indexConfig.metricsInfo != null && config.indexConfig.metricsInfo.isEnabled()) {
-      final DirectoryFactory factory = new MetricsDirectoryFactory(cc.getMetricManager(),
-          registryName, dirFactory);
-        factory.init(config.indexConfig.metricsInfo.initArgs);
-      return factory;
-    } else {
-      return dirFactory;
-    }
+    return dirFactory;
   }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/a248e6e3/solr/core/src/java/org/apache/solr/core/MetricsDirectoryFactory.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/core/MetricsDirectoryFactory.java b/solr/core/src/java/org/apache/solr/core/MetricsDirectoryFactory.java
deleted file mode 100644
index b38c86f..0000000
--- a/solr/core/src/java/org/apache/solr/core/MetricsDirectoryFactory.java
+++ /dev/null
@@ -1,537 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements.  See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.solr.core;
-
-import java.io.IOException;
-import java.util.Collection;
-
-import com.codahale.metrics.Histogram;
-import com.codahale.metrics.Meter;
-import org.apache.lucene.store.Directory;
-import org.apache.lucene.store.FilterDirectory;
-import org.apache.lucene.store.IOContext;
-import org.apache.lucene.store.IndexInput;
-import org.apache.lucene.store.IndexOutput;
-import org.apache.lucene.store.LockFactory;
-import org.apache.solr.common.util.NamedList;
-import org.apache.solr.metrics.SolrMetricManager;
-import org.apache.solr.util.plugin.SolrCoreAware;
-
-/**
- * An implementation of {@link DirectoryFactory} that decorates provided factory by
- * adding metrics for directory IO operations.
- */
-public class MetricsDirectoryFactory extends DirectoryFactory implements SolrCoreAware {
-  private final SolrMetricManager metricManager;
-  private final String registry;
-  private final DirectoryFactory in;
-  private boolean directoryDetails = false;
-  private boolean directoryTotals = false;
-
-  public MetricsDirectoryFactory(SolrMetricManager metricManager, String registry, DirectoryFactory in) {
-    this.metricManager = metricManager;
-    this.registry = registry;
-    this.in = in;
-  }
-
-  public DirectoryFactory getDelegate() {
-    return in;
-  }
-
-  /**
-   * Currently the following arguments are supported:
-   * <ul>
-   *   <li><code>directory</code> - (optional bool, default false) when true then coarse-grained metrics will be collected.</li>
-   *   <li><code>directoryDetails</code> - (optional bool, default false) when true then additional detailed metrics
-   *   will be collected. These include eg. IO size histograms and per-file counters and histograms</li>
-   * </ul>
-   * NOTE: please be aware that collecting even coarse-grained metrics can have significant performance impact
-   * (see SOLR-10130).
-   * @param args init args
-   */
-  @Override
-  public void init(NamedList args) {
-    // should be already inited
-    // in.init(args);
-    if (args == null) {
-      return;
-    }
-    Boolean td = args.getBooleanArg("directory");
-    if (td != null) {
-      directoryTotals = td;
-    } else {
-      directoryTotals = false;
-    }
-    Boolean dd = args.getBooleanArg("directoryDetails");
-    if (dd != null) {
-      directoryDetails = dd;
-    } else {
-      directoryDetails = false;
-    }
-    if (directoryDetails) {
-      directoryTotals = true;
-    }
-  }
-
-  /**
-   * Unwrap just one level if the argument is a {@link MetricsDirectory}
-   * @param dir directory
-   * @return delegate if the instance was a {@link MetricsDirectory}, otherwise unchanged.
-   */
-  private static Directory unwrap(Directory dir) {
-    if (dir instanceof MetricsDirectory) {
-      return ((MetricsDirectory)dir).getDelegate();
-    } else {
-      return dir;
-    }
-  }
-
-  @Override
-  public void doneWithDirectory(Directory dir) throws IOException {
-    dir = unwrap(dir);
-    in.doneWithDirectory(dir);
-  }
-
-  @Override
-  public void addCloseListener(Directory dir, CachingDirectoryFactory.CloseListener closeListener) {
-    dir = unwrap(dir);
-    in.addCloseListener(dir, closeListener);
-  }
-
-  @Override
-  public void close() throws IOException {
-    in.close();
-  }
-
-  @Override
-  protected Directory create(String path, LockFactory lockFactory, DirContext dirContext) throws IOException {
-    Directory dir = in.create(path, lockFactory, dirContext);
-    return new MetricsDirectory(metricManager, registry, dir, directoryTotals, directoryDetails);
-  }
-
-  @Override
-  protected LockFactory createLockFactory(String rawLockType) throws IOException {
-    return in.createLockFactory(rawLockType);
-  }
-
-  @Override
-  public boolean exists(String path) throws IOException {
-    return in.exists(path);
-  }
-
-  @Override
-  public void remove(Directory dir) throws IOException {
-    dir = unwrap(dir);
-    in.remove(dir);
-  }
-
-  @Override
-  public void remove(Directory dir, boolean afterCoreClose) throws IOException {
-    dir = unwrap(dir);
-    in.remove(dir, afterCoreClose);
-  }
-
-  @Override
-  public boolean isSharedStorage() {
-    return in.isSharedStorage();
-  }
-
-  @Override
-  public boolean isAbsolute(String path) {
-    return in.isAbsolute(path);
-  }
-
-  @Override
-  public boolean searchersReserveCommitPoints() {
-    return in.searchersReserveCommitPoints();
-  }
-
-  @Override
-  public String getDataHome(CoreDescriptor cd) throws IOException {
-    return in.getDataHome(cd);
-  }
-
-  @Override
-  public long size(Directory dir) throws IOException {
-    dir = unwrap(dir);
-    return in.size(dir);
-  }
-
-  @Override
-  public long size(String path) throws IOException {
-    return in.size(path);
-  }
-
-  @Override
-  public Collection<SolrInfoMBean> offerMBeans() {
-    return in.offerMBeans();
-  }
-
-  @Override
-  public void cleanupOldIndexDirectories(String dataDirPath, String currentIndexDirPath, boolean reload) {
-    in.cleanupOldIndexDirectories(dataDirPath, currentIndexDirPath, reload);
-  }
-
-  @Override
-  public void remove(String path, boolean afterCoreClose) throws IOException {
-    in.remove(path, afterCoreClose);
-  }
-
-  @Override
-  public void remove(String path) throws IOException {
-    in.remove(path);
-  }
-
-  @Override
-  public void move(Directory fromDir, Directory toDir, String fileName, IOContext ioContext) throws IOException {
-    fromDir = unwrap(fromDir);
-    toDir = unwrap(toDir);
-    in.move(fromDir, toDir, fileName, ioContext);
-  }
-
-  @Override
-  public Directory get(String path, DirContext dirContext, String rawLockType) throws IOException {
-    Directory dir = in.get(path, dirContext, rawLockType);
-    if (dir instanceof MetricsDirectory) {
-      return dir;
-    } else {
-      return new MetricsDirectory(metricManager, registry, dir, directoryTotals, directoryDetails);
-    }
-  }
-
-  @Override
-  public void renameWithOverwrite(Directory dir, String fileName, String toName) throws IOException {
-    dir = unwrap(dir);
-    in.renameWithOverwrite(dir, fileName, toName);
-  }
-
-  @Override
-  public String normalize(String path) throws IOException {
-    return in.normalize(path);
-  }
-
-  @Override
-  protected boolean deleteOldIndexDirectory(String oldDirPath) throws IOException {
-    return in.deleteOldIndexDirectory(oldDirPath);
-  }
-
-  @Override
-  public void initCoreContainer(CoreContainer cc) {
-    in.initCoreContainer(cc);
-  }
-
-  @Override
-  public void incRef(Directory dir) {
-    dir = unwrap(dir);
-    in.incRef(dir);
-  }
-
-  @Override
-  public boolean isPersistent() {
-    return in.isPersistent();
-  }
-
-  @Override
-  public void inform(SolrCore core) {
-    if (in instanceof  SolrCoreAware) {
-      ((SolrCoreAware)in).inform(core);
-    }
-  }
-
-  @Override
-  public void release(Directory dir) throws IOException {
-    dir = unwrap(dir);
-    in.release(dir);
-  }
-
-
-
-  private static final String SEGMENTS = "segments";
-  private static final String SEGMENTS_PREFIX = "segments_";
-  private static final String PENDING_SEGMENTS_PREFIX = "pending_segments_";
-  private static final String TEMP = "temp";
-  private static final String OTHER = "other";
-
-  public static class MetricsDirectory extends FilterDirectory {
-
-    private final Directory in;
-    private final String registry;
-    private final SolrMetricManager metricManager;
-    private final Meter totalReads;
-    private final Histogram totalReadSizes;
-    private final Meter totalWrites;
-    private final Histogram totalWriteSizes;
-    private final boolean directoryDetails;
-    private final boolean directoryTotals;
-
-    private final String PREFIX = SolrInfoMBean.Category.DIRECTORY.toString() + ".";
-
-    public MetricsDirectory(SolrMetricManager metricManager, String registry, Directory in, boolean directoryTotals,
-                            boolean directoryDetails) throws IOException {
-      super(in);
-      this.metricManager = metricManager;
-      this.registry = registry;
-      this.in = in;
-      this.directoryDetails = directoryDetails;
-      this.directoryTotals = directoryTotals;
-      if (directoryTotals) {
-        this.totalReads = metricManager.meter(registry, "reads", SolrInfoMBean.Category.DIRECTORY.toString(), "total");
-        this.totalWrites = metricManager.meter(registry, "writes", SolrInfoMBean.Category.DIRECTORY.toString(), "total");
-        if (directoryDetails) {
-          this.totalReadSizes = metricManager.histogram(registry, "readSizes", SolrInfoMBean.Category.DIRECTORY.toString(), "total");
-          this.totalWriteSizes = metricManager.histogram(registry, "writeSizes", SolrInfoMBean.Category.DIRECTORY.toString(), "total");
-        } else {
-          this.totalReadSizes = null;
-          this.totalWriteSizes = null;
-        }
-      } else {
-        this.totalReads = null;
-        this.totalWrites = null;
-        this.totalReadSizes = null;
-        this.totalWriteSizes = null;
-      }
-    }
-
-    private String getMetricName(String name, boolean output) {
-      if (!directoryDetails) {
-        return null;
-      }
-      String lastName;
-      if (name.startsWith(SEGMENTS_PREFIX) || name.startsWith(PENDING_SEGMENTS_PREFIX)) {
-        lastName = SEGMENTS;
-      } else {
-        int pos = name.lastIndexOf('.');
-        if (pos != -1 && name.length() > pos + 1) {
-          lastName = name.substring(pos + 1);
-        } else {
-          lastName = OTHER;
-        }
-      }
-      StringBuilder sb = new StringBuilder(PREFIX);
-      sb.append(lastName);
-      sb.append('.');
-      if (output) {
-        sb.append("write");
-      } else {
-        sb.append("read");
-      }
-      return sb.toString();
-    }
-
-    @Override
-    public IndexOutput createOutput(String name, IOContext context) throws IOException {
-      IndexOutput output = in.createOutput(name, context);
-      if (!directoryTotals) {
-        return output;
-      }
-      if (output != null) {
-        return new MetricsOutput(totalWrites, totalWriteSizes, metricManager, registry, getMetricName(name, true), output);
-      } else {
-        return null;
-      }
-    }
-
-    @Override
-    public IndexOutput createTempOutput(String prefix, String suffix, IOContext context) throws IOException {
-      IndexOutput output = in.createTempOutput(prefix, suffix, context);
-      if (!directoryTotals) {
-        return output;
-      }
-      if (output != null) {
-        return new MetricsOutput(totalWrites, totalWriteSizes, metricManager, registry, getMetricName(TEMP, true), output);
-      } else {
-        return null;
-      }
-    }
-
-    @Override
-    public IndexInput openInput(String name, IOContext context) throws IOException {
-      IndexInput input = in.openInput(name, context);
-      if (!directoryTotals) {
-        return input;
-      }
-      if (input != null) {
-        return new MetricsInput(totalReads, totalReadSizes, metricManager, registry, getMetricName(name, false), input);
-      } else {
-        return null;
-      }
-    }
-  }
-
-  public static class MetricsOutput extends IndexOutput {
-    private final IndexOutput in;
-    private final Histogram histogram;
-    private final Meter meter;
-    private final Meter totalMeter;
-    private final Histogram totalHistogram;
-    private final boolean withDetails;
-
-    public MetricsOutput(Meter totalMeter, Histogram totalHistogram, SolrMetricManager metricManager,
-                         String registry, String metricName, IndexOutput in) {
-      super(in.toString(), in.getName());
-      this.in = in;
-      this.totalMeter = totalMeter;
-      this.totalHistogram = totalHistogram;
-      if (metricName != null && totalHistogram != null) {
-        withDetails = true;
-        String histName = metricName + "Sizes";
-        String meterName = metricName + "s";
-        this.histogram = metricManager.histogram(registry, histName);
-        this.meter = metricManager.meter(registry, meterName);
-      } else {
-        withDetails = false;
-        this.histogram = null;
-        this.meter = null;
-      }
-    }
-
-    @Override
-    public void writeByte(byte b) throws IOException {
-      in.writeByte(b);
-      totalMeter.mark();
-      if (withDetails) {
-        totalHistogram.update(1);
-        meter.mark();
-        histogram.update(1);
-      }
-    }
-
-    @Override
-    public void writeBytes(byte[] b, int offset, int length) throws IOException {
-      in.writeBytes(b, offset, length);
-      totalMeter.mark(length);
-      if (withDetails) {
-        totalHistogram.update(length);
-        meter.mark(length);
-        histogram.update(length);
-      }
-    }
-
-    @Override
-    public void close() throws IOException {
-      in.close();
-    }
-
-    @Override
-    public long getFilePointer() {
-      return in.getFilePointer();
-    }
-
-    @Override
-    public long getChecksum() throws IOException {
-      return in.getChecksum();
-    }
-  }
-
-  public static class MetricsInput extends IndexInput {
-    private final IndexInput in;
-    private final Meter totalMeter;
-    private final Histogram totalHistogram;
-    private final Histogram histogram;
-    private final Meter meter;
-    private final boolean withDetails;
-
-    public MetricsInput(Meter totalMeter, Histogram totalHistogram, SolrMetricManager metricManager, String registry, String metricName, IndexInput in) {
-      super(in.toString());
-      this.in = in;
-      this.totalMeter = totalMeter;
-      this.totalHistogram = totalHistogram;
-      if (metricName != null && totalHistogram != null) {
-        withDetails = true;
-        String histName = metricName + "Sizes";
-        String meterName = metricName + "s";
-        this.histogram = metricManager.histogram(registry, histName);
-        this.meter = metricManager.meter(registry, meterName);
-      } else {
-        withDetails = false;
-        this.histogram = null;
-        this.meter = null;
-      }
-    }
-
-    public MetricsInput(Meter totalMeter, Histogram totalHistogram, Histogram histogram, Meter meter, IndexInput in) {
-      super(in.toString());
-      this.in = in;
-      this.totalMeter = totalMeter;
-      this.totalHistogram  = totalHistogram;
-      this.histogram = histogram;
-      this.meter = meter;
-      if (totalHistogram != null && meter != null && histogram != null) {
-        withDetails = true;
-      } else {
-        withDetails = false;
-      }
-    }
-
-    @Override
-    public void close() throws IOException {
-      in.close();
-    }
-
-    @Override
-    public long getFilePointer() {
-      return in.getFilePointer();
-    }
-
-    @Override
-    public void seek(long pos) throws IOException {
-      in.seek(pos);
-    }
-
-    @Override
-    public long length() {
-      return in.length();
-    }
-
-    @Override
-    public IndexInput clone() {
-      return new MetricsInput(totalMeter, totalHistogram, histogram, meter, in.clone());
-    }
-
-    @Override
-    public IndexInput slice(String sliceDescription, long offset, long length) throws IOException {
-      IndexInput slice = in.slice(sliceDescription, offset, length);
-      if (slice != null) {
-        return new MetricsInput(totalMeter, totalHistogram, histogram, meter, slice);
-      } else {
-        return null;
-      }
-    }
-
-    @Override
-    public byte readByte() throws IOException {
-      totalMeter.mark();
-      if (withDetails) {
-        totalHistogram.update(1);
-        meter.mark();
-        histogram.update(1);
-      }
-      return in.readByte();
-    }
-
-    @Override
-    public void readBytes(byte[] b, int offset, int len) throws IOException {
-      totalMeter.mark(len);
-      if (withDetails) {
-        totalHistogram.update(len);
-        meter.mark(len);
-        histogram.update(len);
-      }
-      in.readBytes(b, offset, len);
-    }
-  }
-}

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/a248e6e3/solr/core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/core/SolrDeletionPolicy.java b/solr/core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
index eba2964..34482cd 100644
--- a/solr/core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
+++ b/solr/core/src/java/org/apache/solr/core/SolrDeletionPolicy.java
@@ -114,9 +114,6 @@ public class SolrDeletionPolicy extends IndexDeletionPolicy implements NamedList
 
     protected void appendDetails(StringBuilder sb, IndexCommit c) {
       Directory dir = c.getDirectory();
-      if (dir instanceof MetricsDirectoryFactory.MetricsDirectory) { // unwrap
-        dir = ((MetricsDirectoryFactory.MetricsDirectory) dir).getDelegate();
-      }
       if (dir instanceof FSDirectory) {
         FSDirectory fsd = (FSDirectory) dir;
         sb.append("dir=").append(fsd.getDirectory());
@@ -197,9 +194,6 @@ public class SolrDeletionPolicy extends IndexDeletionPolicy implements NamedList
   private String getId(IndexCommit commit) {
     StringBuilder sb = new StringBuilder();
     Directory dir = commit.getDirectory();
-    if (dir instanceof MetricsDirectoryFactory.MetricsDirectory) { // unwrap
-      dir = ((MetricsDirectoryFactory.MetricsDirectory) dir).getDelegate();
-    }
 
     // For anything persistent, make something that will
     // be the same, regardless of the Directory instance.

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/a248e6e3/solr/core/src/test-files/solr/collection1/conf/solrconfig-indexmetrics.xml
----------------------------------------------------------------------
diff --git a/solr/core/src/test-files/solr/collection1/conf/solrconfig-indexmetrics.xml b/solr/core/src/test-files/solr/collection1/conf/solrconfig-indexmetrics.xml
index 188340d..6cf54d5 100644
--- a/solr/core/src/test-files/solr/collection1/conf/solrconfig-indexmetrics.xml
+++ b/solr/core/src/test-files/solr/collection1/conf/solrconfig-indexmetrics.xml
@@ -29,8 +29,6 @@
 
   <indexConfig>
     <metrics>
-      <bool name="directory">${solr.tests.metrics.directory:false}</bool>
-      <bool name="directoryDetails">${solr.tests.metrics.directoryDetails:false}</bool>
       <bool name="merge">${solr.tests.metrics.merge:false}</bool>
       <bool name="mergeDetails">${solr.tests.metrics.mergeDetails:false}</bool>
     </metrics>

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/a248e6e3/solr/core/src/test/org/apache/solr/cloud/hdfs/HdfsWriteToMultipleCollectionsTest.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/cloud/hdfs/HdfsWriteToMultipleCollectionsTest.java b/solr/core/src/test/org/apache/solr/cloud/hdfs/HdfsWriteToMultipleCollectionsTest.java
index b345342..ca85fe0 100644
--- a/solr/core/src/test/org/apache/solr/cloud/hdfs/HdfsWriteToMultipleCollectionsTest.java
+++ b/solr/core/src/test/org/apache/solr/cloud/hdfs/HdfsWriteToMultipleCollectionsTest.java
@@ -41,7 +41,6 @@ import org.apache.solr.cloud.StoppableIndexingThread;
 import org.apache.solr.core.CoreContainer;
 import org.apache.solr.core.DirectoryFactory;
 import org.apache.solr.core.HdfsDirectoryFactory;
-import org.apache.solr.core.MetricsDirectoryFactory;
 import org.apache.solr.core.SolrCore;
 import org.apache.solr.store.blockcache.BlockCache;
 import org.apache.solr.store.blockcache.BlockDirectory;
@@ -137,9 +136,6 @@ public class HdfsWriteToMultipleCollectionsTest extends BasicDistributedZkTest {
         if (core.getCoreDescriptor().getCloudDescriptor().getCollectionName()
             .startsWith(ACOLLECTION)) {
           DirectoryFactory factory = core.getDirectoryFactory();
-          if (factory instanceof MetricsDirectoryFactory) {
-            factory = ((MetricsDirectoryFactory) factory).getDelegate();
-          }
           assertTrue("Found: " + core.getDirectoryFactory().getClass().getName(), factory instanceof HdfsDirectoryFactory);
           Directory dir = factory.get(core.getDataDir(), null, null);
           try {
@@ -159,8 +155,7 @@ public class HdfsWriteToMultipleCollectionsTest extends BasicDistributedZkTest {
               .getSolrCoreState().getIndexWriter(core);
           try {
             IndexWriter iw = iwRef.get();
-            NRTCachingDirectory directory = (NRTCachingDirectory) ((MetricsDirectoryFactory.MetricsDirectory)iw
-                .getDirectory()).getDelegate();
+            NRTCachingDirectory directory = (NRTCachingDirectory) iw.getDirectory();
             BlockDirectory blockDirectory = (BlockDirectory) directory
                 .getDelegate();
             assertTrue(blockDirectory.isBlockCacheReadEnabled());

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/a248e6e3/solr/core/src/test/org/apache/solr/handler/TestReplicationHandler.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/handler/TestReplicationHandler.java b/solr/core/src/test/org/apache/solr/handler/TestReplicationHandler.java
index 9deb51d..8148b88 100644
--- a/solr/core/src/test/org/apache/solr/handler/TestReplicationHandler.java
+++ b/solr/core/src/test/org/apache/solr/handler/TestReplicationHandler.java
@@ -66,8 +66,6 @@ import org.apache.solr.common.params.SolrParams;
 import org.apache.solr.common.util.NamedList;
 import org.apache.solr.core.CachingDirectoryFactory;
 import org.apache.solr.core.CoreContainer;
-import org.apache.solr.core.DirectoryFactory;
-import org.apache.solr.core.MetricsDirectoryFactory;
 import org.apache.solr.core.SolrCore;
 import org.apache.solr.core.StandardDirectoryFactory;
 import org.apache.solr.core.snapshots.SolrSnapshotMetaDataManager;
@@ -923,12 +921,7 @@ public class TestReplicationHandler extends SolrTestCaseJ4 {
   }
 
   private CachingDirectoryFactory getCachingDirectoryFactory(SolrCore core) {
-    DirectoryFactory df = core.getDirectoryFactory();
-    if (df instanceof MetricsDirectoryFactory) {
-      return (CachingDirectoryFactory)((MetricsDirectoryFactory)df).getDelegate();
-    } else {
-      return (CachingDirectoryFactory)df;
-    }
+    return (CachingDirectoryFactory) core.getDirectoryFactory();
   }
 
   private void checkForSingleIndex(JettySolrRunner jetty) {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/a248e6e3/solr/core/src/test/org/apache/solr/handler/admin/CoreMergeIndexesAdminHandlerTest.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/handler/admin/CoreMergeIndexesAdminHandlerTest.java b/solr/core/src/test/org/apache/solr/handler/admin/CoreMergeIndexesAdminHandlerTest.java
index 937cc86..d026ecd 100644
--- a/solr/core/src/test/org/apache/solr/handler/admin/CoreMergeIndexesAdminHandlerTest.java
+++ b/solr/core/src/test/org/apache/solr/handler/admin/CoreMergeIndexesAdminHandlerTest.java
@@ -25,7 +25,6 @@ import org.apache.solr.SolrTestCaseJ4;
 import org.apache.solr.common.params.CoreAdminParams;
 import org.apache.solr.core.CoreContainer;
 import org.apache.solr.core.DirectoryFactory;
-import org.apache.solr.core.MetricsDirectoryFactory;
 import org.apache.solr.core.MockFSDirectoryFactory;
 import org.apache.solr.core.SolrCore;
 import org.apache.solr.response.SolrQueryResponse;
@@ -78,12 +77,7 @@ public class CoreMergeIndexesAdminHandlerTest extends SolrTestCaseJ4 {
 
     try (SolrCore core = cores.getCore("collection1")) {
       DirectoryFactory df = core.getDirectoryFactory();
-      FailingDirectoryFactory dirFactory;
-      if (df instanceof MetricsDirectoryFactory) {
-        dirFactory = (FailingDirectoryFactory)((MetricsDirectoryFactory)df).getDelegate();
-      } else {
-        dirFactory = (FailingDirectoryFactory)df;
-      }
+      FailingDirectoryFactory dirFactory = (FailingDirectoryFactory) df;
 
       try {
         dirFactory.fail = true;

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/a248e6e3/solr/core/src/test/org/apache/solr/update/SolrIndexMetricsTest.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/update/SolrIndexMetricsTest.java b/solr/core/src/test/org/apache/solr/update/SolrIndexMetricsTest.java
index 39b511d..c9935bb 100644
--- a/solr/core/src/test/org/apache/solr/update/SolrIndexMetricsTest.java
+++ b/solr/core/src/test/org/apache/solr/update/SolrIndexMetricsTest.java
@@ -18,7 +18,6 @@ package org.apache.solr.update;
 
 import java.util.Map;
 
-import com.codahale.metrics.Histogram;
 import com.codahale.metrics.Meter;
 import com.codahale.metrics.Metric;
 import com.codahale.metrics.MetricRegistry;
@@ -59,8 +58,6 @@ public class SolrIndexMetricsTest extends SolrTestCaseJ4 {
   public void testIndexMetricsNoDetails() throws Exception {
     System.setProperty("solr.tests.metrics.merge", "true");
     System.setProperty("solr.tests.metrics.mergeDetails", "false");
-    System.setProperty("solr.tests.metrics.directory", "true");
-    System.setProperty("solr.tests.metrics.directoryDetails", "false");
     initCore("solrconfig-indexmetrics.xml", "schema.xml");
 
     addDocs();
@@ -71,7 +68,6 @@ public class SolrIndexMetricsTest extends SolrTestCaseJ4 {
     Map<String, Metric> metrics = registry.getMetrics();
 
     assertEquals(10, metrics.entrySet().stream().filter(e -> e.getKey().startsWith("INDEX")).count());
-    assertEquals(2, metrics.entrySet().stream().filter(e -> e.getKey().startsWith("DIRECTORY")).count());
 
     // check basic index meters
     Timer timer = (Timer)metrics.get("INDEX.merge.minor");
@@ -82,30 +78,12 @@ public class SolrIndexMetricsTest extends SolrTestCaseJ4 {
     assertNull((Meter)metrics.get("INDEX.merge.major.docs"));
     Meter meter = (Meter)metrics.get("INDEX.flush");
     assertTrue("flush: " + meter.getCount(), meter.getCount() > 10);
-
-    // check basic directory meters
-    meter = (Meter)metrics.get("DIRECTORY.total.reads");
-    assertTrue("totalReads", meter.getCount() > 0);
-    meter = (Meter)metrics.get("DIRECTORY.total.writes");
-    assertTrue("totalWrites", meter.getCount() > 0);
-    // check detailed meters
-    Histogram histogram = (Histogram)metrics.get("DIRECTORY.total.readSizes");
-    assertNull("readSizes", histogram);
-    histogram = (Histogram)metrics.get("DIRECTORY.total.writeSizes");
-    assertNull("writeSizes", histogram);
-    meter = (Meter)metrics.get("DIRECTORY.segments.writes");
-    assertNull("segmentsWrites", meter);
-    histogram = (Histogram)metrics.get("DIRECTORY.segments.writeSizes");
-    assertNull("segmentsWriteSizes", histogram);
-
   }
 
   @Test
   public void testIndexNoMetrics() throws Exception {
     System.setProperty("solr.tests.metrics.merge", "false");
     System.setProperty("solr.tests.metrics.mergeDetails", "false");
-    System.setProperty("solr.tests.metrics.directory", "false");
-    System.setProperty("solr.tests.metrics.directoryDetails", "false");
     initCore("solrconfig-indexmetrics.xml", "schema.xml");
 
     addDocs();
@@ -115,16 +93,12 @@ public class SolrIndexMetricsTest extends SolrTestCaseJ4 {
 
     Map<String, Metric> metrics = registry.getMetrics();
     assertEquals(0, metrics.entrySet().stream().filter(e -> e.getKey().startsWith("INDEX")).count());
-    // this is variable, depending on the codec and the number of created files
-    assertEquals(0, metrics.entrySet().stream().filter(e -> e.getKey().startsWith("DIRECTORY")).count());
   }
 
   @Test
   public void testIndexMetricsWithDetails() throws Exception {
     System.setProperty("solr.tests.metrics.merge", "false"); // test mergeDetails override too
     System.setProperty("solr.tests.metrics.mergeDetails", "true");
-    System.setProperty("solr.tests.metrics.directory", "false");
-    System.setProperty("solr.tests.metrics.directoryDetails", "true");
     initCore("solrconfig-indexmetrics.xml", "schema.xml");
 
     addDocs();
@@ -135,8 +109,6 @@ public class SolrIndexMetricsTest extends SolrTestCaseJ4 {
     Map<String, Metric> metrics = registry.getMetrics();
 
     assertTrue(metrics.entrySet().stream().filter(e -> e.getKey().startsWith("INDEX")).count() >= 12);
-    // this is variable, depending on the codec and the number of created files
-    assertTrue(metrics.entrySet().stream().filter(e -> e.getKey().startsWith("DIRECTORY")).count() > 20);
 
     // check basic index meters
     Timer timer = (Timer)metrics.get("INDEX.merge.minor");
@@ -149,21 +121,5 @@ public class SolrIndexMetricsTest extends SolrTestCaseJ4 {
 
     meter = (Meter)metrics.get("INDEX.flush");
     assertTrue("flush: " + meter.getCount(), meter.getCount() > 10);
-
-    // check basic directory meters
-    meter = (Meter)metrics.get("DIRECTORY.total.reads");
-    assertTrue("totalReads", meter.getCount() > 0);
-    meter = (Meter)metrics.get("DIRECTORY.total.writes");
-    assertTrue("totalWrites", meter.getCount() > 0);
-    // check detailed meters
-    Histogram histogram = (Histogram)metrics.get("DIRECTORY.total.readSizes");
-    assertTrue("readSizes", histogram.getCount() > 0);
-    histogram = (Histogram)metrics.get("DIRECTORY.total.writeSizes");
-    assertTrue("writeSizes", histogram.getCount() > 0);
-    meter = (Meter)metrics.get("DIRECTORY.segments.writes");
-    assertTrue("segmentsWrites", meter.getCount() > 0);
-    histogram = (Histogram)metrics.get("DIRECTORY.segments.writeSizes");
-    assertTrue("segmentsWriteSizes", histogram.getCount() > 0);
-
   }
 }


[19/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-9640: Support PKI authentication and SSL in standalone-mode master/slave auth with local security.json

Posted by ab...@apache.org.
SOLR-9640: Support PKI authentication and SSL in standalone-mode master/slave auth with local security.json


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/95d6fc25
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/95d6fc25
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/95d6fc25

Branch: refs/heads/jira/solr-9858
Commit: 95d6fc2512d6525b2354165553f0d6cc4d0d6310
Parents: 5eeb813
Author: Jan H�ydahl <ja...@apache.org>
Authored: Fri Feb 24 14:26:48 2017 +0100
Committer: Jan H�ydahl <ja...@apache.org>
Committed: Fri Feb 24 14:30:42 2017 +0100

----------------------------------------------------------------------
 solr/CHANGES.txt                                |   2 +
 .../org/apache/solr/core/CoreContainer.java     |   9 +-
 .../solr/security/PKIAuthenticationPlugin.java  |  42 +++++-
 .../org/apache/solr/servlet/HttpSolrCall.java   |   4 +-
 .../apache/solr/servlet/SolrDispatchFilter.java |  11 +-
 .../solr/security/BasicAuthDistributedTest.java | 136 +++++++++++++++++++
 .../security/TestPKIAuthenticationPlugin.java   |  38 +++++-
 .../solr/BaseDistributedSearchTestCase.java     |  37 ++++-
 8 files changed, 260 insertions(+), 19 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/95d6fc25/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index 0302615..2c5f0db 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -134,6 +134,8 @@ New Features
   field must both be stored=false, indexed=false, docValues=true. (Ishan Chattopadhyaya, hossman, noble,
   shalin, yonik)
 
+* SOLR-9640: Support PKI authentication and SSL in standalone-mode master/slave auth with local security.json (janhoy)
+
 Bug Fixes
 ----------------------
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/95d6fc25/solr/core/src/java/org/apache/solr/core/CoreContainer.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/core/CoreContainer.java b/solr/core/src/java/org/apache/solr/core/CoreContainer.java
index e3977d7..6115562 100644
--- a/solr/core/src/java/org/apache/solr/core/CoreContainer.java
+++ b/solr/core/src/java/org/apache/solr/core/CoreContainer.java
@@ -497,7 +497,9 @@ public class CoreContainer {
     hostName = cfg.getNodeName();
 
     zkSys.initZooKeeper(this, solrHome, cfg.getCloudConfig());
-    if(isZooKeeperAware())  pkiAuthenticationPlugin = new PKIAuthenticationPlugin(this, zkSys.getZkController().getNodeName());
+    pkiAuthenticationPlugin = isZooKeeperAware() ?
+        new PKIAuthenticationPlugin(this, zkSys.getZkController().getNodeName()) :
+        new PKIAuthenticationPlugin(this, getNodeNameLocal());
 
     MDCLoggingContext.setNode(this);
 
@@ -618,6 +620,11 @@ public class CoreContainer {
     }
   }
 
+  // Builds a node name to be used with PKIAuth.
+  private String getNodeNameLocal() {
+    return getConfig().getCloudConfig().getHost()+":"+getConfig().getCloudConfig().getSolrHostPort()+"_solr";
+  }
+
   public void securityNodeChanged() {
     log.info("Security node changed, reloading security.json");
     reloadSecurityProperties();

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/95d6fc25/solr/core/src/java/org/apache/solr/security/PKIAuthenticationPlugin.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/security/PKIAuthenticationPlugin.java b/solr/core/src/java/org/apache/solr/security/PKIAuthenticationPlugin.java
index fdd4408..d185bc9 100644
--- a/solr/core/src/java/org/apache/solr/security/PKIAuthenticationPlugin.java
+++ b/solr/core/src/java/org/apache/solr/security/PKIAuthenticationPlugin.java
@@ -22,7 +22,9 @@ import javax.servlet.ServletResponse;
 import javax.servlet.http.HttpServletRequest;
 import javax.servlet.http.HttpServletRequestWrapper;
 import java.io.IOException;
+import java.io.UnsupportedEncodingException;
 import java.lang.invoke.MethodHandles;
+import java.net.URLDecoder;
 import java.nio.ByteBuffer;
 import java.security.Principal;
 import java.security.PublicKey;
@@ -193,9 +195,14 @@ public class PKIAuthenticationPlugin extends AuthenticationPlugin implements Htt
   }
 
   PublicKey getRemotePublicKey(String nodename) {
-    String url = cores.getZkController().getZkStateReader().getBaseUrlForNodeName(nodename);
+    String url, uri = null;
+    if (cores.isZooKeeperAware()) {
+      url = cores.getZkController().getZkStateReader().getBaseUrlForNodeName(nodename);
+    } else {
+      url = getBaseUrlForNodeNameLocal(nodename);
+    }
     try {
-      String uri = url + PATH + "?wt=json&omitHeader=true";
+      uri += PATH + "?wt=json&omitHeader=true";
       log.debug("Fetching fresh public key from : {}",uri);
       HttpResponse rsp = cores.getUpdateShardHandler().getHttpClient()
           .execute(new HttpGet(uri), HttpClientUtil.createNewHttpClientRequestContext());
@@ -212,12 +219,41 @@ public class PKIAuthenticationPlugin extends AuthenticationPlugin implements Htt
       keyCache.put(nodename, pubKey);
       return pubKey;
     } catch (Exception e) {
-      log.error("Exception trying to get public key from : " + url, e);
+      log.error("Exception trying to get public key from : " + uri, e);
       return null;
     }
 
   }
 
+  protected String getBaseUrlForNodeNameLocal(String nodeName) {
+    final int _offset = nodeName.indexOf("_");
+    if (_offset < 0) {
+      throw new IllegalArgumentException("nodeName does not contain expected '_' seperator: " + nodeName);
+    }
+    final String hostAndPort = nodeName.substring(0,_offset);
+    try {
+      final String path = URLDecoder.decode(nodeName.substring(1+_offset), "UTF-8");
+      // TODO: Find a better way of resolving urlScheme when not using ZK?
+      String urlScheme = resolveUrlScheme();
+      return urlScheme + "://" + hostAndPort + (path.isEmpty() ? "" : ("/" + path));
+    } catch (UnsupportedEncodingException e) {
+      throw new IllegalStateException("JVM Does not seem to support UTF-8", e);
+    }
+  }
+
+  /**
+   * Resolve urlScheme first from sysProp "urlScheme", if not set or invalid value, peek at ssl sysProps
+   * @return "https" if SSL is enabled, else "http"
+   */
+  protected static String resolveUrlScheme() {
+    String urlScheme = System.getProperty("urlScheme");
+    if (urlScheme != null && urlScheme.matches("https?")) {
+      return urlScheme;
+    } else {
+      return System.getProperty("solr.jetty.keystore") == null ? "http" : "https";
+    }
+  }
+
   @Override
   public SolrHttpClientBuilder getHttpClientBuilder(SolrHttpClientBuilder builder) {
     HttpClientUtil.addRequestInterceptor(interceptor);

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/95d6fc25/solr/core/src/java/org/apache/solr/servlet/HttpSolrCall.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/servlet/HttpSolrCall.java b/solr/core/src/java/org/apache/solr/servlet/HttpSolrCall.java
index 4f6bae0..0dfb0ea 100644
--- a/solr/core/src/java/org/apache/solr/servlet/HttpSolrCall.java
+++ b/solr/core/src/java/org/apache/solr/servlet/HttpSolrCall.java
@@ -560,7 +560,7 @@ public class HttpSolrCall {
   }
 
   private boolean shouldAuthorize() {
-    if(PKIAuthenticationPlugin.PATH.equals(path)) return false;
+    if(path != null && path.endsWith(PKIAuthenticationPlugin.PATH)) return false;
     //admin/info/key is the path where public key is exposed . it is always unsecured
     if (cores.getPkiAuthenticationPlugin() != null && req.getUserPrincipal() != null) {
       boolean b = cores.getPkiAuthenticationPlugin().needsAuthorization(req);
@@ -1081,7 +1081,7 @@ public class HttpSolrCall {
           response.delete(response.length() - 1, response.length());
         
         response.append("], Path: [").append(resource).append("]");
-        response.append(" path : ").append(path).append(" params :").append(solrReq.getParams());
+        response.append(" path : ").append(path).append(" params :").append(solrReq == null ? null : solrReq.getParams());
         return response.toString();
       }
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/95d6fc25/solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java b/solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
index ce65069..4ce57b0 100644
--- a/solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
+++ b/solr/core/src/java/org/apache/solr/servlet/SolrDispatchFilter.java
@@ -402,11 +402,11 @@ public class SolrDispatchFilter extends BaseSolrFilter {
     if (authenticationPlugin == null) {
       return true;
     } else {
-      // /admin/info/key must be always open. see SOLR-9188
-      // tests work only w/ getPathInfo
-      //otherwise it's just enough to have getServletPath()
-      if (PKIAuthenticationPlugin.PATH.equals(((HttpServletRequest) request).getServletPath()) ||
-          PKIAuthenticationPlugin.PATH.equals(((HttpServletRequest) request).getPathInfo())) return true;
+      String requestUri = ((HttpServletRequest) request).getRequestURI();
+      if (requestUri != null && requestUri.endsWith(PKIAuthenticationPlugin.PATH)) {
+        log.debug("Passthrough of pki URL " + requestUri);
+        return true;
+      }
       String header = ((HttpServletRequest) request).getHeader(PKIAuthenticationPlugin.HEADER);
       if (header != null && cores.getPkiAuthenticationPlugin() != null)
         authenticationPlugin = cores.getPkiAuthenticationPlugin();
@@ -418,7 +418,6 @@ public class SolrDispatchFilter extends BaseSolrFilter {
           wrappedRequest.set(req);
         });
       } catch (Exception e) {
-        log.info("Error authenticating", e);
         throw new SolrException(ErrorCode.SERVER_ERROR, "Error during request authentication, ", e);
       }
     }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/95d6fc25/solr/core/src/test/org/apache/solr/security/BasicAuthDistributedTest.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/security/BasicAuthDistributedTest.java b/solr/core/src/test/org/apache/solr/security/BasicAuthDistributedTest.java
new file mode 100644
index 0000000..e35e369
--- /dev/null
+++ b/solr/core/src/test/org/apache/solr/security/BasicAuthDistributedTest.java
@@ -0,0 +1,136 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.solr.security;
+
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.Paths;
+
+import org.apache.lucene.util.LuceneTestCase.Slow;
+import org.apache.solr.BaseDistributedSearchTestCase;
+import org.apache.solr.client.solrj.embedded.JettySolrRunner;
+import org.apache.solr.client.solrj.impl.HttpSolrClient;
+import org.apache.solr.client.solrj.request.QueryRequest;
+import org.apache.solr.client.solrj.response.QueryResponse;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.util.Utils;
+import org.apache.solr.core.CoreContainer;
+import org.apache.solr.handler.admin.SecurityConfHandler;
+import org.apache.solr.handler.admin.SecurityConfHandlerLocalForTesting;
+import org.apache.solr.util.LogLevel;
+import org.junit.Test;
+
+/**
+ * Tests basicAuth in a multi shard env
+ */
+@Slow
+public class BasicAuthDistributedTest extends BaseDistributedSearchTestCase {
+  public BasicAuthDistributedTest() {
+    super();
+    schemaString = "schema.xml";
+  }
+
+  private SecurityConfHandlerLocalForTesting securityConfHandler;
+
+  @Test
+  @LogLevel("org.apache.solr=DEBUG")
+  public void test() throws Exception {
+    index();
+    testAuth();
+  }
+
+  private void index() throws Exception {
+    del("*:*");
+    indexr(id, "1", "text", "doc one");
+    indexr(id, "2", "text", "doc two");
+    indexr(id, "3", "text", "doc three");
+    indexr(id, "4", "text", "doc four");
+    indexr(id, "5", "text", "doc five");
+
+    commit();  // try to ensure there's more than one segment
+
+    indexr(id, "6", "text", "doc six");
+    indexr(id, "7", "text", "doc seven");
+    indexr(id, "8", "text", "doc eight");
+    indexr(id, "9", "text", "doc nine");
+    indexr(id, "10", "text", "doc ten");
+
+    commit();
+
+    handle.clear();
+    handle.put("QTime", SKIPVAL);
+    handle.put("timestamp", SKIPVAL);
+    handle.put("maxScore", SKIPVAL);
+    handle.put("_version_", SKIPVAL);
+  }
+
+  private void testAuth() throws Exception {
+    QueryResponse rsp = query("q","text:doc", "fl", "id,text", "sort", "id asc");
+    assertEquals(10, rsp.getResults().getNumFound());
+
+    // Enable authentication
+    for (JettySolrRunner j : jettys) {
+      writeSecurityJson(j.getCoreContainer());
+    }
+
+    HttpSolrClient.RemoteSolrException expected = expectThrows(HttpSolrClient.RemoteSolrException.class, () -> {
+      query("q","text:doc-fail", "fl", "id,text", "sort", "id asc");
+    });
+    assertEquals(401, expected.code());
+
+    // Add auth
+    ModifiableSolrParams params = new ModifiableSolrParams();
+    params.add("q", "text:doc").add("fl", "id,text").add("sort", "id asc");
+    QueryRequest req = new QueryRequest(params);
+    req.setBasicAuthCredentials("solr", "SolrRocks");
+    rsp = req.process(clients.get(0), null);
+    if (jettys.size() > 1) {
+      assertTrue(rsp.getResults().getNumFound() < 10);
+      rsp = query(true, params, "solr", "SolrRocks");
+    }
+    assertEquals(10, rsp.getResults().getNumFound());
+
+    // Disable auth
+    for (JettySolrRunner j : jettys) {
+      deleteSecurityJson(j.getCoreContainer());
+    }
+
+  }
+
+  private void deleteSecurityJson(CoreContainer coreContainer) throws IOException {
+    securityConfHandler = new SecurityConfHandlerLocalForTesting(coreContainer);
+    Files.delete(Paths.get(coreContainer.getSolrHome()).resolve("security.json"));
+    coreContainer.securityNodeChanged();
+  }
+
+  private void writeSecurityJson(CoreContainer coreContainer) throws IOException {
+    securityConfHandler = new SecurityConfHandlerLocalForTesting(coreContainer);
+    securityConfHandler.persistConf(new SecurityConfHandler.SecurityConfig()
+        .setData(Utils.fromJSONString(ALL_CONF.replaceAll("'", "\""))));
+    coreContainer.securityNodeChanged();
+  }
+
+  protected static final String ALL_CONF = "{\n" +
+      "  'authentication':{\n" +
+      "    'blockUnknown':true,\n" +
+      "    'class':'solr.BasicAuthPlugin',\n" +
+      "    'credentials':{'solr':'orwp2Ghgj39lmnrZOTm7Qtre1VqHFDfwAEzr0ApbN3Y= Ju5osoAqOX8iafhWpPP01E5P+sg8tK8tHON7rCYZRRw='}},\n" +
+      "  'authorization':{\n" +
+      "    'class':'solr.RuleBasedAuthorizationPlugin',\n" +
+      "    'user-role':{'solr':'admin'},\n" +
+      "    'permissions':[{'name':'all','role':'admin'}]}}";
+}

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/95d6fc25/solr/core/src/test/org/apache/solr/security/TestPKIAuthenticationPlugin.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/security/TestPKIAuthenticationPlugin.java b/solr/core/src/test/org/apache/solr/security/TestPKIAuthenticationPlugin.java
index a5a279f..90c5bd2 100644
--- a/solr/core/src/test/org/apache/solr/security/TestPKIAuthenticationPlugin.java
+++ b/solr/core/src/test/org/apache/solr/security/TestPKIAuthenticationPlugin.java
@@ -35,7 +35,11 @@ import org.apache.solr.request.LocalSolrQueryRequest;
 import org.apache.solr.request.SolrRequestInfo;
 import org.apache.solr.response.SolrQueryResponse;
 import org.apache.solr.util.CryptoKeys;
-import static org.mockito.Mockito.*;
+import org.junit.Test;
+
+import static org.mockito.Mockito.any;
+import static org.mockito.Mockito.mock;
+import static org.mockito.Mockito.when;
 
 public class TestPKIAuthenticationPlugin extends SolrTestCaseJ4 {
 
@@ -141,10 +145,38 @@ public class TestPKIAuthenticationPlugin extends SolrTestCaseJ4 {
     mock1.doAuthenticate(mockReq, null,filterChain );
     assertNotNull(wrappedRequestByFilter.get());
     assertEquals("$", ((HttpServletRequest) wrappedRequestByFilter.get()).getUserPrincipal().getName());
+  }
 
+  @Test
+  public void testGetBaseUrlForNodeNameLocal() {
+    synchronized (this) {
+      final MockPKIAuthenticationPlugin mock = new MockPKIAuthenticationPlugin(null, "myName");
+      System.clearProperty("solr.jetty.keystore");
+      assertEquals("http://my.host:9876/solr2", mock.getBaseUrlForNodeNameLocal("my.host:9876_solr2"));
+      System.setProperty("solr.jetty.keystore", "foo");
+      assertEquals("https://my.host:9876/solr2", mock.getBaseUrlForNodeNameLocal("my.host:9876_solr2"));
+      System.clearProperty("solr.jetty.keystore");
+    }
+  }
 
-
-
+  @Test
+  public void testResolveUrlScheme() {
+    synchronized (this) {
+      System.clearProperty("urlScheme");
+      System.clearProperty("solr.jetty.keystore");
+      assertEquals("http", MockPKIAuthenticationPlugin.resolveUrlScheme());
+      System.setProperty("urlScheme", "http");
+      assertEquals("http", MockPKIAuthenticationPlugin.resolveUrlScheme());
+      System.setProperty("urlScheme", "https");
+      assertEquals("https", MockPKIAuthenticationPlugin.resolveUrlScheme());
+      System.setProperty("urlScheme", "ftp");
+      System.clearProperty("solr.jetty.keystore");
+      assertEquals("http", MockPKIAuthenticationPlugin.resolveUrlScheme());
+      System.setProperty("solr.jetty.keystore", "foo");
+      assertEquals("https", MockPKIAuthenticationPlugin.resolveUrlScheme());
+      System.clearProperty("urlScheme");
+      System.clearProperty("solr.jetty.keystore");
+    }
   }
 
   private HttpServletRequest createMockRequest(final AtomicReference<Header> header) {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/95d6fc25/solr/test-framework/src/java/org/apache/solr/BaseDistributedSearchTestCase.java
----------------------------------------------------------------------
diff --git a/solr/test-framework/src/java/org/apache/solr/BaseDistributedSearchTestCase.java b/solr/test-framework/src/java/org/apache/solr/BaseDistributedSearchTestCase.java
index 8c6eb60..bbfc048 100644
--- a/solr/test-framework/src/java/org/apache/solr/BaseDistributedSearchTestCase.java
+++ b/solr/test-framework/src/java/org/apache/solr/BaseDistributedSearchTestCase.java
@@ -50,6 +50,7 @@ import org.apache.solr.client.solrj.SolrServerException;
 import org.apache.solr.client.solrj.embedded.JettyConfig;
 import org.apache.solr.client.solrj.embedded.JettySolrRunner;
 import org.apache.solr.client.solrj.impl.HttpSolrClient;
+import org.apache.solr.client.solrj.request.QueryRequest;
 import org.apache.solr.client.solrj.request.UpdateRequest;
 import org.apache.solr.client.solrj.response.QueryResponse;
 import org.apache.solr.client.solrj.response.UpdateResponse;
@@ -558,6 +559,12 @@ public abstract class BaseDistributedSearchTestCase extends SolrTestCaseJ4 {
     return rsp;
   }
 
+  protected QueryResponse queryServer(QueryRequest req) throws IOException, SolrServerException {
+    int which = r.nextInt(clients.size());
+    SolrClient client = clients.get(which);
+    return req.process(client, null);
+  }
+
   /**
    * Sets distributed params.
    * Returns the QueryResponse from {@link #queryServer},
@@ -591,18 +598,31 @@ public abstract class BaseDistributedSearchTestCase extends SolrTestCaseJ4 {
    * Returns the QueryResponse from {@link #queryServer}  
    */
   protected QueryResponse query(boolean setDistribParams, SolrParams p) throws Exception {
+    return query(setDistribParams, p, null, null);
+  }
+
+  /**
+   * Returns the QueryResponse from {@link #queryServer}
+   * @param setDistribParams whether to do a distributed request
+   * @param user basic auth username (set to null if not in use)
+   * @param pass basic auth password (set to null if not in use)
+   * @return the query response
+   */
+  protected QueryResponse query(boolean setDistribParams, SolrParams p, String user, String pass) throws Exception {
     
     final ModifiableSolrParams params = new ModifiableSolrParams(p);
 
     // TODO: look into why passing true causes fails
     params.set("distrib", "false");
-    final QueryResponse controlRsp = controlClient.query(params);
+    QueryRequest req = generateQueryRequest(params, user, pass);
+    final QueryResponse controlRsp = req.process(controlClient, null);
     validateControlData(controlRsp);
 
     params.remove("distrib");
     if (setDistribParams) setDistributedParams(params);
+    req = generateQueryRequest(params, user, pass);
 
-    QueryResponse rsp = queryServer(params);
+    QueryResponse rsp = queryServer(req);
 
     compareResponses(rsp, controlRsp);
 
@@ -617,7 +637,8 @@ public abstract class BaseDistributedSearchTestCase extends SolrTestCaseJ4 {
               int which = r.nextInt(clients.size());
               SolrClient client = clients.get(which);
               try {
-                QueryResponse rsp = client.query(new ModifiableSolrParams(params));
+                QueryRequest qreq = generateQueryRequest(new ModifiableSolrParams(params), user, pass);
+                QueryResponse rsp = qreq.process(client, null);
                 if (verifyStress) {
                   compareResponses(rsp, controlRsp);
                 }
@@ -636,7 +657,15 @@ public abstract class BaseDistributedSearchTestCase extends SolrTestCaseJ4 {
     }
     return rsp;
   }
-  
+
+  private QueryRequest generateQueryRequest(ModifiableSolrParams params, String user, String pass) {
+    QueryRequest req = new QueryRequest(params);
+    if (user != null && pass != null) {
+      req.setBasicAuthCredentials(user, pass);
+    }
+    return req;
+  }
+
   public QueryResponse queryAndCompare(SolrParams params, SolrClient... clients) throws SolrServerException, IOException {
     return queryAndCompare(params, Arrays.<SolrClient>asList(clients));
   }


[32/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-7453: Remove replication & backup scripts in the solr/scripts directory of the checkout

Posted by ab...@apache.org.
SOLR-7453: Remove replication & backup scripts in the solr/scripts directory of the checkout


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/0f5875b7
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/0f5875b7
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/0f5875b7

Branch: refs/heads/jira/solr-9858
Commit: 0f5875b735d889ad41f22315b00ba5451ac9ad1a
Parents: 0c1fde6
Author: Varun Thacker <va...@apache.org>
Authored: Mon Feb 27 17:40:57 2017 -0800
Committer: Varun Thacker <va...@apache.org>
Committed: Mon Feb 27 17:40:57 2017 -0800

----------------------------------------------------------------------
 solr/CHANGES.txt                |   2 +
 solr/scripts/README.txt         |  13 --
 solr/scripts/abc                | 159 ---------------------
 solr/scripts/abo                | 158 ---------------------
 solr/scripts/backup             | 109 ---------------
 solr/scripts/backupcleaner      | 134 ------------------
 solr/scripts/commit             | 109 ---------------
 solr/scripts/optimize           | 109 ---------------
 solr/scripts/rsyncd-disable     |  77 -----------
 solr/scripts/rsyncd-enable      |  76 ----------
 solr/scripts/rsyncd-start       | 147 --------------------
 solr/scripts/rsyncd-stop        | 105 --------------
 solr/scripts/scripts-util       | 141 -------------------
 solr/scripts/snapcleaner        | 146 --------------------
 solr/scripts/snapinstaller      | 190 -------------------------
 solr/scripts/snappuller         | 261 -----------------------------------
 solr/scripts/snappuller-disable |  77 -----------
 solr/scripts/snappuller-enable  |  77 -----------
 solr/scripts/snapshooter        | 128 -----------------
 19 files changed, 2 insertions(+), 2216 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index 99a0a42..0f1cac5 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -252,6 +252,8 @@ Other Changes
 * SOLR-9450: The docs/ folder in the binary distribution now contains a single index.html file linking
   to the online documentation, reducing the size of the download (janhoy, Shawn Heisey, Uwe Schindler)
 
+* SOLR-7453: Remove replication & backup scripts in the solr/scripts directory of the checkout (Varun Thacker)
+
 ==================  6.4.2 ==================
 
 Consult the LUCENE_CHANGES.txt file for additional, low level, changes in this release.

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/README.txt
----------------------------------------------------------------------
diff --git a/solr/scripts/README.txt b/solr/scripts/README.txt
deleted file mode 100644
index fb61e7b..0000000
--- a/solr/scripts/README.txt
+++ /dev/null
@@ -1,13 +0,0 @@
-This directory contains shell scripts which provided the original
-replication & backup functionality dating back to Solr 1.1.  These
-scripts only work on systems that support removing open hard links and
-were superseded by the ReplicationHandler in Solr 1.4.
-
-These scripts are no longer actively maintained, improved, or tested,
-but they have been left in the source tree for use by legacy users who
-are satisfied with their basic functionality. 
-
-For more information on how these scripts can be used, please consult
-the wiki... 
-
-https://wiki.apache.org/solr/CollectionDistribution

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/abc
----------------------------------------------------------------------
diff --git a/solr/scripts/abc b/solr/scripts/abc
deleted file mode 100755
index 1031ef1..0000000
--- a/solr/scripts/abc
+++ /dev/null
@@ -1,159 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to make an Atomic Backup after Commit of
-# a Solr Lucene collection.
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-curl_url=""
-
-unset solr_hostname solr_port data_dir webapp_name user verbose debug solr_url
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/${prog}.log
-
-# define usage string
-USAGE="\
-usage: $prog [-h hostname] [-p port] [-d dir] [-w webapp_name] [-u username] [-U url] [-v] [-V]
-       -h          specify Solr hostname (defaults to localhost)
-       -p          specify Solr port number
-       -w          specify name of Solr webapp (defaults to solr)
-       -u          specify user to sudo to before running script
-       -U          specify full update url (overrides -h,-p,-w parameters)
-       -d          specify directory holding index data (defaults to data)
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts h:p:d:w:u:U:vV OPTION
-do
-    case $OPTION in
-    h)
-        solr_hostname="$OPTARG"
-        ;;
-    p)
-        solr_port="$OPTARG"
-        ;;
-    d)
-        data_dir="$OPTARG"
-        ;;
-    w)
-        webapp_name="$OPTARG"
-        ;;
-    u)
-        user="$OPTARG"
-        ;;
-    U)
-        solr_url="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-dataDir
-
-curlUrl
-
-fixUser "$@"
-
-setStartTime
-
-logMessage started by $oldwhoami
-logMessage command: $0 $@
-
-logMessage sending commit to Solr server at ${curl_url}
-rs=`curl ${curl_url} -s -H 'Content-type:text/xml; charset=utf-8' -d "<commit/>"`
-if [[ $? != 0 ]]
-then
-  logMessage failed to connect to Solr server at ${curl_url}
-  logMessage commit failed
-  logExit failed 1
-fi
-
-# check status of commit request - original format
-echo $rs | grep '<result.*status="0"' > /dev/null 2>&1
-if [[ $? != 0 ]]
-then
-# check status of commit request - new format
-  echo $rs | grep '<lst name="responseHeader"><int name="status">0</int>' > /dev/null 2>&1
-  if [[ $? != 0 ]]
-  then
-    logMessage commit request to Solr at ${curl_url} failed:
-    logMessage $rs
-    logExit failed 2
-  fi
-fi
-
-# successful commit creates a snapshot file synchronously
-lastsnap=`find ${data_dir} -type d -name 'snapshot.*' 2>/dev/null| sort -r | head -1`
-
-if [[ $lastsnap == "" ]]
-then
-  logMessage commit did not create snapshot at ${curl_url}, backup failed:
-  logExit failed 3
-fi
-
-name=backup.${lastsnap##*snapshot.}
-temp=temp-${name}
-
-if [[ -d ${data_dir}/${name} ]]
-then
-    logMessage backup directory ${data_dir}/${name} already exists
-    logExit aborted 1
-fi
-
-if [[ -d ${data_dir}/${temp} ]]
-then
-    logMessage backingup of ${data_dir}/${name} in progress
-    logExit aborted 1
-fi
-logMessage making backup ${data_dir}/${name}
-
-# clean up after INT/TERM
-trap 'echo cleaning up, please wait ...;/bin/rm -rf ${data_dir}/${name} ${data_dir}/${temp};logExit aborted 13' INT TERM
-
-# make a backup using hard links into temporary location
-# then move it into place atomically
-if [[ "${OS}" == "SunOS" || "${OS}" == "Darwin" || "${OS}" == "FreeBSD" ]]
-then
-  orig_dir=$(pwd)
-  mkdir ${data_dir}/${temp}
-  cd ${lastsnap}
-  find . -print|cpio -pdlmu ${data_dir}/${temp} 1>/dev/null 2>&1
-  cd ${orig_dir}
-else
-  cp -lr ${lastsnap} ${data_dir}/${temp}
-fi
-mv ${data_dir}/${temp} ${data_dir}/${name}
-
-logExit ended 0

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/abo
----------------------------------------------------------------------
diff --git a/solr/scripts/abo b/solr/scripts/abo
deleted file mode 100755
index 144ae8d..0000000
--- a/solr/scripts/abo
+++ /dev/null
@@ -1,158 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to make an Atomic Backup after Optimize of
-# a Solr Lucene collection.
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset solr_hostname solr_port data_dir webapp_name user verbose debug solr_url
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/${prog}.log
-
-# define usage string
-USAGE="\
-usage: $prog [-h hostname] [-p port] [-d dir] [-w webapp_name] [-u username] [-U url] [-v] [-V]
-       -h          specify Solr hostname (defaults to localhost)
-       -p          specify Solr port number
-       -w          specify name of Solr webapp (defaults to solr)
-       -u          specify user to sudo to before running script
-       -U          specify full update url (overrides -h,-p,-w parameters)
-       -d          specify directory holding index data (defaults to data)
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts h:p:d:w:u:U:vV OPTION
-do
-    case $OPTION in
-    h)
-        solr_hostname="$OPTARG"
-        ;;
-    p)
-        solr_port="$OPTARG"
-        ;;
-    d)
-        data_dir="$OPTARG"
-        ;;
-    w)
-        webapp_name="$OPTARG"
-        ;;
-    u)
-        user="$OPTARG"
-        ;;
-    U)
-        solr_url="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-dataDir
-
-curlUrl
-
-fixUser "$@"
-
-setStartTime
-
-logMessage started by $oldwhoami
-logMessage command: $0 $@
-
-logMessage sending optimize to Solr server at ${curl_url}
-rs=`curl ${curl_url} -s -H 'Content-type:text/xml; charset=utf-8' -d "<optimize/>"`
-if [[ $? != 0 ]]
-then
-  logMessage failed to connect to Solr server at ${curl_url}
-  logMessage optimize failed
-  logExit failed 1
-fi
-
-# check status of optimize request - original format
-echo $rs | grep '<result.*status="0"' > /dev/null 2>&1
-if [[ $? != 0 ]]
-then
-# check status of optimize request - new format
-  echo $rs | grep '<lst name="responseHeader"><int name="status">0</int>' > /dev/null 2>&1
-  if [[ $? != 0 ]]
-  then
-    logMessage optimize request to Solr at ${curl_url} failed:
-    logMessage $rs
-    logExit failed 2
-  fi
-fi
-
-# successful optimize creates a snapshot file synchronously
-lastsnap=`find ${data_dir} -type d -name 'snapshot.*' 2>/dev/null| sort -r | head -1`
-
-if [[ $lastsnap == "" ]]
-then
-  logMessage commit did not create snapshot at ${curl_url}, backup failed:
-  logExit failed 3
-fi
-
-name=backup.${lastsnap##*snapshot.}
-temp=temp-${name}
-
-if [[ -d ${data_dir}/${name} ]]
-then
-    logMessage backup directory ${data_dir}/${name} already exists
-    logExit aborted 1
-fi
-
-if [[ -d ${data_dir}/${temp} ]]
-then
-    logMessage backingup of ${data_dir}/${name} in progress
-    logExit aborted 1
-fi
-logMessage making backup ${data_dir}/${name}
-
-# clean up after INT/TERM
-trap 'echo cleaning up, please wait ...;/bin/rm -rf ${data_dir}/${name} ${data_dir}/${temp};logExit aborted 13' INT TERM
-
-# make a backup using hard links into temporary location
-# then move it into place atomically
-if [[ "${OS}" == "SunOS" || "${OS}" == "Darwin" || "${OS}" == "FreeBSD" ]]
-then
-  orig_dir=$(pwd)
-  mkdir ${data_dir}/${temp}
-  cd ${lastsnap}
-  find . -print|cpio -pdlmu ${data_dir}/${temp} 1>/dev/null 2>&1
-  cd ${orig_dir}
-else
-  cp -lr ${lastsnap} ${data_dir}/${temp}
-fi
-mv ${data_dir}/${temp} ${data_dir}/${name}
-
-logExit ended 0

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/backup
----------------------------------------------------------------------
diff --git a/solr/scripts/backup b/solr/scripts/backup
deleted file mode 100755
index 633d7ac..0000000
--- a/solr/scripts/backup
+++ /dev/null
@@ -1,109 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to make a backup of a Solr Lucene collection.
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset data_dir user verbose debug
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/${prog}.log
-
-# define usage string
-USAGE="\
-usage: $prog [-d dir] [-u username] [-v] [-V]
-       -d          specify directory holding index data
-       -u          specify user to sudo to before running script
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts d:u:vV OPTION
-do
-    case $OPTION in
-    d)
-        data_dir="$OPTARG"
-        ;;
-    u)
-        user="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-dataDir
-
-fixUser "$@"
-
-setStartTime
-
-logMessage started by $oldwhoami
-logMessage command: $0 $@
-
-name=backup.`date +"%Y%m%d%H%M%S"`
-temp=temp-${name}
-
-if [[ -d ${data_dir}/${name} ]]
-then
-    logMessage backup directory ${data_dir}/${name} already exists
-    logExit aborted 1
-fi
-
-if [[ -d ${data_dir}/${temp} ]]
-then
-    logMessage backingup of ${data_dir}/${name} in progress
-    logExit aborted 1
-fi
-
-# clean up after INT/TERM
-trap 'echo cleaning up, please wait ...;/bin/rm -rf ${data_dir}/${name} ${data_dir}/${temp};logExit aborted 13' INT TERM
-
-logMessage making backup ${data_dir}/${name}
-
-# make a backup using hard links into temporary location
-# then move it into place atomically
-if [[ "${OS}" == "SunOS" || "${OS}" == "Darwin" || "${OS}" == "FreeBSD" ]]
-then
-  orig_dir=$(pwd)
-  mkdir ${data_dir}/${temp}
-  cd ${data_dir}/index
-  find . -print|cpio -pdlmu ${data_dir}/${temp} 1>/dev/null 2>&1
-  cd ${orig_dir}
-else
-  cp -lr ${data_dir}/index ${data_dir}/${temp}
-fi
-mv ${data_dir}/${temp} ${data_dir}/${name}
-
-logExit ended 0
-

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/backupcleaner
----------------------------------------------------------------------
diff --git a/solr/scripts/backupcleaner b/solr/scripts/backupcleaner
deleted file mode 100755
index 360e734..0000000
--- a/solr/scripts/backupcleaner
+++ /dev/null
@@ -1,134 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to clean up backups of a Solr Lucene collection.
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset days num data_dir user verbose debug
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/${prog}.log
-
-# define usage string
-USAGE="\
-usage: $prog -D <days> | -N <num> [-d dir] [-u username] [-v] [-V]
-       -D <days>   cleanup backups more than <days> days old
-       -N <num>    keep the most recent <num> number of backups and
-                   cleanup up the remaining ones that are not being pulled
-       -d          specify directory holding index data
-       -u          specify user to sudo to before running script
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts D:N:d:u:vV OPTION
-do
-    case $OPTION in
-    D)
-        days="$OPTARG"
-        ;;
-    N)
-        num="$OPTARG"
-        ;;
-    d)
-        data_dir="$OPTARG"
-        ;;
-    u)
-        user="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-if [[ -z ${days} && -z ${num} ]]
-then
-    echo "$USAGE"
-    exit 1
-fi
-
-fixUser "$@"
-
-dataDir
-
-function remove
-{
-    logMessage removing backup $1
-    /bin/rm -rf $1
-}
-
-setStartTime
-
-logMessage started by $oldwhoami
-logMessage command: $0 $@
-
-# trap control-c
-trap 'echo "caught INT/TERM, exiting now but partial cleanup may have already occured";logExit aborted 13' INT TERM
-
-if [[ -n ${days} ]]
-then
-    #is maxdepth supported?
-    find ${data_dir} -maxdepth 0 -name foobar >/dev/null 2>&1
-    if [ $? = 0 ]; then
-      maxdepth="-maxdepth 1"
-    else
-      unset maxdepth
-    fi
-  
-    logMessage cleaning up backups more than ${days} days old
-    for i in `find ${data_dir} ${maxdepth} -name 'backup.*' -mtime +${days} -print`
-    do
-        remove $i
-    done
-elif [[ -n ${num} ]]
-then
-    logMessage cleaning up all backups except for the most recent ${num} ones
-    unset backups count
-    backups=`find ${data_dir} -type d -name 'backup.*' 2>/dev/null| sort -r`
-    if [[ $? == 0 ]]
-    then
-        count=`echo $backups|wc -w`
-        startpos=`expr $num + 1`
-        if [[ $count -gt $num ]]
-        then
-            for i in `echo $backups|cut -f${startpos}- -d" "`
-            do
-	        remove $i
-	    done
-        fi
-    fi
-fi
-
-logExit ended 0
-
-

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/commit
----------------------------------------------------------------------
diff --git a/solr/scripts/commit b/solr/scripts/commit
deleted file mode 100755
index c73c042..0000000
--- a/solr/scripts/commit
+++ /dev/null
@@ -1,109 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to force a commit of all changes since last commit
-# for a Solr server
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset solr_hostname solr_port webapp_name user verbose debug solr_url
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/${prog}.log
-
-# define usage string
-USAGE="\
-usage: $prog [-h hostname] [-p port] [-w webapp_name] [-u username] [-U url] [-v] [-V]
-       -h          specify Solr hostname (defaults to localhost)
-       -p          specify Solr port number
-       -w          specify name of Solr webapp (defaults to solr)
-       -u          specify user to sudo to before running script
-       -U          specify full update url (overrides -h,-p,-w parameters)
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts h:p:w:u:U:vV OPTION
-do
-    case $OPTION in
-    h)
-        solr_hostname="$OPTARG"
-        ;;
-    p)
-        solr_port="$OPTARG"
-        ;;
-    w)
-        webapp_name="$OPTARG"
-        ;;
-    u)
-        user="$OPTARG"
-        ;;
-    U)
-        solr_url="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-curlUrl
-
-fixUser "$@"
-
-setStartTime
-
-logMessage started by $oldwhoami
-logMessage command: $0 $@
-
-rs=`curl ${curl_url} -s -H 'Content-type:text/xml; charset=utf-8' -d "<commit/>"`
-if [[ $? != 0 ]]
-then
-  logMessage failed to connect to Solr server at ${curl_url}
-  logMessage commit failed
-  logExit failed 1
-fi
-
-# check status of commit request - original format
-echo $rs | grep '<result.*status="0"' > /dev/null 2>&1
-if [[ $? != 0 ]]
-then
-# check status of commit request - new format
-  echo $rs | grep '<lst name="responseHeader"><int name="status">0</int>' > /dev/null 2>&1
-  if [[ $? != 0 ]]
-  then
-    logMessage commit request to Solr at ${curl_url} failed:
-    logMessage $rs
-    logExit failed 2
-  fi
-fi
-
-logExit ended 0

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/optimize
----------------------------------------------------------------------
diff --git a/solr/scripts/optimize b/solr/scripts/optimize
deleted file mode 100755
index 6620607..0000000
--- a/solr/scripts/optimize
+++ /dev/null
@@ -1,109 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to force a optimized commit of all changes since last commit
-# for a Solr server
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset solr_hostname solr_port webapp_name user verbose debug solr_url
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/${prog}.log
-
-# define usage string
-USAGE="\
-usage: $prog [-h hostname] [-p port] [-w webapp_name] [-u username] [-U url] [-v] [-V]
-       -h          specify Solr hostname (defaults to localhost)
-       -p          specify Solr port number
-       -w          specify name of Solr webapp (defaults to solr)
-       -u          specify user to sudo to before running script
-       -U          specify full update url (overrides -h,-p,-w parameters)
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts h:p:w:u:U:vV OPTION
-do
-    case $OPTION in
-    h)
-        solr_hostname="$OPTARG"
-        ;;
-    p)
-        solr_port="$OPTARG"
-        ;;
-    w)
-        webapp_name="$OPTARG"
-        ;;
-    u)
-        user="$OPTARG"
-        ;;
-    U)
-        solr_url="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-curlUrl
-
-fixUser "$@"
-
-setStartTime
-
-logMessage started by $oldwhoami
-logMessage command: $0 $@
-
-rs=`curl ${curl_url} -s -H 'Content-type:text/xml; charset=utf-8' -d "<optimize/>"`
-if [[ $? != 0 ]]
-then
-  logMessage failed to connect to Solr server at ${curl_url}
-  logMessage optimize failed
-  logExit failed 1
-fi
-
-# check status of optimize request - original format
-rc=`echo $rs|cut -f2 -d'"'`
-if [[ $? != 0 ]]
-then
-# check status of optimize request - new format
-  echo $rs | grep '<lst name="responseHeader"><int name="status">0</int>' > /dev/null 2>&1
-  if [[ $? != 0 ]]
-  then
-    logMessage optimize request to Solr at ${curl_url} failed:
-    logMessage $rs
-    logExit failed 2
-  fi
-fi
-
-logExit ended 0

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/rsyncd-disable
----------------------------------------------------------------------
diff --git a/solr/scripts/rsyncd-disable b/solr/scripts/rsyncd-disable
deleted file mode 100755
index 8e6c569..0000000
--- a/solr/scripts/rsyncd-disable
+++ /dev/null
@@ -1,77 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to disable rsyncd
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset user verbose debug
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/rsyncd.log
-
-# define usage string
-USAGE="\
-usage: $prog [-u username] [-v] [-V]
-       -u          specify user to sudo to before running script
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts u:vV OPTION
-do
-    case $OPTION in
-    u)
-        user="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-fixUser "$@"
-
-setStartTime
-
-logMessage disabled by $oldwhoami
-logMessage command: $0 $@
-name=${solr_root}/logs/rsyncd-enabled
-
-if [[ -f ${name} ]]
-then
-    rm -f ${name}
-else
-    logMessage rsyncd not currently enabled
-    logExit exited 1
-fi
-
-logExit ended 0

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/rsyncd-enable
----------------------------------------------------------------------
diff --git a/solr/scripts/rsyncd-enable b/solr/scripts/rsyncd-enable
deleted file mode 100755
index 075368d..0000000
--- a/solr/scripts/rsyncd-enable
+++ /dev/null
@@ -1,76 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to enable rsyncd
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset user verbose debug
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-log=${solr_root}/logs/rsyncd.log
-
-# define usage string
-USAGE="\
-usage: $prog [-u username] [-v] [-V]
-       -u          specify user to sudo to before running script
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts u:vV OPTION
-do
-    case $OPTION in
-    u)
-        user="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-fixUser "$@"
-
-setStartTime
-
-logMessage enabled by $oldwhoami
-logMessage command: $0 $@
-name=${solr_root}/logs/rsyncd-enabled
-
-if [[ -f ${name} ]]
-then
-    logMessage rsyncd already currently enabled
-    logExit exited 1
-else
-    touch ${name}
-fi
-
-logExit ended 0

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/rsyncd-start
----------------------------------------------------------------------
diff --git a/solr/scripts/rsyncd-start b/solr/scripts/rsyncd-start
deleted file mode 100755
index 08929ce..0000000
--- a/solr/scripts/rsyncd-start
+++ /dev/null
@@ -1,147 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to start rsyncd on master Solr server
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset data_dir solr_port rsyncd_port user rsyncd_bwlimit verbose debug
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/rsyncd.log
-
-# define usage string
-USAGE="\
-usage: $prog [-d dir] [-p portnum] [-u username] [-b kbps] [-v] [-V]
-       -d          specify directory holding index data
-       -p          specify rsyncd port number
-       -u          specify user to sudo to before running script
-       -b          specify a max transfer rate in kilobytes per second (defaults to 0 (no limit))
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts d:p:u:b:vV OPTION
-do
-    case $OPTION in
-    d)
-        data_dir="$OPTARG"
-        ;;
-    p)
-        rsyncd_port="$OPTARG"
-        ;;
-    u)
-        user="$OPTARG"
-        ;;
-    b)
-        rsyncd_bwlimit="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-fixUser "$@"
-
-# try to determine rsyncd port number from $confFile if not specified on
-# command line, default to solr_port+10000
-if [[ -z ${rsyncd_port} ]]
-then
-    if [[ "${solr_port}" ]]
-    then
-        rsyncd_port=`expr 10000 + ${solr_port}`
-    else
-        echo "rsyncd port number missing in $confFile or command line."
-        echo "$USAGE"
-        exit 1
-    fi
-fi
-
-# Set bwlimit to unlimited by default
-if [[ -z ${rsyncd_bwlimit} ]]
-then
-       rsyncd_bwlimit='0'
-fi
-
-dataDir
-
-logMessage started by $oldwhoami
-logMessage command: $0 $@
-
-if [[ ! -f ${solr_root}/logs/rsyncd-enabled ]]
-then
-    logMessage rsyncd disabled
-    exit 2
-fi
-
-if \
-    rsync rsync://localhost:${rsyncd_port} >/dev/null 2>&1
-then
-    logMessage "rsyncd already running at port ${rsyncd_port}"
-    exit 1
-fi
-
-# create conf/rsyncd.conf on the fly, creating solrlogs directory if needed
-if [[ ! -d ${solr_root}/conf ]]
-then
-    mkdir ${solr_root}/conf
-fi
-cat <<EOF > ${solr_root}/conf/rsyncd.conf
-#### rsyncd.conf file ####
- 
-uid = $(whoami)
-gid = $(whoami)
-use chroot = no
-list = no
-pid file = ${solr_root}/logs/rsyncd.pid
-log file = ${solr_root}/logs/rsyncd.log
-[solr]
-    path = ${data_dir}
-    comment = Solr
-EOF
-
-rsync --daemon --port=${rsyncd_port} --bwlimit=${rsyncd_bwlimit} --config=${solr_root}/conf/rsyncd.conf
-
-# first make sure rsyncd is accepting connections
-i=1
-while \
- ! rsync rsync://localhost:${rsyncd_port} >/dev/null 2>&1
-do
-    if (( i++ > 15 ))
-    then
-        logMessage "rsyncd not accepting connections, exiting" >&2
-        exit 2
-    fi
-    sleep 1
-done
-
-logMessage rsyncd started with data_dir=${data_dir} and accepting requests

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/rsyncd-stop
----------------------------------------------------------------------
diff --git a/solr/scripts/rsyncd-stop b/solr/scripts/rsyncd-stop
deleted file mode 100755
index 4a5899d..0000000
--- a/solr/scripts/rsyncd-stop
+++ /dev/null
@@ -1,105 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to stop rsyncd on master Solr server
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset user verbose debug
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/rsyncd.log
-
-# define usage string
-USAGE="\
-usage: $prog [-u username] [-v] [-V]
-       -u          specify user to sudo to before running script
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts u:vV OPTION
-do
-    case $OPTION in
-    u)
-        user="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-fixUser "$@"
-
-logMessage stopped by $oldwhoami
-logMessage command: $0 $@
-
-# look for pid file
-if [[ ! -f ${solr_root}/logs/rsyncd.pid ]]
-then
-    logMessage "missing rsyncd pid file ${solr_root}/logs/rsyncd.pid"
-    exit 2
-fi
-
-# get PID from file
-pid=$(<${solr_root}/logs/rsyncd.pid)
-if [[ -z $pid ]]
-then
-    logMessage "unable to get rsyncd's PID"
-    exit 2
-fi
-
-kill $pid
-
-# wait until rsyncd dies or we time out
-dead=0
-timer=0
-timeout=300
-while (( ! dead && timer < timeout ))
-do
-    if ps -eo pid | grep -qw $pid
-    then
-	kill $pid
-        (( timer++ ))
-        sleep 1
-    else
-        dead=1
-    fi
-done
-if ps -eo pid | grep -qw $pid
-then
-    logMessage rsyncd failed to stop after $timeout seconds
-    exit 3
-fi
-
-# remove rsyncd.conf
-/bin/rm -f ${solr_root}/conf/rsyncd.conf

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/scripts-util
----------------------------------------------------------------------
diff --git a/solr/scripts/scripts-util b/solr/scripts/scripts-util
deleted file mode 100755
index 05441e0..0000000
--- a/solr/scripts/scripts-util
+++ /dev/null
@@ -1,141 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# util functions used by scripts
-
-export PATH=/sbin:/usr/sbin:/bin:/usr/bin:$PATH
-
-# set up variables
-prog=${0##*/}
-OS=`uname`
-
-# source the config file if present
-confFile=${solr_root}/conf/scripts.conf
-if [[ -f $confFile ]]
-then
-    . $confFile
-fi
-
-function fixUser
-{
-# set user to $(whoami) if not specified
-    if [[ -z ${user} ]]
-    then
-        user=$(whoami)
-    fi
-
-# sudo
-    if [[ $(whoami) != ${user} ]]
-    then
-        sudo -u ${user} $0 "$@"
-        exit $?
-    fi
-
-    oldwhoami=$(who -m | cut -d' ' -f1 | sed -e's/^.*!//')
-
-    if [[ "${oldwhoami}" == "" ]]
-    then
-        oldwhoami=`ps h -Hfp $(pgrep -f -g0 $0) | tail -1|cut -f1 -d" "`
-    fi
-}
-
-function setStartTime
-{
-    if [[ "${OS}" == "SunOS" ]]
-    then
-        start=`perl -e "print time;"`
-    else
-        start=`date +"%s"`
-    fi
-}
-
-function timeStamp
-{
-    date +'%Y/%m/%d %H:%M:%S'
-}
-
-function curlUrl
-{
-    curl_url=""
-    if [[ -n ${solr_url} ]]
-    then
-      curl_url=${solr_url}
-    else
-      if [[ -z ${solr_port} ]]
-      then
-        echo "Solr port number missing in $confFile or command line."
-        echo "$USAGE"
-        exit 1
-      fi
-
-      # use default hostname if not specified
-      if [[ -z ${solr_hostname} ]]
-      then
-        solr_hostname=localhost
-      fi
-
-      # use default webapp name if not specified
-      if [[ -z ${webapp_name} ]]
-      then
-        webapp_name=solr
-      fi
-      curl_url=http://${solr_hostname}:${solr_port}/${webapp_name}/update
-    fi
-}
-
-function dataDir
-{
-    # use default value for data_dir if not specified
-    # relative path starts at ${solr_root}
-    if [[ -z ${data_dir} ]]
-    then
-        data_dir=${solr_root}/data
-    elif [[ "`echo ${data_dir}|cut -c1`" != "/" ]]
-    then
-        data_dir=${solr_root}/${data_dir}
-    fi
-}
-
-function logMessage
-{
-    echo $(timeStamp) $@>>$log
-    if [[ -n ${verbose} ]]
-    then
-	echo $@
-    fi
-}
-
-function logExit
-{
-    if [[ "${OS}" == "SunOS" ]]
-    then
-        end=`perl -e "print time;"`
-    else
-        end=`date +"%s"`
-    fi
-    diff=`expr $end - $start`
-    echo "$(timeStamp) $1 (elapsed time: $diff sec)">>$log
-    exit $2
-}
-
-# create logs directory if not there
-if [[ ! -d ${solr_root}/logs ]]
-then
-    mkdir ${solr_root}/logs
-fi
-
-umask 002

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/snapcleaner
----------------------------------------------------------------------
diff --git a/solr/scripts/snapcleaner b/solr/scripts/snapcleaner
deleted file mode 100755
index a6af629..0000000
--- a/solr/scripts/snapcleaner
+++ /dev/null
@@ -1,146 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to clean up snapshots of a Solr Lucene collection.
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset days num data_dir user verbose debug
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/${prog}.log
-
-# define usage string
-USAGE="\
-usage: $prog -D <days> | -N <num> [-d dir] [-u username] [-v] [-V]
-       -D <days>   cleanup snapshots more than <days> days old
-       -N <num>    keep the most recent <num> number of snapshots and
-                   cleanup up the remaining ones that are not being pulled
-       -d          specify directory holding index data
-       -u          specify user to sudo to before running script
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts D:N:d:u:vV OPTION
-do
-    case $OPTION in
-    D)
-        days="$OPTARG"
-        ;;
-    N)
-        num="$OPTARG"
-        ;;
-    d)
-        data_dir="$OPTARG"
-        ;;
-    u)
-        user="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-if [[ -z ${days} && -z ${num} ]]
-then
-    echo "$USAGE"
-    exit 1
-fi
-
-fixUser "$@"
-
-dataDir
-
-function remove
-{
-    if [[ "${OS}" == "Darwin" || "${OS}" == "FreeBSD" ]]
-    then
-     syncing=`ps -www -U ${user} |grep -w rsync|grep -v grep|grep -w $1`
-    else
-     syncing=`ps -fwwwu ${user}|grep -w rsync|grep -v grep|grep -w $1`
-    fi
-
-    if [[ -n $syncing ]]
-    then
-	logMessage $1 not removed - rsync in progress
-    else
-	logMessage removing snapshot $1
-	/bin/rm -rf $1
-    fi
-}
-
-setStartTime
-
-logMessage started by $oldwhoami
-logMessage command: $0 $@
-
-# trap control-c
-trap 'echo "caught INT/TERM, exiting now but partial cleanup may have already occured";logExit aborted 13' INT TERM
-
-if [[ -n ${days} ]]
-then
-    #is maxdepth supported?
-    find ${data_dir} -maxdepth 0 -name foobar >/dev/null 2>&1
-    if [ $? = 0 ]; then
-      maxdepth="-maxdepth 1"
-    else
-      unset maxdepth
-    fi
-
-    logMessage cleaning up snapshots more than ${days} days old
-    for i in `find ${data_dir} ${maxdepth} -name 'snapshot.*' -mtime +${days} -print`
-    do
-        remove $i
-    done
-elif [[ -n ${num} ]]
-then
-    logMessage cleaning up all snapshots except for the most recent ${num} ones
-    unset snapshots count
-    snapshots=`find ${data_dir} -type d -name 'snapshot.*' 2>/dev/null| sort -r`
-    if [[ $? == 0 ]]
-    then
-        count=`echo $snapshots|wc -w`
-        startpos=`expr $num + 1`
-        if [[ $count -gt $num ]]
-        then
-            for i in `echo $snapshots|cut -f${startpos}- -d" "`
-            do
-	        remove $i
-	    done
-        fi
-    fi
-fi
-
-logExit ended 0
-
-

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/snapinstaller
----------------------------------------------------------------------
diff --git a/solr/scripts/snapinstaller b/solr/scripts/snapinstaller
deleted file mode 100755
index f921b38..0000000
--- a/solr/scripts/snapinstaller
+++ /dev/null
@@ -1,190 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to install a snapshot into place as the Lucene collection
-# for a Solr server
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset master_host master_status_dir data_dir user verbose debug
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/${prog}.log
-
-LOCKDIR="${solr_root}/logs/snapinstaller-lock"
-PIDFILE="${LOCKDIR}/PID"
-
-
-# define usage string
-USAGE="\
-usage: $prog [-M master] [-S sdir] [-d dir] [-u username] [-v] [-V]
-       -M master   specify hostname of master server from where to pull index
-                   snapshot
-       -S          specify directory holding snapshot status on master server
-       -d          specify directory holding index data on local machine
-       -u          specify user to sudo to before running script
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts M:S:d:u:vV OPTION
-do
-    case $OPTION in
-    M)
-        master_host="$OPTARG"
-        ;;
-    S)
-        master_status_dir="$OPTARG"
-        ;;
-    d)
-        data_dir="$OPTARG"
-        ;;
-    u)
-        user="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-if [[ -z ${master_host} ]]
-then
-    echo "name of master server missing in $confFile or command line."
-    echo "$USAGE"
-    exit 1
-fi
-
-if [[ -z ${master_status_dir} ]]
-then
-    echo "directory holding snapshot status on master server missing in $confFile or command line."
-    echo "$USAGE"
-    exit 1
-fi
-
-fixUser "$@"
-
-dataDir
-
-# assume relative path to start at ${solr_root}
-if [[ "`echo ${master_status_dir}|cut -c1`" != "/" ]]
-then
-    master_status_dir=${solr_root}/${master_status_dir}
-fi
-
-setStartTime
-
-if test -r $PIDFILE
-then
-  OTHERPID="$(cat "${PIDFILE}")"
-  if ! kill -0 $OTHERPID &>/dev/null; then
-    logMessage removing stale lock ${OTHERPID}
-    rm -rf "${LOCKDIR}"
-  else
-    logExit "lock failed, PID ${OTHERPID} is active" 1
-  fi
-fi
-
-mkdir "${LOCKDIR}" &>/dev/null
-echo "$$" >"${PIDFILE}"
-
-logMessage started by $oldwhoami
-logMessage command: $0 $@
-
-# get directory name of latest snapshot
-name=`perl -e 'chdir q|'${data_dir}'|; print ((sort grep {/^snapshot[.][1-9][0-9]{13}$/} <*>)[-1])'`
-
-# clean up after INT/TERM
-trap 'echo "caught INT/TERM, exiting now but partial installation may have already occured";/bin/rm -rf ${data_dir}/index.tmp$$;logExit aborted 13' INT TERM
-
-# is there a snapshot
-if [[ "${name}" == "" ]]
-then
-    logMessage no shapshot available
-    logExit ended 0
-fi
-
-name=${data_dir}/${name}
-
-# has snapshot already been installed
-if [[ ${name} == `cat ${solr_root}/logs/snapshot.current 2>/dev/null` ]]
-then
-    logMessage latest snapshot ${name} already installed
-    logExit ended 0
-fi
-
-# make sure master has directory for hold slaves stats/state
-if
-    ! ssh -o StrictHostKeyChecking=no ${master_host} mkdir -p ${master_status_dir}
-then
-    logMessage failed to ssh to master ${master_host}, snapshot status not updated on master
-fi
-
-# install using hard links into temporary directory
-# remove original index and then atomically copy new one into place
-logMessage installing snapshot ${name}
-if [[ "${OS}" == "SunOS" || "${OS}" == "Darwin" || "${OS}" == "FreeBSD" ]]
-then
-  orig_dir=$(pwd)
-  mkdir ${data_dir}/index.tmp$$ && \
-  cd ${name} && \
-  find . -print|cpio -pdlmu ${data_dir}/index.tmp$$ 1>/dev/null 2>&1 && \
-  /bin/rm -rf ${data_dir}/index && \
-  mv -f ${data_dir}/index.tmp$$ ${data_dir}/index
-  cd ${orig_dir}
-else
-  cp -lr ${name}/ ${data_dir}/index.tmp$$ && \
-  /bin/rm -rf ${data_dir}/index && \
-  mv -f ${data_dir}/index.tmp$$ ${data_dir}/index
-fi
-
-# update distribution stats
-echo ${name} > ${solr_root}/logs/snapshot.current
-
-# push stats/state to master
-if
-    ! scp -q -o StrictHostKeyChecking=no ${solr_root}/logs/snapshot.current ${master_host}:${master_status_dir}/snapshot.current.`uname -n`
-then
-    logMessage failed to ssh to master ${master_host}, snapshot status not updated on master
-fi
-
-# notify Solr to open a new Searcher
-logMessage notifing Solr to open a new Searcher
-${solr_root}/bin/commit
-if [[ $? != 0 ]]
-then
-  logMessage failed to connect to Solr server
-  logMessage snapshot installed but Solr server has not open a new Searcher
-  logExit failed 1
-fi
-
-rm -rf "${LOCKDIR}"
-logExit ended 0

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/snappuller
----------------------------------------------------------------------
diff --git a/solr/scripts/snappuller b/solr/scripts/snappuller
deleted file mode 100755
index 7a7c4ae..0000000
--- a/solr/scripts/snappuller
+++ /dev/null
@@ -1,261 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to copy snapshots of a Solr Lucene collection from the master
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset master_host rsyncd_port master_data_dir master_status_dir snap_name
-unset sizeonly stats data_dir user verbose debug compress startStatus
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/${prog}.log
-
-# define usage string
-USAGE="\
-usage: $prog [-M master] [-P portnum] [-D mdir] [-S sdir] [-n snapshot] [-d dir] [-u username] [-svVz]
-       -M master   specify hostname of master server from where to pull index
-                   snapshot
-       -P port     specify rsyncd port number of master server from where to
-                   pull index snapshot
-       -D          specify directory holding index data on master server
-       -S          specify directory holding snapshot status on master server
-       -n snapshot pull a specific snapshot by name
-       -d          specify directory holding index data on local machine
-       -u          specify user to sudo to before running script
-       -s          use the --size-only option with rsync
-       -v          increase verbosity (-vv show file transfer stats also)
-       -V          output debugging info
-       -z          enable compression of data
-"
-
-# parse args
-while getopts M:P:D:S:n:d:u:svVz OPTION
-do
-    case $OPTION in
-    M)
-        master_host="$OPTARG"
-        ;;
-    P)
-        rsyncd_port="$OPTARG"
-        ;;
-    D)
-        master_data_dir="$OPTARG"
-        ;;
-    S)
-        master_status_dir="$OPTARG"
-        ;;
-    n)
-        snap_name="$OPTARG"
-        ;;
-    d)
-        data_dir="$OPTARG"
-        ;;
-    u)
-        user="$OPTARG"
-        ;;
-    s)
-        sizeonly="--size-only"
-        ;;
-    v)
-        [[ -n $verbose ]] && stats="--stats" || verbose=v
-        ;;
-    V)
-        debug="V"
-        ;;
-    z)
-        compress="z"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-if [[ -z ${master_host} ]]
-then
-    echo "name of master server missing in $confFile or command line."
-    echo "$USAGE"
-    exit 1
-fi
-
-# try to determine rsyncd port number from $confFile if not specified on
-# command line, default to solr_port+10000
-if [[ -z ${rsyncd_port} ]]
-then
-    if [[ "${solr_port}" ]]
-    then
-        rsyncd_port=`expr 10000 + ${solr_port}`
-    else
-        echo "rsyncd port number of master server missing in $confFile or command line."
-        echo "$USAGE"
-        exit 1
-    fi
-fi
-
-if [[ -z ${master_data_dir} ]]
-then
-    echo "directory holding index data on master server missing in $confFile or command line."
-    echo "$USAGE"
-    exit 1
-fi
-
-if [[ -z ${master_status_dir} ]]
-then
-    echo "directory holding snapshot status on master server missing in $confFile or command line."
-    echo "$USAGE"
-    exit 1
-fi
-
-fixUser "$@"
-
-dataDir
-
-# assume relative path to start at ${solr_root}
-if [[ "`echo ${master_data_dir}|cut -c1`" != "/" ]]
-then
-    master_data_dir=${solr_root}/${master_data_dir}
-fi
-if [[ "`echo ${master_status_dir}|cut -c1`" != "/" ]]
-then
-    master_status_dir=${solr_root}/${master_status_dir}
-fi
-
-# push stats/state to master if necessary
-function pushStatus
-{
-    scp -q -o StrictHostKeyChecking=no ${solr_root}/logs/snappuller.status ${master_host}:${master_status_dir}/snapshot.status.`uname -n`
-}
-
-setStartTime
-
-logMessage started by $oldwhoami
-logMessage command: $0 $@
-
-if [[ ! -f ${solr_root}/logs/snappuller-enabled ]]
-then
-    logMessage snappuller disabled
-    exit 2
-fi
-
-# make sure we can ssh to master
-if
-    ! ssh -o StrictHostKeyChecking=no ${master_host} id 1>/dev/null 2>&1
-then
-    logMessage failed to ssh to master ${master_host}
-    exit 1
-fi
-
-# get directory name of latest snapshot if not specified on command line
-if [[ -z ${snap_name} ]]
-then
-    snap_name=`ssh -o StrictHostKeyChecking=no ${master_host} "perl -e 'chdir q|${master_data_dir}|; print ((sort grep {/^snapshot[.][1-9][0-9]{13}$/} <*>)[-1])'"`
-fi
-if [[ "${snap_name}" == "" ]]
-then
-    logMessage no snapshot available on ${master_host} in ${master_data_dir}
-    logExit ended 0
-else
-    name=`basename ${snap_name}`
-fi
-
-# clean up after INT/TERM
-trap 'echo cleaning up, please wait ...;/bin/rm -rf ${data_dir}/${name} ${data_dir}/${name}-wip;echo ${startStatus} aborted:$(timeStamp)>${solr_root}/logs/snappuller.status;pushStatus;logExit aborted 13' INT TERM
-
-if [[ -d ${data_dir}/${name} || -d ${data_dir}/${name}-wip ]]
-then
-    logMessage no new snapshot available on ${master_host} in ${master_data_dir}
-    logExit ended 0
-fi
-
-# take a snapshot of current index so that only modified files will be rsync-ed
-# put the snapshot in the 'work-in-progress" directory to prevent it from
-# being installed while the copying is still in progress
-if [[ "${OS}" == "SunOS" || "${OS}" == "Darwin" || "${OS}" == "FreeBSD"  ]]
-then
-  orig_dir=$(pwd)
-  mkdir ${data_dir}/${name}-wip
-  cd ${data_dir}/index
-  find . -print|cpio -pdlmu ${data_dir}/${name}-wip 1>/dev/null 2>&1
-  cd ${orig_dir}
-else
-  cp -lr ${data_dir}/index ${data_dir}/${name}-wip
-fi
-# force rsync of segments and .del files since we are doing size-only
-if [[ -n ${sizeonly} ]]
-then
-    rm -f ${data_dir}/${name}-wip/segments
-    rm -f ${data_dir}/${name}-wip/*.del
-fi
-
-logMessage pulling snapshot ${name}
-
-# make sure master has directory for hold slaves stats/state
-ssh -o StrictHostKeyChecking=no ${master_host} mkdir -p ${master_status_dir}
-
-# start new distribution stats
-rsyncStart=`date +'%Y-%m-%d %H:%M:%S'`
-if [[ "${OS}" == "Darwin" || "${OS}" == "FreeBSD"  ]]
-then
-  startTimestamp=`date -j -f '%Y-%m-%d %H:%M:%S' "$rsyncStart" +'%Y%m%d-%H%M%S'`
-  rsyncStartSec=`date -j -f '%Y-%m-%d %H:%M:%S' "$rsyncStart" +'%s'`
-else
-startTimestamp=`date -d "$rsyncStart" +'%Y%m%d-%H%M%S'`
-rsyncStartSec=`date -d "$rsyncStart" +'%s'`
-fi
-startStatus="rsync of `basename ${name}` started:$startTimestamp"
-echo ${startStatus} > ${solr_root}/logs/snappuller.status
-pushStatus
-
-# rsync over files that have changed
-rsync -Wa${verbose}${compress} --delete ${sizeonly} \
-${stats} rsync://${master_host}:${rsyncd_port}/solr/${name}/ ${data_dir}/${name}-wip
-
-rc=$?
-rsyncEnd=`date +'%Y-%m-%d %H:%M:%S'`
-if [[ "${OS}" == "Darwin" || "${OS}" == "FreeBSD"  ]]
-then
-  endTimestamp=`date -j -f '%Y-%m-%d %H:%M:%S' "$rsyncEnd" +'%Y%m%d-%H%M%S'`
-  rsyncEndSec=`date -j -f '%Y-%m-%d %H:%M:%S' "$rsyncEnd" +'%s'`
-else
-endTimestamp=`date -d "$rsyncEnd" +'%Y%m%d-%H%M%S'`
-rsyncEndSec=`date -d "$rsyncEnd" +'%s'`
-fi
-elapsed=`expr $rsyncEndSec - $rsyncStartSec`
-if [[ $rc != 0 ]]
-then
-  logMessage rsync failed
-  /bin/rm -rf ${data_dir}/${name}-wip
-  echo ${startStatus} failed:$endTimestamp > ${solr_root}/logs/snappuller.status
-  pushStatus
-  logExit failed 1
-fi
-
-# move into place atomically
-mv ${data_dir}/${name}-wip ${data_dir}/${name}
-
-# finish new distribution stats`
-echo ${startStatus} ended:$endTimestamp rsync-elapsed:${elapsed} > ${solr_root}/logs/snappuller.status
-pushStatus
-logExit ended 0

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/snappuller-disable
----------------------------------------------------------------------
diff --git a/solr/scripts/snappuller-disable b/solr/scripts/snappuller-disable
deleted file mode 100755
index 1a988af..0000000
--- a/solr/scripts/snappuller-disable
+++ /dev/null
@@ -1,77 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to disable snappuller
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset user verbose debug
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/snappuller.log
-
-# define usage string
-USAGE="\
-usage: $prog [-u username] [-v] [-V]
-       -u          specify user to sudo to before running script
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts u:vV OPTION
-do
-    case $OPTION in
-    u)
-        user="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-fixUser "$@"
-
-setStartTime
-
-logMessage disabled by $oldwhoami
-logMessage command: $0 $@
-name=${solr_root}/logs/snappuller-enabled
-
-if [[ -f ${name} ]]
-then
-    rm -f ${name}
-else
-    logMessage snappuller not currently enabled
-    logExit exited 1
-fi
-
-logExit ended 0

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/snappuller-enable
----------------------------------------------------------------------
diff --git a/solr/scripts/snappuller-enable b/solr/scripts/snappuller-enable
deleted file mode 100755
index 7d842ba..0000000
--- a/solr/scripts/snappuller-enable
+++ /dev/null
@@ -1,77 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to enable snappuller
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset user verbose debug
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/snappuller.log
-
-# define usage string
-USAGE="\
-usage: $prog [-u username] [-v] [-V]
-       -u          specify user to sudo to before running script
-       -v          increase verbosity
-       -V          output debugging info
-"
-
-# parse args
-while getopts u:vV OPTION
-do
-    case $OPTION in
-    u)
-        user="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-fixUser "$@"
-
-setStartTime
-
-logMessage enabled by $oldwhoami
-logMessage command: $0 $@
-name=${solr_root}/logs/snappuller-enabled
-
-if [[ -f ${name} ]]
-then
-    logMessage snappuller already currently enabled
-    logExit exited 1
-else
-    touch ${name}
-fi
-
-logExit ended 0

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0f5875b7/solr/scripts/snapshooter
----------------------------------------------------------------------
diff --git a/solr/scripts/snapshooter b/solr/scripts/snapshooter
deleted file mode 100755
index 2bf3ff1..0000000
--- a/solr/scripts/snapshooter
+++ /dev/null
@@ -1,128 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Shell script to take a snapshot of a Solr Lucene collection.
-
-orig_dir=$(pwd)
-cd ${0%/*}/..
-solr_root=$(pwd)
-cd ${orig_dir}
-
-unset data_dir user verbose debug
-. ${solr_root}/bin/scripts-util
-
-# set up variables
-prog=${0##*/}
-log=${solr_root}/logs/${prog}.log
-
-# define usage string
-USAGE="\
-usage: $prog [-d dir] [-u username] [-v] [-V] [-c]
-       -d          specify directory holding index data
-       -u          specify user to sudo to before running script
-       -v          increase verbosity
-       -V          output debugging info
-       -c          only take snapshot if different than previous
-"
-
-# parse args
-while getopts d:u:vVc OPTION
-do
-    case $OPTION in
-    d)
-        data_dir="$OPTARG"
-        ;;
-    u)
-        user="$OPTARG"
-        ;;
-    v)
-        verbose="v"
-        ;;
-    V)
-        debug="V"
-        ;;
-    c)
-        check=1
- 	;;
-    *)
-        echo "$USAGE"
-        exit 1
-    esac
-done
-
-[[ -n $debug ]] && set -x
-
-fixUser "$@"
-
-dataDir
-
-setStartTime
-
-logMessage started by $oldwhoami
-logMessage command: $0 $@
-
-snap_name=snapshot.`date +"%Y%m%d%H%M%S"`
-name=${data_dir}/${snap_name}
-temp=${data_dir}/temp-${snap_name}
-
-if [[ -d ${name} ]]
-then
-    logMessage snapshot directory ${name} already exists
-    logExit aborted 1
-fi
-
-if [[ -d ${temp} ]]
-then
-    logMessage snapshoting of ${name} in progress
-    logExit aborted 1
-fi
-
-if [[ ${check} ]]
-then
-   previous=`find ${data_dir} -name snapshot.\* | sort -r  | head -1` 
-   if [[ -d ${previous} ]]
-   then
-     differences=`diff -q ${data_dir}/index ${previous} | wc -l` 
-     if [[ ${differences} -lt 1 ]]
-     then 
-       logMessage Snap would be same as last, exiting
-       logExit aborted 1 
-     fi
-   fi
-fi
-
-# clean up after INT/TERM
-trap 'echo cleaning up, please wait ...;/bin/rm -rf ${name} ${temp};logExit aborted 13' INT TERM
-
-logMessage taking snapshot ${name}
-
-# take a snapshot using hard links into temporary location
-# then move it into place atomically
-if [[ "${OS}" == "SunOS" || "${OS}" == "Darwin"  || "${OS}" == "FreeBSD" ]]
-then
-  orig_dir=$(pwd)
-  mkdir ${temp}
-  cd ${data_dir}/index
-  find . -print|cpio -pdlmu ${temp} 1>/dev/null 2>&1
-  cd ${orig_dir}
-else
-  cp -lr ${data_dir}/index ${temp}
-fi
-mv ${temp} ${name}
-
-logExit ended 0
-


[42/50] [abbrv] lucene-solr:jira/solr-9858: LUCENE-7410: Make cache keys and close listeners less trappy.

Posted by ab...@apache.org.
LUCENE-7410: Make cache keys and close listeners less trappy.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/df6f8307
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/df6f8307
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/df6f8307

Branch: refs/heads/jira/solr-9858
Commit: df6f83072303b4891a296b700a50c743284d3c30
Parents: 8e65aca
Author: Adrien Grand <jp...@gmail.com>
Authored: Tue Feb 28 14:21:30 2017 +0100
Committer: Adrien Grand <jp...@gmail.com>
Committed: Tue Feb 28 14:46:45 2017 +0100

----------------------------------------------------------------------
 lucene/CHANGES.txt                              |   4 +
 lucene/MIGRATE.txt                              |   8 +
 .../apache/lucene/index/FixBrokenOffsets.java   |  10 +
 .../apache/lucene/codecs/DocValuesConsumer.java |   4 +-
 .../lucene/index/ExitableDirectoryReader.java   |  21 +-
 .../apache/lucene/index/FilterCodecReader.java  |  13 +-
 .../apache/lucene/index/FilterLeafReader.java   |  73 +---
 .../org/apache/lucene/index/IndexReader.java    | 115 +++--
 .../org/apache/lucene/index/LeafReader.java     |  84 +---
 .../apache/lucene/index/MergeReaderWrapper.java |  20 +-
 .../org/apache/lucene/index/MultiDocValues.java |  22 +-
 .../org/apache/lucene/index/MultiReader.java    |  11 +
 .../lucene/index/ParallelCompositeReader.java   |  16 +-
 .../apache/lucene/index/ParallelLeafReader.java |  36 +-
 .../apache/lucene/index/SegmentCoreReaders.java |  39 +-
 .../org/apache/lucene/index/SegmentReader.java  |  57 ++-
 .../lucene/index/SlowCodecReaderWrapper.java    |   8 +-
 .../apache/lucene/index/SortingLeafReader.java  |  12 +
 .../lucene/index/StandardDirectoryReader.java   |  42 ++
 .../org/apache/lucene/search/LRUQueryCache.java |  39 +-
 .../index/TestDemoParallelLeafReader.java       |  11 +-
 .../lucene/index/TestDirectoryReader.java       |   8 +-
 .../lucene/index/TestDirectoryReaderReopen.java |  12 +-
 .../index/TestExitableDirectoryReader.java      |  10 +
 .../lucene/index/TestFilterDirectoryReader.java |   5 +
 .../lucene/index/TestFilterLeafReader.java      |  21 +-
 .../lucene/index/TestIndexReaderClose.java      |  62 ++-
 .../apache/lucene/index/TestMultiTermsEnum.java |  10 +
 .../index/TestParallelCompositeReader.java      |  33 +-
 .../lucene/search/TermInSetQueryTest.java       |  17 +-
 .../apache/lucene/search/TestLRUQueryCache.java |  61 ++-
 .../lucene/search/TestSearcherManager.java      |  15 +
 .../org/apache/lucene/search/TestTermQuery.java |  15 +
 .../apache/lucene/search/TestTermScorer.java    |  10 +
 .../DefaultSortedSetDocValuesReaderState.java   |   3 +-
 .../facet/taxonomy/CachedOrdinalsReader.java    |   7 +-
 .../taxonomy/OrdinalMappingLeafReader.java      |  10 +
 .../search/highlight/TermVectorLeafReader.java  |  20 +-
 .../highlight/WeightedSpanTermExtractor.java    |  10 +
 .../lucene/search/uhighlight/PhraseHelper.java  |  10 +
 .../TermVectorFilteredLeafReader.java           |  10 +
 .../search/uhighlight/UnifiedHighlighter.java   |  15 +
 .../TestUnifiedHighlighterTermVec.java          |  15 +
 .../lucene/search/join/QueryBitSetProducer.java |  14 +-
 .../apache/lucene/search/join/TestJoinUtil.java |  10 +-
 .../search/join/TestQueryBitSetProducer.java    | 110 +++++
 .../apache/lucene/index/memory/MemoryIndex.java |  20 +-
 .../lucene/index/MultiPassIndexSplitter.java    |  15 +
 .../apache/lucene/index/PKIndexSplitter.java    |  10 +
 .../nrt/SegmentInfosSearcherManager.java        |   8 +-
 .../lucene/index/AllDeletedFilterReader.java    |  10 +
 .../lucene/index/AssertingDirectoryReader.java  |   9 +-
 .../lucene/index/AssertingLeafReader.java       |  30 +-
 .../index/BaseStoredFieldsFormatTestCase.java   |  15 +
 .../lucene/index/FieldFilterLeafReader.java     |  12 +-
 .../lucene/index/MismatchedDirectoryReader.java |   5 +
 .../lucene/index/MismatchedLeafReader.java      |  10 +
 .../lucene/index/MockRandomMergePolicy.java     |  13 +-
 .../org/apache/lucene/search/QueryUtils.java    |  43 +-
 .../org/apache/lucene/util/LuceneTestCase.java  |  30 +-
 .../src/java/org/apache/solr/core/SolrCore.java |  13 +-
 .../solr/handler/component/ExpandComponent.java |  21 +-
 .../solr/highlight/DefaultSolrHighlighter.java  |  10 +
 .../solr/index/SlowCompositeReaderWrapper.java  |  35 +-
 .../schema/RptWithGeometrySpatialField.java     |   7 +-
 .../solr/search/CollapsingQParserPlugin.java    |  19 +-
 .../java/org/apache/solr/search/Insanity.java   |   9 +-
 .../org/apache/solr/uninverting/FieldCache.java |  18 +-
 .../apache/solr/uninverting/FieldCacheImpl.java |  70 +--
 .../uninverting/FieldCacheSanityChecker.java    | 426 -------------------
 .../solr/uninverting/UninvertingReader.java     |  21 +-
 .../apache/solr/update/SolrIndexSplitter.java   |  10 +
 .../test/org/apache/solr/core/TestNRTOpen.java  |   2 +-
 .../index/TestSlowCompositeReaderWrapper.java   |  53 ++-
 .../test/org/apache/solr/search/TestDocSet.java |  20 +-
 .../apache/solr/search/TestSolr4Spatial2.java   |   2 +-
 .../solr/uninverting/TestDocTermOrds.java       |   4 +-
 .../apache/solr/uninverting/TestFieldCache.java |   4 +-
 .../TestFieldCacheSanityChecker.java            | 164 -------
 .../solr/uninverting/TestLegacyFieldCache.java  |  35 +-
 80 files changed, 1089 insertions(+), 1242 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/CHANGES.txt
----------------------------------------------------------------------
diff --git a/lucene/CHANGES.txt b/lucene/CHANGES.txt
index 21a29c3..6026654 100644
--- a/lucene/CHANGES.txt
+++ b/lucene/CHANGES.txt
@@ -33,6 +33,10 @@ API Changes
 
 * LUCENE-7494: Points now have a per-field API, like doc values. (Adrien Grand)
 
+* LUCENE-7410: Cache keys and close listeners have been refactored in order
+  to be less trappy. See IndexReader.getReaderCacheHelper and
+  LeafReader.getCoreCacheHelper. (Adrien Grand)
+
 Bug Fixes
 
 * LUCENE-7626: IndexWriter will no longer accept broken token offsets

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/MIGRATE.txt
----------------------------------------------------------------------
diff --git a/lucene/MIGRATE.txt b/lucene/MIGRATE.txt
index 06e6a81..51f6435 100644
--- a/lucene/MIGRATE.txt
+++ b/lucene/MIGRATE.txt
@@ -47,3 +47,11 @@ queries.
 
 This option has been removed as expanded terms are now normalized through
 Analyzer#normalize.
+
+## Cache key and close listener refactoring (LUCENE-7410)
+
+The way to access cache keys and add close listeners has been refactored in
+order to be less trappy. You should now use IndexReader.getReaderCacheHelper()
+to have manage caches that take deleted docs and doc values updates into
+account, and LeafReader.getCoreCacheHelper() to manage per-segment caches that
+do not take deleted docs and doc values updates into account.

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/backward-codecs/src/java/org/apache/lucene/index/FixBrokenOffsets.java
----------------------------------------------------------------------
diff --git a/lucene/backward-codecs/src/java/org/apache/lucene/index/FixBrokenOffsets.java b/lucene/backward-codecs/src/java/org/apache/lucene/index/FixBrokenOffsets.java
index d4d6f85..e775a28 100644
--- a/lucene/backward-codecs/src/java/org/apache/lucene/index/FixBrokenOffsets.java
+++ b/lucene/backward-codecs/src/java/org/apache/lucene/index/FixBrokenOffsets.java
@@ -114,6 +114,16 @@ public class FixBrokenOffsets {
               }
             };
           }
+
+          @Override
+          public CacheHelper getCoreCacheHelper() {
+            return null;
+          }
+
+          @Override
+          public CacheHelper getReaderCacheHelper() {
+            return null;
+          }
         });
     }
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/codecs/DocValuesConsumer.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/codecs/DocValuesConsumer.java b/lucene/core/src/java/org/apache/lucene/codecs/DocValuesConsumer.java
index 3d06b51..88e34f6 100644
--- a/lucene/core/src/java/org/apache/lucene/codecs/DocValuesConsumer.java
+++ b/lucene/core/src/java/org/apache/lucene/codecs/DocValuesConsumer.java
@@ -521,7 +521,7 @@ public abstract class DocValuesConsumer implements Closeable {
     }
     
     // step 2: create ordinal map (this conceptually does the "merging")
-    final OrdinalMap map = OrdinalMap.build(this, liveTerms, weights, PackedInts.COMPACT);
+    final OrdinalMap map = OrdinalMap.build(null, liveTerms, weights, PackedInts.COMPACT);
     
     // step 3: add field
     addSortedField(fieldInfo,
@@ -689,7 +689,7 @@ public abstract class DocValuesConsumer implements Closeable {
     }
     
     // step 2: create ordinal map (this conceptually does the "merging")
-    final OrdinalMap map = OrdinalMap.build(this, liveTerms, weights, PackedInts.COMPACT);
+    final OrdinalMap map = OrdinalMap.build(null, liveTerms, weights, PackedInts.COMPACT);
     
     // step 3: add field
     addSortedSetField(mergeFieldInfo,

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/ExitableDirectoryReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/ExitableDirectoryReader.java b/lucene/core/src/java/org/apache/lucene/index/ExitableDirectoryReader.java
index ee1c0ce..a9b7472 100644
--- a/lucene/core/src/java/org/apache/lucene/index/ExitableDirectoryReader.java
+++ b/lucene/core/src/java/org/apache/lucene/index/ExitableDirectoryReader.java
@@ -88,17 +88,19 @@ public class ExitableDirectoryReader extends FilterDirectoryReader {
         return fields;  // break out of wrapper as soon as possible
       }
     }
-    
+
+    // this impl does not change deletes or data so we can delegate the
+    // CacheHelpers
     @Override
-    public Object getCoreCacheKey() {
-      return in.getCoreCacheKey();  
+    public CacheHelper getReaderCacheHelper() {
+      return in.getReaderCacheHelper();
     }
-    
+
     @Override
-    public Object getCombinedCoreAndDeletesKey() {
-      return in.getCombinedCoreAndDeletesKey();
+    public CacheHelper getCoreCacheHelper() {
+      return in.getCoreCacheHelper();
     }
-    
+
   }
 
   /**
@@ -211,6 +213,11 @@ public class ExitableDirectoryReader extends FilterDirectoryReader {
   }
 
   @Override
+  public CacheHelper getReaderCacheHelper() {
+    return in.getReaderCacheHelper();
+  }
+
+  @Override
   public String toString() {
     return "ExitableDirectoryReader(" + in.toString() + ")";
   }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/FilterCodecReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/FilterCodecReader.java b/lucene/core/src/java/org/apache/lucene/index/FilterCodecReader.java
index c0ea8fc..5949fca 100644
--- a/lucene/core/src/java/org/apache/lucene/index/FilterCodecReader.java
+++ b/lucene/core/src/java/org/apache/lucene/index/FilterCodecReader.java
@@ -35,6 +35,9 @@ import org.apache.lucene.util.Bits;
  * A <code>FilterCodecReader</code> contains another CodecReader, which it
  * uses as its basic source of data, possibly transforming the data along the
  * way or providing additional functionality.
+ * <p><b>NOTE</b>: If this {@link FilterCodecReader} does not change the
+ * content the contained reader, you could consider delegating calls to
+ * {@link #getCoreCacheHelper()} and {@link #getReaderCacheHelper()}.
  */
 public abstract class FilterCodecReader extends CodecReader {
   /** 
@@ -106,16 +109,6 @@ public abstract class FilterCodecReader extends CodecReader {
   }
 
   @Override
-  public void addCoreClosedListener(CoreClosedListener listener) {
-    in.addCoreClosedListener(listener);
-  }
-
-  @Override
-  public void removeCoreClosedListener(CoreClosedListener listener) {
-    in.removeCoreClosedListener(listener);
-  }
-
-  @Override
   protected void doClose() throws IOException {
     in.doClose();
   }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/FilterLeafReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/FilterLeafReader.java b/lucene/core/src/java/org/apache/lucene/index/FilterLeafReader.java
index 9ed62e7..0a3ec7f 100644
--- a/lucene/core/src/java/org/apache/lucene/index/FilterLeafReader.java
+++ b/lucene/core/src/java/org/apache/lucene/index/FilterLeafReader.java
@@ -19,9 +19,7 @@ package org.apache.lucene.index;
 
 import java.io.IOException;
 import java.util.Iterator;
-import java.util.Objects;
 
-import org.apache.lucene.search.QueryCache;
 import org.apache.lucene.search.Sort;
 import org.apache.lucene.util.AttributeSource;
 import org.apache.lucene.util.Bits;
@@ -38,12 +36,8 @@ import org.apache.lucene.util.BytesRef;
  * <p><b>NOTE</b>: If you override {@link #getLiveDocs()}, you will likely need
  * to override {@link #numDocs()} as well and vice-versa.
  * <p><b>NOTE</b>: If this {@link FilterLeafReader} does not change the
- * content the contained reader, you could consider overriding
- * {@link #getCoreCacheKey()} so that
- * {@link QueryCache} impls share the same entries for this atomic reader
- * and the wrapped one. {@link #getCombinedCoreAndDeletesKey()} could be
- * overridden as well if the {@link #getLiveDocs() live docs} are not changed
- * either.
+ * content the contained reader, you could consider delegating calls to
+ * {@link #getCoreCacheHelper()} and {@link #getReaderCacheHelper()}.
  */
 public abstract class FilterLeafReader extends LeafReader {
 
@@ -307,69 +301,6 @@ public abstract class FilterLeafReader extends LeafReader {
     in.registerParentReader(this);
   }
 
-  /**
-   * A CoreClosedListener wrapper that adjusts the core cache key that
-   * the wrapper is called with. This is useful if the core cache key
-   * of a reader is different from the key of the wrapped reader.
-   */
-  private static class CoreClosedListenerWrapper implements CoreClosedListener {
-
-    public static CoreClosedListener wrap(CoreClosedListener listener, Object thisCoreKey, Object inCoreKey) {
-      if (thisCoreKey == inCoreKey) {
-        // this reader has the same core cache key as its parent, nothing to do
-        return listener;
-      } else {
-        // we don't have the same cache key as the wrapped reader, we need to wrap
-        // the listener to call it with the correct cache key
-        return new CoreClosedListenerWrapper(listener, thisCoreKey, inCoreKey);
-      }
-    }
-
-    private final CoreClosedListener in;
-    private final Object thisCoreKey;
-    private final Object inCoreKey;
-
-    private CoreClosedListenerWrapper(CoreClosedListener in, Object thisCoreKey, Object inCoreKey) {
-      this.in = in;
-      this.thisCoreKey = thisCoreKey;
-      this.inCoreKey = inCoreKey;
-    }
-
-    @Override
-    public void onClose(Object ownerCoreCacheKey) throws IOException {
-      assert inCoreKey == ownerCoreCacheKey;
-      in.onClose(thisCoreKey);
-    }
-
-    // NOTE: equals/hashcore are important for removeCoreClosedListener to work
-    // correctly
-
-    @Override
-    public boolean equals(Object obj) {
-      if (obj == null || obj.getClass() != CoreClosedListenerWrapper.class) {
-        return false;
-      }
-      CoreClosedListenerWrapper that = (CoreClosedListenerWrapper) obj;
-      return in.equals(that.in) && thisCoreKey == that.thisCoreKey;
-    }
-
-    @Override
-    public int hashCode() {
-      return Objects.hash(getClass(), in, thisCoreKey);
-    }
-
-  }
-
-  @Override
-  public void addCoreClosedListener(final CoreClosedListener listener) {
-    in.addCoreClosedListener(CoreClosedListenerWrapper.wrap(listener, getCoreCacheKey(), in.getCoreCacheKey()));
-  }
-
-  @Override
-  public void removeCoreClosedListener(CoreClosedListener listener) {
-    in.removeCoreClosedListener(CoreClosedListenerWrapper.wrap(listener, getCoreCacheKey(), in.getCoreCacheKey()));
-  }
-
   @Override
   public Bits getLiveDocs() {
     ensureOpen();

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/IndexReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/IndexReader.java b/lucene/core/src/java/org/apache/lucene/index/IndexReader.java
index 976f548..eb3a6db 100644
--- a/lucene/core/src/java/org/apache/lucene/index/IndexReader.java
+++ b/lucene/core/src/java/org/apache/lucene/index/IndexReader.java
@@ -20,7 +20,6 @@ package org.apache.lucene.index;
 import java.io.Closeable;
 import java.io.IOException;
 import java.util.Collections;
-import java.util.LinkedHashSet;
 import java.util.List;
 import java.util.Set;
 import java.util.WeakHashMap;
@@ -88,42 +87,48 @@ public abstract class IndexReader implements Closeable {
     if (!(this instanceof CompositeReader || this instanceof LeafReader))
       throw new Error("IndexReader should never be directly extended, subclass LeafReader or CompositeReader instead.");
   }
-  
+
   /**
-   * A custom listener that's invoked when the IndexReader
-   * is closed.
-   *
+   * A utility class that gives hooks in order to help build a cache based on
+   * the data that is contained in this index. 
    * @lucene.experimental
    */
-  public static interface ReaderClosedListener {
-    /** Invoked when the {@link IndexReader} is closed. */
-    public void onClose(IndexReader reader) throws IOException;
-  }
+  public static interface CacheHelper {
 
-  private final Set<ReaderClosedListener> readerClosedListeners = 
-      Collections.synchronizedSet(new LinkedHashSet<ReaderClosedListener>());
+    /**
+     * Get a key that the resource can be cached on. The given entry can be
+     * compared using identity, ie. {@link Object#equals} is implemented as
+     * {@code ==} and {@link Object#hashCode} is implemented as
+     * {@link System#identityHashCode}.
+     */
+    CacheKey getKey();
 
-  private final Set<IndexReader> parentReaders = 
-      Collections.synchronizedSet(Collections.newSetFromMap(new WeakHashMap<IndexReader,Boolean>()));
+    /**
+     * Add a {@link ClosedListener} which will be called when the resource
+     * guarded by {@link #getKey()} is closed.
+     */
+    void addClosedListener(ClosedListener listener);
 
-  /** Expert: adds a {@link ReaderClosedListener}.  The
-   * provided listener will be invoked when this reader is closed.
-   * At this point, it is safe for apps to evict this reader from
-   * any caches keyed on {@link #getCombinedCoreAndDeletesKey()}.
-   *
-   * @lucene.experimental */
-  public final void addReaderClosedListener(ReaderClosedListener listener) {
-    ensureOpen();
-    readerClosedListeners.add(listener);
   }
 
-  /** Expert: remove a previously added {@link ReaderClosedListener}.
-   *
-   * @lucene.experimental */
-  public final void removeReaderClosedListener(ReaderClosedListener listener) {
-    ensureOpen();
-    readerClosedListeners.remove(listener);
+  /** A cache key identifying a resource that is being cached on. */
+  public static final class CacheKey {
+    CacheKey() {} // only instantiable by core impls
+  }
+
+  /**
+   * A listener that is called when a resource gets closed.
+   * @lucene.experimental
+   */
+  @FunctionalInterface
+  public static interface ClosedListener {
+    /** Invoked when the resource (segment core, or index reader) that is
+     *  being cached on is closed. */
+    void onClose(CacheKey key) throws IOException;
   }
+
+  private final Set<IndexReader> parentReaders = 
+      Collections.synchronizedSet(Collections.newSetFromMap(new WeakHashMap<IndexReader,Boolean>()));
   
   /** Expert: This method is called by {@code IndexReader}s which wrap other readers
    * (e.g. {@link CompositeReader} or {@link FilterLeafReader}) to register the parent
@@ -136,21 +141,10 @@ public abstract class IndexReader implements Closeable {
     parentReaders.add(reader);
   }
 
-  private void notifyReaderClosedListeners(Throwable th) throws IOException {
-    synchronized(readerClosedListeners) {
-      for(ReaderClosedListener listener : readerClosedListeners) {
-        try {
-          listener.onClose(this);
-        } catch (Throwable t) {
-          if (th == null) {
-            th = t;
-          } else {
-            th.addSuppressed(t);
-          }
-        }
-      }
-      IOUtils.reThrow(th);
-    }
+  // overridden by StandardDirectoryReader and SegmentReader
+  void notifyReaderClosedListeners(Throwable th) throws IOException {
+    // nothing to notify in the base impl, just rethrow
+    IOUtils.reThrow(th);
   }
 
   private void reportCloseToParentReaders() {
@@ -279,10 +273,8 @@ public abstract class IndexReader implements Closeable {
   }
   
   /** {@inheritDoc}
-   * <p>For caching purposes, {@code IndexReader} subclasses are not allowed
+   * <p>{@code IndexReader} subclasses are not allowed
    * to implement equals/hashCode, so methods are declared final.
-   * To lookup instances from caches use {@link #getCoreCacheKey} and 
-   * {@link #getCombinedCoreAndDeletesKey}.
    */
   @Override
   public final boolean equals(Object obj) {
@@ -290,10 +282,8 @@ public abstract class IndexReader implements Closeable {
   }
   
   /** {@inheritDoc}
-   * <p>For caching purposes, {@code IndexReader} subclasses are not allowed
+   * <p>{@code IndexReader} subclasses are not allowed
    * to implement equals/hashCode, so methods are declared final.
-   * To lookup instances from caches use {@link #getCoreCacheKey} and 
-   * {@link #getCombinedCoreAndDeletesKey}.
    */
   @Override
   public final int hashCode() {
@@ -436,24 +426,17 @@ public abstract class IndexReader implements Closeable {
     return getContext().leaves();
   }
 
-  /** Expert: Returns a key for this IndexReader, so CachingWrapperFilter can find
-   * it again.
-   * This key must not have equals()/hashCode() methods, so &quot;equals&quot; means &quot;identical&quot;. */
-  public Object getCoreCacheKey() {
-    // Don't call ensureOpen since FC calls this (to evict)
-    // on close
-    return this;
-  }
+  /**
+   * Optional method: Return a {@link CacheHelper} that can be used to cache
+   * based on the content of this reader. Two readers that have different data
+   * or different sets of deleted documents will be considered different.
+   * <p>A return value of {@code null} indicates that this reader is not suited
+   * for caching, which is typically the case for short-lived wrappers that
+   * alter the content of the wrapped reader.
+   * @lucene.experimental
+   */
+  public abstract CacheHelper getReaderCacheHelper();
 
-  /** Expert: Returns a key for this IndexReader that also includes deletions,
-   * so CachingWrapperFilter can find it again.
-   * This key must not have equals()/hashCode() methods, so &quot;equals&quot; means &quot;identical&quot;. */
-  public Object getCombinedCoreAndDeletesKey() {
-    // Don't call ensureOpen since FC calls this (to evict)
-    // on close
-    return this;
-  }
-  
   /** Returns the number of documents containing the 
    * <code>term</code>.  This method returns 0 if the term or
    * field does not exists.  This method does not take into

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/LeafReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/LeafReader.java b/lucene/core/src/java/org/apache/lucene/index/LeafReader.java
index 73394f2..13c8646 100644
--- a/lucene/core/src/java/org/apache/lucene/index/LeafReader.java
+++ b/lucene/core/src/java/org/apache/lucene/index/LeafReader.java
@@ -18,7 +18,7 @@ package org.apache.lucene.index;
 
 import java.io.IOException;
 
-import org.apache.lucene.index.IndexReader.ReaderClosedListener;
+import org.apache.lucene.index.IndexReader.CacheHelper;
 import org.apache.lucene.search.Sort;
 import org.apache.lucene.util.Bits;
 
@@ -61,80 +61,18 @@ public abstract class LeafReader extends IndexReader {
   }
 
   /**
-   * Called when the shared core for this {@link LeafReader}
-   * is closed.
-   * <p>
-   * If this {@link LeafReader} impl has the ability to share
-   * resources across instances that might only vary through
-   * deleted documents and doc values updates, then this listener
-   * will only be called when the shared core is closed.
-   * Otherwise, this listener will be called when this reader is
-   * closed.</p>
-   * <p>
-   * This is typically useful to manage per-segment caches: when
-   * the listener is called, it is safe to evict this reader from
-   * any caches keyed on {@link #getCoreCacheKey}.</p>
-   *
+   * Optional method: Return a {@link CacheHelper} that can be used to cache
+   * based on the content of this leaf regardless of deletions. Two readers
+   * that have the same data but different sets of deleted documents or doc
+   * values updates may be considered equal. Consider using
+   * {@link #getReaderCacheHelper} if you need deletions or dv updates to be
+   * taken into account.
+   * <p>A return value of {@code null} indicates that this reader is not suited
+   * for caching, which is typically the case for short-lived wrappers that
+   * alter the content of the wrapped leaf reader.
    * @lucene.experimental
    */
-  public static interface CoreClosedListener {
-    /** Invoked when the shared core of the original {@code
-     *  SegmentReader} has closed. The provided {@code
-     *  ownerCoreCacheKey} will be the same key as the one
-     *  returned by {@link LeafReader#getCoreCacheKey()}. */
-    void onClose(Object ownerCoreCacheKey) throws IOException;
-  }
-
-  private static class CoreClosedListenerWrapper implements ReaderClosedListener {
-
-    private final CoreClosedListener listener;
-
-    CoreClosedListenerWrapper(CoreClosedListener listener) {
-      this.listener = listener;
-    }
-
-    @Override
-    public void onClose(IndexReader reader) throws IOException {
-      listener.onClose(reader.getCoreCacheKey());
-    }
-
-    @Override
-    public int hashCode() {
-      return listener.hashCode();
-    }
-
-    @Override
-    public boolean equals(Object other) {
-      if (!(other instanceof CoreClosedListenerWrapper)) {
-        return false;
-      }
-      return listener.equals(((CoreClosedListenerWrapper) other).listener);
-    }
-
-  }
-
-  /** Add a {@link CoreClosedListener} as a {@link ReaderClosedListener}. This
-   * method is typically useful for {@link LeafReader} implementations that
-   * don't have the concept of a core that is shared across several
-   * {@link LeafReader} instances in which case the {@link CoreClosedListener}
-   * is called when closing the reader. */
-  protected static void addCoreClosedListenerAsReaderClosedListener(IndexReader reader, CoreClosedListener listener) {
-    reader.addReaderClosedListener(new CoreClosedListenerWrapper(listener));
-  }
-
-  /** Remove a {@link CoreClosedListener} which has been added with
-   * {@link #addCoreClosedListenerAsReaderClosedListener(IndexReader, CoreClosedListener)}. */
-  protected static void removeCoreClosedListenerAsReaderClosedListener(IndexReader reader, CoreClosedListener listener) {
-    reader.removeReaderClosedListener(new CoreClosedListenerWrapper(listener));
-  }
-
-  /** Expert: adds a CoreClosedListener to this reader's shared core
-   *  @lucene.experimental */
-  public abstract void addCoreClosedListener(CoreClosedListener listener);
-
-  /** Expert: removes a CoreClosedListener from this reader's shared core
-   *  @lucene.experimental */
-  public abstract void removeCoreClosedListener(CoreClosedListener listener);
+  public abstract CacheHelper getCoreCacheHelper();
 
   /**
    * Returns {@link Fields} for this reader.

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/MergeReaderWrapper.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/MergeReaderWrapper.java b/lucene/core/src/java/org/apache/lucene/index/MergeReaderWrapper.java
index 7eb90df..fffb693 100644
--- a/lucene/core/src/java/org/apache/lucene/index/MergeReaderWrapper.java
+++ b/lucene/core/src/java/org/apache/lucene/index/MergeReaderWrapper.java
@@ -71,16 +71,6 @@ class MergeReaderWrapper extends LeafReader {
   }
 
   @Override
-  public void addCoreClosedListener(CoreClosedListener listener) {
-    in.addCoreClosedListener(listener);
-  }
-
-  @Override
-  public void removeCoreClosedListener(CoreClosedListener listener) {
-    in.removeCoreClosedListener(listener);
-  }
-
-  @Override
   public Fields fields() throws IOException {
     return fields;
   }
@@ -224,15 +214,15 @@ class MergeReaderWrapper extends LeafReader {
   }
 
   @Override
-  public Object getCoreCacheKey() {
-    return in.getCoreCacheKey();
+  public CacheHelper getCoreCacheHelper() {
+    return in.getCoreCacheHelper();
   }
 
   @Override
-  public Object getCombinedCoreAndDeletesKey() {
-    return in.getCombinedCoreAndDeletesKey();
+  public CacheHelper getReaderCacheHelper() {
+    return in.getReaderCacheHelper();
   }
-  
+
   private void checkBounds(int docID) {
     if (docID < 0 || docID >= maxDoc()) {       
       throw new IndexOutOfBoundsException("docID must be >= 0 and < maxDoc=" + maxDoc() + " (got docID=" + docID + ")");

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/MultiDocValues.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/MultiDocValues.java b/lucene/core/src/java/org/apache/lucene/index/MultiDocValues.java
index 3970e0a..88dd6a1 100644
--- a/lucene/core/src/java/org/apache/lucene/index/MultiDocValues.java
+++ b/lucene/core/src/java/org/apache/lucene/index/MultiDocValues.java
@@ -598,7 +598,9 @@ public class MultiDocValues {
     if (anyReal == false) {
       return null;
     } else {
-      OrdinalMap mapping = OrdinalMap.build(r.getCoreCacheKey(), values, PackedInts.DEFAULT);
+      IndexReader.CacheHelper cacheHelper = r.getReaderCacheHelper();
+      IndexReader.CacheKey owner = cacheHelper == null ? null : cacheHelper.getKey();
+      OrdinalMap mapping = OrdinalMap.build(owner, values, PackedInts.DEFAULT);
       return new MultiSortedDocValues(values, starts, mapping, totalCost);
     }
   }
@@ -640,7 +642,9 @@ public class MultiDocValues {
     if (anyReal == false) {
       return null;
     } else {
-      OrdinalMap mapping = OrdinalMap.build(r.getCoreCacheKey(), values, PackedInts.DEFAULT);
+      IndexReader.CacheHelper cacheHelper = r.getReaderCacheHelper();
+      IndexReader.CacheKey owner = cacheHelper == null ? null : cacheHelper.getKey();
+      OrdinalMap mapping = OrdinalMap.build(owner, values, PackedInts.DEFAULT);
       return new MultiSortedSetDocValues(values, starts, mapping, totalCost);
     }
   }
@@ -710,9 +714,9 @@ public class MultiDocValues {
     /**
      * Create an ordinal map that uses the number of unique values of each
      * {@link SortedDocValues} instance as a weight.
-     * @see #build(Object, TermsEnum[], long[], float)
+     * @see #build(IndexReader.CacheKey, TermsEnum[], long[], float)
      */
-    public static OrdinalMap build(Object owner, SortedDocValues[] values, float acceptableOverheadRatio) throws IOException {
+    public static OrdinalMap build(IndexReader.CacheKey owner, SortedDocValues[] values, float acceptableOverheadRatio) throws IOException {
       final TermsEnum[] subs = new TermsEnum[values.length];
       final long[] weights = new long[values.length];
       for (int i = 0; i < values.length; ++i) {
@@ -725,9 +729,9 @@ public class MultiDocValues {
     /**
      * Create an ordinal map that uses the number of unique values of each
      * {@link SortedSetDocValues} instance as a weight.
-     * @see #build(Object, TermsEnum[], long[], float)
+     * @see #build(IndexReader.CacheKey, TermsEnum[], long[], float)
      */
-    public static OrdinalMap build(Object owner, SortedSetDocValues[] values, float acceptableOverheadRatio) throws IOException {
+    public static OrdinalMap build(IndexReader.CacheKey owner, SortedSetDocValues[] values, float acceptableOverheadRatio) throws IOException {
       final TermsEnum[] subs = new TermsEnum[values.length];
       final long[] weights = new long[values.length];
       for (int i = 0; i < values.length; ++i) {
@@ -748,7 +752,7 @@ public class MultiDocValues {
      *             to the other subs
      * @throws IOException if an I/O error occurred.
      */
-    public static OrdinalMap build(Object owner, TermsEnum subs[], long[] weights, float acceptableOverheadRatio) throws IOException {
+    public static OrdinalMap build(IndexReader.CacheKey owner, TermsEnum subs[], long[] weights, float acceptableOverheadRatio) throws IOException {
       if (subs.length != weights.length) {
         throw new IllegalArgumentException("subs and weights must have the same length");
       }
@@ -761,7 +765,7 @@ public class MultiDocValues {
     private static final long BASE_RAM_BYTES_USED = RamUsageEstimator.shallowSizeOfInstance(OrdinalMap.class);
 
     /** Cache key of whoever asked for this awful thing */
-    public final Object owner;
+    public final IndexReader.CacheKey owner;
     // globalOrd -> (globalOrd - segmentOrd) where segmentOrd is the the ordinal in the first segment that contains this term
     final PackedLongValues globalOrdDeltas;
     // globalOrd -> first segment container
@@ -773,7 +777,7 @@ public class MultiDocValues {
     // ram usage
     final long ramBytesUsed;
     
-    OrdinalMap(Object owner, TermsEnum subs[], SegmentMap segmentMap, float acceptableOverheadRatio) throws IOException {
+    OrdinalMap(IndexReader.CacheKey owner, TermsEnum subs[], SegmentMap segmentMap, float acceptableOverheadRatio) throws IOException {
       // create the ordinal mappings by pulling a termsenum over each sub's 
       // unique terms, and walking a multitermsenum over those
       this.owner = owner;

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/MultiReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/MultiReader.java b/lucene/core/src/java/org/apache/lucene/index/MultiReader.java
index 8f1bb66..4d42382 100644
--- a/lucene/core/src/java/org/apache/lucene/index/MultiReader.java
+++ b/lucene/core/src/java/org/apache/lucene/index/MultiReader.java
@@ -66,6 +66,17 @@ public class MultiReader extends BaseCompositeReader<IndexReader> {
   }
 
   @Override
+  public CacheHelper getReaderCacheHelper() {
+    // MultiReader instances can be short-lived, which would make caching trappy
+    // so we do not cache on them, unless they wrap a single reader in which
+    // case we delegate
+    if (getSequentialSubReaders().size() == 1) {
+      return getSequentialSubReaders().get(0).getReaderCacheHelper();
+    }
+    return null;
+  }
+
+  @Override
   protected synchronized void doClose() throws IOException {
     IOException ioe = null;
     for (final IndexReader r : getSequentialSubReaders()) {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/ParallelCompositeReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/ParallelCompositeReader.java b/lucene/core/src/java/org/apache/lucene/index/ParallelCompositeReader.java
index dd82976..4fc8a20 100644
--- a/lucene/core/src/java/org/apache/lucene/index/ParallelCompositeReader.java
+++ b/lucene/core/src/java/org/apache/lucene/index/ParallelCompositeReader.java
@@ -51,6 +51,7 @@ public class ParallelCompositeReader extends BaseCompositeReader<LeafReader> {
   private final boolean closeSubReaders;
   private final Set<IndexReader> completeReaderSet =
     Collections.newSetFromMap(new IdentityHashMap<IndexReader,Boolean>());
+  private final CacheHelper cacheHelper;
 
   /** Create a ParallelCompositeReader based on the provided
    *  readers; auto-closes the given readers on {@link #close()}. */
@@ -80,6 +81,14 @@ public class ParallelCompositeReader extends BaseCompositeReader<LeafReader> {
     }
     // finally add our own synthetic readers, so we close or decRef them, too (it does not matter what we do)
     completeReaderSet.addAll(getSequentialSubReaders());
+    // ParallelReader instances can be short-lived, which would make caching trappy
+    // so we do not cache on them, unless they wrap a single reader in which
+    // case we delegate
+    if (readers.length == 1 && storedFieldReaders.length == 1 && readers[0] == storedFieldReaders[0]) {
+      cacheHelper = readers[0].getReaderCacheHelper();
+    } else {
+      cacheHelper = null;
+    }
   }
 
   private static LeafReader[] prepareLeafReaders(CompositeReader[] readers, CompositeReader[] storedFieldsReaders) throws IOException {
@@ -142,7 +151,12 @@ public class ParallelCompositeReader extends BaseCompositeReader<LeafReader> {
       }
     }    
   }
-  
+
+  @Override
+  public CacheHelper getReaderCacheHelper() {
+    return cacheHelper;
+  }
+
   @Override
   protected synchronized void doClose() throws IOException {
     IOException ioe = null;

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/ParallelLeafReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/ParallelLeafReader.java b/lucene/core/src/java/org/apache/lucene/index/ParallelLeafReader.java
index 60886ea..c67d07b 100644
--- a/lucene/core/src/java/org/apache/lucene/index/ParallelLeafReader.java
+++ b/lucene/core/src/java/org/apache/lucene/index/ParallelLeafReader.java
@@ -159,16 +159,6 @@ public class ParallelLeafReader extends LeafReader {
     return buffer.append(')').toString();
   }
 
-  @Override
-  public void addCoreClosedListener(CoreClosedListener listener) {
-    addCoreClosedListenerAsReaderClosedListener(this, listener);
-  }
-
-  @Override
-  public void removeCoreClosedListener(CoreClosedListener listener) {
-    removeCoreClosedListenerAsReaderClosedListener(this, listener);
-  }
-
   // Single instance of this, per ParallelReader instance
   private final class ParallelFields extends Fields {
     final Map<String,Terms> fields = new TreeMap<>();
@@ -242,6 +232,32 @@ public class ParallelLeafReader extends LeafReader {
   }
   
   @Override
+  public CacheHelper getCoreCacheHelper() {
+    // ParallelReader instances can be short-lived, which would make caching trappy
+    // so we do not cache on them, unless they wrap a single reader in which
+    // case we delegate
+    if (parallelReaders.length == 1
+        && storedFieldsReaders.length == 1
+        && parallelReaders[0] == storedFieldsReaders[0]) {
+      return parallelReaders[0].getCoreCacheHelper();
+    }
+    return null;
+  }
+
+  @Override
+  public CacheHelper getReaderCacheHelper() {
+    // ParallelReader instances can be short-lived, which would make caching trappy
+    // so we do not cache on them, unless they wrap a single reader in which
+    // case we delegate
+    if (parallelReaders.length == 1
+        && storedFieldsReaders.length == 1
+        && parallelReaders[0] == storedFieldsReaders[0]) {
+      return parallelReaders[0].getReaderCacheHelper();
+    }
+    return null;
+  }
+
+  @Override
   public Fields getTermVectors(int docID) throws IOException {
     ensureOpen();
     ParallelFields fields = null;

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/SegmentCoreReaders.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/SegmentCoreReaders.java b/lucene/core/src/java/org/apache/lucene/index/SegmentCoreReaders.java
index 270a2d5..99e503b 100644
--- a/lucene/core/src/java/org/apache/lucene/index/SegmentCoreReaders.java
+++ b/lucene/core/src/java/org/apache/lucene/index/SegmentCoreReaders.java
@@ -33,7 +33,8 @@ import org.apache.lucene.codecs.PointsReader;
 import org.apache.lucene.codecs.PostingsFormat;
 import org.apache.lucene.codecs.StoredFieldsReader;
 import org.apache.lucene.codecs.TermVectorsReader;
-import org.apache.lucene.index.LeafReader.CoreClosedListener;
+import org.apache.lucene.index.IndexReader.CacheKey;
+import org.apache.lucene.index.IndexReader.ClosedListener;
 import org.apache.lucene.store.AlreadyClosedException;
 import org.apache.lucene.store.Directory;
 import org.apache.lucene.store.IOContext;
@@ -84,8 +85,8 @@ final class SegmentCoreReaders {
     }
   };
 
-  private final Set<CoreClosedListener> coreClosedListeners = 
-      Collections.synchronizedSet(new LinkedHashSet<CoreClosedListener>());
+  private final Set<IndexReader.ClosedListener> coreClosedListeners = 
+      Collections.synchronizedSet(new LinkedHashSet<IndexReader.ClosedListener>());
   
   SegmentCoreReaders(Directory dir, SegmentCommitInfo si, IOContext context) throws IOException {
 
@@ -175,14 +176,32 @@ final class SegmentCoreReaders {
       }
     }
   }
-  
+
+  private final IndexReader.CacheHelper cacheHelper = new IndexReader.CacheHelper() {
+    private final IndexReader.CacheKey cacheKey = new IndexReader.CacheKey();
+
+    @Override
+    public CacheKey getKey() {
+      return cacheKey;
+    }
+
+    @Override
+    public void addClosedListener(ClosedListener listener) {
+      coreClosedListeners.add(listener);
+    }
+  };
+
+  IndexReader.CacheHelper getCacheHelper() {
+    return cacheHelper;
+  }
+
   private void notifyCoreClosedListeners(Throwable th) throws IOException {
     synchronized(coreClosedListeners) {
-      for (CoreClosedListener listener : coreClosedListeners) {
+      for (IndexReader.ClosedListener listener : coreClosedListeners) {
         // SegmentReader uses our instance as its
         // coreCacheKey:
         try {
-          listener.onClose(this);
+          listener.onClose(cacheHelper.getKey());
         } catch (Throwable t) {
           if (th == null) {
             th = t;
@@ -195,14 +214,6 @@ final class SegmentCoreReaders {
     }
   }
 
-  void addCoreClosedListener(CoreClosedListener listener) {
-    coreClosedListeners.add(listener);
-  }
-  
-  void removeCoreClosedListener(CoreClosedListener listener) {
-    coreClosedListeners.remove(listener);
-  }
-
   @Override
   public String toString() {
     return "SegmentCoreReader(" + segment + ")";

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/SegmentReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/SegmentReader.java b/lucene/core/src/java/org/apache/lucene/index/SegmentReader.java
index b01f0b8..5dbc492 100644
--- a/lucene/core/src/java/org/apache/lucene/index/SegmentReader.java
+++ b/lucene/core/src/java/org/apache/lucene/index/SegmentReader.java
@@ -19,6 +19,8 @@ package org.apache.lucene.index;
 
 import java.io.IOException;
 import java.util.Collections;
+import java.util.Set;
+import java.util.concurrent.CopyOnWriteArraySet;
 
 import org.apache.lucene.codecs.Codec;
 import org.apache.lucene.codecs.DocValuesProducer;
@@ -32,6 +34,7 @@ import org.apache.lucene.search.Sort;
 import org.apache.lucene.store.Directory;
 import org.apache.lucene.store.IOContext;
 import org.apache.lucene.util.Bits;
+import org.apache.lucene.util.IOUtils;
 
 /**
  * IndexReader implementation over a single segment. 
@@ -282,32 +285,48 @@ public final class SegmentReader extends CodecReader {
     return si.info.dir;
   }
 
-  // This is necessary so that cloned SegmentReaders (which
-  // share the underlying postings data) will map to the
-  // same entry for CachingWrapperFilter.  See LUCENE-1579.
-  @Override
-  public Object getCoreCacheKey() {
-    // NOTE: if this ever changes, be sure to fix
-    // SegmentCoreReader.notifyCoreClosedListeners to match!
-    // Today it passes "this" as its coreCacheKey:
-    return core;
-  }
+  private final Set<ClosedListener> readerClosedListeners = new CopyOnWriteArraySet<>();
 
   @Override
-  public Object getCombinedCoreAndDeletesKey() {
-    return this;
+  void notifyReaderClosedListeners(Throwable th) throws IOException {
+    synchronized(readerClosedListeners) {
+      for(ClosedListener listener : readerClosedListeners) {
+        try {
+          listener.onClose(cacheHelper.getKey());
+        } catch (Throwable t) {
+          if (th == null) {
+            th = t;
+          } else {
+            th.addSuppressed(t);
+          }
+        }
+      }
+      IOUtils.reThrow(th);
+    }
   }
 
+  private final IndexReader.CacheHelper cacheHelper = new IndexReader.CacheHelper() {
+    private final IndexReader.CacheKey cacheKey = new IndexReader.CacheKey();
+
+    @Override
+    public CacheKey getKey() {
+      return cacheKey;
+    }
+
+    @Override
+    public void addClosedListener(ClosedListener listener) {
+      readerClosedListeners.add(listener);
+    }
+  };
+
   @Override
-  public void addCoreClosedListener(CoreClosedListener listener) {
-    ensureOpen();
-    core.addCoreClosedListener(listener);
+  public CacheHelper getReaderCacheHelper() {
+    return cacheHelper;
   }
-  
+
   @Override
-  public void removeCoreClosedListener(CoreClosedListener listener) {
-    ensureOpen();
-    core.removeCoreClosedListener(listener);
+  public CacheHelper getCoreCacheHelper() {
+    return core.getCacheHelper();
   }
 
   @Override

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/SlowCodecReaderWrapper.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/SlowCodecReaderWrapper.java b/lucene/core/src/java/org/apache/lucene/index/SlowCodecReaderWrapper.java
index d5b5c33..99f35bc 100644
--- a/lucene/core/src/java/org/apache/lucene/index/SlowCodecReaderWrapper.java
+++ b/lucene/core/src/java/org/apache/lucene/index/SlowCodecReaderWrapper.java
@@ -113,13 +113,13 @@ public final class SlowCodecReaderWrapper {
         }
 
         @Override
-        public void addCoreClosedListener(CoreClosedListener listener) {
-          reader.addCoreClosedListener(listener);
+        public CacheHelper getCoreCacheHelper() {
+          return reader.getCoreCacheHelper();
         }
 
         @Override
-        public void removeCoreClosedListener(CoreClosedListener listener) {
-          reader.removeCoreClosedListener(listener);
+        public CacheHelper getReaderCacheHelper() {
+          return reader.getReaderCacheHelper();
         }
 
         @Override

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/SortingLeafReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/SortingLeafReader.java b/lucene/core/src/java/org/apache/lucene/index/SortingLeafReader.java
index f24a4d0..321b552 100644
--- a/lucene/core/src/java/org/apache/lucene/index/SortingLeafReader.java
+++ b/lucene/core/src/java/org/apache/lucene/index/SortingLeafReader.java
@@ -1246,4 +1246,16 @@ class SortingLeafReader extends FilterLeafReader {
   public String toString() {
     return "SortingLeafReader(" + in + ")";
   }
+
+  // no caching on sorted views
+
+  @Override
+  public CacheHelper getCoreCacheHelper() {
+    return null;
+  }
+
+  @Override
+  public CacheHelper getReaderCacheHelper() {
+    return null;
+  }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/index/StandardDirectoryReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/StandardDirectoryReader.java b/lucene/core/src/java/org/apache/lucene/index/StandardDirectoryReader.java
index 7ac059e..46f81af 100644
--- a/lucene/core/src/java/org/apache/lucene/index/StandardDirectoryReader.java
+++ b/lucene/core/src/java/org/apache/lucene/index/StandardDirectoryReader.java
@@ -25,6 +25,8 @@ import java.util.Collections;
 import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.CopyOnWriteArraySet;
 
 import org.apache.lucene.store.AlreadyClosedException;
 import org.apache.lucene.store.Directory;
@@ -469,4 +471,44 @@ public final class StandardDirectoryReader extends DirectoryReader {
       return reader;
     }
   }
+
+  private final Set<ClosedListener> readerClosedListeners = new CopyOnWriteArraySet<>();
+
+  private final CacheHelper cacheHelper = new CacheHelper() {
+    private final CacheKey cacheKey = new CacheKey();
+
+    @Override
+    public CacheKey getKey() {
+      return cacheKey;
+    }
+
+    @Override
+    public void addClosedListener(ClosedListener listener) {
+        readerClosedListeners.add(listener);
+    }
+
+  };
+
+  @Override
+  void notifyReaderClosedListeners(Throwable th) throws IOException {
+    synchronized(readerClosedListeners) {
+      for(ClosedListener listener : readerClosedListeners) {
+        try {
+          listener.onClose(cacheHelper.getKey());
+        } catch (Throwable t) {
+          if (th == null) {
+            th = t;
+          } else {
+            th.addSuppressed(t);
+          }
+        }
+      }
+      IOUtils.reThrow(th);
+    }
+  }
+
+  @Override
+  public CacheHelper getReaderCacheHelper() {
+    return cacheHelper;
+  }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/java/org/apache/lucene/search/LRUQueryCache.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/search/LRUQueryCache.java b/lucene/core/src/java/org/apache/lucene/search/LRUQueryCache.java
index fcdf2a5..b1ba4e4 100644
--- a/lucene/core/src/java/org/apache/lucene/search/LRUQueryCache.java
+++ b/lucene/core/src/java/org/apache/lucene/search/LRUQueryCache.java
@@ -32,6 +32,7 @@ import java.util.concurrent.atomic.AtomicBoolean;
 import java.util.concurrent.locks.ReentrantLock;
 import java.util.function.Predicate;
 
+import org.apache.lucene.index.IndexReader;
 import org.apache.lucene.index.IndexReaderContext;
 import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.index.ReaderUtil;
@@ -107,7 +108,7 @@ public class LRUQueryCache implements QueryCache, Accountable {
   // are only allowed to store sub-sets of the queries that are contained in
   // mostRecentlyUsedQueries. This is why write operations are performed under a lock
   private final Set<Query> mostRecentlyUsedQueries;
-  private final Map<Object, LeafCache> cache;
+  private final Map<IndexReader.CacheKey, LeafCache> cache;
   private final ReentrantLock lock;
 
   // these variables are volatile so that we do not need to sync reads
@@ -264,11 +265,11 @@ public class LRUQueryCache implements QueryCache, Accountable {
     }
   }
 
-  DocIdSet get(Query key, LeafReaderContext context) {
+  DocIdSet get(Query key, LeafReaderContext context, IndexReader.CacheHelper cacheHelper) {
     assert lock.isHeldByCurrentThread();
     assert key instanceof BoostQuery == false;
     assert key instanceof ConstantScoreQuery == false;
-    final Object readerKey = context.reader().getCoreCacheKey();
+    final IndexReader.CacheKey readerKey = cacheHelper.getKey();
     final LeafCache leafCache = cache.get(readerKey);
     if (leafCache == null) {
       onMiss(readerKey, key);
@@ -289,7 +290,7 @@ public class LRUQueryCache implements QueryCache, Accountable {
     return cached;
   }
 
-  void putIfAbsent(Query query, LeafReaderContext context, DocIdSet set) {
+  void putIfAbsent(Query query, LeafReaderContext context, DocIdSet set, IndexReader.CacheHelper cacheHelper) {
     assert query instanceof BoostQuery == false;
     assert query instanceof ConstantScoreQuery == false;
     // under a lock to make sure that mostRecentlyUsedQueries and cache remain sync'ed
@@ -301,15 +302,15 @@ public class LRUQueryCache implements QueryCache, Accountable {
       } else {
         query = singleton;
       }
-      final Object key = context.reader().getCoreCacheKey();
+      final IndexReader.CacheKey key = cacheHelper.getKey();
       LeafCache leafCache = cache.get(key);
       if (leafCache == null) {
         leafCache = new LeafCache(key);
-        final LeafCache previous = cache.put(context.reader().getCoreCacheKey(), leafCache);
+        final LeafCache previous = cache.put(key, leafCache);
         ramBytesUsed += HASHTABLE_RAM_BYTES_PER_ENTRY;
         assert previous == null;
         // we just created a new leaf cache, need to register a close listener
-        context.reader().addCoreClosedListener(this::clearCoreCacheKey);
+        cacheHelper.addClosedListener(this::clearCoreCacheKey);
       }
       leafCache.putIfAbsent(query, set);
       evictIfNecessary();
@@ -720,6 +721,14 @@ public class LRUQueryCache implements QueryCache, Accountable {
       if (used.compareAndSet(false, true)) {
         policy.onUse(getQuery());
       }
+
+      // TODO: should it be pluggable, eg. for queries that run on doc values?
+      final IndexReader.CacheHelper cacheHelper = context.reader().getCoreCacheHelper();
+      if (cacheHelper == null) {
+        // this segment is not suitable for caching
+        return in.scorer(context);
+      }
+
       // Short-circuit: Check whether this segment is eligible for caching
       // before we take a lock because of #get
       if (shouldCache(context) == false) {
@@ -733,7 +742,7 @@ public class LRUQueryCache implements QueryCache, Accountable {
 
       DocIdSet docIdSet;
       try {
-        docIdSet = get(in.getQuery(), context);
+        docIdSet = get(in.getQuery(), context, cacheHelper);
       } finally {
         lock.unlock();
       }
@@ -741,7 +750,7 @@ public class LRUQueryCache implements QueryCache, Accountable {
       if (docIdSet == null) {
         if (policy.shouldCache(in.getQuery())) {
           docIdSet = cache(context);
-          putIfAbsent(in.getQuery(), context, docIdSet);
+          putIfAbsent(in.getQuery(), context, docIdSet, cacheHelper);
         } else {
           return in.scorer(context);
         }
@@ -764,6 +773,14 @@ public class LRUQueryCache implements QueryCache, Accountable {
       if (used.compareAndSet(false, true)) {
         policy.onUse(getQuery());
       }
+
+      // TODO: should it be pluggable, eg. for queries that run on doc values?
+      final IndexReader.CacheHelper cacheHelper = context.reader().getCoreCacheHelper();
+      if (cacheHelper == null) {
+        // this segment is not suitable for caching
+        return in.bulkScorer(context);
+      }
+
       // Short-circuit: Check whether this segment is eligible for caching
       // before we take a lock because of #get
       if (shouldCache(context) == false) {
@@ -777,7 +794,7 @@ public class LRUQueryCache implements QueryCache, Accountable {
 
       DocIdSet docIdSet;
       try {
-        docIdSet = get(in.getQuery(), context);
+        docIdSet = get(in.getQuery(), context, cacheHelper);
       } finally {
         lock.unlock();
       }
@@ -785,7 +802,7 @@ public class LRUQueryCache implements QueryCache, Accountable {
       if (docIdSet == null) {
         if (policy.shouldCache(in.getQuery())) {
           docIdSet = cache(context);
-          putIfAbsent(in.getQuery(), context, docIdSet);
+          putIfAbsent(in.getQuery(), context, docIdSet, cacheHelper);
         } else {
           return in.bulkScorer(context);
         }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/index/TestDemoParallelLeafReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/index/TestDemoParallelLeafReader.java b/lucene/core/src/test/org/apache/lucene/index/TestDemoParallelLeafReader.java
index 7901214..34bde51 100644
--- a/lucene/core/src/test/org/apache/lucene/index/TestDemoParallelLeafReader.java
+++ b/lucene/core/src/test/org/apache/lucene/index/TestDemoParallelLeafReader.java
@@ -239,6 +239,11 @@ public class TestDemoParallelLeafReader extends LuceneTestCase {
         // throw the first exception
         IOUtils.reThrow(firstExc);
       }
+
+      @Override
+      public CacheHelper getReaderCacheHelper() {
+        return null;
+      }
     }
 
     @Override
@@ -324,7 +329,7 @@ public class TestDemoParallelLeafReader extends LuceneTestCase {
       }
     }
 
-    private class ParallelReaderClosed implements LeafReader.ReaderClosedListener {
+    private class ParallelReaderClosed implements IndexReader.ClosedListener {
       private final SegmentIDAndGen segIDGen;
       private final Directory dir;
 
@@ -334,7 +339,7 @@ public class TestDemoParallelLeafReader extends LuceneTestCase {
       }
 
       @Override
-      public void onClose(IndexReader ignored) {
+      public void onClose(IndexReader.CacheKey ignored) {
         try {
           // TODO: make this sync finer, i.e. just the segment + schemaGen
           synchronized(ReindexingReader.this) {
@@ -421,7 +426,7 @@ public class TestDemoParallelLeafReader extends LuceneTestCase {
               // the pruning may remove our directory:
               closedSegments.remove(segIDGen);
 
-              parLeafReader.addReaderClosedListener(new ParallelReaderClosed(segIDGen, dir));
+              parLeafReader.getReaderCacheHelper().addClosedListener(new ParallelReaderClosed(segIDGen, dir));
 
             } else {
               // Used only for merged segment warming:

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/index/TestDirectoryReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/index/TestDirectoryReader.java b/lucene/core/src/test/org/apache/lucene/index/TestDirectoryReader.java
index 9ac719e..7afcf7a 100644
--- a/lucene/core/src/test/org/apache/lucene/index/TestDirectoryReader.java
+++ b/lucene/core/src/test/org/apache/lucene/index/TestDirectoryReader.java
@@ -927,14 +927,14 @@ public class TestDirectoryReader extends LuceneTestCase {
     writer.commit();
     final DirectoryReader reader = writer.getReader();
     final int[] closeCount = new int[1];
-    final IndexReader.ReaderClosedListener listener = new IndexReader.ReaderClosedListener() {
+    final IndexReader.ClosedListener listener = new IndexReader.ClosedListener() {
       @Override
-      public void onClose(IndexReader reader) {
+      public void onClose(IndexReader.CacheKey key) {
         closeCount[0]++;
       }
     };
   
-    reader.addReaderClosedListener(listener);
+    reader.getReaderCacheHelper().addClosedListener(listener);
   
     reader.close();
   
@@ -943,7 +943,7 @@ public class TestDirectoryReader extends LuceneTestCase {
     writer.close();
   
     DirectoryReader reader2 = DirectoryReader.open(dir);
-    reader2.addReaderClosedListener(listener);
+    reader2.getReaderCacheHelper().addClosedListener(listener);
   
     closeCount[0] = 0;
     reader2.close();

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/index/TestDirectoryReaderReopen.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/index/TestDirectoryReaderReopen.java b/lucene/core/src/test/org/apache/lucene/index/TestDirectoryReaderReopen.java
index f415381..b38696a 100644
--- a/lucene/core/src/test/org/apache/lucene/index/TestDirectoryReaderReopen.java
+++ b/lucene/core/src/test/org/apache/lucene/index/TestDirectoryReaderReopen.java
@@ -811,7 +811,8 @@ public class TestDirectoryReaderReopen extends LuceneTestCase {
     assertEquals(1, oldest.leaves().size());
     
     // sharing same core
-    assertSame(latest.leaves().get(0).reader().getCoreCacheKey(), oldest.leaves().get(0).reader().getCoreCacheKey());
+    assertSame(latest.leaves().get(0).reader().getCoreCacheHelper().getKey(),
+        oldest.leaves().get(0).reader().getCoreCacheHelper().getKey());
     
     latest.close();
     oldest.close();
@@ -861,7 +862,8 @@ public class TestDirectoryReaderReopen extends LuceneTestCase {
     assertEquals(1, oldest.leaves().size());
     
     // sharing same core
-    assertSame(latest.leaves().get(0).reader().getCoreCacheKey(), oldest.leaves().get(0).reader().getCoreCacheKey());
+    assertSame(latest.leaves().get(0).reader().getCoreCacheHelper().getKey(),
+        oldest.leaves().get(0).reader().getCoreCacheHelper().getKey());
     
     latest.close();
     oldest.close();
@@ -901,7 +903,8 @@ public class TestDirectoryReaderReopen extends LuceneTestCase {
     assertEquals(1, oldest.leaves().size());
     
     // sharing same core
-    assertSame(latest.leaves().get(0).reader().getCoreCacheKey(), oldest.leaves().get(0).reader().getCoreCacheKey());
+    assertSame(latest.leaves().get(0).reader().getCoreCacheHelper().getKey(),
+        oldest.leaves().get(0).reader().getCoreCacheHelper().getKey());
 
     NumericDocValues values = getOnlyLeafReader(oldest).getNumericDocValues("dv");
     assertEquals(0, values.nextDoc());
@@ -948,7 +951,8 @@ public class TestDirectoryReaderReopen extends LuceneTestCase {
     assertEquals(1, oldest.leaves().size());
     
     // sharing same core
-    assertSame(latest.leaves().get(0).reader().getCoreCacheKey(), oldest.leaves().get(0).reader().getCoreCacheKey());
+    assertSame(latest.leaves().get(0).reader().getCoreCacheHelper().getKey(),
+        oldest.leaves().get(0).reader().getCoreCacheHelper().getKey());
 
     NumericDocValues values = getOnlyLeafReader(oldest).getNumericDocValues("dv");
     assertEquals(0, values.nextDoc());

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/index/TestExitableDirectoryReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/index/TestExitableDirectoryReader.java b/lucene/core/src/test/org/apache/lucene/index/TestExitableDirectoryReader.java
index 71406c8..3f424f5 100644
--- a/lucene/core/src/test/org/apache/lucene/index/TestExitableDirectoryReader.java
+++ b/lucene/core/src/test/org/apache/lucene/index/TestExitableDirectoryReader.java
@@ -86,6 +86,16 @@ public class TestExitableDirectoryReader extends LuceneTestCase {
     public Fields fields() throws IOException {
       return new TestFields(super.fields());
     }
+
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return in.getCoreCacheHelper();
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return in.getReaderCacheHelper();
+    }
   }
 
   /**

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/index/TestFilterDirectoryReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/index/TestFilterDirectoryReader.java b/lucene/core/src/test/org/apache/lucene/index/TestFilterDirectoryReader.java
index 4ce86e2..62a4294 100644
--- a/lucene/core/src/test/org/apache/lucene/index/TestFilterDirectoryReader.java
+++ b/lucene/core/src/test/org/apache/lucene/index/TestFilterDirectoryReader.java
@@ -49,6 +49,11 @@ public class TestFilterDirectoryReader extends LuceneTestCase {
     protected DirectoryReader doWrapDirectoryReader(DirectoryReader in) throws IOException {
       return new DummyFilterDirectoryReader(in);
     }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return in.getReaderCacheHelper();
+    }
     
   }
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/index/TestFilterLeafReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/index/TestFilterLeafReader.java b/lucene/core/src/test/org/apache/lucene/index/TestFilterLeafReader.java
index e9f6fe2..79862fc 100644
--- a/lucene/core/src/test/org/apache/lucene/index/TestFilterLeafReader.java
+++ b/lucene/core/src/test/org/apache/lucene/index/TestFilterLeafReader.java
@@ -106,6 +106,16 @@ public class TestFilterLeafReader extends LuceneTestCase {
     public Fields fields() throws IOException {
       return new TestFields(super.fields());
     }
+
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return null;
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
   }
     
   /**
@@ -196,7 +206,16 @@ public class TestFilterLeafReader extends LuceneTestCase {
     w.addDocument(new Document());
     DirectoryReader dr = w.getReader();
     LeafReader r = dr.leaves().get(0).reader();
-    FilterLeafReader r2 = new FilterLeafReader(r) {};
+    FilterLeafReader r2 = new FilterLeafReader(r) {
+      @Override
+      public CacheHelper getCoreCacheHelper() {
+        return in.getCoreCacheHelper();
+      }
+      @Override
+      public CacheHelper getReaderCacheHelper() {
+        return in.getReaderCacheHelper();
+      }
+    };
     assertEquals(r, r2.getDelegate());
     assertEquals(r, FilterLeafReader.unwrap(r2));
     w.close();

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/index/TestIndexReaderClose.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/index/TestIndexReaderClose.java b/lucene/core/src/test/org/apache/lucene/index/TestIndexReaderClose.java
index 91dcb6e..20088a5 100644
--- a/lucene/core/src/test/org/apache/lucene/index/TestIndexReaderClose.java
+++ b/lucene/core/src/test/org/apache/lucene/index/TestIndexReaderClose.java
@@ -19,7 +19,6 @@ package org.apache.lucene.index;
 
 import java.io.IOException;
 import java.util.ArrayList;
-import java.util.Collections;
 import java.util.List;
 import java.util.concurrent.atomic.AtomicInteger;
 
@@ -47,10 +46,21 @@ public class TestIndexReaderClose extends LuceneTestCase {
       LeafReader leaf = getOnlyLeafReader(open);
       FilterLeafReader reader = new FilterLeafReader(leaf) {
         @Override
+        public CacheHelper getCoreCacheHelper() {
+          return in.getCoreCacheHelper();
+        }
+        @Override
+        public CacheHelper getReaderCacheHelper() {
+          return in.getReaderCacheHelper();
+        }
+        @Override
         protected void doClose() throws IOException {
-          super.doClose();
-          if (throwOnClose) {
-           throw new IllegalStateException("BOOM!");
+          try {
+            super.doClose();
+          } finally {
+            if (throwOnClose) {
+              throw new IllegalStateException("BOOM!");
+             }
           }
         }
       };
@@ -60,14 +70,14 @@ public class TestIndexReaderClose extends LuceneTestCase {
       for (int i = 0; i < listenerCount; i++) {
           if (rarely()) {
             faultySet = true;
-            reader.addReaderClosedListener(new FaultyListener());
+            reader.getReaderCacheHelper().addClosedListener(new FaultyListener());
           } else {
             count.incrementAndGet();
-            reader.addReaderClosedListener(new CountListener(count));
+            reader.getReaderCacheHelper().addClosedListener(new CountListener(count));
           }
       }
       if (!faultySet && !throwOnClose) {
-        reader.addReaderClosedListener(new FaultyListener());
+        reader.getReaderCacheHelper().addClosedListener(new FaultyListener());
       }
 
       IllegalStateException expected = expectThrows(IllegalStateException.class, () -> {
@@ -106,31 +116,19 @@ public class TestIndexReaderClose extends LuceneTestCase {
     w.close();
 
     final IndexReader reader = DirectoryReader.open(w.w.getDirectory());
-    // We explicitly define a different cache key
-    final Object coreCacheKey = new Object();
-    final LeafReader leafReader = new FilterLeafReader(getOnlyLeafReader(reader)) {
-      @Override
-      public Object getCoreCacheKey() {
-        return coreCacheKey;
-      }
-    };
+    final LeafReader leafReader = new AssertingLeafReader(getOnlyLeafReader(reader));
 
     final int numListeners = TestUtil.nextInt(random(), 1, 10);
-    final List<LeafReader.CoreClosedListener> listeners = new ArrayList<>();
+    final List<IndexReader.ClosedListener> listeners = new ArrayList<>();
     AtomicInteger counter = new AtomicInteger(numListeners);
-    
+
     for (int i = 0; i < numListeners; ++i) {
-      CountCoreListener listener = new CountCoreListener(counter, coreCacheKey);
+      CountCoreListener listener = new CountCoreListener(counter, leafReader.getCoreCacheHelper().getKey());
       listeners.add(listener);
-      leafReader.addCoreClosedListener(listener);
+      leafReader.getCoreCacheHelper().addClosedListener(listener);
     }
     for (int i = 0; i < 100; ++i) {
-      leafReader.addCoreClosedListener(listeners.get(random().nextInt(listeners.size())));
-    }
-    final int removed = random().nextInt(numListeners);
-    Collections.shuffle(listeners, random());
-    for (int i = 0; i < removed; ++i) {
-      leafReader.removeCoreClosedListener(listeners.get(i));
+      leafReader.getCoreCacheHelper().addClosedListener(listeners.get(random().nextInt(listeners.size())));
     }
     assertEquals(numListeners, counter.get());
     // make sure listeners are registered on the wrapped reader and that closing any of them has the same effect
@@ -139,11 +137,11 @@ public class TestIndexReaderClose extends LuceneTestCase {
     } else {
       leafReader.close();
     }
-    assertEquals(removed, counter.get());
+    assertEquals(0, counter.get());
     w.w.getDirectory().close();
   }
 
-  private static final class CountCoreListener implements LeafReader.CoreClosedListener {
+  private static final class CountCoreListener implements IndexReader.ClosedListener {
 
     private final AtomicInteger count;
     private final Object coreCacheKey;
@@ -154,14 +152,14 @@ public class TestIndexReaderClose extends LuceneTestCase {
     }
 
     @Override
-    public void onClose(Object coreCacheKey) {
+    public void onClose(IndexReader.CacheKey coreCacheKey) {
       assertSame(this.coreCacheKey, coreCacheKey);
       count.decrementAndGet();
     }
 
   }
 
-  private static final class CountListener implements IndexReader.ReaderClosedListener  {
+  private static final class CountListener implements IndexReader.ClosedListener  {
     private final AtomicInteger count;
 
     public CountListener(AtomicInteger count) {
@@ -169,15 +167,15 @@ public class TestIndexReaderClose extends LuceneTestCase {
     }
 
     @Override
-    public void onClose(IndexReader reader) {
+    public void onClose(IndexReader.CacheKey cacheKey) {
       count.decrementAndGet();
     }
   }
 
-  private static final class FaultyListener implements IndexReader.ReaderClosedListener {
+  private static final class FaultyListener implements IndexReader.ClosedListener {
 
     @Override
-    public void onClose(IndexReader reader) {
+    public void onClose(IndexReader.CacheKey cacheKey) {
       throw new IllegalStateException("GRRRRRRRRRRRR!");
     }
   }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/index/TestMultiTermsEnum.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/index/TestMultiTermsEnum.java b/lucene/core/src/test/org/apache/lucene/index/TestMultiTermsEnum.java
index ac352c1..a265c9c 100644
--- a/lucene/core/src/test/org/apache/lucene/index/TestMultiTermsEnum.java
+++ b/lucene/core/src/test/org/apache/lucene/index/TestMultiTermsEnum.java
@@ -265,5 +265,15 @@ public class TestMultiTermsEnum extends LuceneTestCase {
         delegate.close();
       }
     }
+
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return null;
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
   }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/index/TestParallelCompositeReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/index/TestParallelCompositeReader.java b/lucene/core/src/test/org/apache/lucene/index/TestParallelCompositeReader.java
index 3efdc8b..d452306 100644
--- a/lucene/core/src/test/org/apache/lucene/index/TestParallelCompositeReader.java
+++ b/lucene/core/src/test/org/apache/lucene/index/TestParallelCompositeReader.java
@@ -126,7 +126,7 @@ public class TestParallelCompositeReader extends LuceneTestCase {
     dir2.close();    
   }
   
-  private void testReaderClosedListener(boolean closeSubReaders, int wrapMultiReaderType) throws IOException {
+  private void testReaderClosedListener1(boolean closeSubReaders, int wrapMultiReaderType) throws IOException {
     final Directory dir1 = getDir1(random());
     final CompositeReader ir2, ir1 = DirectoryReader.open(dir1);
     switch (wrapMultiReaderType) {
@@ -147,18 +147,19 @@ public class TestParallelCompositeReader extends LuceneTestCase {
      new CompositeReader[] {ir2},
      new CompositeReader[] {ir2});
 
-    final int[] listenerClosedCount = new int[1];
-
     assertEquals(3, pr.leaves().size());
+    assertEquals(ir1.getReaderCacheHelper(), pr.getReaderCacheHelper());
 
+    int i = 0;
     for(LeafReaderContext cxt : pr.leaves()) {
-      cxt.reader().addReaderClosedListener(reader -> listenerClosedCount[0]++);
+      LeafReader originalLeaf = ir1.leaves().get(i++).reader();
+      assertEquals(originalLeaf.getCoreCacheHelper(), cxt.reader().getCoreCacheHelper());
+      assertEquals(originalLeaf.getReaderCacheHelper(), cxt.reader().getReaderCacheHelper());
     }
     pr.close();
     if (!closeSubReaders) {
       ir1.close();
     }
-    assertEquals(3, listenerClosedCount[0]);
     
     // We have to close the extra MultiReader, because it will not close its own subreaders:
     if (wrapMultiReaderType == 2) {
@@ -168,23 +169,11 @@ public class TestParallelCompositeReader extends LuceneTestCase {
   }
 
   public void testReaderClosedListener1() throws Exception {
-    testReaderClosedListener(false, 0);
-  }
-
-  public void testReaderClosedListener2() throws Exception {
-    testReaderClosedListener(true, 0);
-  }
-
-  public void testReaderClosedListener3() throws Exception {
-    testReaderClosedListener(false, 1);
-  }
-
-  public void testReaderClosedListener4() throws Exception {
-    testReaderClosedListener(true, 1);
-  }
-
-  public void testReaderClosedListener5() throws Exception {
-    testReaderClosedListener(false, 2);
+    testReaderClosedListener1(false, 0);
+    testReaderClosedListener1(true, 0);
+    testReaderClosedListener1(false, 1);
+    testReaderClosedListener1(true, 1);
+    testReaderClosedListener1(false, 2);
   }
 
   public void testCloseInnerReader() throws Exception {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/search/TermInSetQueryTest.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/search/TermInSetQueryTest.java b/lucene/core/src/test/org/apache/lucene/search/TermInSetQueryTest.java
index 3878d59..9b8a285 100644
--- a/lucene/core/src/test/org/apache/lucene/search/TermInSetQueryTest.java
+++ b/lucene/core/src/test/org/apache/lucene/search/TermInSetQueryTest.java
@@ -237,7 +237,17 @@ public class TermInSetQueryTest extends LuceneTestCase {
           }
         };
       }
-      
+
+      @Override
+      public CacheHelper getCoreCacheHelper() {
+        return null;
+      }
+
+      @Override
+      public CacheHelper getReaderCacheHelper() {
+        return null;
+      }
+
     }
 
     @Override
@@ -245,6 +255,11 @@ public class TermInSetQueryTest extends LuceneTestCase {
       return new TermsCountingDirectoryReaderWrapper(in, counter);
     }
 
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
+
   }
 
   public void testPullOneTermsEnum() throws Exception {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/search/TestLRUQueryCache.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/search/TestLRUQueryCache.java b/lucene/core/src/test/org/apache/lucene/search/TestLRUQueryCache.java
index 3acc3ea..91c1887 100644
--- a/lucene/core/src/test/org/apache/lucene/search/TestLRUQueryCache.java
+++ b/lucene/core/src/test/org/apache/lucene/search/TestLRUQueryCache.java
@@ -42,8 +42,11 @@ import org.apache.lucene.document.Field.Store;
 import org.apache.lucene.document.StringField;
 import org.apache.lucene.document.TextField;
 import org.apache.lucene.index.DirectoryReader;
+import org.apache.lucene.index.FilterDirectoryReader;
+import org.apache.lucene.index.FilterLeafReader;
 import org.apache.lucene.index.IndexReader;
 import org.apache.lucene.index.IndexWriterConfig;
+import org.apache.lucene.index.LeafReader;
 import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.index.NoMergePolicy;
 import org.apache.lucene.index.RandomIndexWriter;
@@ -607,12 +610,12 @@ public class TestLRUQueryCache extends LuceneTestCase {
     final int segmentCount2 = reader2.leaves().size();
     final IndexSearcher searcher2 = new IndexSearcher(reader2);
 
-    final Map<Object, Integer> indexId = new HashMap<>();
+    final Map<IndexReader.CacheKey, Integer> indexId = new HashMap<>();
     for (LeafReaderContext ctx : reader1.leaves()) {
-      indexId.put(ctx.reader().getCoreCacheKey(), 1);
+      indexId.put(ctx.reader().getCoreCacheHelper().getKey(), 1);
     }
     for (LeafReaderContext ctx : reader2.leaves()) {
-      indexId.put(ctx.reader().getCoreCacheKey(), 2);
+      indexId.put(ctx.reader().getCoreCacheHelper().getKey(), 2);
     }
 
     final AtomicLong hitCount1 = new AtomicLong();
@@ -1218,4 +1221,56 @@ public class TestLRUQueryCache extends LuceneTestCase {
     w.close();
     dir.close();
   }
+
+  // a reader whose sole purpose is to not be cacheable
+  private static class DummyDirectoryReader extends FilterDirectoryReader {
+
+    public DummyDirectoryReader(DirectoryReader in) throws IOException {
+      super(in, new SubReaderWrapper() {
+        @Override
+        public LeafReader wrap(LeafReader reader) {
+          return new FilterLeafReader(reader) {
+            @Override
+            public CacheHelper getCoreCacheHelper() {
+              return null;
+            }
+            @Override
+            public CacheHelper getReaderCacheHelper() {
+              return null;
+            }
+          };
+        }
+      });
+    }
+
+    @Override
+    protected DirectoryReader doWrapDirectoryReader(DirectoryReader in) throws IOException {
+      return new DummyDirectoryReader(in);
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
+  }
+
+  public void testReaderNotSuitedForCaching() throws IOException {
+    Directory dir = newDirectory();
+    IndexWriterConfig iwc = newIndexWriterConfig().setMergePolicy(NoMergePolicy.INSTANCE);
+    RandomIndexWriter w = new RandomIndexWriter(random(), dir, iwc);
+    w.addDocument(new Document());
+    DirectoryReader reader = new DummyDirectoryReader(w.getReader());
+    IndexSearcher searcher = newSearcher(reader);
+    searcher.setQueryCachingPolicy(QueryCachingPolicy.ALWAYS_CACHE);
+
+    // don't cache if the reader does not expose a cache helper
+    assertNull(reader.leaves().get(0).reader().getCoreCacheHelper());
+    LRUQueryCache cache = new LRUQueryCache(2, 10000, context -> true);
+    searcher.setQueryCache(cache);
+    assertEquals(0, searcher.count(new DummyQuery()));
+    assertEquals(0, cache.getCacheCount());
+    reader.close();
+    w.close();
+    dir.close();
+  }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/search/TestSearcherManager.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/search/TestSearcherManager.java b/lucene/core/src/test/org/apache/lucene/search/TestSearcherManager.java
index 2fac35f..cc9a919 100644
--- a/lucene/core/src/test/org/apache/lucene/search/TestSearcherManager.java
+++ b/lucene/core/src/test/org/apache/lucene/search/TestSearcherManager.java
@@ -443,6 +443,16 @@ public class TestSearcherManager extends ThreadedIndexingAndSearchingTestCase {
     public MyFilterLeafReader(LeafReader in) {
       super(in);
     }
+
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return in.getCoreCacheHelper();
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return in.getReaderCacheHelper();
+    }
   }
 
   private static class MyFilterDirectoryReader extends FilterDirectoryReader {
@@ -462,6 +472,11 @@ public class TestSearcherManager extends ThreadedIndexingAndSearchingTestCase {
     protected DirectoryReader doWrapDirectoryReader(DirectoryReader in) throws IOException {
       return new MyFilterDirectoryReader(in);
     }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return in.getReaderCacheHelper();
+    }
   }
 
   // LUCENE-6087

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/search/TestTermQuery.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/search/TestTermQuery.java b/lucene/core/src/test/org/apache/lucene/search/TestTermQuery.java
index a994118..41d82d5 100644
--- a/lucene/core/src/test/org/apache/lucene/search/TestTermQuery.java
+++ b/lucene/core/src/test/org/apache/lucene/search/TestTermQuery.java
@@ -108,6 +108,11 @@ public class TestTermQuery extends LuceneTestCase {
     protected DirectoryReader doWrapDirectoryReader(DirectoryReader in) throws IOException {
       return new NoSeekDirectoryReader(in);
     }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return in.getReaderCacheHelper();
+    }
     
   }
 
@@ -149,6 +154,16 @@ public class TestTermQuery extends LuceneTestCase {
       };
     }
 
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return in.getCoreCacheHelper();
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return in.getReaderCacheHelper();
+    }
+
   };
 
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/core/src/test/org/apache/lucene/search/TestTermScorer.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/search/TestTermScorer.java b/lucene/core/src/test/org/apache/lucene/search/TestTermScorer.java
index 1ce1bc6..d00e520 100644
--- a/lucene/core/src/test/org/apache/lucene/search/TestTermScorer.java
+++ b/lucene/core/src/test/org/apache/lucene/search/TestTermScorer.java
@@ -179,6 +179,16 @@ public class TestTermScorer extends LuceneTestCase {
         // unreachable
         return null;
       }
+
+      @Override
+      public CacheHelper getCoreCacheHelper() {
+        return in.getCoreCacheHelper();
+      }
+
+      @Override
+      public CacheHelper getReaderCacheHelper() {
+        return in.getReaderCacheHelper();
+      }
     };
     // We don't use newSearcher because it sometimes runs checkIndex which loads norms
     IndexSearcher indexSearcher = new IndexSearcher(forbiddenNorms);

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/facet/src/java/org/apache/lucene/facet/sortedset/DefaultSortedSetDocValuesReaderState.java
----------------------------------------------------------------------
diff --git a/lucene/facet/src/java/org/apache/lucene/facet/sortedset/DefaultSortedSetDocValuesReaderState.java b/lucene/facet/src/java/org/apache/lucene/facet/sortedset/DefaultSortedSetDocValuesReaderState.java
index b959d25..6bcfa46 100644
--- a/lucene/facet/src/java/org/apache/lucene/facet/sortedset/DefaultSortedSetDocValuesReaderState.java
+++ b/lucene/facet/src/java/org/apache/lucene/facet/sortedset/DefaultSortedSetDocValuesReaderState.java
@@ -116,7 +116,8 @@ public class DefaultSortedSetDocValuesReaderState extends SortedSetDocValuesRead
         SortedSetDocValues dv = MultiDocValues.getSortedSetValues(origReader, field);
         if (dv instanceof MultiDocValues.MultiSortedSetDocValues) {
           map = ((MultiDocValues.MultiSortedSetDocValues)dv).mapping;
-          if (map.owner == origReader.getCoreCacheKey()) {
+          IndexReader.CacheHelper cacheHelper = origReader.getReaderCacheHelper();
+          if (cacheHelper != null && map.owner == cacheHelper.getKey()) {
             cachedOrdMaps.put(field, map);
           }
         }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/facet/src/java/org/apache/lucene/facet/taxonomy/CachedOrdinalsReader.java
----------------------------------------------------------------------
diff --git a/lucene/facet/src/java/org/apache/lucene/facet/taxonomy/CachedOrdinalsReader.java b/lucene/facet/src/java/org/apache/lucene/facet/taxonomy/CachedOrdinalsReader.java
index 0fbf4fb..a52b2af 100644
--- a/lucene/facet/src/java/org/apache/lucene/facet/taxonomy/CachedOrdinalsReader.java
+++ b/lucene/facet/src/java/org/apache/lucene/facet/taxonomy/CachedOrdinalsReader.java
@@ -23,6 +23,7 @@ import java.util.WeakHashMap;
 
 import org.apache.lucene.codecs.DocValuesFormat;
 import org.apache.lucene.index.BinaryDocValues;
+import org.apache.lucene.index.IndexReader;
 import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.util.Accountable;
 import org.apache.lucene.util.Accountables;
@@ -67,7 +68,11 @@ public class CachedOrdinalsReader extends OrdinalsReader implements Accountable
   }
 
   private synchronized CachedOrds getCachedOrds(LeafReaderContext context) throws IOException {
-    Object cacheKey = context.reader().getCoreCacheKey();
+    IndexReader.CacheHelper cacheHelper = context.reader().getCoreCacheHelper();
+    if (cacheHelper == null) {
+      throw new IllegalStateException("Cannot cache ordinals on leaf: " + context.reader());
+    }
+    Object cacheKey = cacheHelper.getKey();
     CachedOrds ords = ordsCache.get(cacheKey);
     if (ords == null) {
       ords = new CachedOrds(source.getReader(context), context.reader().maxDoc());

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/facet/src/java/org/apache/lucene/facet/taxonomy/OrdinalMappingLeafReader.java
----------------------------------------------------------------------
diff --git a/lucene/facet/src/java/org/apache/lucene/facet/taxonomy/OrdinalMappingLeafReader.java b/lucene/facet/src/java/org/apache/lucene/facet/taxonomy/OrdinalMappingLeafReader.java
index cb798af..341411d 100644
--- a/lucene/facet/src/java/org/apache/lucene/facet/taxonomy/OrdinalMappingLeafReader.java
+++ b/lucene/facet/src/java/org/apache/lucene/facet/taxonomy/OrdinalMappingLeafReader.java
@@ -157,5 +157,15 @@ public class OrdinalMappingLeafReader extends FilterLeafReader {
       return in.getBinaryDocValues(field);
     }
   }
+
+  @Override
+  public CacheHelper getCoreCacheHelper() {
+    return null;
+  }
+
+  @Override
+  public CacheHelper getReaderCacheHelper() {
+    return null;
+  }
   
 }


[43/50] [abbrv] lucene-solr:jira/solr-9858: Adding 6.4.2 version

Posted by ab...@apache.org.
Adding 6.4.2 version


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/0010867a
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/0010867a
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/0010867a

Branch: refs/heads/jira/solr-9858
Commit: 0010867a631ced339ed9240f573d5e99cad282cf
Parents: df6f830
Author: Ishan Chattopadhyaya <is...@apache.org>
Authored: Tue Feb 28 20:23:11 2017 +0530
Committer: Ishan Chattopadhyaya <is...@apache.org>
Committed: Tue Feb 28 20:23:11 2017 +0530

----------------------------------------------------------------------
 lucene/core/src/java/org/apache/lucene/util/Version.java | 7 +++++++
 1 file changed, 7 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0010867a/lucene/core/src/java/org/apache/lucene/util/Version.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/util/Version.java b/lucene/core/src/java/org/apache/lucene/util/Version.java
index 6477816..895f169 100644
--- a/lucene/core/src/java/org/apache/lucene/util/Version.java
+++ b/lucene/core/src/java/org/apache/lucene/util/Version.java
@@ -88,6 +88,13 @@ public final class Version {
   public static final Version LUCENE_6_4_1 = new Version(6, 4, 1);
 
   /**
+   * Match settings and bugs in Lucene's 6.4.2 release.
+   * @deprecated Use latest
+   */
+  @Deprecated
+  public static final Version LUCENE_6_4_2 = new Version(6, 4, 2);
+
+  /**
    * Match settings and bugs in Lucene's 6.5.0 release.
    * @deprecated Use latest
    */


[04/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10126: Improve test a bit.

Posted by ab...@apache.org.
SOLR-10126: Improve test a bit.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/be64c26c
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/be64c26c
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/be64c26c

Branch: refs/heads/jira/solr-9858
Commit: be64c26c270fc9663609492de77c1dec5574afda
Parents: 55ef713
Author: markrmiller <ma...@apache.org>
Authored: Wed Feb 22 12:52:07 2017 -0500
Committer: markrmiller <ma...@apache.org>
Committed: Wed Feb 22 14:44:17 2017 -0500

----------------------------------------------------------------------
 .../solr/cloud/PeerSyncReplicationTest.java     | 26 ++++++++++++++------
 1 file changed, 19 insertions(+), 7 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/be64c26c/solr/core/src/test/org/apache/solr/cloud/PeerSyncReplicationTest.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/cloud/PeerSyncReplicationTest.java b/solr/core/src/test/org/apache/solr/cloud/PeerSyncReplicationTest.java
index 416e95e..0859eb5 100644
--- a/solr/core/src/test/org/apache/solr/cloud/PeerSyncReplicationTest.java
+++ b/solr/core/src/test/org/apache/solr/cloud/PeerSyncReplicationTest.java
@@ -195,9 +195,15 @@ public class PeerSyncReplicationTest extends AbstractFullDistribZkTestBase {
     }
   }
 
+  class IndexInBackGround extends Thread {
+    private int numDocs;
 
-  private void indexInBackground(int numDocs) {
-    new Thread(() -> {
+    public IndexInBackGround(int numDocs) {
+      super(getClassName());
+      this.numDocs = numDocs;
+    }
+    
+    public void run() {
       try {
         for (int i = 0; i < numDocs; i++) {
           indexDoc(id, docId, i1, 50, tlong, 50, t1, "document number " + docId);
@@ -209,10 +215,7 @@ public class PeerSyncReplicationTest extends AbstractFullDistribZkTestBase {
         log.error("Error indexing doc in background", e);
         //Throwing an error here will kill the thread
       }
-    }, getClassName())
-        .start();
-
-
+    }
   }
    
 
@@ -269,7 +272,8 @@ public class PeerSyncReplicationTest extends AbstractFullDistribZkTestBase {
     // disable fingerprint check if needed
     System.setProperty("solr.disableFingerprint", String.valueOf(disableFingerprint));
 
-    indexInBackground(50);
+    IndexInBackGround iib = new IndexInBackGround(50);
+    iib.start();
     
     // bring back dead node and ensure it recovers
     ChaosMonkey.start(nodeToBringUp.jetty);
@@ -284,6 +288,14 @@ public class PeerSyncReplicationTest extends AbstractFullDistribZkTestBase {
     jetties.removeAll(nodesDown);
     assertEquals(getShardCount() - nodesDown.size(), jetties.size());
 
+    waitForThingsToLevelOut(30);
+    
+    iib.join();
+    
+    cloudClient.commit();
+    
+    checkShardConsistency(false, false);
+    
     long cloudClientDocs = cloudClient.query(new SolrQuery("*:*")).getResults().getNumFound();
     assertEquals(docId, cloudClientDocs);
 


[08/50] [abbrv] lucene-solr:jira/solr-9858: LUCENE-7686: add efficient de-duping to the NRT document suggester

Posted by ab...@apache.org.
LUCENE-7686: add efficient de-duping to the NRT document suggester


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/4e2cf61a
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/4e2cf61a
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/4e2cf61a

Branch: refs/heads/jira/solr-9858
Commit: 4e2cf61ac76db33f35d3aceacaf1563a9bd5edb2
Parents: 29a5ea4
Author: Mike McCandless <mi...@apache.org>
Authored: Wed Feb 22 16:04:26 2017 -0500
Committer: Mike McCandless <mi...@apache.org>
Committed: Wed Feb 22 16:04:26 2017 -0500

----------------------------------------------------------------------
 lucene/CHANGES.txt                              |   4 +
 .../java/org/apache/lucene/util/fst/Util.java   |  80 +++---
 .../suggest/document/CompletionAnalyzer.java    |   2 +-
 .../suggest/document/CompletionQuery.java       |   2 +-
 .../search/suggest/document/NRTSuggester.java   |  89 ++++--
 .../search/suggest/document/SuggestField.java   |   2 +-
 .../suggest/document/SuggestIndexSearcher.java  |   7 +-
 .../search/suggest/document/TopSuggestDocs.java |  19 ++
 .../document/TopSuggestDocsCollector.java       |  83 +++++-
 .../suggest/document/TestContextQuery.java      |  26 +-
 .../document/TestContextSuggestField.java       |   8 +-
 .../document/TestFuzzyCompletionQuery.java      |   6 +-
 .../document/TestPrefixCompletionQuery.java     |  28 +-
 .../document/TestRegexCompletionQuery.java      |   6 +-
 .../suggest/document/TestSuggestField.java      | 278 +++++++++++++++++--
 15 files changed, 517 insertions(+), 123 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/CHANGES.txt
----------------------------------------------------------------------
diff --git a/lucene/CHANGES.txt b/lucene/CHANGES.txt
index c6c97fb..e71149b 100644
--- a/lucene/CHANGES.txt
+++ b/lucene/CHANGES.txt
@@ -126,6 +126,10 @@ New Features
 * LUCENE-7688: Add OneMergeWrappingMergePolicy class.
   (Keith Laban, Christine Poerschke)
 
+* LUCENE-7686: The near-real-time document suggester can now
+  efficiently filter out duplicate suggestions (Uwe Schindler, Mike
+  McCandless)
+
 Bug Fixes
 
 * LUCENE-7630: Fix (Edge)NGramTokenFilter to no longer drop payloads

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/core/src/java/org/apache/lucene/util/fst/Util.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/util/fst/Util.java b/lucene/core/src/java/org/apache/lucene/util/fst/Util.java
index 341b8d0..2f83dd1 100644
--- a/lucene/core/src/java/org/apache/lucene/util/fst/Util.java
+++ b/lucene/core/src/java/org/apache/lucene/util/fst/Util.java
@@ -248,32 +248,38 @@ public final class Util {
    *  @lucene.experimental
    */
   public static class FSTPath<T> {
+    /** Holds the last arc appended to this path */
     public FST.Arc<T> arc;
-    public T cost;
+    /** Holds cost plus any usage-specific output: */
+    public T output;
     public final IntsRefBuilder input;
     public final float boost;
     public final CharSequence context;
 
+    // Custom int payload for consumers; the NRT suggester uses this to record if this path has already enumerated a surface form
+    public int payload;
+
     /** Sole constructor */
-    public FSTPath(T cost, FST.Arc<T> arc, IntsRefBuilder input) {
-      this(cost, arc, input, 0, null);
+    public FSTPath(T output, FST.Arc<T> arc, IntsRefBuilder input) {
+      this(output, arc, input, 0, null, -1);
     }
 
-    public FSTPath(T cost, FST.Arc<T> arc, IntsRefBuilder input, float boost, CharSequence context) {
+    public FSTPath(T output, FST.Arc<T> arc, IntsRefBuilder input, float boost, CharSequence context, int payload) {
       this.arc = new FST.Arc<T>().copyFrom(arc);
-      this.cost = cost;
+      this.output = output;
       this.input = input;
       this.boost = boost;
       this.context = context;
+      this.payload = payload;
     }
 
-    public FSTPath<T> newPath(T cost, IntsRefBuilder input) {
-      return new FSTPath<>(cost, this.arc, input, this.boost, this.context);
+    public FSTPath<T> newPath(T output, IntsRefBuilder input) {
+      return new FSTPath<>(output, this.arc, input, this.boost, this.context, this.payload);
     }
 
     @Override
     public String toString() {
-      return "input=" + input.get() + " cost=" + cost + "context=" + context + "boost=" + boost;
+      return "input=" + input.get() + " output=" + output + " context=" + context + " boost=" + boost + " payload=" + payload;
     }
   }
 
@@ -287,7 +293,7 @@ public final class Util {
 
     @Override
     public int compare(FSTPath<T> a, FSTPath<T> b) {
-      int cmp = comparator.compare(a.cost, b.cost);
+      int cmp = comparator.compare(a.output, b.output);
       if (cmp == 0) {
         return a.input.get().compareTo(b.input.get());
       } else {
@@ -339,8 +345,7 @@ public final class Util {
 
       assert queue != null;
 
-      T cost = fst.outputs.add(path.cost, path.arc.output);
-      //System.out.println("  addIfCompetitive queue.size()=" + queue.size() + " path=" + path + " + label=" + path.arc.label);
+      T output = fst.outputs.add(path.output, path.arc.output);
 
       if (queue.size() == maxQueueDepth) {
         FSTPath<T> bottom = queue.last();
@@ -373,32 +378,32 @@ public final class Util {
       newInput.copyInts(path.input.get());
       newInput.append(path.arc.label);
 
-      queue.add(path.newPath(cost, newInput));
-
-      if (queue.size() == maxQueueDepth+1) {
-        queue.pollLast();
+      FSTPath<T> newPath = path.newPath(output, newInput);
+      if (acceptPartialPath(newPath)) {
+        queue.add(newPath);
+        if (queue.size() == maxQueueDepth+1) {
+          queue.pollLast();
+        }
       }
     }
 
     public void addStartPaths(FST.Arc<T> node, T startOutput, boolean allowEmptyString, IntsRefBuilder input) throws IOException {
-      addStartPaths(node, startOutput, allowEmptyString, input, 0, null);
+      addStartPaths(node, startOutput, allowEmptyString, input, 0, null, -1);
     }
 
     /** Adds all leaving arcs, including 'finished' arc, if
      *  the node is final, from this node into the queue.  */
     public void addStartPaths(FST.Arc<T> node, T startOutput, boolean allowEmptyString, IntsRefBuilder input,
-                              float boost, CharSequence context) throws IOException {
+                              float boost, CharSequence context, int payload) throws IOException {
 
       // De-dup NO_OUTPUT since it must be a singleton:
       if (startOutput.equals(fst.outputs.getNoOutput())) {
         startOutput = fst.outputs.getNoOutput();
       }
 
-      FSTPath<T> path = new FSTPath<>(startOutput, node, input, boost, context);
+      FSTPath<T> path = new FSTPath<>(startOutput, node, input, boost, context, payload);
       fst.readFirstTargetArc(node, path.arc, bytesReader);
 
-      //System.out.println("add start paths");
-
       // Bootstrap: find the min starting arc
       while (true) {
         if (allowEmptyString || path.arc.label != FST.END_LABEL) {
@@ -415,8 +420,6 @@ public final class Util {
 
       final List<Result<T>> results = new ArrayList<>();
 
-      //System.out.println("search topN=" + topN);
-
       final BytesReader fstReader = fst.getBytesReader();
       final T NO_OUTPUT = fst.outputs.getNoOutput();
 
@@ -430,13 +433,11 @@ public final class Util {
 
       // For each top N path:
       while (results.size() < topN) {
-        //System.out.println("\nfind next path: queue.size=" + queue.size());
 
         FSTPath<T> path;
 
         if (queue == null) {
           // Ran out of paths
-          //System.out.println("  break queue=null");
           break;
         }
 
@@ -446,15 +447,18 @@ public final class Util {
 
         if (path == null) {
           // There were less than topN paths available:
-          //System.out.println("  break no more paths");
           break;
         }
+        //System.out.println("pop path=" + path + " arc=" + path.arc.output);
+
+        if (acceptPartialPath(path) == false) {
+          continue;
+        }
 
         if (path.arc.label == FST.END_LABEL) {
-          //System.out.println("    empty string!  cost=" + path.cost);
           // Empty string!
           path.input.setLength(path.input.length() - 1);
-          results.add(new Result<>(path.input.get(), path.cost));
+          results.add(new Result<>(path.input.get(), path.output));
           continue;
         }
 
@@ -463,8 +467,6 @@ public final class Util {
           queue = null;
         }
 
-        //System.out.println("  path: " + path);
-        
         // We take path and find its "0 output completion",
         // ie, just keep traversing the first arc with
         // NO_OUTPUT that we can find, since this must lead
@@ -474,13 +476,11 @@ public final class Util {
         // For each input letter:
         while (true) {
 
-          //System.out.println("\n    cycle path: " + path);         
           fst.readFirstTargetArc(path.arc, path.arc, fstReader);
 
           // For each arc leaving this node:
           boolean foundZero = false;
           while(true) {
-            //System.out.println("      arc=" + (char) path.arc.label + " cost=" + path.arc.output);
             // tricky: instead of comparing output == 0, we must
             // express it via the comparator compare(output, 0) == 0
             if (comparator.compare(NO_OUTPUT, path.arc.output) == 0) {
@@ -514,18 +514,19 @@ public final class Util {
 
           if (path.arc.label == FST.END_LABEL) {
             // Add final output:
-            //System.out.println("    done!: " + path);
-            path.cost = fst.outputs.add(path.cost, path.arc.output);
+            path.output = fst.outputs.add(path.output, path.arc.output);
             if (acceptResult(path)) {
-              //System.out.println("    add result: " + path);
-              results.add(new Result<>(path.input.get(), path.cost));
+              results.add(new Result<>(path.input.get(), path.output));
             } else {
               rejectCount++;
             }
             break;
           } else {
             path.input.append(path.arc.label);
-            path.cost = fst.outputs.add(path.cost, path.arc.output);
+            path.output = fst.outputs.add(path.output, path.arc.output);
+            if (acceptPartialPath(path) == false) {
+              break;
+            }
           }
         }
       }
@@ -533,7 +534,12 @@ public final class Util {
     }
 
     protected boolean acceptResult(FSTPath<T> path) {
-      return acceptResult(path.input.get(), path.cost);
+      return acceptResult(path.input.get(), path.output);
+    }
+
+    /** Override this to prevent considering a path before it's complete */
+    protected boolean acceptPartialPath(FSTPath<T> path) {
+      return true;
     }
 
     protected boolean acceptResult(IntsRef input, T output) {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/CompletionAnalyzer.java
----------------------------------------------------------------------
diff --git a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/CompletionAnalyzer.java b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/CompletionAnalyzer.java
index 6366b6c..13bd392 100644
--- a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/CompletionAnalyzer.java
+++ b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/CompletionAnalyzer.java
@@ -81,7 +81,7 @@ public final class CompletionAnalyzer extends AnalyzerWrapper {
   private final int maxGraphExpansions;
 
   /**
-   * Wraps an analyzer to convert it's output token stream to an automaton
+   * Wraps an analyzer to convert its output token stream to an automaton
    *
    * @param analyzer token stream to be converted to an automaton
    * @param preserveSep Preserve separation between tokens when converting to an automaton

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/CompletionQuery.java
----------------------------------------------------------------------
diff --git a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/CompletionQuery.java b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/CompletionQuery.java
index 71ba15a..49fe7d0 100644
--- a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/CompletionQuery.java
+++ b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/CompletionQuery.java
@@ -34,7 +34,7 @@ import static org.apache.lucene.search.suggest.document.CompletionAnalyzer.SEP_L
  * filtered by {@link BitsProducer}. This should be used to query against any {@link SuggestField}s
  * or {@link ContextSuggestField}s of documents.
  * <p>
- * Use {@link SuggestIndexSearcher#suggest(CompletionQuery, int)} to execute any query
+ * Use {@link SuggestIndexSearcher#suggest(CompletionQuery, int, boolean)} to execute any query
  * that provides a concrete implementation of this query. Example below shows using this query
  * to retrieve the top 5 documents.
  *

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/NRTSuggester.java
----------------------------------------------------------------------
diff --git a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/NRTSuggester.java b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/NRTSuggester.java
index 52e4ea0..7b8981a 100644
--- a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/NRTSuggester.java
+++ b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/NRTSuggester.java
@@ -32,12 +32,11 @@ import org.apache.lucene.util.BytesRef;
 import org.apache.lucene.util.CharsRefBuilder;
 import org.apache.lucene.util.fst.ByteSequenceOutputs;
 import org.apache.lucene.util.fst.FST;
-import org.apache.lucene.util.fst.PairOutputs;
 import org.apache.lucene.util.fst.PairOutputs.Pair;
+import org.apache.lucene.util.fst.PairOutputs;
 import org.apache.lucene.util.fst.PositiveIntOutputs;
 import org.apache.lucene.util.fst.Util;
 
-import static org.apache.lucene.search.suggest.document.NRTSuggester.PayLoadProcessor.parseDocID;
 import static org.apache.lucene.search.suggest.document.NRTSuggester.PayLoadProcessor.parseSurfaceForm;
 
 /**
@@ -142,21 +141,74 @@ public final class NRTSuggester implements Accountable {
     // maximum number of suggestions that can be collected.
     final int topN = collector.getCountToCollect() * prefixPaths.size();
     final int queueSize = getMaxTopNSearcherQueueSize(topN, scorer.reader.numDocs(), liveDocsRatio, scorer.filtered);
+
+    final CharsRefBuilder spare = new CharsRefBuilder();
+
     Comparator<Pair<Long, BytesRef>> comparator = getComparator();
     Util.TopNSearcher<Pair<Long, BytesRef>> searcher = new Util.TopNSearcher<Pair<Long, BytesRef>>(fst, topN, queueSize, comparator,
         new ScoringPathComparator(scorer)) {
 
-      private final CharsRefBuilder spare = new CharsRefBuilder();
+      private final ByteArrayDataInput scratchInput = new ByteArrayDataInput();
+
+      @Override
+      protected boolean acceptPartialPath(Util.FSTPath<Pair<Long,BytesRef>> path) {
+        if (collector.doSkipDuplicates()) {
+          // We are removing dups
+          if (path.payload == -1) {
+            // This path didn't yet see the complete surface form; let's see if it just did with the arc output we just added:
+            BytesRef arcOutput = path.arc.output.output2;
+            BytesRef output = path.output.output2;
+            for(int i=0;i<arcOutput.length;i++) {
+              if (arcOutput.bytes[arcOutput.offset + i] == payloadSep) {
+                // OK this arc that the path was just extended by contains the payloadSep, so we now have a full surface form in this path
+                path.payload = output.length - arcOutput.length + i;
+                assert output.bytes[output.offset + path.payload] == payloadSep;
+                break;
+              }
+            }
+          }
+
+          if (path.payload != -1) {
+            BytesRef output = path.output.output2;
+            spare.copyUTF8Bytes(output.bytes, output.offset, path.payload);
+            if (collector.seenSurfaceForms.contains(spare.chars(), 0, spare.length())) {
+              return false;
+            }
+          }
+        }
+        return true;
+      }
 
       @Override
       protected boolean acceptResult(Util.FSTPath<Pair<Long, BytesRef>> path) {
-        int payloadSepIndex = parseSurfaceForm(path.cost.output2, payloadSep, spare);
-        int docID = parseDocID(path.cost.output2, payloadSepIndex);
+        BytesRef output = path.output.output2;
+        int payloadSepIndex;
+        if (path.payload != -1) {
+          payloadSepIndex = path.payload;
+          spare.copyUTF8Bytes(output.bytes, output.offset, payloadSepIndex);
+        } else {
+          assert collector.doSkipDuplicates() == false;
+          payloadSepIndex = parseSurfaceForm(output, payloadSep, spare);
+        }
+
+        scratchInput.reset(output.bytes, output.offset + payloadSepIndex + 1, output.length - payloadSepIndex - 1);
+        int docID = scratchInput.readVInt();
+        
         if (!scorer.accept(docID, acceptDocs)) {
           return false;
         }
+        if (collector.doSkipDuplicates()) {
+          // now record that we've seen this surface form:
+          char[] key = new char[spare.length()];
+          System.arraycopy(spare.chars(), 0, key, 0, spare.length());
+          if (collector.seenSurfaceForms.contains(key)) {
+            // we already collected a higher scoring document with this key, in this segment:
+            return false;
+          }
+          collector.seenSurfaceForms.add(key);
+        }
         try {
-          float score = scorer.score(decode(path.cost.output1), path.boost);
+          float score = scorer.score(decode(path.output.output1), path.boost);
           collector.collect(docID, spare.toCharsRef(), path.context, score);
           return true;
         } catch (IOException e) {
@@ -167,8 +219,20 @@ public final class NRTSuggester implements Accountable {
 
     for (FSTUtil.Path<Pair<Long, BytesRef>> path : prefixPaths) {
       scorer.weight.setNextMatch(path.input.get());
+      BytesRef output = path.output.output2;
+      int payload = -1;
+      if (collector.doSkipDuplicates()) {
+        for(int j=0;j<output.length;j++) {
+          if (output.bytes[output.offset+j] == payloadSep) {
+            // Important to cache this, else we have a possibly O(N^2) cost where N is the length of suggestions
+            payload = j;
+            break;
+          }
+        }
+      }
+      
       searcher.addStartPaths(path.fstNode, path.output, false, path.input, scorer.weight.boost(),
-          scorer.weight.context());
+                             scorer.weight.context(), payload);
     }
     // hits are also returned by search()
     // we do not use it, instead collect at acceptResult
@@ -191,8 +255,8 @@ public final class NRTSuggester implements Accountable {
 
     @Override
     public int compare(Util.FSTPath<Pair<Long, BytesRef>> first, Util.FSTPath<Pair<Long, BytesRef>> second) {
-      int cmp = Float.compare(scorer.score(decode(second.cost.output1), second.boost),
-          scorer.score(decode(first.cost.output1), first.boost));
+      int cmp = Float.compare(scorer.score(decode(second.output.output1), second.boost),
+          scorer.score(decode(first.output.output1), first.boost));
       return (cmp != 0) ? cmp : first.input.get().compareTo(second.input.get());
     }
   }
@@ -285,13 +349,6 @@ public final class NRTSuggester implements Accountable {
       return surfaceFormLen;
     }
 
-    static int parseDocID(final BytesRef output, int payloadSepIndex) {
-      assert payloadSepIndex != -1 : "payload sep index can not be -1";
-      ByteArrayDataInput input = new ByteArrayDataInput(output.bytes, payloadSepIndex + output.offset + 1,
-          output.length - (payloadSepIndex + output.offset));
-      return input.readVInt();
-    }
-
     static BytesRef make(final BytesRef surface, int docID, int payloadSep) throws IOException {
       int len = surface.length + MAX_DOC_ID_LEN_WITH_SEP;
       byte[] buffer = new byte[len];

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/SuggestField.java
----------------------------------------------------------------------
diff --git a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/SuggestField.java b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/SuggestField.java
index 798a0b8..e5bdda9 100644
--- a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/SuggestField.java
+++ b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/SuggestField.java
@@ -47,7 +47,7 @@ import org.apache.lucene.util.BytesRef;
  * document.add(new SuggestField(name, "suggestion", 4));
  * </pre>
  * To perform document suggestions based on the this field, use
- * {@link SuggestIndexSearcher#suggest(CompletionQuery, int)}
+ * {@link SuggestIndexSearcher#suggest(CompletionQuery, int, boolean)}
  *
  * @lucene.experimental
  */

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/SuggestIndexSearcher.java
----------------------------------------------------------------------
diff --git a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/SuggestIndexSearcher.java b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/SuggestIndexSearcher.java
index a64afed..5f65906 100644
--- a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/SuggestIndexSearcher.java
+++ b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/SuggestIndexSearcher.java
@@ -38,6 +38,9 @@ import org.apache.lucene.search.Weight;
  */
 public class SuggestIndexSearcher extends IndexSearcher {
 
+  // NOTE: we do not accept an ExecutorService here, because at least the dedup
+  // logic in TopSuggestDocsCollector/NRTSuggester would not be thread safe (and maybe other things)
+
   /**
    * Creates a searcher with document suggest capabilities
    * for <code>reader</code>.
@@ -50,8 +53,8 @@ public class SuggestIndexSearcher extends IndexSearcher {
    * Returns top <code>n</code> completion hits for
    * <code>query</code>
    */
-  public TopSuggestDocs suggest(CompletionQuery query, int n) throws IOException {
-    TopSuggestDocsCollector collector = new TopSuggestDocsCollector(n);
+  public TopSuggestDocs suggest(CompletionQuery query, int n, boolean skipDuplicates) throws IOException {
+    TopSuggestDocsCollector collector = new TopSuggestDocsCollector(n, skipDuplicates);
     suggest(query, collector);
     return collector.get();
   }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/TopSuggestDocs.java
----------------------------------------------------------------------
diff --git a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/TopSuggestDocs.java b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/TopSuggestDocs.java
index 6154d29..1ffcbdc 100644
--- a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/TopSuggestDocs.java
+++ b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/TopSuggestDocs.java
@@ -66,6 +66,25 @@ public class TopSuggestDocs extends TopDocs {
     public int compareTo(SuggestScoreDoc o) {
       return Lookup.CHARSEQUENCE_COMPARATOR.compare(key, o.key);
     }
+
+    @Override
+    public boolean equals(Object other) {
+      if (other instanceof SuggestScoreDoc == false) {
+        return false;
+      } else {
+        return key.equals(((SuggestScoreDoc) other).key);
+      }
+    }
+
+    @Override
+    public int hashCode() {
+      return key.hashCode();
+    }
+
+    @Override
+    public String toString() {
+      return "key=" + key + " doc=" + doc + " score=" + score + " shardIndex=" + shardIndex;      
+    }
   }
 
   /**

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/TopSuggestDocsCollector.java
----------------------------------------------------------------------
diff --git a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/TopSuggestDocsCollector.java b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/TopSuggestDocsCollector.java
index d50e93b..3336896 100644
--- a/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/TopSuggestDocsCollector.java
+++ b/lucene/suggest/src/java/org/apache/lucene/search/suggest/document/TopSuggestDocsCollector.java
@@ -17,7 +17,12 @@
 package org.apache.lucene.search.suggest.document;
 
 import java.io.IOException;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.Comparator;
+import java.util.List;
 
+import org.apache.lucene.analysis.CharArraySet;
 import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.search.CollectionTerminatedException;
 import org.apache.lucene.search.SimpleCollector;
@@ -47,9 +52,13 @@ public class TopSuggestDocsCollector extends SimpleCollector {
   private final SuggestScoreDocPriorityQueue priorityQueue;
   private final int num;
 
-  /**
-   * Document base offset for the current Leaf
-   */
+  /** Only set if we are deduplicating hits: holds all per-segment hits until the end, when we dedup them */
+  private final List<SuggestScoreDoc> pendingResults;
+
+  /** Only set if we are deduplicating hits: holds all surface forms seen so far in the current segment */
+  final CharArraySet seenSurfaceForms;
+
+  /** Document base offset for the current Leaf */
   protected int docBase;
 
   /**
@@ -58,12 +67,24 @@ public class TopSuggestDocsCollector extends SimpleCollector {
    * Collects at most <code>num</code> completions
    * with corresponding document and weight
    */
-  public TopSuggestDocsCollector(int num) {
+  public TopSuggestDocsCollector(int num, boolean skipDuplicates) {
     if (num <= 0) {
       throw new IllegalArgumentException("'num' must be > 0");
     }
     this.num = num;
     this.priorityQueue = new SuggestScoreDocPriorityQueue(num);
+    if (skipDuplicates) {
+      seenSurfaceForms = new CharArraySet(num, false);
+      pendingResults = new ArrayList<>();
+    } else {
+      seenSurfaceForms = null;
+      pendingResults = null;
+    }
+  }
+
+  /** Returns true if duplicates are filtered out */
+  protected boolean doSkipDuplicates() {
+    return seenSurfaceForms != null;
   }
 
   /**
@@ -76,6 +97,13 @@ public class TopSuggestDocsCollector extends SimpleCollector {
   @Override
   protected void doSetNextReader(LeafReaderContext context) throws IOException {
     docBase = context.docBase;
+    if (seenSurfaceForms != null) {
+      seenSurfaceForms.clear();
+      // NOTE: this also clears the priorityQueue:
+      for (SuggestScoreDoc hit : priorityQueue.getResults()) {
+        pendingResults.add(hit);
+      }
+    }
   }
 
   /**
@@ -101,7 +129,52 @@ public class TopSuggestDocsCollector extends SimpleCollector {
    * Returns at most <code>num</code> Top scoring {@link org.apache.lucene.search.suggest.document.TopSuggestDocs}s
    */
   public TopSuggestDocs get() throws IOException {
-    SuggestScoreDoc[] suggestScoreDocs = priorityQueue.getResults();
+
+    SuggestScoreDoc[] suggestScoreDocs;
+    
+    if (seenSurfaceForms != null) {
+      // NOTE: this also clears the priorityQueue:
+      for (SuggestScoreDoc hit : priorityQueue.getResults()) {
+        pendingResults.add(hit);
+      }
+
+      // Deduplicate all hits: we already dedup'd efficiently within each segment by
+      // truncating the FST top paths search, but across segments there may still be dups:
+      seenSurfaceForms.clear();
+
+      // TODO: we could use a priority queue here to make cost O(N * log(num)) instead of O(N * log(N)), where N = O(num *
+      // numSegments), but typically numSegments is smallish and num is smallish so this won't matter much in practice:
+
+      Collections.sort(pendingResults,
+                       new Comparator<SuggestScoreDoc>() {
+                         @Override
+                         public int compare(SuggestScoreDoc a, SuggestScoreDoc b) {
+                           // sort by higher score
+                           int cmp = Float.compare(b.score, a.score);
+                           if (cmp == 0) {
+                             // tie break by lower docID:
+                             cmp = Integer.compare(a.doc, b.doc);
+                           }
+                           return cmp;
+                         }
+                       });
+
+      List<SuggestScoreDoc> hits = new ArrayList<>();
+      
+      for (SuggestScoreDoc hit : pendingResults) {
+        if (seenSurfaceForms.contains(hit.key) == false) {
+          seenSurfaceForms.add(hit.key);
+          hits.add(hit);
+          if (hits.size() == num) {
+            break;
+          }
+        }
+      }
+      suggestScoreDocs = hits.toArray(new SuggestScoreDoc[0]);
+    } else {
+      suggestScoreDocs = priorityQueue.getResults();
+    }
+
     if (suggestScoreDocs.length > 0) {
       return new TopSuggestDocs(suggestScoreDocs.length, suggestScoreDocs, suggestScoreDocs[0].score);
     } else {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestContextQuery.java
----------------------------------------------------------------------
diff --git a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestContextQuery.java b/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestContextQuery.java
index 35661ee..2c5dcd8 100644
--- a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestContextQuery.java
+++ b/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestContextQuery.java
@@ -89,7 +89,7 @@ public class TestContextQuery extends LuceneTestCase {
     query.addContext("type2", 2);
     query.addContext("type3", 3);
     query.addContext("type4", 4);
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5, false);
     assertSuggestions(suggest,
         new Entry("suggestion4", "type4", 5 * 4),
         new Entry("suggestion3", "type3", 6 * 3),
@@ -124,7 +124,7 @@ public class TestContextQuery extends LuceneTestCase {
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     ContextQuery query = new ContextQuery(new PrefixCompletionQuery(analyzer, new Term("suggest_field", "ab")));
     IllegalStateException expected = expectThrows(IllegalStateException.class, () -> {
-      suggestIndexSearcher.suggest(query, 4);
+      suggestIndexSearcher.suggest(query, 4, false);
     });
     assertTrue(expected.getMessage().contains("SuggestField"));
 
@@ -155,7 +155,7 @@ public class TestContextQuery extends LuceneTestCase {
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     ContextQuery query = new ContextQuery(new PrefixCompletionQuery(analyzer, new Term("suggest_field", "sugg")));
     query.addContext("type", 1, false);
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5, false);
     assertSuggestions(suggest,
         new Entry("suggestion1", "type1", 4),
         new Entry("suggestion2", "type2", 3),
@@ -185,7 +185,7 @@ public class TestContextQuery extends LuceneTestCase {
     ContextQuery query = new ContextQuery(new PrefixCompletionQuery(analyzer, new Term("suggest_field", "sugg")));
     query.addContext("type", 1);
     query.addContext("typetype", 2);
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5, false);
     assertSuggestions(suggest,
         new Entry("suggestion1", "typetype", 4 * 2),
         new Entry("suggestion2", "type", 3 * 1)
@@ -215,7 +215,7 @@ public class TestContextQuery extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     ContextQuery query = new ContextQuery(new PrefixCompletionQuery(analyzer, new Term("suggest_field", "sugg")));
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5, false);
     assertSuggestions(suggest,
         new Entry("suggestion_no_ctx", null, 4),
         new Entry("suggestion", "type4", 1));
@@ -249,7 +249,7 @@ public class TestContextQuery extends LuceneTestCase {
     ContextQuery query = new ContextQuery(new PrefixCompletionQuery(analyzer, new Term("suggest_field", "sugg")));
     query.addContext("type4", 10);
     query.addAllContexts();
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5, false);
     assertSuggestions(suggest,
         new Entry("suggestion4", "type4", 1 * 10),
         new Entry("suggestion1", null, 4),
@@ -284,7 +284,7 @@ public class TestContextQuery extends LuceneTestCase {
     query.addContext("type2", 2);
     query.addContext("type3", 3);
     query.addContext("type4", 4);
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5, false);
     assertSuggestions(suggest,
         new Entry("suggestion", "type1", 4 * 10),
         new Entry("suggestion", "type3", 4 * 3),
@@ -321,7 +321,7 @@ public class TestContextQuery extends LuceneTestCase {
     query.addContext("type1", 7);
     query.addContext("type2", 6);
     query.addAllContexts();
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5, false);
     assertSuggestions(suggest,
         new Entry("suggestion1", "type1", 4 * 7),
         new Entry("suggestion2", "type2", 3 * 6),
@@ -357,7 +357,7 @@ public class TestContextQuery extends LuceneTestCase {
     ContextQuery query = new ContextQuery(new PrefixCompletionQuery(analyzer, new Term("suggest_field", "sugg")));
     query.addContext("type3", 3);
     query.addContext("type4", 4);
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5, false);
     assertSuggestions(suggest,
         new Entry("suggestion3", "type3", 2 * 3),
         new Entry("suggestion4", "type4", 1 * 4)
@@ -389,7 +389,7 @@ public class TestContextQuery extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     CompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "sugg"));
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5, false);
     assertSuggestions(suggest,
         new Entry("suggestion1", "type1", 4),
         new Entry("suggestion2", "type2", 3),
@@ -426,7 +426,7 @@ public class TestContextQuery extends LuceneTestCase {
     query.addContext("type2", 2);
     query.addContext("type3", 3);
     query.addContext("type4", 4);
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5, false);
     assertSuggestions(suggest,
         new Entry("suggestion1", "type3", 8 * 3),
         new Entry("suggestion4", "type4", 5 * 4),
@@ -460,7 +460,7 @@ public class TestContextQuery extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     ContextQuery query = new ContextQuery(new PrefixCompletionQuery(analyzer, new Term("suggest_field", "sugg")));
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 4);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 4, false);
     assertSuggestions(suggest,
         new Entry("suggestion1", "type1", 4),
         new Entry("suggestion2", "type2", 3),
@@ -520,7 +520,7 @@ public class TestContextQuery extends LuceneTestCase {
         for (int i = 0; i < contexts.size(); i++) {
           query.addContext(contexts.get(i), i + 1);
         }
-        TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 4);
+        TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 4, false);
         assertSuggestions(suggest, Arrays.copyOfRange(expectedResults, 0, 4));
       }
     }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestContextSuggestField.java
----------------------------------------------------------------------
diff --git a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestContextSuggestField.java b/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestContextSuggestField.java
index 9f207f8..0c3b254 100644
--- a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestContextSuggestField.java
+++ b/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestContextSuggestField.java
@@ -172,7 +172,7 @@ public class TestContextSuggestField extends LuceneTestCase {
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
 
     CompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "sugg"));
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 10);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 10, false);
     assertSuggestions(suggest,
         new Entry("suggestion1", 4),
         new Entry("suggestion2", 3),
@@ -180,7 +180,7 @@ public class TestContextSuggestField extends LuceneTestCase {
         new Entry("suggestion4", 1));
 
     query = new PrefixCompletionQuery(analyzer, new Term("context_suggest_field", "sugg"));
-    suggest = suggestIndexSearcher.suggest(query, 10);
+    suggest = suggestIndexSearcher.suggest(query, 10, false);
     assertSuggestions(suggest,
         new Entry("suggestion1", "type1", 4),
         new Entry("suggestion2", "type2", 3),
@@ -212,14 +212,14 @@ public class TestContextSuggestField extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     ContextQuery query = new ContextQuery(new PrefixCompletionQuery(completionAnalyzer, new Term("suggest_field", "sugg")));
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 4);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 4, false);
     assertSuggestions(suggest,
         new Entry("suggestion1", "type1", 4),
         new Entry("suggestion2", "type2", 3),
         new Entry("suggestion3", "type3", 2),
         new Entry("suggestion4", "type4", 1));
     query.addContext("type1");
-    suggest = suggestIndexSearcher.suggest(query, 4);
+    suggest = suggestIndexSearcher.suggest(query, 4, false);
     assertSuggestions(suggest,
         new Entry("suggestion1", "type1", 4));
     reader.close();

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestFuzzyCompletionQuery.java
----------------------------------------------------------------------
diff --git a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestFuzzyCompletionQuery.java b/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestFuzzyCompletionQuery.java
index 9a773ca..40c3f88 100644
--- a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestFuzzyCompletionQuery.java
+++ b/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestFuzzyCompletionQuery.java
@@ -66,7 +66,7 @@ public class TestFuzzyCompletionQuery extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     CompletionQuery query = new FuzzyCompletionQuery(analyzer, new Term("suggest_field", "sugg"));
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 4);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 4, false);
     assertSuggestions(suggest,
         new Entry("suaggestion", 4 * 2),
         new Entry("suggestion", 2 * 3),
@@ -101,7 +101,7 @@ public class TestFuzzyCompletionQuery extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     CompletionQuery query =  new ContextQuery(new FuzzyCompletionQuery(analyzer, new Term("suggest_field", "sugge")));
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5, false);
     assertSuggestions(suggest,
         new Entry("suggestion", "type4", 4),
         new Entry("suggdestion", "type4", 4),
@@ -140,7 +140,7 @@ public class TestFuzzyCompletionQuery extends LuceneTestCase {
     ContextQuery contextQuery = new ContextQuery(fuzzyQuery);
     contextQuery.addContext("type1", 6);
     contextQuery.addContext("type3", 2);
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(contextQuery, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(contextQuery, 5, false);
     assertSuggestions(suggest,
         new Entry("sduggestion", "type1", 1 * (1 + 6)),
         new Entry("sugdgestion", "type3", 1 * (3 + 2))

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestPrefixCompletionQuery.java
----------------------------------------------------------------------
diff --git a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestPrefixCompletionQuery.java b/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestPrefixCompletionQuery.java
index f5bacef..515ac2d 100644
--- a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestPrefixCompletionQuery.java
+++ b/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestPrefixCompletionQuery.java
@@ -135,7 +135,7 @@ public class TestPrefixCompletionQuery extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "ab"));
-    TopSuggestDocs lookupDocs = suggestIndexSearcher.suggest(query, 3);
+    TopSuggestDocs lookupDocs = suggestIndexSearcher.suggest(query, 3, false);
     assertSuggestions(lookupDocs, new Entry("abcdd", 5), new Entry("abd", 4), new Entry("abc", 3));
 
     reader.close();
@@ -165,7 +165,7 @@ public class TestPrefixCompletionQuery extends LuceneTestCase {
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "abc_"), filter);
     // if at most half of the top scoring documents have been filtered out
     // the search should be admissible for a single segment
-    TopSuggestDocs suggest = indexSearcher.suggest(query, num);
+    TopSuggestDocs suggest = indexSearcher.suggest(query, num, false);
     assertTrue(suggest.totalHits >= 1);
     assertThat(suggest.scoreLookupDocs()[0].key.toString(), equalTo("abc_" + topScore));
     assertThat(suggest.scoreLookupDocs()[0].score, equalTo((float) topScore));
@@ -174,14 +174,14 @@ public class TestPrefixCompletionQuery extends LuceneTestCase {
     query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "abc_"), filter);
     // if more than half of the top scoring documents have been filtered out
     // search is not admissible, so # of suggestions requested is num instead of 1
-    suggest = indexSearcher.suggest(query, num);
+    suggest = indexSearcher.suggest(query, num, false);
     assertSuggestions(suggest, new Entry("abc_0", 0));
 
     filter = new NumericRangeBitsProducer("filter_int_fld", num - 1, num - 1);
     query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "abc_"), filter);
     // if only lower scoring documents are filtered out
     // search is admissible
-    suggest = indexSearcher.suggest(query, 1);
+    suggest = indexSearcher.suggest(query, 1, false);
     assertSuggestions(suggest, new Entry("abc_" + (num - 1), num - 1));
 
     reader.close();
@@ -216,13 +216,13 @@ public class TestPrefixCompletionQuery extends LuceneTestCase {
 
     // suggest without filter
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "app"));
-    TopSuggestDocs suggest = indexSearcher.suggest(query, 3);
+    TopSuggestDocs suggest = indexSearcher.suggest(query, 3, false);
     assertSuggestions(suggest, new Entry("apple", 5), new Entry("applle", 4), new Entry("apples", 3));
 
     // suggest with filter
     BitsProducer filter = new NumericRangeBitsProducer("filter_int_fld", 5, 12);
     query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "app"), filter);
-    suggest = indexSearcher.suggest(query, 3);
+    suggest = indexSearcher.suggest(query, 3, false);
     assertSuggestions(suggest, new Entry("applle", 4), new Entry("apples", 3));
 
     reader.close();
@@ -243,10 +243,10 @@ public class TestPrefixCompletionQuery extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher indexSearcher = new SuggestIndexSearcher(reader);
     CompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field_no_p_sep_or_pos_inc", "fo"));
-    TopSuggestDocs suggest = indexSearcher.suggest(query, 4); // all 4
+    TopSuggestDocs suggest = indexSearcher.suggest(query, 4, false); // all 4
     assertSuggestions(suggest, new Entry("the foo bar", 10), new Entry("the fo", 9), new Entry("foo bar", 8), new Entry("foobar", 7));
     query = new PrefixCompletionQuery(analyzer, new Term("suggest_field_no_p_sep_or_pos_inc", "foob"));
-    suggest = indexSearcher.suggest(query, 4); // not the fo
+    suggest = indexSearcher.suggest(query, 4, false); // not the fo
     assertSuggestions(suggest, new Entry("the foo bar", 10), new Entry("foo bar", 8), new Entry("foobar", 7));
     reader.close();
     iw.close();
@@ -266,10 +266,10 @@ public class TestPrefixCompletionQuery extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher indexSearcher = new SuggestIndexSearcher(reader);
     CompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field_no_p_pos_inc", "fo"));
-    TopSuggestDocs suggest = indexSearcher.suggest(query, 4); //matches all 4
+    TopSuggestDocs suggest = indexSearcher.suggest(query, 4, false); //matches all 4
     assertSuggestions(suggest, new Entry("the foo bar", 10), new Entry("the fo", 9), new Entry("foo bar", 8), new Entry("foobar", 7));
     query = new PrefixCompletionQuery(analyzer, new Term("suggest_field_no_p_pos_inc", "foob"));
-    suggest = indexSearcher.suggest(query, 4); // only foobar
+    suggest = indexSearcher.suggest(query, 4, false); // only foobar
     assertSuggestions(suggest, new Entry("foobar", 7));
     reader.close();
     iw.close();
@@ -289,10 +289,10 @@ public class TestPrefixCompletionQuery extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher indexSearcher = new SuggestIndexSearcher(reader);
     CompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field_no_p_sep", "fo"));
-    TopSuggestDocs suggest = indexSearcher.suggest(query, 4); // matches all 4
+    TopSuggestDocs suggest = indexSearcher.suggest(query, 4, false); // matches all 4
     assertSuggestions(suggest, new Entry("the foo bar", 10), new Entry("the fo", 9), new Entry("foo bar", 8), new Entry("foobar", 7));
     query = new PrefixCompletionQuery(analyzer, new Term("suggest_field_no_p_sep", "foob"));
-    suggest = indexSearcher.suggest(query, 4); // except the fo
+    suggest = indexSearcher.suggest(query, 4, false); // except the fo
     assertSuggestions(suggest, new Entry("the foo bar", 10), new Entry("foo bar", 8), new Entry("foobar", 7));
     reader.close();
     iw.close();
@@ -329,10 +329,10 @@ public class TestPrefixCompletionQuery extends LuceneTestCase {
     SuggestIndexSearcher indexSearcher = new SuggestIndexSearcher(reader);
 
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "app"));
-    assertEquals(0, indexSearcher.suggest(query, 3).totalHits);
+    assertEquals(0, indexSearcher.suggest(query, 3, false).totalHits);
 
     query = new PrefixCompletionQuery(analyzer, new Term("suggest_field2", "app"));
-    assertSuggestions(indexSearcher.suggest(query, 3), new Entry("apples", 3));
+    assertSuggestions(indexSearcher.suggest(query, 3, false), new Entry("apples", 3));
 
     reader.close();
     iw.close();

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestRegexCompletionQuery.java
----------------------------------------------------------------------
diff --git a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestRegexCompletionQuery.java b/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestRegexCompletionQuery.java
index 23710e9..2dd7184 100644
--- a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestRegexCompletionQuery.java
+++ b/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestRegexCompletionQuery.java
@@ -67,7 +67,7 @@ public class TestRegexCompletionQuery extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     RegexCompletionQuery query = new RegexCompletionQuery(new Term("suggest_field", "[a|w|s]s?ugg"));
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 4);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 4, false);
     assertSuggestions(suggest, new Entry("wsuggestion", 4), new Entry("ssuggestion", 3),
         new Entry("asuggestion", 2), new Entry("suggestion", 1));
 
@@ -98,7 +98,7 @@ public class TestRegexCompletionQuery extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     CompletionQuery query = new RegexCompletionQuery(new Term("suggest_field", "[a|s][d|u|s][u|d|g]"));
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(query, 5, false);
     assertSuggestions(suggest,
         new Entry("sduggestion", "type1", 5),
         new Entry("sudggestion", "type2", 4),
@@ -137,7 +137,7 @@ public class TestRegexCompletionQuery extends LuceneTestCase {
     contextQuery.addContext("type1", 6);
     contextQuery.addContext("type3", 7);
     contextQuery.addAllContexts();
-    TopSuggestDocs suggest = suggestIndexSearcher.suggest(contextQuery, 5);
+    TopSuggestDocs suggest = suggestIndexSearcher.suggest(contextQuery, 5, false);
     assertSuggestions(suggest,
         new Entry("sduggestion", "type1", 5 * 6),
         new Entry("sugdgestion", "type3", 3 * 7),

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/4e2cf61a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestSuggestField.java
----------------------------------------------------------------------
diff --git a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestSuggestField.java b/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestSuggestField.java
index fe9992d..3efb50d 100644
--- a/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestSuggestField.java
+++ b/lucene/suggest/src/test/org/apache/lucene/search/suggest/document/TestSuggestField.java
@@ -20,7 +20,10 @@ import java.io.ByteArrayOutputStream;
 import java.io.IOException;
 import java.util.ArrayList;
 import java.util.Arrays;
+import java.util.Collections;
+import java.util.Comparator;
 import java.util.HashMap;
+import java.util.HashSet;
 import java.util.List;
 import java.util.Map;
 import java.util.Set;
@@ -33,9 +36,9 @@ import org.apache.lucene.analysis.TokenStream;
 import org.apache.lucene.codecs.Codec;
 import org.apache.lucene.codecs.PostingsFormat;
 import org.apache.lucene.codecs.lucene70.Lucene70Codec;
-import org.apache.lucene.document.IntPoint;
 import org.apache.lucene.document.Document;
 import org.apache.lucene.document.Field;
+import org.apache.lucene.document.IntPoint;
 import org.apache.lucene.document.StoredField;
 import org.apache.lucene.index.DirectoryReader;
 import org.apache.lucene.index.IndexWriter;
@@ -122,7 +125,7 @@ public class TestSuggestField extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "ab"));
-    TopSuggestDocs lookupDocs = suggestIndexSearcher.suggest(query, 3);
+    TopSuggestDocs lookupDocs = suggestIndexSearcher.suggest(query, 3, false);
     assertThat(lookupDocs.totalHits, equalTo(0));
     reader.close();
     iw.close();
@@ -157,7 +160,7 @@ public class TestSuggestField extends LuceneTestCase {
     int[] weights = new int[num];
     for(int i = 0; i < num; i++) {
       Document document = new Document();
-      weights[i] = Math.abs(random().nextInt());
+      weights[i] = random().nextInt(Integer.MAX_VALUE);
       document.add(new SuggestField("suggest_field", "abc", weights[i]));
       iw.addDocument(document);
 
@@ -175,12 +178,230 @@ public class TestSuggestField extends LuceneTestCase {
 
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "abc"));
-    TopSuggestDocs lookupDocs = suggestIndexSearcher.suggest(query, num);
+    TopSuggestDocs lookupDocs = suggestIndexSearcher.suggest(query, num, false);
+    assertSuggestions(lookupDocs, expectedEntries);
+
+    reader.close();
+    iw.close();
+  }
+
+  public void testDeduplication() throws Exception {
+    Analyzer analyzer = new MockAnalyzer(random());
+    RandomIndexWriter iw = new RandomIndexWriter(random(), dir, iwcWithSuggestField(analyzer, "suggest_field"));
+    final int num = TestUtil.nextInt(random(), 2, 20);
+    int[] weights = new int[num];
+    int bestABCWeight = Integer.MIN_VALUE;
+    int bestABDWeight = Integer.MIN_VALUE;
+    for(int i = 0; i < num; i++) {
+      Document document = new Document();
+      weights[i] = random().nextInt(Integer.MAX_VALUE);
+      String suggestValue;
+      boolean doABC;
+      if (i == 0) {
+        doABC = true;
+      } else if (i == 1) {
+        doABC = false;
+      } else {
+        doABC = random().nextBoolean();
+      }
+      if (doABC) {
+        suggestValue = "abc";
+        bestABCWeight = Math.max(bestABCWeight, weights[i]);
+      } else {
+        suggestValue = "abd";
+        bestABDWeight = Math.max(bestABDWeight, weights[i]);
+      }
+      document.add(new SuggestField("suggest_field", suggestValue, weights[i]));
+      iw.addDocument(document);
+
+      if (usually()) {
+        iw.commit();
+      }
+    }
+
+    DirectoryReader reader = iw.getReader();
+    Entry[] expectedEntries = new Entry[2];
+    if (bestABDWeight > bestABCWeight) {
+      expectedEntries[0] = new Entry("abd", bestABDWeight);
+      expectedEntries[1] = new Entry("abc", bestABCWeight);
+    } else {
+      expectedEntries[0] = new Entry("abc", bestABCWeight);
+      expectedEntries[1] = new Entry("abd", bestABDWeight);
+    }
+
+    SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
+    PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "a"));
+    TopSuggestDocsCollector collector = new TopSuggestDocsCollector(2, true);
+    suggestIndexSearcher.suggest(query, collector);
+    TopSuggestDocs lookupDocs = collector.get();
+    assertSuggestions(lookupDocs, expectedEntries);
+
+    reader.close();
+    iw.close();
+  }
+
+  public void testExtremeDeduplication() throws Exception {
+    Analyzer analyzer = new MockAnalyzer(random());
+    RandomIndexWriter iw = new RandomIndexWriter(random(), dir, iwcWithSuggestField(analyzer, "suggest_field"));
+    final int num = atLeast(5000);
+    int bestWeight = Integer.MIN_VALUE;
+    for(int i = 0; i < num; i++) {
+      Document document = new Document();
+      int weight = TestUtil.nextInt(random(), 10, 100);
+      bestWeight = Math.max(weight, bestWeight);
+      document.add(new SuggestField("suggest_field", "abc", weight));
+      iw.addDocument(document);
+      if (rarely()) {
+        iw.commit();
+      }
+    }
+    Document document = new Document();
+    document.add(new SuggestField("suggest_field", "abd", 7));
+    iw.addDocument(document);
+
+    if (random().nextBoolean()) {
+      iw.forceMerge(1);
+    }
+
+    DirectoryReader reader = iw.getReader();
+    Entry[] expectedEntries = new Entry[2];
+    expectedEntries[0] = new Entry("abc", bestWeight);
+    expectedEntries[1] = new Entry("abd", 7);
+
+    SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
+    PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "a"));
+    TopSuggestDocsCollector collector = new TopSuggestDocsCollector(2, true);
+    suggestIndexSearcher.suggest(query, collector);
+    TopSuggestDocs lookupDocs = collector.get();
     assertSuggestions(lookupDocs, expectedEntries);
 
     reader.close();
     iw.close();
   }
+  
+  private static String randomSimpleString(int numDigits, int maxLen) {
+    final int len = TestUtil.nextInt(random(), 1, maxLen);
+    final char[] chars = new char[len];
+    for(int j=0;j<len;j++) {
+      chars[j] = (char) ('a' + random().nextInt(numDigits));
+    }
+    return new String(chars);
+  }
+
+  public void testRandom() throws Exception {
+    int numDigits = TestUtil.nextInt(random(), 1, 6);
+    Set<String> keys = new HashSet<>();
+    int keyCount = TestUtil.nextInt(random(), 1, 20);
+    if (numDigits == 1) {
+      keyCount = Math.min(9, keyCount);
+    }
+    while (keys.size() < keyCount) {
+      keys.add(randomSimpleString(numDigits, 10));
+    }
+    List<String> keysList = new ArrayList<>(keys);
+
+    Analyzer analyzer = new MockAnalyzer(random());
+    IndexWriterConfig iwc = iwcWithSuggestField(analyzer, "suggest_field");
+    // we rely on docID order:
+    iwc.setMergePolicy(newLogMergePolicy());
+    RandomIndexWriter iw = new RandomIndexWriter(random(), dir, iwc);
+    int docCount = TestUtil.nextInt(random(), 1, 200);
+    Entry[] docs = new Entry[docCount];
+    for(int i=0;i<docCount;i++) {
+      int weight = random().nextInt(40);
+      String key = keysList.get(random().nextInt(keyCount));
+      //System.out.println("KEY: " + key);
+      docs[i] = new Entry(key, null, weight, i);
+      Document doc = new Document();
+      doc.add(new SuggestField("suggest_field", key, weight));
+      iw.addDocument(doc);
+      if (usually()) {
+        iw.commit();
+      }
+    }
+
+    DirectoryReader reader = iw.getReader();
+    SuggestIndexSearcher searcher = new SuggestIndexSearcher(reader);
+
+    int iters = atLeast(200);
+    for(int iter=0;iter<iters;iter++) {
+      String prefix = randomSimpleString(numDigits, 2);
+      if (VERBOSE) {
+        System.out.println("\nTEST: prefix=" + prefix);
+      }
+
+      // slow but hopefully correct suggester:
+      List<Entry> expected = new ArrayList<>();
+      for(Entry doc : docs) {
+        if (doc.output.startsWith(prefix)) {
+          expected.add(doc);
+        }
+      }
+      Collections.sort(expected,
+                       new Comparator<Entry>() {
+                         @Override
+                         public int compare(Entry a, Entry b) {
+                           // sort by higher score:
+                           int cmp = Float.compare(b.value, a.value);
+                           if (cmp == 0) {
+                             // tie break by smaller docID:
+                             cmp = Integer.compare(a.id, b.id);
+                           }
+                           return cmp;
+                         }
+                       });
+
+      boolean dedup = random().nextBoolean();
+      if (dedup) {
+        List<Entry> deduped = new ArrayList<>();
+        Set<String> seen = new HashSet<>();
+        for(Entry entry : expected) {
+          if (seen.contains(entry.output) == false) {
+            seen.add(entry.output);
+            deduped.add(entry);
+          }
+        }
+        expected = deduped;
+      }
+
+      // TODO: re-enable this, except something is buggy about tie breaks at the topN threshold now:
+      //int topN = TestUtil.nextInt(random(), 1, docCount+10);
+      int topN = docCount;
+      
+      if (VERBOSE) {
+        if (dedup) {
+          System.out.println("  expected (dedup'd) topN=" + topN + ":");
+        } else {
+          System.out.println("  expected topN=" + topN + ":");
+        }
+        for(int i=0;i<expected.size();i++) {
+          if (i >= topN) {
+            System.out.println("    leftover: " + i + ": " + expected.get(i));
+          } else {
+            System.out.println("    " + i + ": " + expected.get(i));
+          }
+        }
+      }
+      expected = expected.subList(0, Math.min(topN, expected.size()));
+      
+      PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", prefix));
+      TopSuggestDocsCollector collector = new TopSuggestDocsCollector(topN, dedup);
+      searcher.suggest(query, collector);
+      TopSuggestDocs actual = collector.get();
+      if (VERBOSE) {
+        System.out.println("  actual:");
+        SuggestScoreDoc[] suggestScoreDocs = (SuggestScoreDoc[]) actual.scoreDocs;
+        for(int i=0;i<suggestScoreDocs.length;i++) {
+          System.out.println("    " + i + ": " + suggestScoreDocs[i]);
+        }
+      }
+
+      assertSuggestions(actual, expected.toArray(new Entry[expected.size()]));
+    }
+    
+    reader.close();
+    iw.close();
+  }
 
   @Test
   public void testNRTDeletedDocFiltering() throws Exception {
@@ -214,7 +435,7 @@ public class TestSuggestField extends LuceneTestCase {
     DirectoryReader reader = DirectoryReader.open(iw);
     SuggestIndexSearcher indexSearcher = new SuggestIndexSearcher(reader);
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "abc_"));
-    TopSuggestDocs suggest = indexSearcher.suggest(query, numLive);
+    TopSuggestDocs suggest = indexSearcher.suggest(query, numLive, false);
     assertSuggestions(suggest, expectedEntries.toArray(new Entry[expectedEntries.size()]));
 
     reader.close();
@@ -248,7 +469,7 @@ public class TestSuggestField extends LuceneTestCase {
     // no random access required;
     // calling suggest with filter that does not match any documents should early terminate
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "abc_"), filter);
-    TopSuggestDocs suggest = indexSearcher.suggest(query, num);
+    TopSuggestDocs suggest = indexSearcher.suggest(query, num, false);
     assertThat(suggest.totalHits, equalTo(0));
     reader.close();
     iw.close();
@@ -276,7 +497,7 @@ public class TestSuggestField extends LuceneTestCase {
     DirectoryReader reader = DirectoryReader.open(iw);
     SuggestIndexSearcher indexSearcher = new SuggestIndexSearcher(reader);
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "abc_"));
-    TopSuggestDocs suggest = indexSearcher.suggest(query, num);
+    TopSuggestDocs suggest = indexSearcher.suggest(query, num, false);
     assertThat(suggest.totalHits, equalTo(0));
 
     reader.close();
@@ -306,7 +527,7 @@ public class TestSuggestField extends LuceneTestCase {
     DirectoryReader reader = DirectoryReader.open(iw);
     SuggestIndexSearcher indexSearcher = new SuggestIndexSearcher(reader);
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "abc_"));
-    TopSuggestDocs suggest = indexSearcher.suggest(query, 1);
+    TopSuggestDocs suggest = indexSearcher.suggest(query, 1, false);
     assertSuggestions(suggest, new Entry("abc_1", 1));
 
     reader.close();
@@ -335,10 +556,10 @@ public class TestSuggestField extends LuceneTestCase {
 
     SuggestIndexSearcher suggestIndexSearcher = new SuggestIndexSearcher(reader);
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("sug_field_1", "ap"));
-    TopSuggestDocs suggestDocs1 = suggestIndexSearcher.suggest(query, 4);
+    TopSuggestDocs suggestDocs1 = suggestIndexSearcher.suggest(query, 4, false);
     assertSuggestions(suggestDocs1, new Entry("apple", 4), new Entry("aples", 3));
     query = new PrefixCompletionQuery(analyzer, new Term("sug_field_2", "ap"));
-    TopSuggestDocs suggestDocs2 = suggestIndexSearcher.suggest(query, 4);
+    TopSuggestDocs suggestDocs2 = suggestIndexSearcher.suggest(query, 4, false);
     assertSuggestions(suggestDocs2, new Entry("april", 3), new Entry("apartment", 2));
 
     // check that the doc ids are consistent
@@ -372,7 +593,7 @@ public class TestSuggestField extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher indexSearcher = new SuggestIndexSearcher(reader);
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "abc_"));
-    TopSuggestDocs suggest = indexSearcher.suggest(query, 1);
+    TopSuggestDocs suggest = indexSearcher.suggest(query, 1, false);
     assertSuggestions(suggest, new Entry("abc_" + num, num));
 
     reader.close();
@@ -402,7 +623,7 @@ public class TestSuggestField extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher indexSearcher = new SuggestIndexSearcher(reader);
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "abc_"));
-    TopSuggestDocs suggest = indexSearcher.suggest(query, (entries.size() == 0) ? 1 : entries.size());
+    TopSuggestDocs suggest = indexSearcher.suggest(query, (entries.size() == 0) ? 1 : entries.size(), false);
     assertSuggestions(suggest, entries.toArray(new Entry[entries.size()]));
 
     reader.close();
@@ -430,7 +651,7 @@ public class TestSuggestField extends LuceneTestCase {
     DirectoryReader reader = iw.getReader();
     SuggestIndexSearcher indexSearcher = new SuggestIndexSearcher(reader);
     PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", "abc_"));
-    TopSuggestDocs suggest = indexSearcher.suggest(query, num);
+    TopSuggestDocs suggest = indexSearcher.suggest(query, num, false);
     assertEquals(num, suggest.totalHits);
     for (SuggestScoreDoc suggestScoreDoc : suggest.scoreLookupDocs()) {
       String key = suggestScoreDoc.key.toString();
@@ -456,7 +677,7 @@ public class TestSuggestField extends LuceneTestCase {
     for (int i = 0; i < num; i++) {
       Document document = new Document();
       String suggest = prefixes[i % 3] + TestUtil.randomSimpleString(random(), 10) + "_" +String.valueOf(i);
-      int weight = Math.abs(random().nextInt());
+      int weight = random().nextInt(Integer.MAX_VALUE);
       document.add(new SuggestField("suggest_field", suggest, weight));
       mappings.put(suggest, weight);
       iw.addDocument(document);
@@ -470,7 +691,7 @@ public class TestSuggestField extends LuceneTestCase {
     SuggestIndexSearcher indexSearcher = new SuggestIndexSearcher(reader);
     for (String prefix : prefixes) {
       PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", prefix));
-      TopSuggestDocs suggest = indexSearcher.suggest(query, num);
+      TopSuggestDocs suggest = indexSearcher.suggest(query, num, false);
       assertTrue(suggest.totalHits > 0);
       float topScore = -1;
       for (SuggestScoreDoc scoreDoc : suggest.scoreLookupDocs()) {
@@ -498,7 +719,7 @@ public class TestSuggestField extends LuceneTestCase {
     for (int i = 0; i < num; i++) {
       Document document = lineFileDocs.nextDoc();
       String title = document.getField("title").stringValue();
-      int weight = Math.abs(random().nextInt());
+      int weight = random().nextInt(Integer.MAX_VALUE);
       Integer prevWeight = mappings.get(title);
       if (prevWeight == null || prevWeight < weight) {
         mappings.put(title, weight);
@@ -519,7 +740,7 @@ public class TestSuggestField extends LuceneTestCase {
       String title = entry.getKey();
 
       PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field", title));
-      TopSuggestDocs suggest = indexSearcher.suggest(query, mappings.size());
+      TopSuggestDocs suggest = indexSearcher.suggest(query, mappings.size(), false);
       assertTrue(suggest.totalHits > 0);
       boolean matched = false;
       for (ScoreDoc scoreDoc : suggest.scoreDocs) {
@@ -577,13 +798,13 @@ public class TestSuggestField extends LuceneTestCase {
           try {
             startingGun.await();
             PrefixCompletionQuery query = new PrefixCompletionQuery(analyzer, new Term("suggest_field_1", prefix1));
-            TopSuggestDocs suggest = indexSearcher.suggest(query, num);
+            TopSuggestDocs suggest = indexSearcher.suggest(query, num, false);
             assertSuggestions(suggest, entries1);
             query = new PrefixCompletionQuery(analyzer, new Term("suggest_field_2", prefix2));
-            suggest = indexSearcher.suggest(query, num);
+            suggest = indexSearcher.suggest(query, num, false);
             assertSuggestions(suggest, entries2);
             query = new PrefixCompletionQuery(analyzer, new Term("suggest_field_3", prefix3));
-            suggest = indexSearcher.suggest(query, num);
+            suggest = indexSearcher.suggest(query, num, false);
             assertSuggestions(suggest, entries3);
           } catch (Throwable e) {
             errors.add(e);
@@ -607,28 +828,39 @@ public class TestSuggestField extends LuceneTestCase {
     final String output;
     final float value;
     final String context;
+    final int id;
 
     Entry(String output, float value) {
       this(output, null, value);
     }
 
     Entry(String output, String context, float value) {
+      this(output, context, value, -1);
+    }
+
+    Entry(String output, String context, float value, int id) {
       this.output = output;
       this.value = value;
       this.context = context;
+      this.id = id;
+    }
+
+    @Override
+    public String toString() {
+      return "key=" + output + " score=" + value + " context=" + context + " id=" + id;
     }
   }
 
   static void assertSuggestions(TopDocs actual, Entry... expected) {
     SuggestScoreDoc[] suggestScoreDocs = (SuggestScoreDoc[]) actual.scoreDocs;
-    assertThat(suggestScoreDocs.length, equalTo(expected.length));
-    for (int i = 0; i < suggestScoreDocs.length; i++) {
+    for (int i = 0; i < Math.min(expected.length, suggestScoreDocs.length); i++) {
       SuggestScoreDoc lookupDoc = suggestScoreDocs[i];
-      String msg = "Expected: " + toString(expected[i]) + " Actual: " + toString(lookupDoc);
+      String msg = "Hit " + i + ": expected: " + toString(expected[i]) + " but actual: " + toString(lookupDoc);
       assertThat(msg, lookupDoc.key.toString(), equalTo(expected[i].output));
       assertThat(msg, lookupDoc.score, equalTo(expected[i].value));
       assertThat(msg, lookupDoc.context, equalTo(expected[i].context));
     }
+    assertThat(suggestScoreDocs.length, equalTo(expected.length));
   }
 
   private static String toString(Entry expected) {


[13/50] [abbrv] lucene-solr:jira/solr-9858: add javadocs explaining SynonymGraphFilter's ignoreCase

Posted by ab...@apache.org.
add javadocs explaining SynonymGraphFilter's ignoreCase


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/3ad6e419
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/3ad6e419
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/3ad6e419

Branch: refs/heads/jira/solr-9858
Commit: 3ad6e41910158a46025ff78330d78a31a7081887
Parents: 8ed8ecf
Author: Mike McCandless <mi...@apache.org>
Authored: Thu Feb 23 07:22:57 2017 -0500
Committer: Mike McCandless <mi...@apache.org>
Committed: Thu Feb 23 07:23:24 2017 -0500

----------------------------------------------------------------------
 .../analysis/synonym/SynonymGraphFilter.java    |  8 +++++
 .../synonym/TestSynonymGraphFilter.java         | 34 ++++++++++++++++++++
 2 files changed, 42 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/3ad6e419/lucene/analysis/common/src/java/org/apache/lucene/analysis/synonym/SynonymGraphFilter.java
----------------------------------------------------------------------
diff --git a/lucene/analysis/common/src/java/org/apache/lucene/analysis/synonym/SynonymGraphFilter.java b/lucene/analysis/common/src/java/org/apache/lucene/analysis/synonym/SynonymGraphFilter.java
index 788db0a..e59e61b 100644
--- a/lucene/analysis/common/src/java/org/apache/lucene/analysis/synonym/SynonymGraphFilter.java
+++ b/lucene/analysis/common/src/java/org/apache/lucene/analysis/synonym/SynonymGraphFilter.java
@@ -160,6 +160,14 @@ public final class SynonymGraphFilter extends TokenFilter {
     }
   }
 
+  /**
+   * Apply previously built synonyms to incoming tokens.
+   * @param input input tokenstream
+   * @param synonyms synonym map
+   * @param ignoreCase case-folds input for matching with {@link Character#toLowerCase(int)}.
+   *                   Note, if you set this to true, it's your responsibility to lowercase
+   *                   the input entries when you create the {@link SynonymMap}
+   */
   public SynonymGraphFilter(TokenStream input, SynonymMap synonyms, boolean ignoreCase) {
     super(input);
     this.synonyms = synonyms;

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/3ad6e419/lucene/analysis/common/src/test/org/apache/lucene/analysis/synonym/TestSynonymGraphFilter.java
----------------------------------------------------------------------
diff --git a/lucene/analysis/common/src/test/org/apache/lucene/analysis/synonym/TestSynonymGraphFilter.java b/lucene/analysis/common/src/test/org/apache/lucene/analysis/synonym/TestSynonymGraphFilter.java
index e00a165..730d00a 100644
--- a/lucene/analysis/common/src/test/org/apache/lucene/analysis/synonym/TestSynonymGraphFilter.java
+++ b/lucene/analysis/common/src/test/org/apache/lucene/analysis/synonym/TestSynonymGraphFilter.java
@@ -23,6 +23,7 @@ import java.text.ParseException;
 import java.util.ArrayList;
 import java.util.HashSet;
 import java.util.List;
+import java.util.Locale;
 import java.util.Set;
 
 import org.apache.lucene.analysis.Analyzer;
@@ -1922,4 +1923,37 @@ public class TestSynonymGraphFilter extends BaseTokenStreamTestCase {
         new int[]{1, 1, 0, 1, 1});
     a.close();
   }
+
+  public void testUpperCase() throws IOException {
+    assertMapping("word", "synonym");
+    assertMapping("word".toUpperCase(Locale.ROOT), "synonym");
+  }
+
+  private void assertMapping(String inputString, String outputString) throws IOException {
+    SynonymMap.Builder builder = new SynonymMap.Builder(false);
+    // the rules must be lowercased up front, but the incoming tokens will be case insensitive:
+    CharsRef input = SynonymMap.Builder.join(inputString.toLowerCase(Locale.ROOT).split(" "), new CharsRefBuilder());
+    CharsRef output = SynonymMap.Builder.join(outputString.split(" "), new CharsRefBuilder());
+    builder.add(input, output, true);
+    Analyzer analyzer = new CustomAnalyzer(builder.build());
+    TokenStream tokenStream = analyzer.tokenStream("field", inputString);
+    assertTokenStreamContents(tokenStream, new String[]{
+        outputString, inputString
+      });
+  }
+
+  static class CustomAnalyzer extends Analyzer {
+    private SynonymMap synonymMap;
+
+    CustomAnalyzer(SynonymMap synonymMap) {
+      this.synonymMap = synonymMap;
+    }
+
+    @Override
+    protected TokenStreamComponents createComponents(String s) {
+      Tokenizer tokenizer = new MockTokenizer(MockTokenizer.WHITESPACE, false);
+      TokenStream tokenStream = new SynonymGraphFilter(tokenizer, synonymMap, true); // Ignore case True
+      return new TokenStreamComponents(tokenizer, tokenStream);
+    }
+  }
 }


[35/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10196: ElectionContext#runLeaderProcess can hit NPE on core close.

Posted by ab...@apache.org.
SOLR-10196: ElectionContext#runLeaderProcess can hit NPE on core close.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/04ba9968
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/04ba9968
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/04ba9968

Branch: refs/heads/jira/solr-9858
Commit: 04ba9968c0686a5fa1a9c5d89a7cd92839902f32
Parents: ed0f0f4
Author: markrmiller <ma...@apache.org>
Authored: Mon Feb 27 23:41:30 2017 -0500
Committer: markrmiller <ma...@apache.org>
Committed: Mon Feb 27 23:41:30 2017 -0500

----------------------------------------------------------------------
 solr/CHANGES.txt                                             | 2 ++
 .../core/src/java/org/apache/solr/cloud/ElectionContext.java | 8 ++++++--
 .../src/test/org/apache/solr/cloud/CleanupOldIndexTest.java  | 2 +-
 3 files changed, 9 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/04ba9968/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index 0f1cac5..07f1c4e 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -186,6 +186,8 @@ Bug Fixes
 * SOLR-10055: Linux installer now renames existing bin/solr.in.* as bin/solr.in.*.orig to make the installed config in
   /etc/defaults be the one found by default when launching solr manually. (janhoy)
 
+* SOLR-10196: ElectionContext#runLeaderProcess can hit NPE on core close. (Mark Miller)
+
 Optimizations
 ----------------------
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/04ba9968/solr/core/src/java/org/apache/solr/cloud/ElectionContext.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/cloud/ElectionContext.java b/solr/core/src/java/org/apache/solr/cloud/ElectionContext.java
index 6c073b9..ff6fb30 100644
--- a/solr/core/src/java/org/apache/solr/cloud/ElectionContext.java
+++ b/solr/core/src/java/org/apache/solr/cloud/ElectionContext.java
@@ -423,8 +423,12 @@ final class ShardLeaderElectionContext extends ShardLeaderElectionContextBase {
           
           super.runLeaderProcess(weAreReplacement, 0);
           try (SolrCore core = cc.getCore(coreName)) {
-            core.getCoreDescriptor().getCloudDescriptor().setLeader(true);
-            publishActiveIfRegisteredAndNotActive(core);
+            if (core != null) {
+              core.getCoreDescriptor().getCloudDescriptor().setLeader(true);
+              publishActiveIfRegisteredAndNotActive(core);
+            } else {
+              return;
+            }
           }
           log.info("I am the new leader: " + ZkCoreNodeProps.getCoreUrl(leaderProps) + " " + shardId);
           

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/04ba9968/solr/core/src/test/org/apache/solr/cloud/CleanupOldIndexTest.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/cloud/CleanupOldIndexTest.java b/solr/core/src/test/org/apache/solr/cloud/CleanupOldIndexTest.java
index cc03a25..547de8d 100644
--- a/solr/core/src/test/org/apache/solr/cloud/CleanupOldIndexTest.java
+++ b/solr/core/src/test/org/apache/solr/cloud/CleanupOldIndexTest.java
@@ -47,7 +47,7 @@ public class CleanupOldIndexTest extends SolrCloudTestCase {
   }
   
   @AfterClass
-  public static void teardownTestCases() throws Exception {
+  public static void afterClass() throws Exception {
 
     if (suiteFailureMarker.wasSuccessful()) {
       zkClient().printLayoutToStdOut();


[47/50] [abbrv] lucene-solr:jira/solr-9858: Fixing precommit by removing unused imports

Posted by ab...@apache.org.
Fixing precommit by removing unused imports


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/8b4502c2
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/8b4502c2
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/8b4502c2

Branch: refs/heads/jira/solr-9858
Commit: 8b4502c21842374b93336a88c3978c0cc0afa205
Parents: 0b7b144
Author: Ishan Chattopadhyaya <is...@apache.org>
Authored: Wed Mar 1 04:38:35 2017 +0530
Committer: Ishan Chattopadhyaya <is...@apache.org>
Committed: Wed Mar 1 04:38:35 2017 +0530

----------------------------------------------------------------------
 solr/core/src/java/org/apache/solr/store/blockcache/Metrics.java | 3 ---
 1 file changed, 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/8b4502c2/solr/core/src/java/org/apache/solr/store/blockcache/Metrics.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/store/blockcache/Metrics.java b/solr/core/src/java/org/apache/solr/store/blockcache/Metrics.java
index 3dc8947..d3e3497 100644
--- a/solr/core/src/java/org/apache/solr/store/blockcache/Metrics.java
+++ b/solr/core/src/java/org/apache/solr/store/blockcache/Metrics.java
@@ -17,9 +17,6 @@
 package org.apache.solr.store.blockcache;
 
 import java.net.URL;
-import java.util.Map;
-import java.util.Map.Entry;
-import java.util.concurrent.ConcurrentHashMap;
 import java.util.concurrent.atomic.AtomicLong;
 
 import org.apache.solr.common.util.NamedList;


[41/50] [abbrv] lucene-solr:jira/solr-9858: LUCENE-7410: Make cache keys and close listeners less trappy.

Posted by ab...@apache.org.
http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/highlighter/src/java/org/apache/lucene/search/highlight/TermVectorLeafReader.java
----------------------------------------------------------------------
diff --git a/lucene/highlighter/src/java/org/apache/lucene/search/highlight/TermVectorLeafReader.java b/lucene/highlighter/src/java/org/apache/lucene/search/highlight/TermVectorLeafReader.java
index 608e3d4..2e3cdab 100644
--- a/lucene/highlighter/src/java/org/apache/lucene/search/highlight/TermVectorLeafReader.java
+++ b/lucene/highlighter/src/java/org/apache/lucene/search/highlight/TermVectorLeafReader.java
@@ -85,16 +85,6 @@ public class TermVectorLeafReader extends LeafReader {
   }
 
   @Override
-  public void addCoreClosedListener(CoreClosedListener listener) {
-    addCoreClosedListenerAsReaderClosedListener(this, listener);
-  }
-
-  @Override
-  public void removeCoreClosedListener(CoreClosedListener listener) {
-    removeCoreClosedListenerAsReaderClosedListener(this, listener);
-  }
-
-  @Override
   protected void doClose() throws IOException {
   }
 
@@ -178,4 +168,14 @@ public class TermVectorLeafReader extends LeafReader {
   public Sort getIndexSort() {
     return null;
   }
+
+  @Override
+  public CacheHelper getCoreCacheHelper() {
+    return null;
+  }
+
+  @Override
+  public CacheHelper getReaderCacheHelper() {
+    return null;
+  }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/highlighter/src/java/org/apache/lucene/search/highlight/WeightedSpanTermExtractor.java
----------------------------------------------------------------------
diff --git a/lucene/highlighter/src/java/org/apache/lucene/search/highlight/WeightedSpanTermExtractor.java b/lucene/highlighter/src/java/org/apache/lucene/search/highlight/WeightedSpanTermExtractor.java
index 0e0093b..a898854 100644
--- a/lucene/highlighter/src/java/org/apache/lucene/search/highlight/WeightedSpanTermExtractor.java
+++ b/lucene/highlighter/src/java/org/apache/lucene/search/highlight/WeightedSpanTermExtractor.java
@@ -474,6 +474,16 @@ public class WeightedSpanTermExtractor {
     public NumericDocValues getNormValues(String field) throws IOException {
       return super.getNormValues(FIELD_NAME);
     }
+
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return null;
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
   }
 
   /**

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/PhraseHelper.java
----------------------------------------------------------------------
diff --git a/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/PhraseHelper.java b/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/PhraseHelper.java
index d7e8671..ebc5d37 100644
--- a/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/PhraseHelper.java
+++ b/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/PhraseHelper.java
@@ -584,6 +584,16 @@ public class PhraseHelper {
     public NumericDocValues getNormValues(String field) throws IOException {
       return super.getNormValues(fieldName);
     }
+
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return null;
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
   }
 
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/TermVectorFilteredLeafReader.java
----------------------------------------------------------------------
diff --git a/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/TermVectorFilteredLeafReader.java b/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/TermVectorFilteredLeafReader.java
index 954024c..8555cbd 100644
--- a/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/TermVectorFilteredLeafReader.java
+++ b/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/TermVectorFilteredLeafReader.java
@@ -127,4 +127,14 @@ final class TermVectorFilteredLeafReader extends FilterLeafReader {
     }
 
   }
+
+  @Override
+  public CacheHelper getCoreCacheHelper() {
+    return null;
+  }
+
+  @Override
+  public CacheHelper getReaderCacheHelper() {
+    return null;
+  }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/UnifiedHighlighter.java
----------------------------------------------------------------------
diff --git a/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/UnifiedHighlighter.java b/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/UnifiedHighlighter.java
index bbcfd5b..f1e2c44 100644
--- a/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/UnifiedHighlighter.java
+++ b/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/UnifiedHighlighter.java
@@ -1047,6 +1047,11 @@ public class UnifiedHighlighter {
         protected void doClose() throws IOException {
           reader.close();
         }
+
+        @Override
+        public CacheHelper getReaderCacheHelper() {
+          return null;
+        }
       };
     }
 
@@ -1066,6 +1071,16 @@ public class UnifiedHighlighter {
       return tvFields;
     }
 
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return null;
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
+
   }
 
   /**

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/highlighter/src/test/org/apache/lucene/search/uhighlight/TestUnifiedHighlighterTermVec.java
----------------------------------------------------------------------
diff --git a/lucene/highlighter/src/test/org/apache/lucene/search/uhighlight/TestUnifiedHighlighterTermVec.java b/lucene/highlighter/src/test/org/apache/lucene/search/uhighlight/TestUnifiedHighlighterTermVec.java
index 57d398b..c462aee 100644
--- a/lucene/highlighter/src/test/org/apache/lucene/search/uhighlight/TestUnifiedHighlighterTermVec.java
+++ b/lucene/highlighter/src/test/org/apache/lucene/search/uhighlight/TestUnifiedHighlighterTermVec.java
@@ -132,6 +132,16 @@ public class TestUnifiedHighlighterTermVec extends LuceneTestCase {
 
             return super.getTermVectors(docID);
           }
+
+          @Override
+          public CacheHelper getCoreCacheHelper() {
+            return null;
+          }
+
+          @Override
+          public CacheHelper getReaderCacheHelper() {
+            return null;
+          }
         };
       }
     };
@@ -144,6 +154,11 @@ public class TestUnifiedHighlighterTermVec extends LuceneTestCase {
     protected DirectoryReader doWrapDirectoryReader(DirectoryReader in) throws IOException {
       return new AssertOnceTermVecDirectoryReader(in);
     }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
   }
 
   private static boolean calledBy(Class<?> clazz) {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/join/src/java/org/apache/lucene/search/join/QueryBitSetProducer.java
----------------------------------------------------------------------
diff --git a/lucene/join/src/java/org/apache/lucene/search/join/QueryBitSetProducer.java b/lucene/join/src/java/org/apache/lucene/search/join/QueryBitSetProducer.java
index 98d85cd..ac15664 100644
--- a/lucene/join/src/java/org/apache/lucene/search/join/QueryBitSetProducer.java
+++ b/lucene/join/src/java/org/apache/lucene/search/join/QueryBitSetProducer.java
@@ -21,6 +21,7 @@ import java.util.Collections;
 import java.util.Map;
 import java.util.WeakHashMap;
 
+import org.apache.lucene.index.IndexReader;
 import org.apache.lucene.index.IndexReaderContext;
 import org.apache.lucene.index.LeafReader;
 import org.apache.lucene.index.LeafReaderContext;
@@ -39,7 +40,7 @@ import org.apache.lucene.util.BitSet;
  */
 public class QueryBitSetProducer implements BitSetProducer {
   private final Query query;
-  private final Map<Object,DocIdSet> cache = Collections.synchronizedMap(new WeakHashMap<>());
+  final Map<IndexReader.CacheKey,DocIdSet> cache = Collections.synchronizedMap(new WeakHashMap<>());
 
   /** Wraps another query's result and caches it into bitsets.
    * @param query Query to cache results of
@@ -59,9 +60,12 @@ public class QueryBitSetProducer implements BitSetProducer {
   @Override
   public BitSet getBitSet(LeafReaderContext context) throws IOException {
     final LeafReader reader = context.reader();
-    final Object key = reader.getCoreCacheKey();
+    final IndexReader.CacheHelper cacheHelper = reader.getCoreCacheHelper();
 
-    DocIdSet docIdSet = cache.get(key);
+    DocIdSet docIdSet = null;
+    if (cacheHelper != null) {
+      docIdSet = cache.get(cacheHelper.getKey());
+    }
     if (docIdSet == null) {
       final IndexReaderContext topLevelContext = ReaderUtil.getTopLevelContext(context);
       final IndexSearcher searcher = new IndexSearcher(topLevelContext);
@@ -74,7 +78,9 @@ public class QueryBitSetProducer implements BitSetProducer {
       } else {
         docIdSet = new BitDocIdSet(BitSet.of(s.iterator(), context.reader().maxDoc()));
       }
-      cache.put(key, docIdSet);
+      if (cacheHelper != null) {
+        cache.put(cacheHelper.getKey(), docIdSet);
+      }
     }
     return docIdSet == DocIdSet.EMPTY ? null : ((BitDocIdSet) docIdSet).bits();
   }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/join/src/test/org/apache/lucene/search/join/TestJoinUtil.java
----------------------------------------------------------------------
diff --git a/lucene/join/src/test/org/apache/lucene/search/join/TestJoinUtil.java b/lucene/join/src/test/org/apache/lucene/search/join/TestJoinUtil.java
index 39979ac..72b6bf5 100644
--- a/lucene/join/src/test/org/apache/lucene/search/join/TestJoinUtil.java
+++ b/lucene/join/src/test/org/apache/lucene/search/join/TestJoinUtil.java
@@ -267,7 +267,7 @@ public class TestJoinUtil extends LuceneTestCase {
       values[i] = DocValues.getSorted(leafReader, joinField);
     }
     MultiDocValues.OrdinalMap ordinalMap = MultiDocValues.OrdinalMap.build(
-        r.getCoreCacheKey(), values, PackedInts.DEFAULT
+        null, values, PackedInts.DEFAULT
     );
 
     Query toQuery = new TermQuery(new Term(typeField, "price"));
@@ -372,7 +372,7 @@ public class TestJoinUtil extends LuceneTestCase {
       values[i] = DocValues.getSorted(leafReader, joinField);
     }
     MultiDocValues.OrdinalMap ordinalMap = MultiDocValues.OrdinalMap.build(
-        r.getCoreCacheKey(), values, PackedInts.DEFAULT
+        null, values, PackedInts.DEFAULT
     );
 
     Query toQuery = new TermQuery(new Term("price", "5.0"));
@@ -500,7 +500,7 @@ public class TestJoinUtil extends LuceneTestCase {
       values[leadContext.ord] = DocValues.getSorted(leadContext.reader(), "join_field");
     }
     MultiDocValues.OrdinalMap ordinalMap = MultiDocValues.OrdinalMap.build(
-        searcher.getIndexReader().getCoreCacheKey(), values, PackedInts.DEFAULT
+        null, values, PackedInts.DEFAULT
     );
     BooleanQuery.Builder fromQuery = new BooleanQuery.Builder();
     fromQuery.add(priceQuery, BooleanClause.Occur.MUST);
@@ -621,7 +621,7 @@ public class TestJoinUtil extends LuceneTestCase {
       values[leadContext.ord] = DocValues.getSorted(leadContext.reader(), "join_field");
     }
     MultiDocValues.OrdinalMap ordinalMap = MultiDocValues.OrdinalMap.build(
-        searcher.getIndexReader().getCoreCacheKey(), values, PackedInts.DEFAULT
+        null, values, PackedInts.DEFAULT
     );
     Query fromQuery = new TermQuery(new Term("type", "from"));
     Query toQuery = new TermQuery(new Term("type", "to"));
@@ -1336,7 +1336,7 @@ public class TestJoinUtil extends LuceneTestCase {
         values[leadContext.ord] = DocValues.getSorted(leadContext.reader(), "join_field");
       }
       context.ordinalMap = MultiDocValues.OrdinalMap.build(
-          topLevelReader.getCoreCacheKey(), values, PackedInts.DEFAULT
+          null, values, PackedInts.DEFAULT
       );
     }
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/join/src/test/org/apache/lucene/search/join/TestQueryBitSetProducer.java
----------------------------------------------------------------------
diff --git a/lucene/join/src/test/org/apache/lucene/search/join/TestQueryBitSetProducer.java b/lucene/join/src/test/org/apache/lucene/search/join/TestQueryBitSetProducer.java
new file mode 100644
index 0000000..c90b0a8
--- /dev/null
+++ b/lucene/join/src/test/org/apache/lucene/search/join/TestQueryBitSetProducer.java
@@ -0,0 +1,110 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.lucene.search.join;
+
+import java.io.IOException;
+
+import org.apache.lucene.document.Document;
+import org.apache.lucene.index.DirectoryReader;
+import org.apache.lucene.index.FilterDirectoryReader;
+import org.apache.lucene.index.FilterLeafReader;
+import org.apache.lucene.index.IndexWriterConfig;
+import org.apache.lucene.index.LeafReader;
+import org.apache.lucene.index.NoMergePolicy;
+import org.apache.lucene.index.RandomIndexWriter;
+import org.apache.lucene.search.MatchAllDocsQuery;
+import org.apache.lucene.search.MatchNoDocsQuery;
+import org.apache.lucene.store.Directory;
+import org.apache.lucene.util.BitSet;
+import org.apache.lucene.util.IOUtils;
+import org.apache.lucene.util.LuceneTestCase;
+
+public class TestQueryBitSetProducer extends LuceneTestCase {
+
+  public void testSimple() throws Exception {
+    Directory dir = newDirectory();
+    IndexWriterConfig iwc = newIndexWriterConfig().setMergePolicy(NoMergePolicy.INSTANCE);
+    RandomIndexWriter w = new RandomIndexWriter(random(), dir, iwc);
+    w.addDocument(new Document());
+    DirectoryReader reader = w.getReader();
+
+    QueryBitSetProducer producer = new QueryBitSetProducer(new MatchNoDocsQuery());
+    assertNull(producer.getBitSet(reader.leaves().get(0)));
+    assertEquals(1, producer.cache.size());
+
+    producer = new QueryBitSetProducer(new MatchAllDocsQuery());
+    BitSet bitSet = producer.getBitSet(reader.leaves().get(0));
+    assertEquals(1, bitSet.length());
+    assertEquals(true, bitSet.get(0));
+    assertEquals(1, producer.cache.size());
+
+    IOUtils.close(reader, w, dir);
+  }
+
+  public void testReaderNotSuitedForCaching() throws IOException{
+    Directory dir = newDirectory();
+    IndexWriterConfig iwc = newIndexWriterConfig().setMergePolicy(NoMergePolicy.INSTANCE);
+    RandomIndexWriter w = new RandomIndexWriter(random(), dir, iwc);
+    w.addDocument(new Document());
+    DirectoryReader reader = new DummyDirectoryReader(w.getReader());
+
+    QueryBitSetProducer producer = new QueryBitSetProducer(new MatchNoDocsQuery());
+    assertNull(producer.getBitSet(reader.leaves().get(0)));
+    assertEquals(0, producer.cache.size());
+
+    producer = new QueryBitSetProducer(new MatchAllDocsQuery());
+    BitSet bitSet = producer.getBitSet(reader.leaves().get(0));
+    assertEquals(1, bitSet.length());
+    assertEquals(true, bitSet.get(0));
+    assertEquals(0, producer.cache.size());
+
+    IOUtils.close(reader, w, dir);
+  }
+
+  // a reader whose sole purpose is to not be cacheable
+  private static class DummyDirectoryReader extends FilterDirectoryReader {
+
+    public DummyDirectoryReader(DirectoryReader in) throws IOException {
+      super(in, new SubReaderWrapper() {
+        @Override
+        public LeafReader wrap(LeafReader reader) {
+          return new FilterLeafReader(reader) {
+
+            @Override
+            public CacheHelper getCoreCacheHelper() {
+              return null;
+            }
+
+            @Override
+            public CacheHelper getReaderCacheHelper() {
+              return null;
+            }};
+        }
+      });
+    }
+
+    @Override
+    protected DirectoryReader doWrapDirectoryReader(DirectoryReader in) throws IOException {
+      return new DummyDirectoryReader(in);
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
+  }
+}

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/memory/src/java/org/apache/lucene/index/memory/MemoryIndex.java
----------------------------------------------------------------------
diff --git a/lucene/memory/src/java/org/apache/lucene/index/memory/MemoryIndex.java b/lucene/memory/src/java/org/apache/lucene/index/memory/MemoryIndex.java
index 7587406..dc4a666 100644
--- a/lucene/memory/src/java/org/apache/lucene/index/memory/MemoryIndex.java
+++ b/lucene/memory/src/java/org/apache/lucene/index/memory/MemoryIndex.java
@@ -1182,16 +1182,6 @@ public class MemoryIndex {
       }
     }
 
-    @Override
-    public void addCoreClosedListener(CoreClosedListener listener) {
-      addCoreClosedListenerAsReaderClosedListener(this, listener);
-    }
-
-    @Override
-    public void removeCoreClosedListener(CoreClosedListener listener) {
-      removeCoreClosedListenerAsReaderClosedListener(this, listener);
-    }
-
     private Info getInfoForExpectedDocValuesType(String fieldName, DocValuesType expectedType) {
       if (expectedType == DocValuesType.NONE) {
         return null;
@@ -1684,6 +1674,16 @@ public class MemoryIndex {
     public Sort getIndexSort() {
       return null;
     }
+
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return null;
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
   }
 
   /**

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/misc/src/java/org/apache/lucene/index/MultiPassIndexSplitter.java
----------------------------------------------------------------------
diff --git a/lucene/misc/src/java/org/apache/lucene/index/MultiPassIndexSplitter.java b/lucene/misc/src/java/org/apache/lucene/index/MultiPassIndexSplitter.java
index b4e2131..6d7e4ea 100644
--- a/lucene/misc/src/java/org/apache/lucene/index/MultiPassIndexSplitter.java
+++ b/lucene/misc/src/java/org/apache/lucene/index/MultiPassIndexSplitter.java
@@ -206,6 +206,11 @@ public class MultiPassIndexSplitter {
     @Override
     protected void doClose() {}
 
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
+
     // no need to override numDocs/hasDeletions,
     // as we pass the subreaders directly to IW.addIndexes().
   }
@@ -247,5 +252,15 @@ public class MultiPassIndexSplitter {
     public Bits getLiveDocs() {
       return liveDocs;
     }
+
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return in.getCoreCacheHelper();
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
   }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/misc/src/java/org/apache/lucene/index/PKIndexSplitter.java
----------------------------------------------------------------------
diff --git a/lucene/misc/src/java/org/apache/lucene/index/PKIndexSplitter.java b/lucene/misc/src/java/org/apache/lucene/index/PKIndexSplitter.java
index 49ec6a2..c95cda6 100644
--- a/lucene/misc/src/java/org/apache/lucene/index/PKIndexSplitter.java
+++ b/lucene/misc/src/java/org/apache/lucene/index/PKIndexSplitter.java
@@ -167,5 +167,15 @@ public class PKIndexSplitter {
     public Bits getLiveDocs() {
       return liveDocs;
     }
+
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return in.getCoreCacheHelper();
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
   }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/replicator/src/java/org/apache/lucene/replicator/nrt/SegmentInfosSearcherManager.java
----------------------------------------------------------------------
diff --git a/lucene/replicator/src/java/org/apache/lucene/replicator/nrt/SegmentInfosSearcherManager.java b/lucene/replicator/src/java/org/apache/lucene/replicator/nrt/SegmentInfosSearcherManager.java
index 4cb49c4..a04464a 100644
--- a/lucene/replicator/src/java/org/apache/lucene/replicator/nrt/SegmentInfosSearcherManager.java
+++ b/lucene/replicator/src/java/org/apache/lucene/replicator/nrt/SegmentInfosSearcherManager.java
@@ -111,10 +111,14 @@ class SegmentInfosSearcherManager extends ReferenceManager<IndexSearcher> {
   }
 
   private void addReaderClosedListener(IndexReader r) {
+    IndexReader.CacheHelper cacheHelper = r.getReaderCacheHelper();
+    if (cacheHelper == null) {
+      throw new IllegalStateException("StandardDirectoryReader must support caching");
+    }
     openReaderCount.incrementAndGet();
-    r.addReaderClosedListener(new IndexReader.ReaderClosedListener() {
+    cacheHelper.addClosedListener(new IndexReader.ClosedListener() {
         @Override
-        public void onClose(IndexReader reader) {
+        public void onClose(IndexReader.CacheKey cacheKey) {
           onReaderClosed();
         }
       });

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/test-framework/src/java/org/apache/lucene/index/AllDeletedFilterReader.java
----------------------------------------------------------------------
diff --git a/lucene/test-framework/src/java/org/apache/lucene/index/AllDeletedFilterReader.java b/lucene/test-framework/src/java/org/apache/lucene/index/AllDeletedFilterReader.java
index ff789a0..f11d2e3 100644
--- a/lucene/test-framework/src/java/org/apache/lucene/index/AllDeletedFilterReader.java
+++ b/lucene/test-framework/src/java/org/apache/lucene/index/AllDeletedFilterReader.java
@@ -39,4 +39,14 @@ public class AllDeletedFilterReader extends FilterLeafReader {
   public int numDocs() {
     return 0;
   }
+
+  @Override
+  public CacheHelper getCoreCacheHelper() {
+    return in.getCoreCacheHelper();
+  }
+
+  @Override
+  public CacheHelper getReaderCacheHelper() {
+    return null;
+  }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/test-framework/src/java/org/apache/lucene/index/AssertingDirectoryReader.java
----------------------------------------------------------------------
diff --git a/lucene/test-framework/src/java/org/apache/lucene/index/AssertingDirectoryReader.java b/lucene/test-framework/src/java/org/apache/lucene/index/AssertingDirectoryReader.java
index 712f36d..2326726 100644
--- a/lucene/test-framework/src/java/org/apache/lucene/index/AssertingDirectoryReader.java
+++ b/lucene/test-framework/src/java/org/apache/lucene/index/AssertingDirectoryReader.java
@@ -41,13 +41,8 @@ public class AssertingDirectoryReader extends FilterDirectoryReader {
   }
 
   @Override
-  public Object getCoreCacheKey() {
-    return in.getCoreCacheKey();
-  }
-
-  @Override
-  public Object getCombinedCoreAndDeletesKey() {
-    return in.getCombinedCoreAndDeletesKey();
+  public CacheHelper getReaderCacheHelper() {
+    return in.getReaderCacheHelper();
   }
 
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/test-framework/src/java/org/apache/lucene/index/AssertingLeafReader.java
----------------------------------------------------------------------
diff --git a/lucene/test-framework/src/java/org/apache/lucene/index/AssertingLeafReader.java b/lucene/test-framework/src/java/org/apache/lucene/index/AssertingLeafReader.java
index e837359..b03fa3d 100644
--- a/lucene/test-framework/src/java/org/apache/lucene/index/AssertingLeafReader.java
+++ b/lucene/test-framework/src/java/org/apache/lucene/index/AssertingLeafReader.java
@@ -51,11 +51,23 @@ public class AssertingLeafReader extends FilterLeafReader {
     assert in.numDeletedDocs() + in.numDocs() == in.maxDoc();
     assert !in.hasDeletions() || in.numDeletedDocs() > 0 && in.numDocs() < in.maxDoc();
 
-    addCoreClosedListener(ownerCoreCacheKey -> {
-      final Object expectedKey = getCoreCacheKey();
-      assert expectedKey == ownerCoreCacheKey
-          : "Core closed listener called on a different key " + expectedKey + " <> " + ownerCoreCacheKey;
-    });
+    CacheHelper coreCacheHelper = in.getCoreCacheHelper();
+    if (coreCacheHelper != null) {
+      coreCacheHelper.addClosedListener(cacheKey -> {
+        final Object expectedKey = coreCacheHelper.getKey();
+        assert expectedKey == cacheKey
+            : "Core closed listener called on a different key " + expectedKey + " <> " + cacheKey;
+      });
+    }
+
+    CacheHelper readerCacheHelper = in.getReaderCacheHelper();
+    if (readerCacheHelper != null) {
+      readerCacheHelper.addClosedListener(cacheKey -> {
+        final Object expectedKey = readerCacheHelper.getKey();
+        assert expectedKey == cacheKey
+            : "Core closed listener called on a different key " + expectedKey + " <> " + cacheKey;
+      });
+    }
   }
 
   @Override
@@ -1137,12 +1149,12 @@ public class AssertingLeafReader extends FilterLeafReader {
   // we don't change behavior of the reader: just validate the API.
 
   @Override
-  public Object getCoreCacheKey() {
-    return in.getCoreCacheKey();
+  public CacheHelper getCoreCacheHelper() {
+    return in.getCoreCacheHelper();
   }
 
   @Override
-  public Object getCombinedCoreAndDeletesKey() {
-    return in.getCombinedCoreAndDeletesKey();
+  public CacheHelper getReaderCacheHelper() {
+    return in.getReaderCacheHelper();
   }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/test-framework/src/java/org/apache/lucene/index/BaseStoredFieldsFormatTestCase.java
----------------------------------------------------------------------
diff --git a/lucene/test-framework/src/java/org/apache/lucene/index/BaseStoredFieldsFormatTestCase.java b/lucene/test-framework/src/java/org/apache/lucene/index/BaseStoredFieldsFormatTestCase.java
index 554d908..60e2cca 100644
--- a/lucene/test-framework/src/java/org/apache/lucene/index/BaseStoredFieldsFormatTestCase.java
+++ b/lucene/test-framework/src/java/org/apache/lucene/index/BaseStoredFieldsFormatTestCase.java
@@ -586,6 +586,16 @@ public abstract class BaseStoredFieldsFormatTestCase extends BaseIndexFileFormat
       super.document(maxDoc() - 1 - docID, visitor);
     }
 
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return null;
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
+
   }
 
   private static class DummyFilterDirectoryReader extends FilterDirectoryReader {
@@ -603,6 +613,11 @@ public abstract class BaseStoredFieldsFormatTestCase extends BaseIndexFileFormat
     protected DirectoryReader doWrapDirectoryReader(DirectoryReader in) throws IOException {
       return new DummyFilterDirectoryReader(in);
     }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
     
   }
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/test-framework/src/java/org/apache/lucene/index/FieldFilterLeafReader.java
----------------------------------------------------------------------
diff --git a/lucene/test-framework/src/java/org/apache/lucene/index/FieldFilterLeafReader.java b/lucene/test-framework/src/java/org/apache/lucene/index/FieldFilterLeafReader.java
index a75af54..751833f 100644
--- a/lucene/test-framework/src/java/org/apache/lucene/index/FieldFilterLeafReader.java
+++ b/lucene/test-framework/src/java/org/apache/lucene/index/FieldFilterLeafReader.java
@@ -174,5 +174,15 @@ public final class FieldFilterLeafReader extends FilterLeafReader {
     }
     
   }
-  
+
+  @Override
+  public CacheHelper getCoreCacheHelper() {
+    return null;
+  }
+
+  @Override
+  public CacheHelper getReaderCacheHelper() {
+    return null;
+  }
+
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/test-framework/src/java/org/apache/lucene/index/MismatchedDirectoryReader.java
----------------------------------------------------------------------
diff --git a/lucene/test-framework/src/java/org/apache/lucene/index/MismatchedDirectoryReader.java b/lucene/test-framework/src/java/org/apache/lucene/index/MismatchedDirectoryReader.java
index 76aecef..7fb1581 100644
--- a/lucene/test-framework/src/java/org/apache/lucene/index/MismatchedDirectoryReader.java
+++ b/lucene/test-framework/src/java/org/apache/lucene/index/MismatchedDirectoryReader.java
@@ -46,4 +46,9 @@ public class MismatchedDirectoryReader extends FilterDirectoryReader {
   protected DirectoryReader doWrapDirectoryReader(DirectoryReader in) throws IOException {
     return new AssertingDirectoryReader(in);
   }
+
+  @Override
+  public CacheHelper getReaderCacheHelper() {
+    return in.getReaderCacheHelper();
+  }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/test-framework/src/java/org/apache/lucene/index/MismatchedLeafReader.java
----------------------------------------------------------------------
diff --git a/lucene/test-framework/src/java/org/apache/lucene/index/MismatchedLeafReader.java b/lucene/test-framework/src/java/org/apache/lucene/index/MismatchedLeafReader.java
index 38c1d7f..7dd6ba8 100644
--- a/lucene/test-framework/src/java/org/apache/lucene/index/MismatchedLeafReader.java
+++ b/lucene/test-framework/src/java/org/apache/lucene/index/MismatchedLeafReader.java
@@ -45,6 +45,16 @@ public class MismatchedLeafReader extends FilterLeafReader {
     in.document(docID, new MismatchedVisitor(visitor));
   }
 
+  @Override
+  public CacheHelper getCoreCacheHelper() {
+    return in.getCoreCacheHelper();
+  }
+
+  @Override
+  public CacheHelper getReaderCacheHelper() {
+    return in.getReaderCacheHelper();
+  }
+
   static FieldInfos shuffleInfos(FieldInfos infos, Random random) {
     // first, shuffle the order
     List<FieldInfo> shuffled = new ArrayList<>();

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/test-framework/src/java/org/apache/lucene/index/MockRandomMergePolicy.java
----------------------------------------------------------------------
diff --git a/lucene/test-framework/src/java/org/apache/lucene/index/MockRandomMergePolicy.java b/lucene/test-framework/src/java/org/apache/lucene/index/MockRandomMergePolicy.java
index b1cc0ee..f9fa601 100644
--- a/lucene/test-framework/src/java/org/apache/lucene/index/MockRandomMergePolicy.java
+++ b/lucene/test-framework/src/java/org/apache/lucene/index/MockRandomMergePolicy.java
@@ -156,7 +156,18 @@ public class MockRandomMergePolicy extends MergePolicy {
         if (LuceneTestCase.VERBOSE) {
           System.out.println("NOTE: MockRandomMergePolicy now swaps in a SlowCodecReaderWrapper for merging reader=" + reader);
         }
-        return SlowCodecReaderWrapper.wrap(new FilterLeafReader(new MergeReaderWrapper(reader)) {});
+        return SlowCodecReaderWrapper.wrap(new FilterLeafReader(new MergeReaderWrapper(reader)) {
+
+          @Override
+          public CacheHelper getCoreCacheHelper() {
+            return in.getCoreCacheHelper();
+          }
+
+          @Override
+          public CacheHelper getReaderCacheHelper() {
+            return in.getReaderCacheHelper();
+          }
+        });
       } else if (thingToDo == 1) {
         // renumber fields
         // NOTE: currently this only "blocks" bulk merges just by

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/test-framework/src/java/org/apache/lucene/search/QueryUtils.java
----------------------------------------------------------------------
diff --git a/lucene/test-framework/src/java/org/apache/lucene/search/QueryUtils.java b/lucene/test-framework/src/java/org/apache/lucene/search/QueryUtils.java
index ae4c890..c40f875 100644
--- a/lucene/test-framework/src/java/org/apache/lucene/search/QueryUtils.java
+++ b/lucene/test-framework/src/java/org/apache/lucene/search/QueryUtils.java
@@ -132,27 +132,6 @@ public class QueryUtils {
     }
   }
 
-  /** This is a MultiReader that can be used for randomly wrapping other readers
-   * without creating FieldCache insanity.
-   * The trick is to use an opaque/fake cache key. */
-  public static class FCInvisibleMultiReader extends MultiReader {
-    private final Object cacheKey = new Object();
-
-    public FCInvisibleMultiReader(IndexReader... readers) throws IOException {
-      super(readers);
-    }
-
-    @Override
-    public Object getCoreCacheKey() {
-      return cacheKey;
-    }
-
-    @Override
-    public Object getCombinedCoreAndDeletesKey() {
-      return cacheKey;
-    }
-  }
-
   /**
    * Given an IndexSearcher, returns a new IndexSearcher whose IndexReader
    * is a MultiReader containing the Reader of the original IndexSearcher,
@@ -172,17 +151,17 @@ public class QueryUtils {
     IndexReader[] readers = new IndexReader[] {
       edge < 0 ? r : new MultiReader(),
       new MultiReader(),
-      new FCInvisibleMultiReader(edge < 0 ? emptyReader(4) : new MultiReader(),
+      new MultiReader(edge < 0 ? emptyReader(4) : new MultiReader(),
           new MultiReader(),
           0 == edge ? r : new MultiReader()),
       0 < edge ? new MultiReader() : emptyReader(7),
       new MultiReader(),
-      new FCInvisibleMultiReader(0 < edge ? new MultiReader() : emptyReader(5),
+      new MultiReader(0 < edge ? new MultiReader() : emptyReader(5),
           new MultiReader(),
           0 < edge ? r : new MultiReader())
     };
 
-    IndexSearcher out = LuceneTestCase.newSearcher(new FCInvisibleMultiReader(readers));
+    IndexSearcher out = LuceneTestCase.newSearcher(new MultiReader(readers));
     out.setSimilarity(s.getSimilarity(true));
     return out;
   }
@@ -191,12 +170,6 @@ public class QueryUtils {
     return new LeafReader() {
 
       @Override
-      public void addCoreClosedListener(CoreClosedListener listener) {}
-
-      @Override
-      public void removeCoreClosedListener(CoreClosedListener listener) {}
-
-      @Override
       public Fields fields() throws IOException {
         return new Fields() {
           @Override
@@ -290,6 +263,16 @@ public class QueryUtils {
       public Sort getIndexSort() {
         return null;
       }
+
+      @Override
+      public CacheHelper getCoreCacheHelper() {
+        return null;
+      }
+
+      @Override
+      public CacheHelper getReaderCacheHelper() {
+        return null;
+      }
     };
   }
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/lucene/test-framework/src/java/org/apache/lucene/util/LuceneTestCase.java
----------------------------------------------------------------------
diff --git a/lucene/test-framework/src/java/org/apache/lucene/util/LuceneTestCase.java b/lucene/test-framework/src/java/org/apache/lucene/util/LuceneTestCase.java
index 90f85b9..591db20 100644
--- a/lucene/test-framework/src/java/org/apache/lucene/util/LuceneTestCase.java
+++ b/lucene/test-framework/src/java/org/apache/lucene/util/LuceneTestCase.java
@@ -74,7 +74,6 @@ import org.apache.lucene.document.FieldType;
 import org.apache.lucene.document.StringField;
 import org.apache.lucene.document.TextField;
 import org.apache.lucene.index.*;
-import org.apache.lucene.index.IndexReader.ReaderClosedListener;
 import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.index.TermsEnum.SeekStatus;
 import org.apache.lucene.mockfile.FilterPath;
@@ -86,7 +85,6 @@ import org.apache.lucene.search.LRUQueryCache;
 import org.apache.lucene.search.Query;
 import org.apache.lucene.search.QueryCache;
 import org.apache.lucene.search.QueryCachingPolicy;
-import org.apache.lucene.search.QueryUtils.FCInvisibleMultiReader;
 import org.apache.lucene.store.BaseDirectoryWrapper;
 import org.apache.lucene.store.Directory;
 import org.apache.lucene.store.FSDirectory;
@@ -1664,7 +1662,7 @@ public abstract class LuceneTestCase extends Assert {
     Random random = random();
       
     for (int i = 0, c = random.nextInt(6)+1; i < c; i++) {
-      switch(random.nextInt(5)) {
+      switch(random.nextInt(4)) {
       case 0:
         // will create no FC insanity in atomic case, as ParallelLeafReader has own cache key:
         if (VERBOSE) {
@@ -1675,15 +1673,6 @@ public abstract class LuceneTestCase extends Assert {
         new ParallelCompositeReader((CompositeReader) r);
         break;
       case 1:
-        // H�ckidy-Hick-Hack: a standard MultiReader will cause FC insanity, so we use
-        // QueryUtils' reader with a fake cache key, so insanity checker cannot walk
-        // along our reader:
-        if (VERBOSE) {
-          System.out.println("NOTE: LuceneTestCase.wrapReader: wrapping previous reader=" + r + " with FCInvisibleMultiReader");
-        }
-        r = new FCInvisibleMultiReader(r);
-        break;
-      case 2:
         if (r instanceof LeafReader) {
           final LeafReader ar = (LeafReader) r;
           final List<String> allFields = new ArrayList<>();
@@ -1703,7 +1692,7 @@ public abstract class LuceneTestCase extends Assert {
                                      );
         }
         break;
-      case 3:
+      case 2:
         // H�ckidy-Hick-Hack: a standard Reader will cause FC insanity, so we use
         // QueryUtils' reader with a fake cache key, so insanity checker cannot walk
         // along our reader:
@@ -1716,7 +1705,7 @@ public abstract class LuceneTestCase extends Assert {
           r = new AssertingDirectoryReader((DirectoryReader)r);
         }
         break;
-      case 4:
+      case 3:
         if (VERBOSE) {
           System.out.println("NOTE: LuceneTestCase.wrapReader: wrapping previous reader=" + r + " with MismatchedLeaf/DirectoryReader");
         }
@@ -1731,10 +1720,6 @@ public abstract class LuceneTestCase extends Assert {
       }
     }
 
-    if ((r instanceof CompositeReader) && !(r instanceof FCInvisibleMultiReader)) {
-      // prevent cache insanity caused by e.g. ParallelCompositeReader, to fix we wrap one more time:
-      r = new FCInvisibleMultiReader(r);
-    }
     if (VERBOSE) {
       System.out.println("wrapReader wrapped: " +r);
     }
@@ -1900,7 +1885,7 @@ public abstract class LuceneTestCase extends Assert {
     } else {
       int threads = 0;
       final ThreadPoolExecutor ex;
-      if (random.nextBoolean()) {
+      if (r.getReaderCacheHelper() == null || random.nextBoolean()) {
         ex = null;
       } else {
         threads = TestUtil.nextInt(random, 1, 8);
@@ -1914,12 +1899,7 @@ public abstract class LuceneTestCase extends Assert {
        if (VERBOSE) {
          System.out.println("NOTE: newSearcher using ExecutorService with " + threads + " threads");
        }
-       r.addReaderClosedListener(new ReaderClosedListener() {
-         @Override
-         public void onClose(IndexReader reader) {
-           TestUtil.shutdownExecutorService(ex);
-         }
-       });
+       r.getReaderCacheHelper().addClosedListener(cacheKey -> TestUtil.shutdownExecutorService(ex));
       }
       IndexSearcher ret;
       if (wrapWithAssertions) {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/java/org/apache/solr/core/SolrCore.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/core/SolrCore.java b/solr/core/src/java/org/apache/solr/core/SolrCore.java
index b108b38..1c30e4c 100644
--- a/solr/core/src/java/org/apache/solr/core/SolrCore.java
+++ b/solr/core/src/java/org/apache/solr/core/SolrCore.java
@@ -63,6 +63,7 @@ import org.apache.lucene.analysis.util.ResourceLoader;
 import org.apache.lucene.codecs.Codec;
 import org.apache.lucene.index.DirectoryReader;
 import org.apache.lucene.index.IndexDeletionPolicy;
+import org.apache.lucene.index.IndexReader;
 import org.apache.lucene.index.IndexWriter;
 import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.search.BooleanQuery;
@@ -219,7 +220,7 @@ public final class SolrCore implements SolrInfoMBean, Closeable {
 
   public Date getStartTimeStamp() { return startTime; }
 
-  private final Map<Object, IndexFingerprint> perSegmentFingerprintCache = new MapMaker().weakKeys().makeMap();
+  private final Map<IndexReader.CacheKey, IndexFingerprint> perSegmentFingerprintCache = new MapMaker().weakKeys().makeMap();
 
   public long getStartNanoTime() {
     return startNanoTime;
@@ -1775,8 +1776,14 @@ public final class SolrCore implements SolrInfoMBean, Closeable {
    */
   public IndexFingerprint getIndexFingerprint(SolrIndexSearcher searcher, LeafReaderContext ctx, long maxVersion)
       throws IOException {
+    IndexReader.CacheHelper cacheHelper = ctx.reader().getReaderCacheHelper();
+    if (cacheHelper == null) {
+      log.debug("Cannot cache IndexFingerprint as reader does not support caching. searcher:{} reader:{} readerHash:{} maxVersion:{}", searcher, ctx.reader(), ctx.reader().hashCode(), maxVersion);
+      return IndexFingerprint.getFingerprint(searcher, ctx, maxVersion);
+    }
+    
     IndexFingerprint f = null;
-    f = perSegmentFingerprintCache.get(ctx.reader().getCombinedCoreAndDeletesKey());
+    f = perSegmentFingerprintCache.get(cacheHelper.getKey());
     // fingerprint is either not cached or
     // if we want fingerprint only up to a version less than maxVersionEncountered in the segment, or
     // documents were deleted from segment for which fingerprint was cached
@@ -1787,7 +1794,7 @@ public final class SolrCore implements SolrInfoMBean, Closeable {
       // cache fingerprint for the segment only if all the versions in the segment are included in the fingerprint
       if (f.getMaxVersionEncountered() == f.getMaxInHash()) {
         log.info("Caching fingerprint for searcher:{} leafReaderContext:{} mavVersion:{}", searcher, ctx, maxVersion);
-        perSegmentFingerprintCache.put(ctx.reader().getCombinedCoreAndDeletesKey(), f);
+        perSegmentFingerprintCache.put(cacheHelper.getKey(), f);
       }
 
     } else {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/java/org/apache/solr/handler/component/ExpandComponent.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/handler/component/ExpandComponent.java b/solr/core/src/java/org/apache/solr/handler/component/ExpandComponent.java
index c06aab0..8078bdc 100644
--- a/solr/core/src/java/org/apache/solr/handler/component/ExpandComponent.java
+++ b/solr/core/src/java/org/apache/solr/handler/component/ExpandComponent.java
@@ -761,6 +761,8 @@ public class ExpandComponent extends SearchComponent implements PluginInfoInitia
     }
   }
 
+  // this reader alters the content of the given reader so it should not
+  // delegate the caching stuff
   private class ReaderWrapper extends FilterLeafReader {
 
     private String field;
@@ -774,10 +776,6 @@ public class ExpandComponent extends SearchComponent implements PluginInfoInitia
       return null;
     }
 
-    public Object getCoreCacheKey() {
-      return in.getCoreCacheKey();
-    }
-
     public FieldInfos getFieldInfos() {
       Iterator<FieldInfo> it = in.getFieldInfos().iterator();
       List<FieldInfo> newInfos = new ArrayList<>();
@@ -805,6 +803,21 @@ public class ExpandComponent extends SearchComponent implements PluginInfoInitia
       FieldInfos infos = new FieldInfos(newInfos.toArray(new FieldInfo[newInfos.size()]));
       return infos;
     }
+
+    // NOTE: delegating the caches is wrong here as we are altering the content
+    // of the reader, this should ONLY be used under an uninvertingreader which
+    // will restore doc values back using uninversion, otherwise all sorts of
+    // crazy things could happen.
+
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return in.getCoreCacheHelper();
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return in.getReaderCacheHelper();
+    }
   }
 
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java b/solr/core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
index 33ea575..e4ada59 100644
--- a/solr/core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
+++ b/solr/core/src/java/org/apache/solr/highlight/DefaultSolrHighlighter.java
@@ -914,4 +914,14 @@ class TermVectorReusingLeafReader extends FilterLeafReader {
     return tvFields;
   }
 
+  @Override
+  public CacheHelper getCoreCacheHelper() {
+    return null;
+  }
+
+  @Override
+  public CacheHelper getReaderCacheHelper() {
+    return null;
+  }
+
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/java/org/apache/solr/index/SlowCompositeReaderWrapper.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/index/SlowCompositeReaderWrapper.java b/solr/core/src/java/org/apache/solr/index/SlowCompositeReaderWrapper.java
index 12f5bd1..c445cdf 100644
--- a/solr/core/src/java/org/apache/solr/index/SlowCompositeReaderWrapper.java
+++ b/solr/core/src/java/org/apache/solr/index/SlowCompositeReaderWrapper.java
@@ -47,7 +47,6 @@ public final class SlowCompositeReaderWrapper extends LeafReader {
 
   private final CompositeReader in;
   private final Fields fields;
-  private final boolean merging;
   
   /** This method is sugar for getting an {@link LeafReader} from
    * an {@link IndexReader} of any kind. If the reader is already atomic,
@@ -55,19 +54,18 @@ public final class SlowCompositeReaderWrapper extends LeafReader {
    */
   public static LeafReader wrap(IndexReader reader) throws IOException {
     if (reader instanceof CompositeReader) {
-      return new SlowCompositeReaderWrapper((CompositeReader) reader, false);
+      return new SlowCompositeReaderWrapper((CompositeReader) reader);
     } else {
       assert reader instanceof LeafReader;
       return (LeafReader) reader;
     }
   }
 
-  SlowCompositeReaderWrapper(CompositeReader reader, boolean merging) throws IOException {
+  SlowCompositeReaderWrapper(CompositeReader reader) throws IOException {
     super();
     in = reader;
     fields = MultiFields.getFields(in);
     in.registerParentReader(this);
-    this.merging = merging;
   }
 
   @Override
@@ -76,13 +74,16 @@ public final class SlowCompositeReaderWrapper extends LeafReader {
   }
 
   @Override
-  public void addCoreClosedListener(CoreClosedListener listener) {
-    addCoreClosedListenerAsReaderClosedListener(in, listener);
+  public CacheHelper getReaderCacheHelper() {
+    return in.getReaderCacheHelper();
   }
 
   @Override
-  public void removeCoreClosedListener(CoreClosedListener listener) {
-    removeCoreClosedListenerAsReaderClosedListener(in, listener);
+  public CacheHelper getCoreCacheHelper() {
+    // TODO: this is trappy as the expectation is that core keys live for a long
+    // time, but here we need to bound it to the lifetime of the wrapped
+    // composite reader? Unfortunately some features seem to rely on this...
+    return in.getReaderCacheHelper();
   }
 
   @Override
@@ -120,7 +121,8 @@ public final class SlowCompositeReaderWrapper extends LeafReader {
         SortedDocValues dv = MultiDocValues.getSortedValues(in, field);
         if (dv instanceof MultiSortedDocValues) {
           map = ((MultiSortedDocValues)dv).mapping;
-          if (map.owner == getCoreCacheKey() && merging == false) {
+          IndexReader.CacheHelper cacheHelper = getReaderCacheHelper();
+          if (cacheHelper != null && map.owner == cacheHelper.getKey()) {
             cachedOrdMaps.put(field, map);
           }
         }
@@ -161,7 +163,8 @@ public final class SlowCompositeReaderWrapper extends LeafReader {
         SortedSetDocValues dv = MultiDocValues.getSortedSetValues(in, field);
         if (dv instanceof MultiDocValues.MultiSortedSetDocValues) {
           map = ((MultiDocValues.MultiSortedSetDocValues)dv).mapping;
-          if (map.owner == getCoreCacheKey() && merging == false) {
+          IndexReader.CacheHelper cacheHelper = getReaderCacheHelper();
+          if (cacheHelper != null && map.owner == cacheHelper.getKey()) {
             cachedOrdMaps.put(field, map);
           }
         }
@@ -195,7 +198,7 @@ public final class SlowCompositeReaderWrapper extends LeafReader {
   
   // TODO: this could really be a weak map somewhere else on the coreCacheKey,
   // but do we really need to optimize slow-wrapper any more?
-  private final Map<String,OrdinalMap> cachedOrdMaps = new HashMap<>();
+  final Map<String,OrdinalMap> cachedOrdMaps = new HashMap<>();
 
   @Override
   public NumericDocValues getNormValues(String field) throws IOException {
@@ -246,16 +249,6 @@ public final class SlowCompositeReaderWrapper extends LeafReader {
   }
 
   @Override
-  public Object getCoreCacheKey() {
-    return in.getCoreCacheKey();
-  }
-
-  @Override
-  public Object getCombinedCoreAndDeletesKey() {
-    return in.getCombinedCoreAndDeletesKey();
-  }
-
-  @Override
   protected void doClose() throws IOException {
     // TODO: as this is a wrapper, should we really close the delegate?
     in.close();

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/java/org/apache/solr/schema/RptWithGeometrySpatialField.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/schema/RptWithGeometrySpatialField.java b/solr/core/src/java/org/apache/solr/schema/RptWithGeometrySpatialField.java
index f072867..acd2255 100644
--- a/solr/core/src/java/org/apache/solr/schema/RptWithGeometrySpatialField.java
+++ b/solr/core/src/java/org/apache/solr/schema/RptWithGeometrySpatialField.java
@@ -22,6 +22,7 @@ import java.util.HashMap;
 import java.util.Map;
 
 import org.apache.lucene.analysis.Analyzer;
+import org.apache.lucene.index.IndexReader;
 import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.queries.function.FunctionValues;
 import org.apache.lucene.queries.function.ValueSource;
@@ -162,7 +163,11 @@ public class RptWithGeometrySpatialField extends AbstractSpatialFieldType<Compos
           }
           docId = doc;
           //lookup in cache
-          PerSegCacheKey key = new PerSegCacheKey(readerContext.reader().getCoreCacheKey(), doc);
+          IndexReader.CacheHelper cacheHelper = readerContext.reader().getCoreCacheHelper();
+          if (cacheHelper == null) {
+            throw new IllegalStateException("Leaf " + readerContext.reader() + " is not suited for caching");
+          }
+          PerSegCacheKey key = new PerSegCacheKey(cacheHelper.getKey(), doc);
           shape = cache.get(key);
           if (shape == null) {
             shape = (Shape) targetFuncValues.objectVal(doc);

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/java/org/apache/solr/search/CollapsingQParserPlugin.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/search/CollapsingQParserPlugin.java b/solr/core/src/java/org/apache/solr/search/CollapsingQParserPlugin.java
index 65d470e..71478aa 100644
--- a/solr/core/src/java/org/apache/solr/search/CollapsingQParserPlugin.java
+++ b/solr/core/src/java/org/apache/solr/search/CollapsingQParserPlugin.java
@@ -388,12 +388,23 @@ public class CollapsingQParserPlugin extends QParserPlugin {
       this.field = field;
     }
 
-    public SortedDocValues getSortedDocValues(String field) {
-      return null;
+    // NOTE: delegating the caches is wrong here as we are altering the content
+    // of the reader, this should ONLY be used under an uninvertingreader which
+    // will restore doc values back using uninversion, otherwise all sorts of
+    // crazy things could happen.
+
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return in.getCoreCacheHelper();
     }
 
-    public Object getCoreCacheKey() {
-      return in.getCoreCacheKey();
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return in.getReaderCacheHelper();
+    }
+
+    public SortedDocValues getSortedDocValues(String field) {
+      return null;
     }
 
     public FieldInfos getFieldInfos() {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/java/org/apache/solr/search/Insanity.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/search/Insanity.java b/solr/core/src/java/org/apache/solr/search/Insanity.java
index 7f16797..aa36652 100644
--- a/solr/core/src/java/org/apache/solr/search/Insanity.java
+++ b/solr/core/src/java/org/apache/solr/search/Insanity.java
@@ -118,13 +118,14 @@ public class Insanity {
     // important to override these, so fieldcaches are shared on what we wrap
     
     @Override
-    public Object getCoreCacheKey() {
-      return in.getCoreCacheKey();
+    public CacheHelper getCoreCacheHelper() {
+      return in.getCoreCacheHelper();
     }
 
     @Override
-    public Object getCombinedCoreAndDeletesKey() {
-      return in.getCombinedCoreAndDeletesKey();
+    public CacheHelper getReaderCacheHelper() {
+      return in.getReaderCacheHelper();
     }
+
   }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/java/org/apache/solr/uninverting/FieldCache.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/uninverting/FieldCache.java b/solr/core/src/java/org/apache/solr/uninverting/FieldCache.java
index 544800e..89e6f0b 100644
--- a/solr/core/src/java/org/apache/solr/uninverting/FieldCache.java
+++ b/solr/core/src/java/org/apache/solr/uninverting/FieldCache.java
@@ -17,7 +17,6 @@
 package org.apache.solr.uninverting;
 
 import java.io.IOException;
-import java.io.PrintStream;
 
 import org.apache.lucene.document.NumericDocValuesField;
 import org.apache.lucene.index.BinaryDocValues;
@@ -41,7 +40,6 @@ import org.apache.lucene.util.RamUsageEstimator;
  * <p>Created: May 19, 2004 11:13:14 AM
  *
  * @since   lucene 1.4
- * @see FieldCacheSanityChecker
  *
  * @lucene.internal
  */
@@ -357,7 +355,7 @@ public interface FieldCache {
     private final Object custom;
     private final Accountable value;
 
-    public CacheEntry(Object readerKey, String fieldName,
+    public CacheEntry(IndexReader.CacheKey readerKey, String fieldName,
                       Class<?> cacheType,
                       Object custom,
                       Accountable value) {
@@ -437,21 +435,13 @@ public interface FieldCache {
 
   /**
    * Expert: drops all cache entries associated with this
-   * reader {@link IndexReader#getCoreCacheKey}.  NOTE: this cache key must
+   * reader {@link org.apache.lucene.index.IndexReader.CacheHelper#getKey()}.
+   * NOTE: this cache key must
    * precisely match the reader that the cache entry is
    * keyed on. If you pass a top-level reader, it usually
    * will have no effect as Lucene now caches at the segment
    * reader level.
    */
-  public void purgeByCacheKey(Object coreCacheKey);
+  public void purgeByCacheKey(IndexReader.CacheKey coreCacheKey);
 
-  /**
-   * If non-null, FieldCacheImpl will warn whenever
-   * entries are created that are not sane according to
-   * {@link FieldCacheSanityChecker}.
-   */
-  public void setInfoStream(PrintStream stream);
-
-  /** counterpart of {@link #setInfoStream(PrintStream)} */
-  public PrintStream getInfoStream();
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/java/org/apache/solr/uninverting/FieldCacheImpl.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/uninverting/FieldCacheImpl.java b/solr/core/src/java/org/apache/solr/uninverting/FieldCacheImpl.java
index 90be400..e6e8fda 100644
--- a/solr/core/src/java/org/apache/solr/uninverting/FieldCacheImpl.java
+++ b/solr/core/src/java/org/apache/solr/uninverting/FieldCacheImpl.java
@@ -17,7 +17,6 @@
 package org.apache.solr.uninverting;
 
 import java.io.IOException;
-import java.io.PrintStream;
 import java.util.ArrayList;
 import java.util.Collection;
 import java.util.Collections;
@@ -31,13 +30,13 @@ import org.apache.lucene.index.DocValues;
 import org.apache.lucene.index.DocValuesType;
 import org.apache.lucene.index.FieldInfo;
 import org.apache.lucene.index.IndexOptions;
+import org.apache.lucene.index.IndexReader;
 import org.apache.lucene.index.LeafReader;
 import org.apache.lucene.index.NumericDocValues;
 import org.apache.lucene.index.PointValues;
 import org.apache.lucene.index.PointValues.IntersectVisitor;
 import org.apache.lucene.index.PointValues.Relation;
 import org.apache.lucene.index.PostingsEnum;
-import org.apache.lucene.index.SegmentReader;
 import org.apache.lucene.index.SortedDocValues;
 import org.apache.lucene.index.SortedSetDocValues;
 import org.apache.lucene.index.Terms;
@@ -82,7 +81,7 @@ public class FieldCacheImpl implements FieldCache {
   }
 
   @Override
-  public synchronized void purgeByCacheKey(Object coreCacheKey) {
+  public synchronized void purgeByCacheKey(IndexReader.CacheKey coreCacheKey) {
     for(Cache c : caches.values()) {
       c.purgeByCacheKey(coreCacheKey);
     }
@@ -95,8 +94,8 @@ public class FieldCacheImpl implements FieldCache {
       final Cache cache = cacheEntry.getValue();
       final Class<?> cacheType = cacheEntry.getKey();
       synchronized(cache.readerCache) {
-        for (final Map.Entry<Object,Map<CacheKey, Accountable>> readerCacheEntry : cache.readerCache.entrySet()) {
-          final Object readerKey = readerCacheEntry.getKey();
+        for (final Map.Entry<IndexReader.CacheKey,Map<CacheKey, Accountable>> readerCacheEntry : cache.readerCache.entrySet()) {
+          final IndexReader.CacheKey readerKey = readerCacheEntry.getKey();
           if (readerKey == null) continue;
           final Map<CacheKey, Accountable> innerCache = readerCacheEntry.getValue();
           for (final Map.Entry<CacheKey, Accountable> mapEntry : innerCache.entrySet()) {
@@ -112,10 +111,14 @@ public class FieldCacheImpl implements FieldCache {
   }
 
   // per-segment fieldcaches don't purge until the shared core closes.
-  final SegmentReader.CoreClosedListener purgeCore = FieldCacheImpl.this::purgeByCacheKey;
+  final IndexReader.ClosedListener purgeCore = FieldCacheImpl.this::purgeByCacheKey;
   
   private void initReader(LeafReader reader) {
-    reader.addCoreClosedListener(purgeCore);
+    IndexReader.CacheHelper cacheHelper = reader.getCoreCacheHelper();
+    if (cacheHelper == null) {
+      throw new IllegalStateException("Cannot cache on " + reader);
+    }
+    cacheHelper.addClosedListener(purgeCore);
   }
 
   /** Expert: Internal cache. */
@@ -127,13 +130,13 @@ public class FieldCacheImpl implements FieldCache {
 
     final FieldCacheImpl wrapper;
 
-    final Map<Object,Map<CacheKey,Accountable>> readerCache = new WeakHashMap<>();
+    final Map<IndexReader.CacheKey,Map<CacheKey,Accountable>> readerCache = new WeakHashMap<>();
     
     protected abstract Accountable createValue(LeafReader reader, CacheKey key)
         throws IOException;
 
     /** Remove this reader from the cache, if present. */
-    public void purgeByCacheKey(Object coreCacheKey) {
+    public void purgeByCacheKey(IndexReader.CacheKey coreCacheKey) {
       synchronized(readerCache) {
         readerCache.remove(coreCacheKey);
       }
@@ -142,7 +145,11 @@ public class FieldCacheImpl implements FieldCache {
     /** Sets the key to the value for the provided reader;
      *  if the key is already set then this doesn't change it. */
     public void put(LeafReader reader, CacheKey key, Accountable value) {
-      final Object readerKey = reader.getCoreCacheKey();
+      IndexReader.CacheHelper cacheHelper = reader.getCoreCacheHelper();
+      if (cacheHelper == null) {
+        throw new IllegalStateException("Cannot cache on " + reader);
+      }
+      final IndexReader.CacheKey readerKey = cacheHelper.getKey();
       synchronized (readerCache) {
         Map<CacheKey,Accountable> innerCache = readerCache.get(readerKey);
         if (innerCache == null) {
@@ -163,7 +170,12 @@ public class FieldCacheImpl implements FieldCache {
     public Object get(LeafReader reader, CacheKey key) throws IOException {
       Map<CacheKey,Accountable> innerCache;
       Accountable value;
-      final Object readerKey = reader.getCoreCacheKey();
+      IndexReader.CacheHelper cacheHelper = reader.getCoreCacheHelper();
+      if (cacheHelper == null) {
+        reader.getCoreCacheHelper();
+        throw new IllegalStateException("Cannot cache on " + reader);
+      }
+      final IndexReader.CacheKey readerKey = cacheHelper.getKey();
       synchronized (readerCache) {
         innerCache = readerCache.get(readerKey);
         if (innerCache == null) {
@@ -188,39 +200,12 @@ public class FieldCacheImpl implements FieldCache {
             synchronized (readerCache) {
               innerCache.put(key, progress.value);
             }
-
-            // Only check if key.custom (the parser) is
-            // non-null; else, we check twice for a single
-            // call to FieldCache.getXXX
-            if (key.custom != null && wrapper != null) {
-              final PrintStream infoStream = wrapper.getInfoStream();
-              if (infoStream != null) {
-                printNewInsanity(infoStream, progress.value);
-              }
-            }
           }
           return progress.value;
         }
       }
       return value;
     }
-
-    private void printNewInsanity(PrintStream infoStream, Object value) {
-      final FieldCacheSanityChecker.Insanity[] insanities = FieldCacheSanityChecker.checkSanity(wrapper);
-      for(int i=0;i<insanities.length;i++) {
-        final FieldCacheSanityChecker.Insanity insanity = insanities[i];
-        final CacheEntry[] entries = insanity.getCacheEntries();
-        for(int j=0;j<entries.length;j++) {
-          if (entries[j].getValue() == value) {
-            // OK this insanity involves our entry
-            infoStream.println("WARNING: new FieldCache insanity created\nDetails: " + insanity.toString());
-            infoStream.println("\nStack:\n");
-            new Throwable().printStackTrace(infoStream);
-            break;
-          }
-        }
-      }
-    }
   }
 
   /** Expert: Every composite-key in the internal cache is of this type. */
@@ -1265,14 +1250,5 @@ public class FieldCacheImpl implements FieldCache {
     }
   }
 
-  private volatile PrintStream infoStream;
-
-  public void setInfoStream(PrintStream stream) {
-    infoStream = stream;
-  }
-
-  public PrintStream getInfoStream() {
-    return infoStream;
-  }
 }
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/java/org/apache/solr/uninverting/FieldCacheSanityChecker.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/uninverting/FieldCacheSanityChecker.java b/solr/core/src/java/org/apache/solr/uninverting/FieldCacheSanityChecker.java
deleted file mode 100644
index 3d874ce..0000000
--- a/solr/core/src/java/org/apache/solr/uninverting/FieldCacheSanityChecker.java
+++ /dev/null
@@ -1,426 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements.  See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.solr.uninverting;
-
-import java.util.ArrayList;
-import java.util.Collection;
-import java.util.HashMap;
-import java.util.HashSet;
-import java.util.List;
-import java.util.Map;
-import java.util.Set;
-
-import org.apache.lucene.index.IndexReader;
-import org.apache.lucene.index.IndexReaderContext;
-import org.apache.lucene.store.AlreadyClosedException;
-import org.apache.lucene.util.Accountable;
-import org.apache.lucene.util.MapOfSets;
-import org.apache.solr.uninverting.FieldCache.CacheEntry;
-
-/** 
- * Provides methods for sanity checking that entries in the FieldCache 
- * are not wasteful or inconsistent.
- * </p>
- * <p>
- * Lucene 2.9 Introduced numerous enhancements into how the FieldCache 
- * is used by the low levels of Lucene searching (for Sorting and 
- * ValueSourceQueries) to improve both the speed for Sorting, as well 
- * as reopening of IndexReaders.  But these changes have shifted the 
- * usage of FieldCache from "top level" IndexReaders (frequently a 
- * MultiReader or DirectoryReader) down to the leaf level SegmentReaders.  
- * As a result, existing applications that directly access the FieldCache 
- * may find RAM usage increase significantly when upgrading to 2.9 or 
- * Later.  This class provides an API for these applications (or their 
- * Unit tests) to check at run time if the FieldCache contains "insane" 
- * usages of the FieldCache.
- * </p>
- * @lucene.experimental
- * @see FieldCache
- * @see FieldCacheSanityChecker.Insanity
- * @see FieldCacheSanityChecker.InsanityType
- */
-final class FieldCacheSanityChecker {
-
-  public FieldCacheSanityChecker() {
-    /* NOOP */
-  }
-
-  /** 
-   * Quick and dirty convenience method
-   * @see #check
-   */
-  public static Insanity[] checkSanity(FieldCache cache) {
-    return checkSanity(cache.getCacheEntries());
-  }
-
-  /** 
-   * Quick and dirty convenience method that instantiates an instance with 
-   * "good defaults" and uses it to test the CacheEntrys
-   * @see #check
-   */
-  public static Insanity[] checkSanity(CacheEntry... cacheEntries) {
-    FieldCacheSanityChecker sanityChecker = new FieldCacheSanityChecker();
-    return sanityChecker.check(cacheEntries);
-  }
-
-
-  /**
-   * Tests a CacheEntry[] for indication of "insane" cache usage.
-   * <p>
-   * <B>NOTE:</b>FieldCache CreationPlaceholder objects are ignored.
-   * (:TODO: is this a bad idea? are we masking a real problem?)
-   * </p>
-   */
-  public Insanity[] check(CacheEntry... cacheEntries) {
-    if (null == cacheEntries || 0 == cacheEntries.length) 
-      return new Insanity[0];
-
-    // the indirect mapping lets MapOfSet dedup identical valIds for us
-    //
-    // maps the (valId) identityhashCode of cache values to 
-    // sets of CacheEntry instances
-    final MapOfSets<Integer, CacheEntry> valIdToItems = new MapOfSets<>(new HashMap<Integer, Set<CacheEntry>>(17));
-    // maps ReaderField keys to Sets of ValueIds
-    final MapOfSets<ReaderField, Integer> readerFieldToValIds = new MapOfSets<>(new HashMap<ReaderField, Set<Integer>>(17));
-    //
-
-    // any keys that we know result in more then one valId
-    final Set<ReaderField> valMismatchKeys = new HashSet<>();
-
-    // iterate over all the cacheEntries to get the mappings we'll need
-    for (int i = 0; i < cacheEntries.length; i++) {
-      final CacheEntry item = cacheEntries[i];
-      final Accountable val = item.getValue();
-
-      // It's OK to have dup entries, where one is eg
-      // float[] and the other is the Bits (from
-      // getDocWithField())
-      if (val instanceof FieldCacheImpl.BitsEntry) {
-        continue;
-      }
-
-      if (val instanceof FieldCache.CreationPlaceholder)
-        continue;
-
-      final ReaderField rf = new ReaderField(item.getReaderKey(), 
-                                            item.getFieldName());
-
-      final Integer valId = Integer.valueOf(System.identityHashCode(val));
-
-      // indirect mapping, so the MapOfSet will dedup identical valIds for us
-      valIdToItems.put(valId, item);
-      if (1 < readerFieldToValIds.put(rf, valId)) {
-        valMismatchKeys.add(rf);
-      }
-    }
-
-    final List<Insanity> insanity = new ArrayList<>(valMismatchKeys.size() * 3);
-
-    insanity.addAll(checkValueMismatch(valIdToItems, 
-                                       readerFieldToValIds, 
-                                       valMismatchKeys));
-    insanity.addAll(checkSubreaders(valIdToItems, 
-                                    readerFieldToValIds));
-                    
-    return insanity.toArray(new Insanity[insanity.size()]);
-  }
-
-  /** 
-   * Internal helper method used by check that iterates over 
-   * valMismatchKeys and generates a Collection of Insanity 
-   * instances accordingly.  The MapOfSets are used to populate 
-   * the Insanity objects. 
-   * @see InsanityType#VALUEMISMATCH
-   */
-  private Collection<Insanity> checkValueMismatch(MapOfSets<Integer, CacheEntry> valIdToItems,
-                                        MapOfSets<ReaderField, Integer> readerFieldToValIds,
-                                        Set<ReaderField> valMismatchKeys) {
-
-    final List<Insanity> insanity = new ArrayList<>(valMismatchKeys.size() * 3);
-
-    if (! valMismatchKeys.isEmpty() ) { 
-      // we have multiple values for some ReaderFields
-
-      final Map<ReaderField, Set<Integer>> rfMap = readerFieldToValIds.getMap();
-      final Map<Integer, Set<CacheEntry>> valMap = valIdToItems.getMap();
-      for (final ReaderField rf : valMismatchKeys) {
-        final List<CacheEntry> badEntries = new ArrayList<>(valMismatchKeys.size() * 2);
-        for(final Integer value: rfMap.get(rf)) {
-          for (final CacheEntry cacheEntry : valMap.get(value)) {
-            badEntries.add(cacheEntry);
-          }
-        }
-
-        CacheEntry[] badness = new CacheEntry[badEntries.size()];
-        badness = badEntries.toArray(badness);
-
-        insanity.add(new Insanity(InsanityType.VALUEMISMATCH,
-                                  "Multiple distinct value objects for " + 
-                                  rf.toString(), badness));
-      }
-    }
-    return insanity;
-  }
-
-  /** 
-   * Internal helper method used by check that iterates over 
-   * the keys of readerFieldToValIds and generates a Collection 
-   * of Insanity instances whenever two (or more) ReaderField instances are 
-   * found that have an ancestry relationships.  
-   *
-   * @see InsanityType#SUBREADER
-   */
-  private Collection<Insanity> checkSubreaders( MapOfSets<Integer, CacheEntry>  valIdToItems,
-                                      MapOfSets<ReaderField, Integer> readerFieldToValIds) {
-
-    final List<Insanity> insanity = new ArrayList<>(23);
-
-    Map<ReaderField, Set<ReaderField>> badChildren = new HashMap<>(17);
-    MapOfSets<ReaderField, ReaderField> badKids = new MapOfSets<>(badChildren); // wrapper
-
-    Map<Integer, Set<CacheEntry>> viToItemSets = valIdToItems.getMap();
-    Map<ReaderField, Set<Integer>> rfToValIdSets = readerFieldToValIds.getMap();
-
-    Set<ReaderField> seen = new HashSet<>(17);
-
-    Set<ReaderField> readerFields = rfToValIdSets.keySet();
-    for (final ReaderField rf : readerFields) {
-      
-      if (seen.contains(rf)) continue;
-
-      List<Object> kids = getAllDescendantReaderKeys(rf.readerKey);
-      for (Object kidKey : kids) {
-        ReaderField kid = new ReaderField(kidKey, rf.fieldName);
-        
-        if (badChildren.containsKey(kid)) {
-          // we've already process this kid as RF and found other problems
-          // track those problems as our own
-          badKids.put(rf, kid);
-          badKids.putAll(rf, badChildren.get(kid));
-          badChildren.remove(kid);
-          
-        } else if (rfToValIdSets.containsKey(kid)) {
-          // we have cache entries for the kid
-          badKids.put(rf, kid);
-        }
-        seen.add(kid);
-      }
-      seen.add(rf);
-    }
-
-    // every mapping in badKids represents an Insanity
-    for (final ReaderField parent : badChildren.keySet()) {
-      Set<ReaderField> kids = badChildren.get(parent);
-
-      List<CacheEntry> badEntries = new ArrayList<>(kids.size() * 2);
-
-      // put parent entr(ies) in first
-      {
-        for (final Integer value  : rfToValIdSets.get(parent)) {
-          badEntries.addAll(viToItemSets.get(value));
-        }
-      }
-
-      // now the entries for the descendants
-      for (final ReaderField kid : kids) {
-        for (final Integer value : rfToValIdSets.get(kid)) {
-          badEntries.addAll(viToItemSets.get(value));
-        }
-      }
-
-      CacheEntry[] badness = new CacheEntry[badEntries.size()];
-      badness = badEntries.toArray(badness);
-
-      insanity.add(new Insanity(InsanityType.SUBREADER,
-                                "Found caches for descendants of " + 
-                                parent.toString(),
-                                badness));
-    }
-
-    return insanity;
-
-  }
-
-  /**
-   * Checks if the seed is an IndexReader, and if so will walk
-   * the hierarchy of subReaders building up a list of the objects 
-   * returned by {@code seed.getCoreCacheKey()}
-   */
-  private List<Object> getAllDescendantReaderKeys(Object seed) {
-    List<Object> all = new ArrayList<>(17); // will grow as we iter
-    all.add(seed);
-    for (int i = 0; i < all.size(); i++) {
-      final Object obj = all.get(i);
-      // TODO: We don't check closed readers here (as getTopReaderContext
-      // throws AlreadyClosedException), what should we do? Reflection?
-      if (obj instanceof IndexReader) {
-        try {
-          final List<IndexReaderContext> childs =
-            ((IndexReader) obj).getContext().children();
-          if (childs != null) { // it is composite reader
-            for (final IndexReaderContext ctx : childs) {
-              all.add(ctx.reader().getCoreCacheKey());
-            }
-          }
-        } catch (AlreadyClosedException ace) {
-          // ignore this reader
-        }
-      }
-    }
-    // need to skip the first, because it was the seed
-    return all.subList(1, all.size());
-  }
-
-  /**
-   * Simple pair object for using "readerKey + fieldName" a Map key
-   */
-  private final static class ReaderField {
-    public final Object readerKey;
-    public final String fieldName;
-    public ReaderField(Object readerKey, String fieldName) {
-      this.readerKey = readerKey;
-      this.fieldName = fieldName;
-    }
-    @Override
-    public int hashCode() {
-      return System.identityHashCode(readerKey) * fieldName.hashCode();
-    }
-    @Override
-    public boolean equals(Object that) {
-      if (! (that instanceof ReaderField)) return false;
-
-      ReaderField other = (ReaderField) that;
-      return (this.readerKey == other.readerKey &&
-              this.fieldName.equals(other.fieldName));
-    }
-    @Override
-    public String toString() {
-      return readerKey.toString() + "+" + fieldName;
-    }
-  }
-
-  /**
-   * Simple container for a collection of related CacheEntry objects that 
-   * in conjunction with each other represent some "insane" usage of the 
-   * FieldCache.
-   */
-  public final static class Insanity {
-    private final InsanityType type;
-    private final String msg;
-    private final CacheEntry[] entries;
-    public Insanity(InsanityType type, String msg, CacheEntry... entries) {
-      if (null == type) {
-        throw new IllegalArgumentException
-          ("Insanity requires non-null InsanityType");
-      }
-      if (null == entries || 0 == entries.length) {
-        throw new IllegalArgumentException
-          ("Insanity requires non-null/non-empty CacheEntry[]");
-      }
-      this.type = type;
-      this.msg = msg;
-      this.entries = entries;
-      
-    }
-    /**
-     * Type of insane behavior this object represents
-     */
-    public InsanityType getType() { return type; }
-    /**
-     * Description of hte insane behavior
-     */
-    public String getMsg() { return msg; }
-    /**
-     * CacheEntry objects which suggest a problem
-     */
-    public CacheEntry[] getCacheEntries() { return entries; }
-    /**
-     * Multi-Line representation of this Insanity object, starting with 
-     * the Type and Msg, followed by each CacheEntry.toString() on its 
-     * own line prefaced by a tab character
-     */
-    @Override
-    public String toString() {
-      StringBuilder buf = new StringBuilder();
-      buf.append(getType()).append(": ");
-
-      String m = getMsg();
-      if (null != m) buf.append(m);
-
-      buf.append('\n');
-
-      CacheEntry[] ce = getCacheEntries();
-      for (int i = 0; i < ce.length; i++) {
-        buf.append('\t').append(ce[i].toString()).append('\n');
-      }
-
-      return buf.toString();
-    }
-  }
-
-  /**
-   * An Enumeration of the different types of "insane" behavior that 
-   * may be detected in a FieldCache.
-   *
-   * @see InsanityType#SUBREADER
-   * @see InsanityType#VALUEMISMATCH
-   * @see InsanityType#EXPECTED
-   */
-  public final static class InsanityType {
-    private final String label;
-    private InsanityType(final String label) {
-      this.label = label;
-    }
-    @Override
-    public String toString() { return label; }
-
-    /** 
-     * Indicates an overlap in cache usage on a given field 
-     * in sub/super readers.
-     */
-    public final static InsanityType SUBREADER 
-      = new InsanityType("SUBREADER");
-
-    /** 
-     * <p>
-     * Indicates entries have the same reader+fieldname but 
-     * different cached values.  This can happen if different datatypes, 
-     * or parsers are used -- and while it's not necessarily a bug 
-     * it's typically an indication of a possible problem.
-     * </p>
-     * <p>
-     * <b>NOTE:</b> Only the reader, fieldname, and cached value are actually 
-     * tested -- if two cache entries have different parsers or datatypes but 
-     * the cached values are the same Object (== not just equal()) this method 
-     * does not consider that a red flag.  This allows for subtle variations 
-     * in the way a Parser is specified (null vs DEFAULT_LONG_PARSER, etc...)
-     * </p>
-     */
-    public final static InsanityType VALUEMISMATCH 
-      = new InsanityType("VALUEMISMATCH");
-
-    /** 
-     * Indicates an expected bit of "insanity".  This may be useful for 
-     * clients that wish to preserve/log information about insane usage 
-     * but indicate that it was expected. 
-     */
-    public final static InsanityType EXPECTED
-      = new InsanityType("EXPECTED");
-  }
-  
-  
-}

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/java/org/apache/solr/uninverting/UninvertingReader.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/uninverting/UninvertingReader.java b/solr/core/src/java/org/apache/solr/uninverting/UninvertingReader.java
index bc27231..0ba0b81 100644
--- a/solr/core/src/java/org/apache/solr/uninverting/UninvertingReader.java
+++ b/solr/core/src/java/org/apache/solr/uninverting/UninvertingReader.java
@@ -227,6 +227,15 @@ public class UninvertingReader extends FilterLeafReader {
     protected DirectoryReader doWrapDirectoryReader(DirectoryReader in) throws IOException {
       return new UninvertingDirectoryReader(in, mapping);
     }
+
+    // NOTE: delegating the cache helpers is wrong since this wrapper alters the
+    // content of the reader, it is only fine to do that because Solr ALWAYS
+    // consumes index readers through this wrapper
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return in.getReaderCacheHelper();
+    }
   }
   
   final Map<String,Type> mapping;
@@ -391,14 +400,18 @@ public class UninvertingReader extends FilterLeafReader {
     return mapping.get(field);
   }
 
+  // NOTE: delegating the cache helpers is wrong since this wrapper alters the
+  // content of the reader, it is only fine to do that because Solr ALWAYS
+  // consumes index readers through this wrapper
+
   @Override
-  public Object getCoreCacheKey() {
-    return in.getCoreCacheKey();
+  public CacheHelper getCoreCacheHelper() {
+    return in.getCoreCacheHelper();
   }
 
   @Override
-  public Object getCombinedCoreAndDeletesKey() {
-    return in.getCombinedCoreAndDeletesKey();
+  public CacheHelper getReaderCacheHelper() {
+    return in.getReaderCacheHelper();
   }
 
   @Override

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/java/org/apache/solr/update/SolrIndexSplitter.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/update/SolrIndexSplitter.java b/solr/core/src/java/org/apache/solr/update/SolrIndexSplitter.java
index bd4612e..a147b0f 100644
--- a/solr/core/src/java/org/apache/solr/update/SolrIndexSplitter.java
+++ b/solr/core/src/java/org/apache/solr/update/SolrIndexSplitter.java
@@ -289,6 +289,16 @@ public class SolrIndexSplitter {
     public Bits getLiveDocs() {
       return liveDocs;
     }
+
+    @Override
+    public CacheHelper getCoreCacheHelper() {
+      return in.getCoreCacheHelper();
+    }
+
+    @Override
+    public CacheHelper getReaderCacheHelper() {
+      return null;
+    }
   }
 
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/test/org/apache/solr/core/TestNRTOpen.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/core/TestNRTOpen.java b/solr/core/src/test/org/apache/solr/core/TestNRTOpen.java
index 1ebd5b6..0970953 100644
--- a/solr/core/src/test/org/apache/solr/core/TestNRTOpen.java
+++ b/solr/core/src/test/org/apache/solr/core/TestNRTOpen.java
@@ -145,7 +145,7 @@ public class TestNRTOpen extends SolrTestCaseJ4 {
     try {
       DirectoryReader ir = searcher.get().getRawReader();
       for (LeafReaderContext context : ir.leaves()) {
-        set.add(context.reader().getCoreCacheKey());
+        set.add(context.reader().getCoreCacheHelper().getKey());
       }
     } finally {
       searcher.decref();

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/test/org/apache/solr/index/TestSlowCompositeReaderWrapper.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/index/TestSlowCompositeReaderWrapper.java b/solr/core/src/test/org/apache/solr/index/TestSlowCompositeReaderWrapper.java
index 0685e55..195aae5 100644
--- a/solr/core/src/test/org/apache/solr/index/TestSlowCompositeReaderWrapper.java
+++ b/solr/core/src/test/org/apache/solr/index/TestSlowCompositeReaderWrapper.java
@@ -18,15 +18,20 @@ package org.apache.solr.index;
 
 import java.io.IOException;
 import java.util.ArrayList;
-import java.util.Collections;
 import java.util.List;
 import java.util.concurrent.atomic.AtomicInteger;
 
 import org.apache.lucene.document.Document;
+import org.apache.lucene.document.SortedDocValuesField;
+import org.apache.lucene.document.SortedSetDocValuesField;
 import org.apache.lucene.index.DirectoryReader;
 import org.apache.lucene.index.IndexReader;
 import org.apache.lucene.index.LeafReader;
 import org.apache.lucene.index.RandomIndexWriter;
+import org.apache.lucene.index.MultiDocValues.MultiSortedDocValues;
+import org.apache.lucene.index.MultiDocValues.MultiSortedSetDocValues;
+import org.apache.lucene.store.Directory;
+import org.apache.lucene.util.BytesRef;
 import org.apache.lucene.util.LuceneTestCase;
 import org.apache.lucene.util.TestUtil;
 
@@ -48,21 +53,16 @@ public class TestSlowCompositeReaderWrapper extends LuceneTestCase {
     final LeafReader leafReader = SlowCompositeReaderWrapper.wrap(reader);
     
     final int numListeners = TestUtil.nextInt(random(), 1, 10);
-    final List<LeafReader.CoreClosedListener> listeners = new ArrayList<>();
+    final List<IndexReader.ClosedListener> listeners = new ArrayList<>();
     AtomicInteger counter = new AtomicInteger(numListeners);
     
     for (int i = 0; i < numListeners; ++i) {
-      CountCoreListener listener = new CountCoreListener(counter, leafReader.getCoreCacheKey());
+      CountCoreListener listener = new CountCoreListener(counter, leafReader.getCoreCacheHelper().getKey());
       listeners.add(listener);
-      leafReader.addCoreClosedListener(listener);
+      leafReader.getCoreCacheHelper().addClosedListener(listener);
     }
     for (int i = 0; i < 100; ++i) {
-      leafReader.addCoreClosedListener(listeners.get(random().nextInt(listeners.size())));
-    }
-    final int removed = random().nextInt(numListeners);
-    Collections.shuffle(listeners, random());
-    for (int i = 0; i < removed; ++i) {
-      leafReader.removeCoreClosedListener(listeners.get(i));
+      leafReader.getCoreCacheHelper().addClosedListener(listeners.get(random().nextInt(listeners.size())));
     }
     assertEquals(numListeners, counter.get());
     // make sure listeners are registered on the wrapped reader and that closing any of them has the same effect
@@ -71,11 +71,11 @@ public class TestSlowCompositeReaderWrapper extends LuceneTestCase {
     } else {
       leafReader.close();
     }
-    assertEquals(removed, counter.get());
+    assertEquals(0, counter.get());
     w.w.getDirectory().close();
   }
 
-  private static final class CountCoreListener implements LeafReader.CoreClosedListener {
+  private static final class CountCoreListener implements IndexReader.ClosedListener {
 
     private final AtomicInteger count;
     private final Object coreCacheKey;
@@ -86,10 +86,37 @@ public class TestSlowCompositeReaderWrapper extends LuceneTestCase {
     }
 
     @Override
-    public void onClose(Object coreCacheKey) {
+    public void onClose(IndexReader.CacheKey coreCacheKey) {
       assertSame(this.coreCacheKey, coreCacheKey);
       count.decrementAndGet();
     }
 
   }
+
+  public void testOrdMapsAreCached() throws Exception {
+    Directory dir = newDirectory();
+    RandomIndexWriter w = new RandomIndexWriter(random(), dir);
+    Document doc = new Document();
+    doc.add(new SortedDocValuesField("sorted", new BytesRef("a")));
+    doc.add(new SortedSetDocValuesField("sorted_set", new BytesRef("b")));
+    doc.add(new SortedSetDocValuesField("sorted_set", new BytesRef("c")));
+    w.addDocument(doc);
+    w.getReader().close();
+    doc = new Document();
+    doc.add(new SortedDocValuesField("sorted", new BytesRef("b")));
+    doc.add(new SortedSetDocValuesField("sorted_set", new BytesRef("c")));
+    doc.add(new SortedSetDocValuesField("sorted_set", new BytesRef("d")));
+    w.addDocument(doc);
+    IndexReader reader = w.getReader();
+    assertTrue(reader.leaves().size() > 1);
+    SlowCompositeReaderWrapper slowWrapper = (SlowCompositeReaderWrapper) SlowCompositeReaderWrapper.wrap(reader);
+    assertEquals(0, slowWrapper.cachedOrdMaps.size());
+    assertEquals(MultiSortedDocValues.class, slowWrapper.getSortedDocValues("sorted").getClass());
+    assertEquals(1, slowWrapper.cachedOrdMaps.size());
+    assertEquals(MultiSortedSetDocValues.class, slowWrapper.getSortedSetDocValues("sorted_set").getClass());
+    assertEquals(2, slowWrapper.cachedOrdMaps.size());
+    reader.close();
+    w.close();
+    dir.close();
+  }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/test/org/apache/solr/search/TestDocSet.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/search/TestDocSet.java b/solr/core/src/test/org/apache/solr/search/TestDocSet.java
index 2849f09..db6523e 100644
--- a/solr/core/src/test/org/apache/solr/search/TestDocSet.java
+++ b/solr/core/src/test/org/apache/solr/search/TestDocSet.java
@@ -388,16 +388,6 @@ public class TestDocSet extends LuceneTestCase {
       }
 
       @Override
-      public void addCoreClosedListener(CoreClosedListener listener) {
-        throw new UnsupportedOperationException();
-      }
-
-      @Override
-      public void removeCoreClosedListener(CoreClosedListener listener) {
-        throw new UnsupportedOperationException();
-      }
-
-      @Override
       public FieldInfos getFieldInfos() {
         return new FieldInfos(new FieldInfo[0]);
       }
@@ -468,6 +458,16 @@ public class TestDocSet extends LuceneTestCase {
       public Sort getIndexSort() {
         return null;
       }
+
+      @Override
+      public CacheHelper getCoreCacheHelper() {
+        return null;
+      }
+
+      @Override
+      public CacheHelper getReaderCacheHelper() {
+        return null;
+      }
     };
   }
 


[48/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10134: EmbeddedSolrServer handles SchemaAPI requests

Posted by ab...@apache.org.
SOLR-10134: EmbeddedSolrServer handles SchemaAPI requests


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/0baf2fa3
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/0baf2fa3
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/0baf2fa3

Branch: refs/heads/jira/solr-9858
Commit: 0baf2fa33cef485df94649fd408c22e6430b68cf
Parents: 8b4502c
Author: Mikhail Khludnev <mk...@apache.org>
Authored: Thu Feb 23 00:40:40 2017 +0300
Committer: Mikhail Khludnev <mk...@apache.org>
Committed: Wed Mar 1 08:32:35 2017 +0300

----------------------------------------------------------------------
 solr/CHANGES.txt                                |   1 +
 .../solrj/embedded/EmbeddedSolrServer.java      |  55 +++++----
 .../solr/request/SolrQueryRequestBase.java      |  17 ++-
 .../apache/solr/servlet/SolrRequestParsers.java |  11 +-
 .../TestEmbeddedSolrServerSchemaAPI.java        | 111 +++++++++++++++++++
 .../java/org/apache/solr/SolrTestCaseJ4.java    |   6 +-
 6 files changed, 168 insertions(+), 33 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0baf2fa3/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index 47f190b..db5e3e6 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -259,6 +259,7 @@ Other Changes
 * SOLR-10214: Remove unused HDFS BlockCache metrics and add storeFails, as well as adding total
   counts for lookups, hits, and evictions. (yonik)
   
+* SOLR-10134: EmbeddedSolrServer responds on Schema API requests (Robert Alexandersson via Mikhail Khludnev)  
 
 ==================  6.4.2 ==================
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0baf2fa3/solr/core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java b/solr/core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
index fc283f4..8de5fc9 100644
--- a/solr/core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
+++ b/solr/core/src/java/org/apache/solr/client/solrj/embedded/EmbeddedSolrServer.java
@@ -172,6 +172,7 @@ public class EmbeddedSolrServer extends SolrClient {
 
       req = _parser.buildRequestFrom(core, params, request.getContentStreams());
       req.getContext().put(PATH, path);
+      req.getContext().put("httpMethod", request.getMethod().name());
       SolrQueryResponse rsp = new SolrQueryResponse();
       SolrRequestInfo.setRequestInfo(new SolrRequestInfo(req, rsp));
 
@@ -199,32 +200,13 @@ public class EmbeddedSolrServer extends SolrClient {
               };
 
 
-          ByteArrayOutputStream out = new ByteArrayOutputStream();
-          new JavaBinCodec(resolver) {
+          try(ByteArrayOutputStream out = new ByteArrayOutputStream()) {
+            createJavaBinCodec(callback, resolver).setWritableDocFields(resolver).marshal(rsp.getValues(), out);
 
-            @Override
-            public void writeSolrDocument(SolrDocument doc) {
-              callback.streamSolrDocument(doc);
-              //super.writeSolrDocument( doc, fields );
+            try(InputStream in = out.toInputStream()){
+              return (NamedList<Object>) new JavaBinCodec(resolver).unmarshal(in);
             }
-
-            @Override
-            public void writeSolrDocumentList(SolrDocumentList docs) throws IOException {
-              if (docs.size() > 0) {
-                SolrDocumentList tmp = new SolrDocumentList();
-                tmp.setMaxScore(docs.getMaxScore());
-                tmp.setNumFound(docs.getNumFound());
-                tmp.setStart(docs.getStart());
-                docs = tmp;
-              }
-              callback.streamDocListInfo(docs.getNumFound(), docs.getStart(), docs.getMaxScore());
-              super.writeSolrDocumentList(docs);
-            }
-
-          }.setWritableDocFields(resolver). marshal(rsp.getValues(), out);
-
-          InputStream in = out.toInputStream();
-          return (NamedList<Object>) new JavaBinCodec(resolver).unmarshal(in);
+          }
         } catch (Exception ex) {
           throw new RuntimeException(ex);
         }
@@ -243,6 +225,31 @@ public class EmbeddedSolrServer extends SolrClient {
     }
   }
 
+  private JavaBinCodec createJavaBinCodec(final StreamingResponseCallback callback, final BinaryResponseWriter.Resolver resolver) {
+    return new JavaBinCodec(resolver) {
+
+      @Override
+      public void writeSolrDocument(SolrDocument doc) {
+        callback.streamSolrDocument(doc);
+        //super.writeSolrDocument( doc, fields );
+      }
+
+      @Override
+      public void writeSolrDocumentList(SolrDocumentList docs) throws IOException {
+        if (docs.size() > 0) {
+          SolrDocumentList tmp = new SolrDocumentList();
+          tmp.setMaxScore(docs.getMaxScore());
+          tmp.setNumFound(docs.getNumFound());
+          tmp.setStart(docs.getStart());
+          docs = tmp;
+        }
+        callback.streamDocListInfo(docs.getNumFound(), docs.getStart(), docs.getMaxScore());
+        super.writeSolrDocumentList(docs);
+      }
+
+    };
+  }
+
   private static void checkForExceptions(SolrQueryResponse rsp) throws Exception {
     if (rsp.getException() != null) {
       if (rsp.getException() instanceof SolrException) {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0baf2fa3/solr/core/src/java/org/apache/solr/request/SolrQueryRequestBase.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/request/SolrQueryRequestBase.java b/solr/core/src/java/org/apache/solr/request/SolrQueryRequestBase.java
index 4b0e4d6..19350f0 100644
--- a/solr/core/src/java/org/apache/solr/request/SolrQueryRequestBase.java
+++ b/solr/core/src/java/org/apache/solr/request/SolrQueryRequestBase.java
@@ -31,13 +31,14 @@ import org.apache.solr.common.util.ContentStream;
 import org.apache.solr.core.SolrCore;
 
 import java.io.Closeable;
+import java.io.IOException;
 import java.io.InputStream;
 import java.io.InputStreamReader;
 import java.security.Principal;
 import java.util.Collections;
+import java.util.HashMap;
 import java.util.List;
 import java.util.Map;
-import java.util.HashMap;
 
 import static java.nio.charset.StandardCharsets.UTF_8;
 
@@ -202,7 +203,7 @@ public abstract class SolrQueryRequestBase implements SolrQueryRequest, Closeabl
       Iterable<ContentStream> contentStreams = getContentStreams();
       if (contentStreams == null) throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No content stream");
       for (ContentStream contentStream : contentStreams) {
-        parsedCommands = ApiBag.getCommandOperations(new InputStreamReader((InputStream) contentStream, UTF_8),
+        parsedCommands = ApiBag.getCommandOperations(getInputStream(contentStream),
             getValidators(), validateInput);
       }
 
@@ -211,6 +212,18 @@ public abstract class SolrQueryRequestBase implements SolrQueryRequest, Closeabl
 
   }
 
+  private InputStreamReader getInputStream(ContentStream contentStream) {
+    if(contentStream instanceof InputStream) {
+      return new InputStreamReader((InputStream)contentStream, UTF_8);
+    } else {
+      try {
+        return new InputStreamReader(contentStream.getStream(), UTF_8);
+      } catch (IOException e) {
+        throw new RuntimeException(e);
+      }
+    }
+  }
+
   protected ValidatingJsonMap getSpec() {
     return null;
   }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0baf2fa3/solr/core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/servlet/SolrRequestParsers.java b/solr/core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
index c311d4a..93baace 100644
--- a/solr/core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
+++ b/solr/core/src/java/org/apache/solr/servlet/SolrRequestParsers.java
@@ -16,7 +16,7 @@
  */
 package org.apache.solr.servlet;
 
-import javax.servlet.http.HttpServletRequest;
+import static org.apache.solr.common.params.CommonParams.PATH;
 
 import java.io.ByteArrayOutputStream;
 import java.io.File;
@@ -33,13 +33,14 @@ import java.security.Principal;
 import java.util.ArrayList;
 import java.util.Arrays;
 import java.util.Collection;
-import java.util.Collections;
 import java.util.HashMap;
 import java.util.Iterator;
 import java.util.LinkedList;
 import java.util.List;
 import java.util.Map;
 
+import javax.servlet.http.HttpServletRequest;
+
 import org.apache.commons.fileupload.FileItem;
 import org.apache.commons.fileupload.disk.DiskFileItemFactory;
 import org.apache.commons.fileupload.servlet.ServletFileUpload;
@@ -64,8 +65,6 @@ import org.apache.solr.util.CommandOperation;
 import org.apache.solr.util.RTimerTree;
 import org.apache.solr.util.SolrFileCleaningTracker;
 
-import static org.apache.solr.common.params.CommonParams.PATH;
-
 
 public class SolrRequestParsers 
 {
@@ -239,7 +238,7 @@ public class SolrRequestParsers
         if (httpSolrCall != null) {
           return httpSolrCall.getCommands(validateInput);
         }
-        return Collections.emptyList();
+        return super.getCommands(validateInput);
       }
 
       @Override
@@ -247,7 +246,7 @@ public class SolrRequestParsers
         if (httpSolrCall != null && httpSolrCall instanceof V2HttpCall) {
           return ((V2HttpCall) httpSolrCall).getUrlParts();
         }
-        return Collections.EMPTY_MAP;
+        return super.getPathTemplateValues();
       }
 
       @Override

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0baf2fa3/solr/core/src/test/org/apache/solr/client/solrj/embedded/TestEmbeddedSolrServerSchemaAPI.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/client/solrj/embedded/TestEmbeddedSolrServerSchemaAPI.java b/solr/core/src/test/org/apache/solr/client/solrj/embedded/TestEmbeddedSolrServerSchemaAPI.java
new file mode 100644
index 0000000..f253831
--- /dev/null
+++ b/solr/core/src/test/org/apache/solr/client/solrj/embedded/TestEmbeddedSolrServerSchemaAPI.java
@@ -0,0 +1,111 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.solr.client.solrj.embedded;
+
+import java.io.IOException;
+import java.nio.file.Path;
+import java.util.Collections;
+import java.util.LinkedHashMap;
+import java.util.Map;
+
+import org.apache.solr.SolrTestCaseJ4;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.schema.SchemaRequest;
+import org.apache.solr.client.solrj.response.schema.SchemaResponse;
+import org.apache.solr.client.solrj.response.schema.SchemaResponse.FieldResponse;
+import org.apache.solr.common.SolrException;
+import org.junit.AfterClass;
+import org.junit.Before;
+import org.junit.BeforeClass;
+import org.junit.Test;
+
+public class TestEmbeddedSolrServerSchemaAPI extends SolrTestCaseJ4 {
+
+  private String fieldName = "VerificationTest";
+  private static EmbeddedSolrServer server;
+  private final Map<String, Object> fieldAttributes;
+  {
+    Map<String,Object> field = new LinkedHashMap<>();
+    field.put("name", fieldName);
+    field.put("type", "string");
+    field.put("stored", false);
+    field.put("indexed", true);
+    field.put("multiValued", true);
+    fieldAttributes = Collections.unmodifiableMap(field);
+  }
+
+  @BeforeClass
+  public static void initClass() throws Exception {
+    assertNull("no system props clash please", System.getProperty("managed.schema.mutable"));
+    System.setProperty("managed.schema.mutable", ""+//true
+    random().nextBoolean()
+    );
+    Path tmpHome = createTempDir("tmp-home");
+    Path coreDir = tmpHome.resolve(DEFAULT_TEST_CORENAME);
+    copyMinConf(coreDir.toFile(), null, "solrconfig-managed-schema.xml");
+    initCore("solrconfig.xml" /*it's renamed to to*/, "schema.xml", tmpHome.toAbsolutePath().toString());
+    
+    server = new EmbeddedSolrServer(h.getCoreContainer(), DEFAULT_TEST_CORENAME);
+  }
+
+  @AfterClass
+  public static void destroyClass() throws IOException {
+    server.close(); // doubtful
+    server = null;
+    System.clearProperty("managed.schema.mutable");
+  }
+
+  @Before
+  public void thereIsNoFieldYet() throws SolrServerException, IOException{
+    try{
+      FieldResponse process = new SchemaRequest.Field(fieldName)
+                  .process(server);
+      fail(""+process);
+    }catch(SolrException e){
+      assertTrue(e.getMessage().contains("No")
+           && e.getMessage().contains("VerificationTest"));
+    }
+  }
+  
+  @Test
+  public void testSchemaAddFieldAndVerifyExistence() throws Exception {
+    assumeTrue("it needs to ammend schema", Boolean.getBoolean("managed.schema.mutable"));
+    SchemaResponse.UpdateResponse addFieldResponse = new SchemaRequest.AddField(fieldAttributes).process(server);
+
+    assertEquals(addFieldResponse.toString(), 0, addFieldResponse.getStatus());
+
+    // This asserts that the field was actually created
+    // this is due to the fact that the response gave OK but actually never created the field.
+    Map<String,Object> foundFieldAttributes = new SchemaRequest.Field(fieldName).process(server).getField();
+    assertEquals(fieldAttributes, foundFieldAttributes);
+
+    assertEquals("removing " + fieldName, 0,
+        new SchemaRequest.DeleteField(fieldName).process(server).getStatus());
+  }
+
+  @Test 
+  public void testSchemaAddFieldAndFailOnImmutable() throws Exception {
+    assumeFalse("it needs a readonly schema", Boolean.getBoolean("managed.schema.mutable"));
+
+      SchemaRequest.AddField addFieldUpdateSchemaRequest = new SchemaRequest.AddField(fieldAttributes);
+      SchemaResponse.UpdateResponse addFieldResponse = addFieldUpdateSchemaRequest.process(server);
+      // wt hell???? assertFalse(addFieldResponse.toString(), addFieldResponse.getStatus()==0);
+      assertTrue((""+addFieldResponse).contains("schema is not editable"));
+
+  }
+
+}

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0baf2fa3/solr/test-framework/src/java/org/apache/solr/SolrTestCaseJ4.java
----------------------------------------------------------------------
diff --git a/solr/test-framework/src/java/org/apache/solr/SolrTestCaseJ4.java b/solr/test-framework/src/java/org/apache/solr/SolrTestCaseJ4.java
index 7d9d1f6..e5bd384 100644
--- a/solr/test-framework/src/java/org/apache/solr/SolrTestCaseJ4.java
+++ b/solr/test-framework/src/java/org/apache/solr/SolrTestCaseJ4.java
@@ -2019,6 +2019,10 @@ public abstract class SolrTestCaseJ4 extends LuceneTestCase {
   // the string to write to the core.properties file may be null in which case nothing is done with it.
   // propertiesContent may be an empty string, which will actually work.
   public static void copyMinConf(File dstRoot, String propertiesContent) throws IOException {
+    copyMinConf(dstRoot, propertiesContent, "solrconfig-minimal.xml");
+  }
+
+  public static void copyMinConf(File dstRoot, String propertiesContent, String solrconfigXmlName) throws IOException {
 
     File subHome = new File(dstRoot, "conf");
     if (! dstRoot.exists()) {
@@ -2030,7 +2034,7 @@ public abstract class SolrTestCaseJ4 extends LuceneTestCase {
     }
     String top = SolrTestCaseJ4.TEST_HOME() + "/collection1/conf";
     FileUtils.copyFile(new File(top, "schema-tiny.xml"), new File(subHome, "schema.xml"));
-    FileUtils.copyFile(new File(top, "solrconfig-minimal.xml"), new File(subHome, "solrconfig.xml"));
+    FileUtils.copyFile(new File(top, solrconfigXmlName), new File(subHome, "solrconfig.xml"));
     FileUtils.copyFile(new File(top, "solrconfig.snippet.randomindexconfig.xml"), new File(subHome, "solrconfig.snippet.randomindexconfig.xml"));
   }
 


[25/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10190: Fix NPE in CloudSolrClient when reading stale alias

Posted by ab...@apache.org.
SOLR-10190: Fix NPE in CloudSolrClient when reading stale alias

This closes #160


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/39887b86
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/39887b86
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/39887b86

Branch: refs/heads/jira/solr-9858
Commit: 39887b86297e36785607f57cfd0e785bcae3c61a
Parents: 30125f9
Author: Tomas Fernandez Lobbe <tf...@apache.org>
Authored: Fri Feb 24 17:33:12 2017 -0800
Committer: Tomas Fernandez Lobbe <tf...@apache.org>
Committed: Fri Feb 24 17:33:12 2017 -0800

----------------------------------------------------------------------
 solr/CHANGES.txt                                |  2 ++
 .../solr/client/solrj/impl/CloudSolrClient.java |  3 +++
 .../client/solrj/impl/CloudSolrClientTest.java  | 25 ++++++++++++++++++++
 3 files changed, 30 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/39887b86/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index 0302615..2b0044c 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -275,6 +275,8 @@ Bug Fixes
 
 * SOLR-10083: Fix instanceof check in ConstDoubleSource.equals (Pushkar Raste via Christine Poerschke)
 
+* SOLR-10190: Fix NPE in CloudSolrClient when reading stale alias (Janosch Woschitz via Tom�s Fern�ndez L�bbe)
+
 ==================  6.4.1 ==================
 
 Consult the LUCENE_CHANGES.txt file for additional, low level, changes in this release.

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/39887b86/solr/solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrClient.java
----------------------------------------------------------------------
diff --git a/solr/solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrClient.java b/solr/solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrClient.java
index d0263c8..3147d4e 100644
--- a/solr/solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrClient.java
+++ b/solr/solrj/src/java/org/apache/solr/client/solrj/impl/CloudSolrClient.java
@@ -1075,6 +1075,9 @@ public class CloudSolrClient extends SolrClient {
       for (String requestedCollection : requestedCollectionNames) {
         // track the version of state we're using on the client side using the _stateVer_ param
         DocCollection coll = getDocCollection(requestedCollection, null);
+        if (coll == null) {
+          throw new SolrException(ErrorCode.BAD_REQUEST, "Collection not found: " + requestedCollection);
+        }
         int collVer = coll.getZNodeVersion();
         if (coll.getStateFormat()>1) {
           if(requestedCollections == null) requestedCollections = new ArrayList<>(requestedCollectionNames.size());

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/39887b86/solr/solrj/src/test/org/apache/solr/client/solrj/impl/CloudSolrClientTest.java
----------------------------------------------------------------------
diff --git a/solr/solrj/src/test/org/apache/solr/client/solrj/impl/CloudSolrClientTest.java b/solr/solrj/src/test/org/apache/solr/client/solrj/impl/CloudSolrClientTest.java
index 1698075..cff5c23 100644
--- a/solr/solrj/src/test/org/apache/solr/client/solrj/impl/CloudSolrClientTest.java
+++ b/solr/solrj/src/test/org/apache/solr/client/solrj/impl/CloudSolrClientTest.java
@@ -147,6 +147,31 @@ public class CloudSolrClientTest extends SolrCloudTestCase {
   }
 
   @Test
+  public void testHandlingOfStaleAlias() throws Exception {
+    try (CloudSolrClient client = getCloudSolrClient(cluster.getZkServer().getZkAddress())) {
+      client.setDefaultCollection("misconfigured-alias");
+
+      CollectionAdminRequest.createCollection("nemesis", "conf", 2, 1).process(client);
+      CollectionAdminRequest.createAlias("misconfigured-alias", "nemesis").process(client);
+      CollectionAdminRequest.deleteCollection("nemesis").process(client);
+
+      List<SolrInputDocument> docs = new ArrayList<>();
+
+      SolrInputDocument doc = new SolrInputDocument();
+      doc.addField(id, Integer.toString(1));
+      docs.add(doc);
+
+      try {
+        client.add(docs);
+        fail("Alias points to non-existing collection, add should fail");
+      } catch (SolrException e) {
+        assertEquals(SolrException.ErrorCode.BAD_REQUEST.code, e.code());
+        assertTrue("Unexpected error exception", e.getMessage().contains("Collection not found"));
+      }
+    }
+  }
+
+  @Test
   public void testRouting() throws Exception {
     
     AbstractUpdateRequest request = new UpdateRequest()


[03/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10143: Added CHANGES entry

Posted by ab...@apache.org.
SOLR-10143: Added CHANGES entry


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/55ef713e
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/55ef713e
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/55ef713e

Branch: refs/heads/jira/solr-9858
Commit: 55ef713eb281178a10ae9d34fce4d7a91a7d3733
Parents: 21690f5
Author: Tomas Fernandez Lobbe <tf...@apache.org>
Authored: Wed Feb 22 10:32:54 2017 -0800
Committer: Tomas Fernandez Lobbe <tf...@apache.org>
Committed: Wed Feb 22 10:32:54 2017 -0800

----------------------------------------------------------------------
 solr/CHANGES.txt | 3 +++
 1 file changed, 3 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/55ef713e/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index 44e4fa9..a6b5504 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -190,6 +190,9 @@ Optimizations
 * SOLR-9584: Support Solr being proxied with another endpoint than default /solr, by using relative links
   in AdminUI javascripts (Yun Jie Zhou via janhoy)
 
+* SOLR-10143: PointFields will create IndexOrDocValuesQuery when a field is both, indexed=true and docValues=true
+  (Tom�s Fern�ndez L�bbe)
+
 Other Changes
 ----------------------
 * SOLR-9980: Expose configVersion in core admin status (Jessica Cheng Mallet via Tom�s Fern�ndez L�bbe)


[36/50] [abbrv] lucene-solr:jira/solr-9858: Avoid infinite loop in TestFuzzyQuery.

Posted by ab...@apache.org.
Avoid infinite loop in TestFuzzyQuery.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/b6c5a8a0
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/b6c5a8a0
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/b6c5a8a0

Branch: refs/heads/jira/solr-9858
Commit: b6c5a8a0c1c6b93b36a57921b06346b577251439
Parents: 04ba996
Author: Adrien Grand <jp...@gmail.com>
Authored: Tue Feb 28 11:53:50 2017 +0100
Committer: Adrien Grand <jp...@gmail.com>
Committed: Tue Feb 28 11:53:50 2017 +0100

----------------------------------------------------------------------
 .../core/src/test/org/apache/lucene/search/TestFuzzyQuery.java  | 5 ++++-
 1 file changed, 4 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/b6c5a8a0/lucene/core/src/test/org/apache/lucene/search/TestFuzzyQuery.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/search/TestFuzzyQuery.java b/lucene/core/src/test/org/apache/lucene/search/TestFuzzyQuery.java
index 62e63ea..ebaf3c0 100644
--- a/lucene/core/src/test/org/apache/lucene/search/TestFuzzyQuery.java
+++ b/lucene/core/src/test/org/apache/lucene/search/TestFuzzyQuery.java
@@ -510,8 +510,11 @@ public class TestFuzzyQuery extends LuceneTestCase {
 
   @SuppressWarnings({"unchecked","rawtypes"})
   public void testRandom() throws Exception {
-    int numTerms = atLeast(100);
     int digits = TestUtil.nextInt(random(), 2, 3);
+    // underestimated total number of unique terms that randomSimpleString
+    // maybe generate, it assumes all terms have a length of 7
+    int vocabularySize = digits << 7;
+    int numTerms = Math.min(atLeast(100), vocabularySize);
     Set<String> terms = new HashSet<>();
     while (terms.size() < numTerms) {
       terms.add(randomSimpleString(digits));


[18/50] [abbrv] lucene-solr:jira/solr-9858: LUCENE-7707: Use predefined shard index when mergeing top docs if present.

Posted by ab...@apache.org.
LUCENE-7707: Use predefined shard index when mergeing top docs if present.

This allows to use TopDoc#merge to merge shard responses incrementally
instead of once all shard responses are present.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/5eeb8136
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/5eeb8136
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/5eeb8136

Branch: refs/heads/jira/solr-9858
Commit: 5eeb8136f34fc7b3e2157982e5fa42da7f115d76
Parents: 05c17c9
Author: Simon Willnauer <si...@apache.org>
Authored: Fri Feb 24 11:43:24 2017 +0100
Committer: Simon Willnauer <si...@apache.org>
Committed: Fri Feb 24 11:43:24 2017 +0100

----------------------------------------------------------------------
 lucene/CHANGES.txt                              |   4 +
 .../java/org/apache/lucene/search/ScoreDoc.java |   2 +-
 .../java/org/apache/lucene/search/TopDocs.java  | 108 ++++++++++++-------
 .../apache/lucene/search/TestTopDocsMerge.java  |  61 +++++++++++
 4 files changed, 138 insertions(+), 37 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/5eeb8136/lucene/CHANGES.txt
----------------------------------------------------------------------
diff --git a/lucene/CHANGES.txt b/lucene/CHANGES.txt
index e71149b..741418a 100644
--- a/lucene/CHANGES.txt
+++ b/lucene/CHANGES.txt
@@ -172,6 +172,10 @@ Improvements
   earlier than regular queries in order to improve cache efficiency.
   (Adrien Grand)
 
+* LUCENE-7707: Use predefined shard index when mergeing top docs if present. This 
+  allows to use TopDoc#merge to merge shard responses incrementally instead of
+  once all shard responses are present. (Simon Willnauer)
+
 Optimizations
 
 * LUCENE-7641: Optimized point range queries to compute documents that do not

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/5eeb8136/lucene/core/src/java/org/apache/lucene/search/ScoreDoc.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/search/ScoreDoc.java b/lucene/core/src/java/org/apache/lucene/search/ScoreDoc.java
index 69464cf..eb95e29 100644
--- a/lucene/core/src/java/org/apache/lucene/search/ScoreDoc.java
+++ b/lucene/core/src/java/org/apache/lucene/search/ScoreDoc.java
@@ -28,7 +28,7 @@ public class ScoreDoc {
    * @see IndexSearcher#doc(int) */
   public int doc;
 
-  /** Only set by {@link TopDocs#merge} */
+  /** Only set by {@link TopDocs#merge}*/
   public int shardIndex;
 
   /** Constructs a ScoreDoc. */

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/5eeb8136/lucene/core/src/java/org/apache/lucene/search/TopDocs.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/search/TopDocs.java b/lucene/core/src/java/org/apache/lucene/search/TopDocs.java
index c1f825e..2913cb2 100644
--- a/lucene/core/src/java/org/apache/lucene/search/TopDocs.java
+++ b/lucene/core/src/java/org/apache/lucene/search/TopDocs.java
@@ -57,22 +57,54 @@ public class TopDocs {
   }
 
   // Refers to one hit:
-  private static class ShardRef {
+  private final static class ShardRef {
     // Which shard (index into shardHits[]):
     final int shardIndex;
+    final boolean useScoreDocIndex;
 
     // Which hit within the shard:
     int hitIndex;
 
-    public ShardRef(int shardIndex) {
+    ShardRef(int shardIndex, boolean useScoreDocIndex) {
       this.shardIndex = shardIndex;
+      this.useScoreDocIndex = useScoreDocIndex;
     }
 
     @Override
     public String toString() {
       return "ShardRef(shardIndex=" + shardIndex + " hitIndex=" + hitIndex + ")";
     }
-  };
+
+    int getShardIndex(ScoreDoc scoreDoc) {
+      if (useScoreDocIndex) {
+        assert scoreDoc.shardIndex != -1 : "scoreDoc shardIndex must be predefined set but wasn't";
+        return scoreDoc.shardIndex;
+      } else {
+        assert scoreDoc.shardIndex == -1 : "scoreDoc shardIndex must be undefined but wasn't";
+        return shardIndex;
+      }
+    }
+  }
+
+  /**
+   * if we need to tie-break since score / sort value are the same we first compare shard index (lower shard wins)
+   * and then iff shard index is the same we use the hit index.
+   */
+  static boolean tieBreakLessThan(ShardRef first, ScoreDoc firstDoc, ShardRef second, ScoreDoc secondDoc) {
+    final int firstShardIndex = first.getShardIndex(firstDoc);
+    final int secondShardIndex = second.getShardIndex(secondDoc);
+    // Tie break: earlier shard wins
+    if (firstShardIndex< secondShardIndex) {
+      return true;
+    } else if (firstShardIndex > secondShardIndex) {
+      return false;
+    } else {
+      // Tie break in same shard: resolve however the
+      // shard had resolved it:
+      assert first.hitIndex != second.hitIndex;
+      return first.hitIndex < second.hitIndex;
+    }
+  }
 
   // Specialized MergeSortQueue that just merges by
   // relevance score, descending:
@@ -91,25 +123,14 @@ public class TopDocs {
     @Override
     public boolean lessThan(ShardRef first, ShardRef second) {
       assert first != second;
-      final float firstScore = shardHits[first.shardIndex][first.hitIndex].score;
-      final float secondScore = shardHits[second.shardIndex][second.hitIndex].score;
-
-      if (firstScore < secondScore) {
+      ScoreDoc firstScoreDoc = shardHits[first.shardIndex][first.hitIndex];
+      ScoreDoc secondScoreDoc = shardHits[second.shardIndex][second.hitIndex];
+      if (firstScoreDoc.score < secondScoreDoc.score) {
         return false;
-      } else if (firstScore > secondScore) {
+      } else if (firstScoreDoc.score > secondScoreDoc.score) {
         return true;
       } else {
-        // Tie break: earlier shard wins
-        if (first.shardIndex < second.shardIndex) {
-          return true;
-        } else if (first.shardIndex > second.shardIndex) {
-          return false;
-        } else {
-          // Tie break in same shard: resolve however the
-          // shard had resolved it:
-          assert first.hitIndex != second.hitIndex;
-          return first.hitIndex < second.hitIndex;
-        }
+        return tieBreakLessThan(first, firstScoreDoc, second, secondScoreDoc);
       }
     }
   }
@@ -172,27 +193,15 @@ public class TopDocs {
           return cmp < 0;
         }
       }
-
-      // Tie break: earlier shard wins
-      if (first.shardIndex < second.shardIndex) {
-        //System.out.println("    return tb true");
-        return true;
-      } else if (first.shardIndex > second.shardIndex) {
-        //System.out.println("    return tb false");
-        return false;
-      } else {
-        // Tie break in same shard: resolve however the
-        // shard had resolved it:
-        //System.out.println("    return tb " + (first.hitIndex < second.hitIndex));
-        assert first.hitIndex != second.hitIndex;
-        return first.hitIndex < second.hitIndex;
-      }
+      return tieBreakLessThan(first, firstFD, second, secondFD);
     }
   }
 
   /** Returns a new TopDocs, containing topN results across
    *  the provided TopDocs, sorting by score. Each {@link TopDocs}
    *  instance must be sorted.
+   *
+   *  @see #merge(int, int, TopDocs[])
    *  @lucene.experimental */
   public static TopDocs merge(int topN, TopDocs[] shardHits) {
     return merge(0, topN, shardHits);
@@ -201,6 +210,10 @@ public class TopDocs {
   /**
    * Same as {@link #merge(int, TopDocs[])} but also ignores the top
    * {@code start} top docs. This is typically useful for pagination.
+   *
+   * Note: This method will fill the {@link ScoreDoc#shardIndex} on all score docs returned iff all ScoreDocs passed
+   * to this have it's shard index set to <tt>-1</tt>. Otherwise the shard index is not set. This allows to predefine
+   * the shard index in order to incrementally merge shard responses without losing the original shard index.
    * @lucene.experimental
    */
   public static TopDocs merge(int start, int topN, TopDocs[] shardHits) {
@@ -213,6 +226,7 @@ public class TopDocs {
    *  the same Sort, and sort field values must have been
    *  filled (ie, <code>fillFields=true</code> must be
    *  passed to {@link TopFieldCollector#create}).
+   *  @see #merge(Sort, int, int, TopFieldDocs[])
    * @lucene.experimental */
   public static TopFieldDocs merge(Sort sort, int topN, TopFieldDocs[] shardHits) {
     return merge(sort, 0, topN, shardHits);
@@ -221,6 +235,10 @@ public class TopDocs {
   /**
    * Same as {@link #merge(Sort, int, TopFieldDocs[])} but also ignores the top
    * {@code start} top docs. This is typically useful for pagination.
+   *
+   * Note: This method will fill the {@link ScoreDoc#shardIndex} on all score docs returned iff all ScoreDocs passed
+   * to this have it's shard index set to <tt>-1</tt>. Otherwise the shard index is not set. This allows to predefine
+   * the shard index in order to incrementally merge shard responses without losing the original shard index.
    * @lucene.experimental
    */
   public static TopFieldDocs merge(Sort sort, int start, int topN, TopFieldDocs[] shardHits) {
@@ -243,14 +261,26 @@ public class TopDocs {
     int totalHitCount = 0;
     int availHitCount = 0;
     float maxScore = Float.MIN_VALUE;
+    Boolean setShardIndex = null;
     for(int shardIDX=0;shardIDX<shardHits.length;shardIDX++) {
       final TopDocs shard = shardHits[shardIDX];
       // totalHits can be non-zero even if no hits were
       // collected, when searchAfter was used:
       totalHitCount += shard.totalHits;
       if (shard.scoreDocs != null && shard.scoreDocs.length > 0) {
+        if (shard.scoreDocs[0].shardIndex == -1) {
+          if (setShardIndex != null && setShardIndex == false) {
+            throw new IllegalStateException("scoreDocs at index " + shardIDX + " has undefined shard indices but previous scoreDocs were predefined");
+          }
+          setShardIndex = true;
+        } else {
+          if (setShardIndex != null && setShardIndex) {
+            throw new IllegalStateException("scoreDocs at index " + shardIDX + " has predefined shard indices but previous scoreDocs were undefined");
+          }
+          setShardIndex = false;
+        }
         availHitCount += shard.scoreDocs.length;
-        queue.add(new ShardRef(shardIDX));
+        queue.add(new ShardRef(shardIDX, setShardIndex == false));
         maxScore = Math.max(maxScore, shard.getMaxScore());
         //System.out.println("  maxScore now " + maxScore + " vs " + shard.getMaxScore());
       }
@@ -272,7 +302,13 @@ public class TopDocs {
         assert queue.size() > 0;
         ShardRef ref = queue.top();
         final ScoreDoc hit = shardHits[ref.shardIndex].scoreDocs[ref.hitIndex++];
-        hit.shardIndex = ref.shardIndex;
+        if (setShardIndex) {
+          // unless this index is already initialized potentially due to multiple merge phases, or explicitly by the user
+          // we set the shard index to the index of the TopDocs array this hit is coming from.
+          // this allows multiple merge phases if needed but requires extra accounting on the users end.
+          // at the same time this is fully backwards compatible since the value was initialize to -1 from the beginning
+          hit.shardIndex = ref.shardIndex;
+        }
         if (hitUpto >= start) {
           hits[hitUpto - start] = hit;
         }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/5eeb8136/lucene/core/src/test/org/apache/lucene/search/TestTopDocsMerge.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/search/TestTopDocsMerge.java b/lucene/core/src/test/org/apache/lucene/search/TestTopDocsMerge.java
index a5eafad..37c61a4 100644
--- a/lucene/core/src/test/org/apache/lucene/search/TestTopDocsMerge.java
+++ b/lucene/core/src/test/org/apache/lucene/search/TestTopDocsMerge.java
@@ -30,6 +30,7 @@ import org.apache.lucene.index.RandomIndexWriter;
 import org.apache.lucene.index.ReaderUtil;
 import org.apache.lucene.index.Term;
 import org.apache.lucene.store.Directory;
+import org.apache.lucene.util.ArrayUtil;
 import org.apache.lucene.util.BytesRef;
 import org.apache.lucene.util.LuceneTestCase;
 import org.apache.lucene.util.TestUtil;
@@ -37,7 +38,9 @@ import org.apache.lucene.util.TestUtil;
 import java.io.IOException;
 import java.util.ArrayList;
 import java.util.Collections;
+import java.util.HashMap;
 import java.util.List;
+import java.util.Map;
 
 public class TestTopDocsMerge extends LuceneTestCase {
 
@@ -72,6 +75,64 @@ public class TestTopDocsMerge extends LuceneTestCase {
     testSort(true);
   }
 
+  public void testInconsistentTopDocsFail() {
+    TopDocs[] topDocs = new TopDocs[] {
+        new TopDocs(1, new ScoreDoc[] { new ScoreDoc(1, 1.0f, 1) }),
+        new TopDocs(1, new ScoreDoc[] { new ScoreDoc(1, 1.0f, -1) })
+    };
+    if (random().nextBoolean()) {
+      ArrayUtil.swap(topDocs, 0, 1);
+    }
+    expectThrows(IllegalStateException.class, () -> {
+      TopDocs.merge(0, 1, topDocs);
+    });
+  }
+
+  public void testAssignShardIndex() {
+    boolean useConstantScore = random().nextBoolean();
+    int numTopDocs = 2 + random().nextInt(10);
+    ArrayList<TopDocs> topDocs = new ArrayList<>(numTopDocs);
+    Map<Integer, TopDocs> shardResultMapping = new HashMap<>();
+    int numHitsTotal = 0;
+    for (int i = 0; i < numTopDocs; i++) {
+      int numHits = 1 + random().nextInt(10);
+      numHitsTotal += numHits;
+      ScoreDoc[] scoreDocs = new ScoreDoc[numHits];
+      for (int j = 0; j < scoreDocs.length; j++) {
+        float score = useConstantScore ? 1.0f : random().nextFloat();
+        scoreDocs[j] = new ScoreDoc((100 * i) + j, score , i);
+        // we set the shard index to index in the list here but shuffle the entire list below
+      }
+      topDocs.add(new TopDocs(numHits, scoreDocs));
+      shardResultMapping.put(i, topDocs.get(i));
+    }
+    // shuffle the entire thing such that we don't get 1 to 1 mapping of shard index to index in the array
+    // -- well likely ;)
+    Collections.shuffle(topDocs, random());
+    final int from = random().nextInt(numHitsTotal-1);
+    final int size = 1 + random().nextInt(numHitsTotal - from);
+    TopDocs merge = TopDocs.merge(from, size, topDocs.toArray(new TopDocs[0]));
+    assertTrue(merge.scoreDocs.length > 0);
+    for (ScoreDoc scoreDoc : merge.scoreDocs) {
+      assertTrue(scoreDoc.shardIndex != -1);
+      TopDocs shardTopDocs = shardResultMapping.get(scoreDoc.shardIndex);
+      assertNotNull(shardTopDocs);
+      boolean found = false;
+      for (ScoreDoc shardScoreDoc : shardTopDocs.scoreDocs) {
+        if (shardScoreDoc == scoreDoc) {
+          found = true;
+          break;
+        }
+      }
+      assertTrue(found);
+    }
+
+    // now ensure merge is stable even if we use our own shard IDs
+    Collections.shuffle(topDocs, random());
+    TopDocs merge2 = TopDocs.merge(from, size, topDocs.toArray(new TopDocs[0]));
+    assertArrayEquals(merge.scoreDocs, merge2.scoreDocs);
+  }
+
   void testSort(boolean useFrom) throws Exception {
 
     IndexReader reader = null;


[06/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-9855: DynamicInterceptor in HttpClientUtils use synchronization that can deadlock and puts a global mutex around per request process calls.

Posted by ab...@apache.org.
SOLR-9855: DynamicInterceptor in HttpClientUtils use synchronization that can deadlock and puts a global mutex around per request process calls.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/2f82409e
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/2f82409e
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/2f82409e

Branch: refs/heads/jira/solr-9858
Commit: 2f82409e5b3a90363941caa3767c3de2abecdaf0
Parents: be64c26
Author: markrmiller <ma...@apache.org>
Authored: Wed Feb 22 13:01:21 2017 -0500
Committer: markrmiller <ma...@apache.org>
Committed: Wed Feb 22 14:44:18 2017 -0500

----------------------------------------------------------------------
 .../org/apache/solr/client/solrj/impl/HttpClientUtil.java    | 8 +++++---
 1 file changed, 5 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/2f82409e/solr/solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
----------------------------------------------------------------------
diff --git a/solr/solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java b/solr/solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
index 7ee90e1..7a84e7f 100644
--- a/solr/solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
+++ b/solr/solrj/src/java/org/apache/solr/client/solrj/impl/HttpClientUtil.java
@@ -19,10 +19,9 @@ package org.apache.solr.client.solrj.impl;
 import java.io.IOException;
 import java.io.InputStream;
 import java.lang.invoke.MethodHandles;
-import java.util.ArrayList;
-import java.util.Collections;
 import java.util.List;
 import java.util.Optional;
+import java.util.concurrent.CopyOnWriteArrayList;
 import java.util.concurrent.TimeUnit;
 import java.util.function.Consumer;
 import java.util.zip.GZIPInputStream;
@@ -128,7 +127,7 @@ public class HttpClientUtil {
 
   private static volatile SchemaRegistryProvider schemaRegistryProvider;
   private static volatile String cookiePolicy;
-  private static final List<HttpRequestInterceptor> interceptors = Collections.synchronizedList(new ArrayList<HttpRequestInterceptor>());
+  private static final List<HttpRequestInterceptor> interceptors = new CopyOnWriteArrayList<>();
 
 
   static {
@@ -156,6 +155,9 @@ public class HttpClientUtil {
 
     @Override
     public void process(HttpRequest request, HttpContext context) throws HttpException, IOException {
+      // don't synchronize traversal - can lead to deadlock - CopyOnWriteArrayList is critical
+      // we also do not want to have to acquire the mutex when the list is empty or put a global
+      // mutex around the process calls
       interceptors.forEach(new Consumer<HttpRequestInterceptor>() {
 
         @Override


[23/50] [abbrv] lucene-solr:jira/solr-9858: LUCENE-7708: Fix position length attribute set by the ShingleFilter when outputUnigrams=false

Posted by ab...@apache.org.
LUCENE-7708: Fix position length attribute set by the ShingleFilter when outputUnigrams=false


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/57a42e4e
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/57a42e4e
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/57a42e4e

Branch: refs/heads/jira/solr-9858
Commit: 57a42e4ec54aebac40c1ef7dc93d933cd00dbe1e
Parents: cab3aae
Author: Jim Ferenczi <ji...@elastic.co>
Authored: Fri Feb 24 23:37:37 2017 +0100
Committer: Jim Ferenczi <ji...@elastic.co>
Committed: Fri Feb 24 23:37:37 2017 +0100

----------------------------------------------------------------------
 lucene/CHANGES.txt                              |  4 +
 .../lucene/analysis/shingle/ShingleFilter.java  |  7 +-
 .../analysis/shingle/ShingleFilterTest.java     | 94 +++++++++++++++++++-
 3 files changed, 102 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/57a42e4e/lucene/CHANGES.txt
----------------------------------------------------------------------
diff --git a/lucene/CHANGES.txt b/lucene/CHANGES.txt
index 1d45ab8..c119eaa 100644
--- a/lucene/CHANGES.txt
+++ b/lucene/CHANGES.txt
@@ -155,6 +155,10 @@ Bug Fixes
   token graph, messing up phrase queries when it was used during query
   parsing (Ere Maijala via Mike McCandless)
 
+* LUCENE-7708: ShingleFilter without unigram was producing a disconnected
+  token graph, messing up queries when it was used during query
+  parsing (Jim Ferenczi)
+
 Improvements
 
 * LUCENE-7055: Added Weight#scorerSupplier, which allows to estimate the cost

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/57a42e4e/lucene/analysis/common/src/java/org/apache/lucene/analysis/shingle/ShingleFilter.java
----------------------------------------------------------------------
diff --git a/lucene/analysis/common/src/java/org/apache/lucene/analysis/shingle/ShingleFilter.java b/lucene/analysis/common/src/java/org/apache/lucene/analysis/shingle/ShingleFilter.java
index 5d99291..e3fa803 100644
--- a/lucene/analysis/common/src/java/org/apache/lucene/analysis/shingle/ShingleFilter.java
+++ b/lucene/analysis/common/src/java/org/apache/lucene/analysis/shingle/ShingleFilter.java
@@ -343,7 +343,12 @@ public final class ShingleFilter extends TokenFilter {
           noShingleOutput = false;
         }
         offsetAtt.setOffset(offsetAtt.startOffset(), nextToken.offsetAtt.endOffset());
-        posLenAtt.setPositionLength(builtGramSize);
+        if (outputUnigrams) {
+          posLenAtt.setPositionLength(builtGramSize);
+        } else {
+          // position length for this token is the number of position created by shingles of smaller size.
+          posLenAtt.setPositionLength(Math.max(1, (builtGramSize - minShingleSize) + 1));
+        }
         isOutputHere = true;
         gramSize.advance();
         tokenAvailable = true;

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/57a42e4e/lucene/analysis/common/src/test/org/apache/lucene/analysis/shingle/ShingleFilterTest.java
----------------------------------------------------------------------
diff --git a/lucene/analysis/common/src/test/org/apache/lucene/analysis/shingle/ShingleFilterTest.java b/lucene/analysis/common/src/test/org/apache/lucene/analysis/shingle/ShingleFilterTest.java
index 192de38..5645900 100644
--- a/lucene/analysis/common/src/test/org/apache/lucene/analysis/shingle/ShingleFilterTest.java
+++ b/lucene/analysis/common/src/test/org/apache/lucene/analysis/shingle/ShingleFilterTest.java
@@ -30,7 +30,7 @@ import org.apache.lucene.analysis.TokenStream;
 import org.apache.lucene.analysis.Tokenizer;
 import org.apache.lucene.analysis.core.KeywordTokenizer;
 import org.apache.lucene.analysis.core.WhitespaceTokenizer;
-import org.apache.lucene.analysis.tokenattributes.*;
+import org.apache.lucene.analysis.tokenattributes.TypeAttribute;
 
 public class ShingleFilterTest extends BaseTokenStreamTestCase {
 
@@ -1239,7 +1239,6 @@ public class ShingleFilterTest extends BaseTokenStreamTestCase {
     filter = new ShingleFilter(new CannedTokenStream(2, 20, inputTokens), 2, 3);
     filter.setFillerToken(null);
     filter.setTokenSeparator(null);
-
     assertTokenStreamContents(filter,
         new String[] {"purple", "purplewizard", "purplewizard", "wizard", "wizard", "wizard"},
         new int[] {0, 0, 0, 7, 7, 7},
@@ -1247,4 +1246,95 @@ public class ShingleFilterTest extends BaseTokenStreamTestCase {
         new int[] {1, 0, 0, 1, 0, 0},
         20);
   }
+
+  public void testPositionLength() throws Exception {
+    Analyzer a = new Analyzer() {
+      @Override
+      protected TokenStreamComponents createComponents(String fieldName) {
+        Tokenizer tokenizer = new MockTokenizer(MockTokenizer.WHITESPACE, false);
+        ShingleFilter filter = new ShingleFilter(tokenizer, 4, 4);
+        filter.setOutputUnigrams(false);
+        return new TokenStreamComponents(tokenizer, filter);
+      }
+    };
+    assertTokenStreamContents(a.tokenStream("", "to be or not to be"),
+        new String[] {"to be or not", "be or not to", "or not to be"},
+        new int[] {0, 3, 6},
+        new int[] {12, 15, 18},
+        null,
+        new int[] {1, 1, 1},
+        new int[] {1, 1, 1},
+        18,
+        // offsets are correct but assertTokenStreamContents does not handle multiple terms with different offsets
+        // finishing at the same position
+        false);
+
+
+    a = new Analyzer() {
+      @Override
+      protected TokenStreamComponents createComponents(String fieldName) {
+        Tokenizer tokenizer = new MockTokenizer(MockTokenizer.WHITESPACE, false);
+        ShingleFilter filter = new ShingleFilter(tokenizer, 2, 4);
+        filter.setOutputUnigrams(false);
+        return new TokenStreamComponents(tokenizer, filter);
+      }
+    };
+    assertTokenStreamContents(a.tokenStream("", "to be or not to be"),
+        new String[] {"to be", "to be or", "to be or not", "be or", "be or not", "be or not to", "or not", "or not to",
+            "or not to be", "not to", "not to be", "to be"},
+        new int[] {0, 0, 0, 3, 3, 3, 6, 6, 6, 9, 9, 13},
+        new int[] {5, 8, 12, 8, 12, 15, 12, 15, 18, 15, 18, 18},
+        null,
+        new int[] {1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 1},
+        new int[] {1, 2, 3, 1, 2, 3, 1, 2, 3, 1, 2, 1},
+        18,
+        // offsets are correct but assertTokenStreamContents does not handle multiple terms with different offsets
+        // finishing at the same position
+        false);
+
+    a = new Analyzer() {
+      @Override
+      protected TokenStreamComponents createComponents(String fieldName) {
+        Tokenizer tokenizer = new MockTokenizer(MockTokenizer.WHITESPACE, false);
+        ShingleFilter filter = new ShingleFilter(tokenizer, 3, 4);
+        filter.setOutputUnigrams(false);
+        return new TokenStreamComponents(tokenizer, filter);
+      }
+    };
+
+    assertTokenStreamContents(a.tokenStream("", "to be or not to be"),
+        new String[] {"to be or", "to be or not", "be or not", "be or not to", "or not to",
+            "or not to be", "not to be"},
+        new int[] {0, 0, 3, 3, 6, 6, 9},
+        new int[] {8, 12, 12, 15, 15, 18, 18},
+        null,
+        new int[] {1, 0, 1, 0, 1, 0, 1, 0},
+        new int[] {1, 2, 1, 2, 1, 2, 1, 2},
+        18,
+        // offsets are correct but assertTokenStreamContents does not handle multiple terms with different offsets
+        // finishing at the same position
+        false);
+
+    a = new Analyzer() {
+      @Override
+      protected TokenStreamComponents createComponents(String fieldName) {
+        Tokenizer tokenizer = new MockTokenizer(MockTokenizer.WHITESPACE, false);
+        ShingleFilter filter = new ShingleFilter(tokenizer, 3, 5);
+        filter.setOutputUnigrams(false);
+        return new TokenStreamComponents(tokenizer, filter);
+      }
+    };
+    assertTokenStreamContents(a.tokenStream("", "to be or not to be"),
+        new String[] {"to be or", "to be or not", "to be or not to", "be or not", "be or not to",
+            "be or not to be", "or not to", "or not to be", "not to be"},
+        new int[] {0, 0, 0, 3, 3, 3, 6, 6, 9, 9},
+        new int[] {8, 12, 15, 12, 15, 18, 15, 18, 18},
+        null,
+        new int[] {1, 0, 0, 1, 0, 0, 1, 0, 1, 0},
+        new int[] {1, 2, 3, 1, 2, 3, 1, 2, 1},
+        18,
+        // offsets are correct but assertTokenStreamContents does not handle multiple terms with different offsets
+        // finishing at the same position
+        false);
+  }
 }


[31/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10208: Adjust scoring formula for the scoreNodes function

Posted by ab...@apache.org.
SOLR-10208: Adjust scoring formula for the scoreNodes function


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/0c1fde66
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/0c1fde66
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/0c1fde66

Branch: refs/heads/jira/solr-9858
Commit: 0c1fde664fb1c9456b3fdc2abd08e80dc8f86eb8
Parents: a248e6e
Author: Joel Bernstein <jb...@apache.org>
Authored: Mon Feb 27 12:03:03 2017 -0500
Committer: Joel Bernstein <jb...@apache.org>
Committed: Mon Feb 27 12:03:26 2017 -0500

----------------------------------------------------------------------
 .../solr/client/solrj/io/stream/ScoreNodesStream.java    |  2 +-
 .../client/solrj/io/stream/SignificantTermsStream.java   |  2 +-
 .../solr/client/solrj/io/graph/GraphExpressionTest.java  | 11 +++++------
 3 files changed, 7 insertions(+), 8 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0c1fde66/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/ScoreNodesStream.java
----------------------------------------------------------------------
diff --git a/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/ScoreNodesStream.java b/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/ScoreNodesStream.java
index 41e7197..f394424 100644
--- a/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/ScoreNodesStream.java
+++ b/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/ScoreNodesStream.java
@@ -237,7 +237,7 @@ public class ScoreNodesStream extends TupleStream implements Expressible
             throw new Exception("termFreq field not present in the Tuple");
           }
           Number termFreqValue = (Number)tuple.get(termFreq);
-          float score = termFreqValue.floatValue() * (float) (Math.log((numDocs + 1) / (docFreq.doubleValue() + 1)) + 1.0);
+          float score = (float)(Math.log(termFreqValue.floatValue())+1.0) * (float) (Math.log((numDocs + 1) / (docFreq.doubleValue() + 1)) + 1.0);
           tuple.put("nodeScore", score);
           tuple.put("docFreq", docFreq);
           tuple.put("numDocs", numDocs);

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0c1fde66/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java
----------------------------------------------------------------------
diff --git a/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java b/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java
index f5f8a06..87b5a9f 100644
--- a/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java
+++ b/solr/solrj/src/java/org/apache/solr/client/solrj/io/stream/SignificantTermsStream.java
@@ -358,7 +358,7 @@ public class SignificantTermsStream extends TupleStream implements Expressible{
           map.put("background", freqs[0]);
           map.put("foreground", freqs[1]);
 
-          float score = (float)Math.log(freqs[1]) * (float) (Math.log(((float)(numDocs + 1)) / (freqs[0] + 1)) + 1.0);
+          float score = (float)(Math.log(freqs[1])+1.0) * (float) (Math.log(((float)(numDocs + 1)) / (freqs[0] + 1)) + 1.0);
 
           map.put("score", score);
           maps.add(map);

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0c1fde66/solr/solrj/src/test/org/apache/solr/client/solrj/io/graph/GraphExpressionTest.java
----------------------------------------------------------------------
diff --git a/solr/solrj/src/test/org/apache/solr/client/solrj/io/graph/GraphExpressionTest.java b/solr/solrj/src/test/org/apache/solr/client/solrj/io/graph/GraphExpressionTest.java
index cf07058..33781ef 100644
--- a/solr/solrj/src/test/org/apache/solr/client/solrj/io/graph/GraphExpressionTest.java
+++ b/solr/solrj/src/test/org/apache/solr/client/solrj/io/graph/GraphExpressionTest.java
@@ -521,10 +521,10 @@ public class GraphExpressionTest extends SolrCloudTestCase {
         .add(id, "3", "basket_s", "basket2", "product_ss", "product1", "product_ss", "product6", "product_ss", "product7", "price_f", "1")
         .add(id, "6", "basket_s", "basket3", "product_ss", "product4",  "product_ss","product3", "product_ss","product1", "price_f", "1")
         .add(id, "9", "basket_s", "basket4", "product_ss", "product4", "product_ss", "product3", "product_ss", "product1","price_f", "1")
-        .add(id, "12", "basket_s", "basket5", "product_ss", "product1", "price_f", "1")
-        .add(id, "13", "basket_s", "basket6", "product_ss", "product1", "price_f", "1")
-        .add(id, "14", "basket_s", "basket7", "product_ss", "product1", "price_f", "1")
-        .add(id, "15", "basket_s", "basket4", "product_ss", "product1", "price_f", "1")
+        //.add(id, "12", "basket_s", "basket5", "product_ss", "product1", "price_f", "1")
+        //.add(id, "13", "basket_s", "basket6", "product_ss", "product1", "price_f", "1")
+        //.add(id, "14", "basket_s", "basket7", "product_ss", "product1", "price_f", "1")
+        //.add(id, "15", "basket_s", "basket4", "product_ss", "product1", "price_f", "1")
         .commit(cluster.getSolrClient(), COLLECTION);
 
     List<Tuple> tuples = null;
@@ -557,7 +557,6 @@ public class GraphExpressionTest extends SolrCloudTestCase {
     stream.setStreamContext(context);
     tuples = getTuples(stream);
 
-    //The highest scoring tuple will be the product searched for.
     Tuple tuple = tuples.get(0);
     assert(tuple.getString("node").equals("product3"));
     assert(tuple.getLong("docFreq") == 3);
@@ -570,7 +569,7 @@ public class GraphExpressionTest extends SolrCloudTestCase {
 
     Tuple tuple1 = tuples.get(2);
     assert(tuple1.getString("node").equals("product1"));
-    assert(tuple1.getLong("docFreq") == 8);
+    assert(tuple1.getLong("docFreq") == 4);
     assert(tuple1.getLong("count(*)") == 3);
 
     Tuple tuple2 = tuples.get(3);


[34/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10207: Harden CleanupOldIndexTest.

Posted by ab...@apache.org.
SOLR-10207: Harden CleanupOldIndexTest.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/ed0f0f45
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/ed0f0f45
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/ed0f0f45

Branch: refs/heads/jira/solr-9858
Commit: ed0f0f45ce17e2218ec2e97aab2fcb1a0d4defa6
Parents: 86b5b63
Author: markrmiller <ma...@apache.org>
Authored: Mon Feb 27 22:55:10 2017 -0500
Committer: markrmiller <ma...@apache.org>
Committed: Mon Feb 27 23:33:47 2017 -0500

----------------------------------------------------------------------
 .../apache/solr/cloud/CleanupOldIndexTest.java    | 18 ++++++++++++++----
 1 file changed, 14 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/ed0f0f45/solr/core/src/test/org/apache/solr/cloud/CleanupOldIndexTest.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/cloud/CleanupOldIndexTest.java b/solr/core/src/test/org/apache/solr/cloud/CleanupOldIndexTest.java
index 006c3c9..cc03a25 100644
--- a/solr/core/src/test/org/apache/solr/cloud/CleanupOldIndexTest.java
+++ b/solr/core/src/test/org/apache/solr/cloud/CleanupOldIndexTest.java
@@ -30,6 +30,7 @@ import org.apache.solr.common.cloud.DocCollection;
 import org.apache.solr.core.CoreContainer;
 import org.apache.solr.core.SolrCore;
 import org.apache.solr.handler.SnapShooter;
+import org.junit.AfterClass;
 import org.junit.BeforeClass;
 import org.junit.Test;
 
@@ -44,6 +45,15 @@ public class CleanupOldIndexTest extends SolrCloudTestCase {
         .addConfig("conf1", TEST_PATH().resolve("configsets").resolve("cloud-dynamic").resolve("conf"))
         .configure();
   }
+  
+  @AfterClass
+  public static void teardownTestCases() throws Exception {
+
+    if (suiteFailureMarker.wasSuccessful()) {
+      zkClient().printLayoutToStdOut();
+    }
+
+  }
 
   private static final String COLLECTION = "oldindextest";
 
@@ -54,14 +64,14 @@ public class CleanupOldIndexTest extends SolrCloudTestCase {
         .processAndWait(cluster.getSolrClient(), DEFAULT_TIMEOUT);
     cluster.getSolrClient().setDefaultCollection(COLLECTION); // TODO make this configurable on StoppableIndexingThread
 
-    int[] maxDocList = new int[] {300, 700, 1200};
+    int[] maxDocList = new int[] {300, 500, 700};
     int maxDoc = maxDocList[random().nextInt(maxDocList.length - 1)];
 
     StoppableIndexingThread indexThread = new StoppableIndexingThread(null, cluster.getSolrClient(), "1", true, maxDoc, 1, true);
     indexThread.start();
 
     // give some time to index...
-    int[] waitTimes = new int[] {200, 2000, 3000};
+    int[] waitTimes = new int[] {3000, 4000};
     Thread.sleep(waitTimes[random().nextInt(waitTimes.length - 1)]);
 
     // create some "old" index directories
@@ -86,13 +96,13 @@ public class CleanupOldIndexTest extends SolrCloudTestCase {
     assertTrue(oldIndexDir2.isDirectory());
 
     // bring shard replica down
-    jetty.stop();
+    ChaosMonkey.stop(jetty);
 
     // wait a moment - lets allow some docs to be indexed so replication time is non 0
     Thread.sleep(waitTimes[random().nextInt(waitTimes.length - 1)]);
 
     // bring shard replica up
-    jetty.start();
+    ChaosMonkey.start(jetty);
 
     // make sure replication can start
     Thread.sleep(3000);


[14/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-9450: The docs/ folder in the binary distribution now contains a single index.html file linking to the online documentation, reducing the size of the download

Posted by ab...@apache.org.
SOLR-9450: The docs/ folder in the binary distribution now contains a single index.html file linking to the online documentation, reducing the size of the download


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/894a43b2
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/894a43b2
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/894a43b2

Branch: refs/heads/jira/solr-9858
Commit: 894a43b259a72a82f07649b0d93ab3c17c4d89c4
Parents: 3ad6e41
Author: Uwe Schindler <us...@apache.org>
Authored: Thu Feb 23 15:43:45 2017 +0100
Committer: Uwe Schindler <us...@apache.org>
Committed: Thu Feb 23 15:43:45 2017 +0100

----------------------------------------------------------------------
 solr/CHANGES.txt          |  6 ++--
 solr/README.txt           |  2 +-
 solr/build.xml            | 21 +++++++++++--
 solr/common-build.xml     | 18 ++++++++++-
 solr/site/online-link.xsl | 69 ++++++++++++++++++++++++++++++++++++++++++
 5 files changed, 109 insertions(+), 7 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/894a43b2/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index fc5bfe1..9ece4f8 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -246,6 +246,9 @@ Other Changes
 
 * SOLR-10020: Cannot reload a core if it fails initialization. (Mike Drob via Erick Erickson)
 
+* SOLR-9450: The docs/ folder in the binary distribution now contains a single index.html file linking
+  to the online documentation, reducing the size of the download (janhoy, Shawn Heisey, Uwe Schindler)
+
 ==================  6.4.2 ==================
 
 Consult the LUCENE_CHANGES.txt file for additional, low level, changes in this release.
@@ -697,7 +700,7 @@ Upgrade Notes
 
 New Features
 ----------------------
-* SOLR-5725: facet.method=enum can bypass exact counts calculation with facet.exists=true, it just returns 1 for 
+* SOLR-5725: facet.method=enum can bypass exact counts calculation with facet.exists=true, it just returns 1 for
   terms which exists in result docset. (Alexey Kozhemiakin, Sebastian Koziel, Radoslaw Zielinski via Mikhail Khludnev)
 
 * SOLR-9127: Excel workbook (.xlsx) response writer. use 'wt=xlsx' (Tony Moriarty, noble)
@@ -974,7 +977,6 @@ Other Changes
 * SOLR-9371: Fix bin/solr calculations for start/stop wait time and RMI_PORT.
   (Shawn Heisey via Erick Erickson)
 
-
 ==================  6.2.1 ==================
 
 Bug Fixes

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/894a43b2/solr/README.txt
----------------------------------------------------------------------
diff --git a/solr/README.txt b/solr/README.txt
index 3e7a09c..4ef5eac 100644
--- a/solr/README.txt
+++ b/solr/README.txt
@@ -118,7 +118,7 @@ dist/solr-<component>-XX.jar
   for more information).
 
 docs/index.html
-  The Apache Solr Javadoc API documentation and Tutorial
+  A link to the online version of Apache Solr Javadoc API documentation and Tutorial
 
 
 Instructions for Building Apache Solr from Source

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/894a43b2/solr/build.xml
----------------------------------------------------------------------
diff --git a/solr/build.xml b/solr/build.xml
index b900aed..b176585 100644
--- a/solr/build.xml
+++ b/solr/build.xml
@@ -209,6 +209,21 @@
       </filterchain>
     </copy>
   </target>
+  
+  <target name="documentation-online" description="Generate a link to the online documentation"
+      depends="define-solr-javadoc-url">
+    <xslt in="${ant.file}" out="${javadoc-online.dir}/index.html" style="site/online-link.xsl" force="true">
+      <outputproperty name="method" value="html"/>
+      <outputproperty name="version" value="4.0"/>
+      <outputproperty name="encoding" value="UTF-8"/>
+      <outputproperty name="indent" value="yes"/>
+      <param name="version" expression="${version}"/>
+      <param name="solrJavadocUrl" expression="${solr.javadoc.url}"/>
+    </xslt>
+    <copy todir="${javadoc-online.dir}">
+      <fileset dir="site/assets" includes="**/solr.svg"/>
+    </copy>
+  </target>
 
   <target name="process-webpages" depends="define-lucene-javadoc-url,resolve-pegdown">
     <makeurl property="process-webpages.buildfiles" separator="|">
@@ -455,7 +470,7 @@
   <target name="prepare-release" depends="prepare-release-no-sign, sign-artifacts"/>
  
   <!-- make a distribution -->
-  <target name="package" depends="package-src-tgz,create-package,-dist-changes,-dist-keys"/>
+  <target name="package" depends="package-src-tgz,create-package,documentation,-dist-changes,-dist-keys"/>
 
   <!-- copy changes/ to the release folder -->
   <target name="-dist-changes">
@@ -545,7 +560,7 @@
       <target name="init-dist"/>
       <target name="dist"/>
       <target name="server"/>
-      <target name="documentation"/>
+      <target name="documentation-online"/>
     </antcall>
     <mkdir dir="${dest}/${fullnamever}"/>
     <delete includeemptydirs="true">
@@ -586,7 +601,7 @@
                             dist/solrj-lib/*
                             dist/test-framework/**"
                   excludes="**/*.tgz **/*.zip **/*.md5 **/*src*.jar **/*docs*.jar **/*.sha1" />
-      <tarfileset dir="${javadoc.dir}"
+      <tarfileset dir="${javadoc-online.dir}"
                   prefix="${fullnamever}/docs" />
     </tar>
     <make-checksums file="${package.dir}/${fullnamever}.tgz"/>

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/894a43b2/solr/common-build.xml
----------------------------------------------------------------------
diff --git a/solr/common-build.xml b/solr/common-build.xml
index 8bf9db7..5e8976c 100644
--- a/solr/common-build.xml
+++ b/solr/common-build.xml
@@ -42,6 +42,7 @@
   <property name="server.dir" location="${common-solr.dir}/server" />
   <property name="example" location="${common-solr.dir}/example" />
   <property name="javadoc.dir" location="${dest}/docs"/>
+  <property name="javadoc-online.dir" location="${dest}/docs-online"/>
   <property name="tests.cleanthreads.sysprop" value="perClass"/>
 
   <property name="changes.target.dir" location="${dest}/docs/changes"/>
@@ -339,7 +340,7 @@
     <groovy><![CDATA[
       String url, version = properties['version'];
       String useLocalJavadocUrl = properties['useLocalJavadocUrl'];
-      if (version.contains('-SNAPSHOT') || Boolean.parseBoolean(useLocalJavadocUrl)) {
+      if (version != properties['version.base'] || Boolean.parseBoolean(useLocalJavadocUrl)) {
         url = new File(properties['common.dir'], 'build' + File.separator + 'docs').toURI().toASCIIString();
         if (!(url =~ /\/$/)) url += '/';
       } else {
@@ -351,6 +352,21 @@
     ]]></groovy>
   </target>
 
+  <target name="define-solr-javadoc-url" depends="resolve-groovy" unless="solr.javadoc.url">
+    <groovy><![CDATA[
+      String url, version = properties['version'];
+      if (version != properties['version.base']) {
+        url = '';
+        task.log('Disabled Solr Javadocs online URL for packaging (custom build / SNAPSHOT version).');
+      } else {
+        version = version.replace('.', '_');
+        url = 'http://lucene.apache.org/solr/' + version + '/';
+        task.log('Using the following URL to refer to Solr Javadocs: ' + url);
+      }
+      properties['solr.javadoc.url'] = url;
+    ]]></groovy>
+  </target>
+
   <target name="jar-src">
     <sequential>
       <mkdir dir="${build.dir}"/>

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/894a43b2/solr/site/online-link.xsl
----------------------------------------------------------------------
diff --git a/solr/site/online-link.xsl b/solr/site/online-link.xsl
new file mode 100644
index 0000000..0f2fd18
--- /dev/null
+++ b/solr/site/online-link.xsl
@@ -0,0 +1,69 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+  Licensed to the Apache Software Foundation (ASF) under one or more
+  contributor license agreements.  See the NOTICE file distributed with
+  this work for additional information regarding copyright ownership.
+  The ASF licenses this file to You under the Apache License, Version 2.0
+  (the "License"); you may not use this file except in compliance with
+  the License.  You may obtain a copy of the License at
+
+      http://www.apache.org/licenses/LICENSE-2.0
+
+  Unless required by applicable law or agreed to in writing, software
+  distributed under the License is distributed on an "AS IS" BASIS,
+  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  See the License for the specific language governing permissions and
+  limitations under the License.
+-->
+<xsl:stylesheet version="1.0" 
+  xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
+>
+  <xsl:param name="version"/>
+  <xsl:param name="solrJavadocUrl"/>
+  
+  <!--
+    NOTE: This template matches the root element of any given input XML document!
+    The XSL input file is ignored completely, but XSL expects one to be given,
+    so build.xml passes itself here. The list of module build.xmls is given via
+    string parameter, that must be splitted at '|'.
+  --> 
+  <xsl:template match="/">
+    <html>
+      <head>
+        <title><xsl:text>Apache Solr </xsl:text><xsl:value-of select="$version"/><xsl:text> Documentation</xsl:text></title>
+        <link rel="icon" type="image/x-icon" href="images/favicon.ico"/>
+        <link rel="shortcut icon" type="image/x-icon" href="images/favicon.ico"/>
+      </head>
+      <body>
+        <div>
+          <a href="http://lucene.apache.org/solr/">
+            <img src="images/solr.svg" style="width:210px; margin:22px 0px 7px 20px; border:none;" title="Apache Solr Logo" alt="Solr" />
+          </a>
+          <div style="z-index:100;position:absolute;top:25px;left:226px">
+            <span style="font-size: x-small">TM</span>
+          </div>
+        </div>
+        <h1>
+          <xsl:text>Apache Solr</xsl:text>
+          <span style="vertical-align: top; font-size: x-small">
+            <xsl:text>TM</xsl:text>
+          </span>
+          <xsl:text> </xsl:text>
+          <xsl:value-of select="$version"/>
+          <xsl:text> Documentation</xsl:text>
+        </h1>
+        <p>
+          <xsl:choose>
+            <xsl:when test="$solrJavadocUrl">
+              <a href="{$solrJavadocUrl}">Follow this link to view online documentation for Solr <xsl:value-of select="$version"/>.</a>
+            </xsl:when>
+            <xsl:otherwise>
+              No online documentation available for custom builds or SNAPSHOT versions. Run <code>ant documentation</code> from <code>src.tgz</code> package to build docs locally.
+            </xsl:otherwise>
+          </xsl:choose>
+        </p>
+      </body>
+    </html>
+  </xsl:template>
+
+</xsl:stylesheet>


[46/50] [abbrv] lucene-solr:jira/solr-9858: Fix UpdateLogTest failure

Posted by ab...@apache.org.
Fix UpdateLogTest failure


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/0b7b1443
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/0b7b1443
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/0b7b1443

Branch: refs/heads/jira/solr-9858
Commit: 0b7b1443c27c9d666a3cca8f683d4b19fbf9ce14
Parents: 34bb7f3
Author: Ishan Chattopadhyaya <is...@apache.org>
Authored: Wed Mar 1 04:18:22 2017 +0530
Committer: Ishan Chattopadhyaya <is...@apache.org>
Committed: Wed Mar 1 04:18:22 2017 +0530

----------------------------------------------------------------------
 solr/core/src/test/org/apache/solr/update/UpdateLogTest.java | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/0b7b1443/solr/core/src/test/org/apache/solr/update/UpdateLogTest.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/update/UpdateLogTest.java b/solr/core/src/test/org/apache/solr/update/UpdateLogTest.java
index e9269b0..8abfe2a 100644
--- a/solr/core/src/test/org/apache/solr/update/UpdateLogTest.java
+++ b/solr/core/src/test/org/apache/solr/update/UpdateLogTest.java
@@ -27,7 +27,7 @@ import org.apache.solr.common.SolrInputDocument;
 import org.apache.solr.handler.component.RealTimeGetComponent;
 import org.apache.solr.request.SolrQueryRequest;
 import org.apache.solr.update.processor.DistributedUpdateProcessor;
-import org.junit.After;
+import org.junit.AfterClass;
 import org.junit.BeforeClass;
 import org.junit.Test;
 
@@ -58,8 +58,8 @@ public class UpdateLogTest extends SolrTestCaseJ4 {
     }
   }
 
-  @After
-  public void after() {
+  @AfterClass
+  public static void afterClass() {
     System.clearProperty("solr.tests.intClassName");
     System.clearProperty("solr.tests.longClassName");
     System.clearProperty("solr.tests.floatClassName");


[09/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10055: Linux installer now renames existing bin/solr.in.* as bin/solr.in.*.orig to avoid wrong resolving.

Posted by ab...@apache.org.
SOLR-10055: Linux installer now renames existing bin/solr.in.* as bin/solr.in.*.orig to avoid wrong resolving.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/1e206d82
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/1e206d82
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/1e206d82

Branch: refs/heads/jira/solr-9858
Commit: 1e206d820ab0a3c080562e056970c77ef5c99f04
Parents: 4e2cf61
Author: Jan H�ydahl <ja...@apache.org>
Authored: Wed Feb 22 23:44:19 2017 +0100
Committer: Jan H�ydahl <ja...@apache.org>
Committed: Wed Feb 22 23:44:19 2017 +0100

----------------------------------------------------------------------
 solr/CHANGES.txt                 | 3 +++
 solr/bin/install_solr_service.sh | 2 ++
 2 files changed, 5 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/1e206d82/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index dcea40c..ed30d53 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -180,6 +180,9 @@ Bug Fixes
 
 * SOLR-9824: Some bulk update paths could be very slow due to CUSC polling. (David Smiley, Mark Miller)
 
+* SOLR-10055: Linux installer now renames existing bin/solr.in.* as bin/solr.in.*.orig to make the installed config in
+  /etc/defaults be the one found by default when launching solr manually. (janhoy)
+
 Optimizations
 ----------------------
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/1e206d82/solr/bin/install_solr_service.sh
----------------------------------------------------------------------
diff --git a/solr/bin/install_solr_service.sh b/solr/bin/install_solr_service.sh
index a23612f..b331870 100755
--- a/solr/bin/install_solr_service.sh
+++ b/solr/bin/install_solr_service.sh
@@ -322,6 +322,8 @@ elif [ -f "/etc/default/$SOLR_SERVICE.in.sh" ]; then
 else
   echo -e "\nInstalling /etc/default/$SOLR_SERVICE.in.sh ...\n"
   cp "$SOLR_INSTALL_DIR/bin/solr.in.sh" "/etc/default/$SOLR_SERVICE.in.sh"
+  mv "$SOLR_INSTALL_DIR/bin/solr.in.sh" "$SOLR_INSTALL_DIR/bin/solr.in.sh.orig"  
+  mv "$SOLR_INSTALL_DIR/bin/solr.in.cmd" "$SOLR_INSTALL_DIR/bin/solr.in.cmd.orig"  
   echo "SOLR_PID_DIR=\"$SOLR_VAR_DIR\"
 SOLR_HOME=\"$SOLR_VAR_DIR/data\"
 LOG4J_PROPS=\"$SOLR_VAR_DIR/log4j.properties\"


[28/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10158: Add support for "preload" option in MMapDirectoryFactory

Posted by ab...@apache.org.
SOLR-10158: Add support for "preload" option in MMapDirectoryFactory


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/ea37b9ae
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/ea37b9ae
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/ea37b9ae

Branch: refs/heads/jira/solr-9858
Commit: ea37b9ae870257c943bdc8c2896f1238a4dc94b6
Parents: 6f3f6a2
Author: Uwe Schindler <us...@apache.org>
Authored: Sat Feb 25 21:15:09 2017 +0100
Committer: Uwe Schindler <us...@apache.org>
Committed: Sat Feb 25 21:15:09 2017 +0100

----------------------------------------------------------------------
 solr/CHANGES.txt                                                 | 3 +++
 .../core/src/java/org/apache/solr/core/MMapDirectoryFactory.java | 4 ++++
 2 files changed, 7 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/ea37b9ae/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index 2b0044c..e06603c 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -134,6 +134,9 @@ New Features
   field must both be stored=false, indexed=false, docValues=true. (Ishan Chattopadhyaya, hossman, noble,
   shalin, yonik)
 
+* SOLR-10158: Add support for "preload" option in MMapDirectoryFactory.
+  (Amrit Sarkar via Uwe Schindler)
+
 Bug Fixes
 ----------------------
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/ea37b9ae/solr/core/src/java/org/apache/solr/core/MMapDirectoryFactory.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/core/MMapDirectoryFactory.java b/solr/core/src/java/org/apache/solr/core/MMapDirectoryFactory.java
index c68ae62..e9fbce7 100644
--- a/solr/core/src/java/org/apache/solr/core/MMapDirectoryFactory.java
+++ b/solr/core/src/java/org/apache/solr/core/MMapDirectoryFactory.java
@@ -35,6 +35,7 @@ import org.slf4j.LoggerFactory;
  * Can set the following parameters:
  * <ul>
  *  <li>unmap -- See {@link MMapDirectory#setUseUnmap(boolean)}</li>
+ *  <li>preload -- See {@link MMapDirectory#setPreload(boolean)}</li>
  *  <li>maxChunkSize -- The Max chunk size.  See {@link MMapDirectory#MMapDirectory(Path, LockFactory, int)}</li>
  * </ul>
  *
@@ -42,6 +43,7 @@ import org.slf4j.LoggerFactory;
 public class MMapDirectoryFactory extends StandardDirectoryFactory {
   private static final Logger log = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
   boolean unmapHack;
+  boolean preload;
   private int maxChunk;
 
   @Override
@@ -53,6 +55,7 @@ public class MMapDirectoryFactory extends StandardDirectoryFactory {
       throw new IllegalArgumentException("maxChunk must be greater than 0");
     }
     unmapHack = params.getBool("unmap", true);
+    preload = params.getBool("preload", false); //default turn-off
   }
 
   @Override
@@ -64,6 +67,7 @@ public class MMapDirectoryFactory extends StandardDirectoryFactory {
     } catch (IllegalArgumentException e) {
       log.warn("Unmap not supported on this JVM, continuing on without setting unmap", e);
     }
+    mapDirectory.setPreload(preload);
     return mapDirectory;
   }
   


[45/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10214: clean up BlockCache Metrics, add storeFails and counts

Posted by ab...@apache.org.
SOLR-10214: clean up BlockCache Metrics, add storeFails and counts


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/34bb7f31
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/34bb7f31
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/34bb7f31

Branch: refs/heads/jira/solr-9858
Commit: 34bb7f31e546856094cb378b9d12c9ac7540e7e2
Parents: 2adc11c
Author: yonik <yo...@apache.org>
Authored: Tue Feb 28 12:35:13 2017 -0500
Committer: yonik <yo...@apache.org>
Committed: Tue Feb 28 12:36:08 2017 -0500

----------------------------------------------------------------------
 solr/CHANGES.txt                                |   4 +
 .../solr/store/blockcache/BlockCache.java       |   6 +
 .../store/blockcache/BlockDirectoryCache.java   |   5 -
 .../apache/solr/store/blockcache/Metrics.java   | 118 ++++++++-----------
 4 files changed, 62 insertions(+), 71 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/34bb7f31/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index 07f1c4e..47f190b 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -256,6 +256,10 @@ Other Changes
 
 * SOLR-7453: Remove replication & backup scripts in the solr/scripts directory of the checkout (Varun Thacker)
 
+* SOLR-10214: Remove unused HDFS BlockCache metrics and add storeFails, as well as adding total
+  counts for lookups, hits, and evictions. (yonik)
+  
+
 ==================  6.4.2 ==================
 
 Consult the LUCENE_CHANGES.txt file for additional, low level, changes in this release.

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/34bb7f31/solr/core/src/java/org/apache/solr/store/blockcache/BlockCache.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/store/blockcache/BlockCache.java b/solr/core/src/java/org/apache/solr/store/blockcache/BlockCache.java
index b774782..ad5b2f4 100644
--- a/solr/core/src/java/org/apache/solr/store/blockcache/BlockCache.java
+++ b/solr/core/src/java/org/apache/solr/store/blockcache/BlockCache.java
@@ -133,6 +133,7 @@ public class BlockCache {
         // YCS: it looks like when the cache is full (a normal scenario), then two concurrent writes will result in one of them failing
         // because no eviction is done first.  The code seems to rely on leaving just a single block empty.
         // TODO: simplest fix would be to leave more than one block empty
+        metrics.blockCacheStoreFail.incrementAndGet();
         return false;
       }
     } else {
@@ -141,6 +142,8 @@ public class BlockCache {
       // purpose (and then our write may overwrite that).  This can happen even if clients never try to update existing blocks,
       // since two clients can try to cache the same block concurrently.  Because of this, the ability to update an existing
       // block has been removed for the time being (see SOLR-10121).
+
+      // No metrics to update: we don't count a redundant store as a store fail.
       return false;
     }
 
@@ -168,6 +171,7 @@ public class BlockCache {
       int blockOffset, int off, int length) {
     BlockCacheLocation location = cache.getIfPresent(blockCacheKey);
     if (location == null) {
+      metrics.blockCacheMiss.incrementAndGet();
       return false;
     }
 
@@ -181,9 +185,11 @@ public class BlockCache {
     if (location.isRemoved()) {
       // must check *after* the read is done since the bank may have been reused for another block
       // before or during the read.
+      metrics.blockCacheMiss.incrementAndGet();
       return false;
     }
 
+    metrics.blockCacheHit.incrementAndGet();
     return true;
   }
   

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/34bb7f31/solr/core/src/java/org/apache/solr/store/blockcache/BlockDirectoryCache.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/store/blockcache/BlockDirectoryCache.java b/solr/core/src/java/org/apache/solr/store/blockcache/BlockDirectoryCache.java
index e8a9f43..6e999d5 100644
--- a/solr/core/src/java/org/apache/solr/store/blockcache/BlockDirectoryCache.java
+++ b/solr/core/src/java/org/apache/solr/store/blockcache/BlockDirectoryCache.java
@@ -104,11 +104,6 @@ public class BlockDirectoryCache implements Cache {
     blockCacheKey.setFile(file);
     boolean fetch = blockCache.fetch(blockCacheKey, b, blockOffset, off,
         lengthToReadInBlock);
-    if (fetch) {
-      metrics.blockCacheHit.incrementAndGet();
-    } else {
-      metrics.blockCacheMiss.incrementAndGet();
-    }
     return fetch;
   }
   

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/34bb7f31/solr/core/src/java/org/apache/solr/store/blockcache/Metrics.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/store/blockcache/Metrics.java b/solr/core/src/java/org/apache/solr/store/blockcache/Metrics.java
index cfab89e..3dc8947 100644
--- a/solr/core/src/java/org/apache/solr/store/blockcache/Metrics.java
+++ b/solr/core/src/java/org/apache/solr/store/blockcache/Metrics.java
@@ -33,92 +33,78 @@ import org.apache.solr.search.SolrCacheBase;
  * @lucene.experimental
  */
 public class Metrics extends SolrCacheBase implements SolrInfoMBean {
-  
-  public static class MethodCall {
-    public AtomicLong invokes = new AtomicLong();
-    public AtomicLong times = new AtomicLong();
-  }
 
+
+  public AtomicLong blockCacheSize = new AtomicLong(0);
   public AtomicLong blockCacheHit = new AtomicLong(0);
   public AtomicLong blockCacheMiss = new AtomicLong(0);
   public AtomicLong blockCacheEviction = new AtomicLong(0);
-  public AtomicLong blockCacheSize = new AtomicLong(0);
-  public AtomicLong rowReads = new AtomicLong(0);
-  public AtomicLong rowWrites = new AtomicLong(0);
-  public AtomicLong recordReads = new AtomicLong(0);
-  public AtomicLong recordWrites = new AtomicLong(0);
-  public AtomicLong queriesExternal = new AtomicLong(0);
-  public AtomicLong queriesInternal = new AtomicLong(0);
+  public AtomicLong blockCacheStoreFail = new AtomicLong(0);
+
+  // since the last call
+  private AtomicLong blockCacheHit_last = new AtomicLong(0);
+  private AtomicLong blockCacheMiss_last = new AtomicLong(0);
+  private AtomicLong blockCacheEviction_last = new AtomicLong(0);
+  public AtomicLong blockCacheStoreFail_last = new AtomicLong(0);
+
+
+  // These are used by the BufferStore (just a generic cache of byte[]).
+  // TODO: If this (the Store) is a good idea, we should make it more general and use it across more places in Solr.
   public AtomicLong shardBuffercacheAllocate = new AtomicLong(0);
   public AtomicLong shardBuffercacheLost = new AtomicLong(0);
-  public Map<String,MethodCall> methodCalls = new ConcurrentHashMap<>();
-  
-  public AtomicLong tableCount = new AtomicLong(0);
-  public AtomicLong rowCount = new AtomicLong(0);
-  public AtomicLong recordCount = new AtomicLong(0);
-  public AtomicLong indexCount = new AtomicLong(0);
-  public AtomicLong indexMemoryUsage = new AtomicLong(0);
-  public AtomicLong segmentCount = new AtomicLong(0);
+
 
   private long previous = System.nanoTime();
 
-  public static void main(String[] args) throws InterruptedException {
-    Metrics metrics = new Metrics();
-    MethodCall methodCall = new MethodCall();
-    metrics.methodCalls.put("test", methodCall);
-    for (int i = 0; i < 100; i++) {
-      metrics.blockCacheHit.incrementAndGet();
-      metrics.blockCacheMiss.incrementAndGet();
-      methodCall.invokes.incrementAndGet();
-      methodCall.times.addAndGet(56000000);
-      Thread.sleep(500);
-    }
-  }
 
   public NamedList<Number> getStatistics() {
     NamedList<Number> stats = new SimpleOrderedMap<>(21); // room for one method call before growing
-    
+
     long now = System.nanoTime();
-    float seconds = (now - previous) / 1000000000.0f;
-    
-    long hits = blockCacheHit.getAndSet(0);
-    long lookups = hits + blockCacheMiss.getAndSet(0);
-    
-    stats.add("lookups", getPerSecond(lookups, seconds));
-    stats.add("hits", getPerSecond(hits, seconds));
-    stats.add("hitratio", calcHitRatio(lookups, hits));
-    stats.add("evictions", getPerSecond(blockCacheEviction.getAndSet(0), seconds));
+    long delta = Math.max(now - previous, 1);
+    double seconds = delta / 1000000000.0;
+
+    long hits_total = blockCacheHit.get();
+    long hits_delta = hits_total - blockCacheHit_last.get();
+    blockCacheHit_last.set(hits_total);
+
+    long miss_total = blockCacheMiss.get();
+    long miss_delta = miss_total - blockCacheMiss_last.get();
+    blockCacheMiss_last.set(miss_total);
+
+    long evict_total = blockCacheEviction.get();
+    long evict_delta = evict_total - blockCacheEviction_last.get();
+    blockCacheEviction_last.set(evict_total);
+
+    long storeFail_total = blockCacheStoreFail.get();
+    long storeFail_delta = storeFail_total - blockCacheStoreFail_last.get();
+    blockCacheStoreFail_last.set(storeFail_total);
+
+    long lookups_delta = hits_delta + miss_delta;
+    long lookups_total = hits_total + miss_total;
+
     stats.add("size", blockCacheSize.get());
-    stats.add("row.reads", getPerSecond(rowReads.getAndSet(0), seconds));
-    stats.add("row.writes", getPerSecond(rowWrites.getAndSet(0), seconds));
-    stats.add("record.reads", getPerSecond(recordReads.getAndSet(0), seconds));
-    stats.add("record.writes", getPerSecond(recordWrites.getAndSet(0), seconds));
-    stats.add("query.external", getPerSecond(queriesExternal.getAndSet(0), seconds));
-    stats.add("query.internal", getPerSecond(queriesInternal.getAndSet(0), seconds));
+    stats.add("lookups", lookups_total);
+    stats.add("hits", hits_total);
+    stats.add("evictions", evict_total);
+    stats.add("storeFails", storeFail_total);
+    stats.add("hitratio_current", calcHitRatio(lookups_delta, hits_delta));  // hit ratio since the last call
+    stats.add("lookups_persec", getPerSecond(lookups_delta, seconds)); // lookups per second since the last call
+    stats.add("hits_persec", getPerSecond(hits_delta, seconds));       // hits per second since the last call
+    stats.add("evictions_persec", getPerSecond(evict_delta, seconds));  // evictions per second since the last call
+    stats.add("storeFails_persec", getPerSecond(storeFail_delta, seconds));  // evictions per second since the last call
+    stats.add("time_delta", seconds);  // seconds since last call
+
+    // TODO: these aren't really related to the BlockCache
     stats.add("buffercache.allocations", getPerSecond(shardBuffercacheAllocate.getAndSet(0), seconds));
     stats.add("buffercache.lost", getPerSecond(shardBuffercacheLost.getAndSet(0), seconds));
-    for (Entry<String,MethodCall> entry : methodCalls.entrySet()) {
-      String key = entry.getKey();
-      MethodCall value = entry.getValue();
-      long invokes = value.invokes.getAndSet(0);
-      long times = value.times.getAndSet(0);
-      
-      float avgTimes = (times / (float) invokes) / 1000000000.0f;
-      stats.add("methodcalls." + key + ".count", getPerSecond(invokes, seconds));
-      stats.add("methodcalls." + key + ".time", avgTimes);
-    }
-    stats.add("tables", tableCount.get());
-    stats.add("rows", rowCount.get());
-    stats.add("records", recordCount.get());
-    stats.add("index.count", indexCount.get());
-    stats.add("index.memoryusage", indexMemoryUsage.get());
-    stats.add("index.segments", segmentCount.get());
+
     previous = now;
-    
+
     return stats;
   }
 
-  private float getPerSecond(long value, float seconds) {
+  private float getPerSecond(long value, double seconds) {
     return (float) (value / seconds);
   }
 


[49/50] [abbrv] lucene-solr:jira/solr-9858: LUCENE-7717: UnifiedHighlighter and PostingsHighlighter bug in PrefixQuery and TermRangeQuery for multi-byte text

Posted by ab...@apache.org.
LUCENE-7717: UnifiedHighlighter and PostingsHighlighter bug in PrefixQuery and TermRangeQuery for multi-byte text


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/ec13032a
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/ec13032a
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/ec13032a

Branch: refs/heads/jira/solr-9858
Commit: ec13032a948a29f69d50d41e4859fd38ed5ca377
Parents: 0baf2fa
Author: David Smiley <ds...@apache.org>
Authored: Wed Mar 1 01:38:54 2017 -0500
Committer: David Smiley <ds...@apache.org>
Committed: Wed Mar 1 01:38:54 2017 -0500

----------------------------------------------------------------------
 lucene/CHANGES.txt                              |  4 +++
 .../MultiTermHighlighting.java                  | 20 ++++++-------
 .../uhighlight/MultiTermHighlighting.java       | 20 ++++++-------
 .../uhighlight/TestUnifiedHighlighterMTQ.java   | 30 ++++++++++++++++----
 4 files changed, 49 insertions(+), 25 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/ec13032a/lucene/CHANGES.txt
----------------------------------------------------------------------
diff --git a/lucene/CHANGES.txt b/lucene/CHANGES.txt
index 6026654..7d8e363 100644
--- a/lucene/CHANGES.txt
+++ b/lucene/CHANGES.txt
@@ -257,6 +257,10 @@ Bug Fixes
 * LUCENE-7676: Fixed FilterCodecReader to override more super-class methods.
   Also added TestFilterCodecReader class. (Christine Poerschke)
 
+* LUCENE-7717: The UnifiedHighlighter and PostingsHighlighter were not highlighting
+  prefix queries with multi-byte characters. TermRangeQuery is affected too.
+  (Dmitry Malinin, David Smiley)
+
 ======================= Lucene 6.4.1 =======================
 
 Build

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/ec13032a/lucene/highlighter/src/java/org/apache/lucene/search/postingshighlight/MultiTermHighlighting.java
----------------------------------------------------------------------
diff --git a/lucene/highlighter/src/java/org/apache/lucene/search/postingshighlight/MultiTermHighlighting.java b/lucene/highlighter/src/java/org/apache/lucene/search/postingshighlight/MultiTermHighlighting.java
index 56345c2..c9733d3 100644
--- a/lucene/highlighter/src/java/org/apache/lucene/search/postingshighlight/MultiTermHighlighting.java
+++ b/lucene/highlighter/src/java/org/apache/lucene/search/postingshighlight/MultiTermHighlighting.java
@@ -87,16 +87,6 @@ class MultiTermHighlighting {
       list.addAll(Arrays.asList(extractAutomata(((SpanPositionCheckQuery) query).getMatch(), field)));
     } else if (query instanceof SpanMultiTermQueryWrapper) {
       list.addAll(Arrays.asList(extractAutomata(((SpanMultiTermQueryWrapper<?>) query).getWrappedQuery(), field)));
-    } else if (query instanceof AutomatonQuery) {
-      final AutomatonQuery aq = (AutomatonQuery) query;
-      if (aq.getField().equals(field)) {
-        list.add(new CharacterRunAutomaton(aq.getAutomaton()) {
-          @Override
-          public String toString() {
-            return aq.toString();
-          }
-        });
-      }
     } else if (query instanceof PrefixQuery) {
       final PrefixQuery pq = (PrefixQuery) query;
       Term prefix = pq.getPrefix();
@@ -182,6 +172,16 @@ class MultiTermHighlighting {
           }
         });
       }
+    } else if (query instanceof AutomatonQuery) {
+      final AutomatonQuery aq = (AutomatonQuery) query;
+      if (aq.getField().equals(field)) {
+        list.add(new CharacterRunAutomaton(aq.getAutomaton()) {
+          @Override
+          public String toString() {
+            return aq.toString();
+          }
+        });
+      }
     }
     return list.toArray(new CharacterRunAutomaton[list.size()]);
   }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/ec13032a/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/MultiTermHighlighting.java
----------------------------------------------------------------------
diff --git a/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/MultiTermHighlighting.java b/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/MultiTermHighlighting.java
index 267d603..89403d5 100644
--- a/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/MultiTermHighlighting.java
+++ b/lucene/highlighter/src/java/org/apache/lucene/search/uhighlight/MultiTermHighlighting.java
@@ -100,16 +100,6 @@ class MultiTermHighlighting {
     } else if (lookInSpan && query instanceof SpanMultiTermQueryWrapper) {
       list.addAll(Arrays.asList(extractAutomata(((SpanMultiTermQueryWrapper<?>) query).getWrappedQuery(),
           fieldMatcher, lookInSpan, preRewriteFunc)));
-    } else if (query instanceof AutomatonQuery) {
-      final AutomatonQuery aq = (AutomatonQuery) query;
-      if (fieldMatcher.test(aq.getField())) {
-        list.add(new CharacterRunAutomaton(aq.getAutomaton()) {
-          @Override
-          public String toString() {
-            return aq.toString();
-          }
-        });
-      }
     } else if (query instanceof PrefixQuery) {
       final PrefixQuery pq = (PrefixQuery) query;
       Term prefix = pq.getPrefix();
@@ -197,6 +187,16 @@ class MultiTermHighlighting {
           }
         });
       }
+    } else if (query instanceof AutomatonQuery) {
+      final AutomatonQuery aq = (AutomatonQuery) query;
+      if (fieldMatcher.test(aq.getField())) {
+        list.add(new CharacterRunAutomaton(aq.getAutomaton()) {
+          @Override
+          public String toString() {
+            return aq.toString();
+          }
+        });
+      }
     }
     return list.toArray(new CharacterRunAutomaton[list.size()]);
   }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/ec13032a/lucene/highlighter/src/test/org/apache/lucene/search/uhighlight/TestUnifiedHighlighterMTQ.java
----------------------------------------------------------------------
diff --git a/lucene/highlighter/src/test/org/apache/lucene/search/uhighlight/TestUnifiedHighlighterMTQ.java b/lucene/highlighter/src/test/org/apache/lucene/search/uhighlight/TestUnifiedHighlighterMTQ.java
index 10f36a7..4a4b7ed 100644
--- a/lucene/highlighter/src/test/org/apache/lucene/search/uhighlight/TestUnifiedHighlighterMTQ.java
+++ b/lucene/highlighter/src/test/org/apache/lucene/search/uhighlight/TestUnifiedHighlighterMTQ.java
@@ -29,6 +29,7 @@ import org.apache.lucene.analysis.MockAnalyzer;
 import org.apache.lucene.analysis.MockTokenizer;
 import org.apache.lucene.analysis.TokenStream;
 import org.apache.lucene.analysis.Tokenizer;
+import org.apache.lucene.analysis.standard.StandardAnalyzer;
 import org.apache.lucene.document.Document;
 import org.apache.lucene.document.Field;
 import org.apache.lucene.document.FieldType;
@@ -668,10 +669,11 @@ public class TestUnifiedHighlighterMTQ extends LuceneTestCase {
 
     IndexSearcher searcher = newSearcher(ir);
     UnifiedHighlighter highlighter = new UnifiedHighlighter(searcher, indexAnalyzer);
+    // use a variety of common MTQ types
     BooleanQuery query = new BooleanQuery.Builder()
-        .add(new WildcardQuery(new Term("body", "te*")), BooleanClause.Occur.SHOULD)
-        .add(new WildcardQuery(new Term("body", "one")), BooleanClause.Occur.SHOULD)
-        .add(new WildcardQuery(new Term("body", "se*")), BooleanClause.Occur.SHOULD)
+        .add(new PrefixQuery(new Term("body", "te")), BooleanClause.Occur.SHOULD)
+        .add(new WildcardQuery(new Term("body", "*one*")), BooleanClause.Occur.SHOULD)
+        .add(new FuzzyQuery(new Term("body", "zentence~")), BooleanClause.Occur.SHOULD)
         .build();
     TopDocs topDocs = searcher.search(query, 10, Sort.INDEXORDER);
     assertEquals(1, topDocs.totalHits);
@@ -732,8 +734,7 @@ public class TestUnifiedHighlighterMTQ extends LuceneTestCase {
     snippets = highlighter.highlight("body", query, topDocs);
     assertEquals(1, snippets.length);
 
-    // Default formatter bolds each hit:
-    assertEquals("<b>Test(body:te*)</b> a <b>one(body:one)</b> <b>sentence(body:se*)</b> document.", snippets[0]);
+    assertEquals("<b>Test(body:te*)</b> a <b>one(body:*one*)</b> <b>sentence(body:zentence~~2)</b> document.", snippets[0]);
 
     ir.close();
   }
@@ -1054,4 +1055,23 @@ public class TestUnifiedHighlighterMTQ extends LuceneTestCase {
     }
   }
 
+  // LUCENE-7717 bug, ordering of MTQ AutomatonQuery detection
+  public void testRussianPrefixQuery() throws IOException {
+    Analyzer analyzer = new StandardAnalyzer();
+    RandomIndexWriter iw = new RandomIndexWriter(random(), dir, analyzer);
+    String field = "title";
+    Document doc = new Document();
+    doc.add(new Field(field, "\u044f", fieldType)); // Russian char; uses 2 UTF8 bytes
+    iw.addDocument(doc);
+    IndexReader ir = iw.getReader();
+    iw.close();
+
+    IndexSearcher searcher = newSearcher(ir);
+    Query query = new PrefixQuery(new Term(field, "\u044f"));
+    TopDocs topDocs = searcher.search(query, 1);
+    UnifiedHighlighter highlighter = new UnifiedHighlighter(searcher, analyzer);
+    String[] snippets = highlighter.highlight(field, query, topDocs);
+    assertEquals("[<b>\u044f</b>]", Arrays.toString(snippets));
+    ir.close();
+  }
 }


[39/50] [abbrv] lucene-solr:jira/solr-9858: LUCENE-7716: Reduce specialization in TopFieldCollector.

Posted by ab...@apache.org.
LUCENE-7716: Reduce specialization in TopFieldCollector.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/8e65aca0
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/8e65aca0
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/8e65aca0

Branch: refs/heads/jira/solr-9858
Commit: 8e65aca0e1e08c8f3e3d53e2561b8cd09a5e1a22
Parents: c7fd143
Author: Adrien Grand <jp...@gmail.com>
Authored: Tue Feb 28 13:38:55 2017 +0100
Committer: Adrien Grand <jp...@gmail.com>
Committed: Tue Feb 28 14:46:44 2017 +0100

----------------------------------------------------------------------
 .../lucene/search/MultiLeafFieldComparator.java |  92 ++++++++
 .../apache/lucene/search/TopFieldCollector.java | 212 +++++--------------
 2 files changed, 141 insertions(+), 163 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/8e65aca0/lucene/core/src/java/org/apache/lucene/search/MultiLeafFieldComparator.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/search/MultiLeafFieldComparator.java b/lucene/core/src/java/org/apache/lucene/search/MultiLeafFieldComparator.java
new file mode 100644
index 0000000..5fdb87d
--- /dev/null
+++ b/lucene/core/src/java/org/apache/lucene/search/MultiLeafFieldComparator.java
@@ -0,0 +1,92 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.lucene.search;
+
+import java.io.IOException;
+
+final class MultiLeafFieldComparator implements LeafFieldComparator {
+
+  private final LeafFieldComparator[] comparators;
+  private final int[] reverseMul;
+  // we extract the first comparator to avoid array access in the common case
+  // that the first comparator compares worse than the bottom entry in the queue
+  private final LeafFieldComparator firstComparator;
+  private final int firstReverseMul;
+
+  MultiLeafFieldComparator(LeafFieldComparator[] comparators, int[] reverseMul) {
+    if (comparators.length != reverseMul.length) {
+      throw new IllegalArgumentException("Must have the same number of comparators and reverseMul, got "
+          + comparators.length + " and " + reverseMul.length);
+    }
+    this.comparators = comparators;
+    this.reverseMul = reverseMul;
+    this.firstComparator = comparators[0];
+    this.firstReverseMul = reverseMul[0];
+  }
+
+  @Override
+  public void setBottom(int slot) throws IOException {
+    for (LeafFieldComparator comparator : comparators) {
+      comparator.setBottom(slot);
+    }
+  }
+
+  @Override
+  public int compareBottom(int doc) throws IOException {
+    int cmp = firstReverseMul * firstComparator.compareBottom(doc);
+    if (cmp != 0) {
+      return cmp;
+    }
+    for (int i = 1; i < comparators.length; ++i) {
+      cmp = reverseMul[i] * comparators[i].compareBottom(doc);
+      if (cmp != 0) {
+        return cmp;
+      }
+    }
+    return 0;
+  }
+
+  @Override
+  public int compareTop(int doc) throws IOException {
+    int cmp = firstReverseMul * firstComparator.compareTop(doc);
+    if (cmp != 0) {
+      return cmp;
+    }
+    for (int i = 1; i < comparators.length; ++i) {
+      cmp = reverseMul[i] * comparators[i].compareTop(doc);
+      if (cmp != 0) {
+        return cmp;
+      }
+    }
+    return 0;
+  }
+
+  @Override
+  public void copy(int slot, int doc) throws IOException {
+    for (LeafFieldComparator comparator : comparators) {
+      comparator.copy(slot, doc);
+    }
+  }
+
+  @Override
+  public void setScorer(Scorer scorer) throws IOException {
+    for (LeafFieldComparator comparator : comparators) {
+      comparator.setScorer(scorer);
+    }
+  }
+
+}

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/8e65aca0/lucene/core/src/java/org/apache/lucene/search/TopFieldCollector.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/search/TopFieldCollector.java b/lucene/core/src/java/org/apache/lucene/search/TopFieldCollector.java
index 3433906..1ec322f 100644
--- a/lucene/core/src/java/org/apache/lucene/search/TopFieldCollector.java
+++ b/lucene/core/src/java/org/apache/lucene/search/TopFieldCollector.java
@@ -39,16 +39,21 @@ public abstract class TopFieldCollector extends TopDocsCollector<Entry> {
   // always compare lower than a real hit; this would
   // save having to check queueFull on each insert
 
-  private static abstract class OneComparatorLeafCollector implements LeafCollector {
+  private static abstract class MultiComparatorLeafCollector implements LeafCollector {
 
     final LeafFieldComparator comparator;
     final int reverseMul;
     final boolean mayNeedScoresTwice;
     Scorer scorer;
 
-    OneComparatorLeafCollector(LeafFieldComparator comparator, int reverseMul, boolean mayNeedScoresTwice) {
-      this.comparator = comparator;
-      this.reverseMul = reverseMul;
+    MultiComparatorLeafCollector(LeafFieldComparator[] comparators, int[] reverseMul, boolean mayNeedScoresTwice) {
+      if (comparators.length == 1) {
+        this.reverseMul = reverseMul[0];
+        this.comparator = comparators[0];
+      } else {
+        this.reverseMul = 1;
+        this.comparator = new MultiLeafFieldComparator(comparators, reverseMul);
+      }
       this.mayNeedScoresTwice = mayNeedScoresTwice;
     }
 
@@ -57,77 +62,8 @@ public abstract class TopFieldCollector extends TopDocsCollector<Entry> {
       if (mayNeedScoresTwice && scorer instanceof ScoreCachingWrappingScorer == false) {
         scorer = new ScoreCachingWrappingScorer(scorer);
       }
-      this.scorer = scorer;
       comparator.setScorer(scorer);
-    }
-  }
-
-  private static abstract class MultiComparatorLeafCollector implements LeafCollector {
-
-    final LeafFieldComparator[] comparators;
-    final int[] reverseMul;
-    final LeafFieldComparator firstComparator;
-    final int firstReverseMul;
-    final boolean mayNeedScoresTwice;
-    Scorer scorer;
-
-    MultiComparatorLeafCollector(LeafFieldComparator[] comparators, int[] reverseMul, boolean mayNeedScoresTwice) {
-      this.comparators = comparators;
-      this.reverseMul = reverseMul;
-      firstComparator = comparators[0];
-      firstReverseMul = reverseMul[0];
-      this.mayNeedScoresTwice = mayNeedScoresTwice;
-    }
-
-    protected final int compareBottom(int doc) throws IOException {
-      int cmp = firstReverseMul * firstComparator.compareBottom(doc);
-      if (cmp != 0) {
-        return cmp;
-      }
-      for (int i = 1; i < comparators.length; ++i) {
-        cmp = reverseMul[i] * comparators[i].compareBottom(doc);
-        if (cmp != 0) {
-          return cmp;
-        }
-      }
-      return 0;
-    }
-
-    protected final void copy(int slot, int doc) throws IOException {
-      for (LeafFieldComparator comparator : comparators) {
-        comparator.copy(slot, doc);
-      }
-    }
-
-    protected final void setBottom(int slot) throws IOException {
-      for (LeafFieldComparator comparator : comparators) {
-        comparator.setBottom(slot);
-      }
-    }
-
-    protected final int compareTop(int doc) throws IOException {
-      int cmp = firstReverseMul * firstComparator.compareTop(doc);
-      if (cmp != 0) {
-        return cmp;
-      }
-      for (int i = 1; i < comparators.length; ++i) {
-        cmp = reverseMul[i] * comparators[i].compareTop(doc);
-        if (cmp != 0) {
-          return cmp;
-        }
-      }
-      return 0;
-    }
-
-    @Override
-    public void setScorer(Scorer scorer) throws IOException {
       this.scorer = scorer;
-      if (mayNeedScoresTwice && scorer instanceof ScoreCachingWrappingScorer == false) {
-        scorer = new ScoreCachingWrappingScorer(scorer);
-      }
-      for (LeafFieldComparator comparator : comparators) {
-        comparator.setScorer(scorer);
-      }
     }
   }
 
@@ -164,103 +100,53 @@ public abstract class TopFieldCollector extends TopDocsCollector<Entry> {
       final LeafFieldComparator[] comparators = queue.getComparators(context);
       final int[] reverseMul = queue.getReverseMul();
 
-      if (comparators.length == 1) {
-        return new OneComparatorLeafCollector(comparators[0], reverseMul[0], mayNeedScoresTwice) {
+      return new MultiComparatorLeafCollector(comparators, reverseMul, mayNeedScoresTwice) {
 
-          @Override
-          public void collect(int doc) throws IOException {
-            float score = Float.NaN;
-            if (trackMaxScore) {
-              score = scorer.score();
-              if (score > maxScore) {
-                maxScore = score;
-              }
+        @Override
+        public void collect(int doc) throws IOException {
+          float score = Float.NaN;
+          if (trackMaxScore) {
+            score = scorer.score();
+            if (score > maxScore) {
+              maxScore = score;
             }
+          }
 
-            ++totalHits;
-            if (queueFull) {
-              if (reverseMul * comparator.compareBottom(doc) <= 0) {
-                // since docs are visited in doc Id order, if compare is 0, it means
-                // this document is largest than anything else in the queue, and
-                // therefore not competitive.
-                return;
-              }
-
-              if (trackDocScores && !trackMaxScore) {
-                score = scorer.score();
-              }
-
-              // This hit is competitive - replace bottom element in queue & adjustTop
-              comparator.copy(bottom.slot, doc);
-              updateBottom(doc, score);
-              comparator.setBottom(bottom.slot);
-            } else {
-              // Startup transient: queue hasn't gathered numHits yet
-              final int slot = totalHits - 1;
-
-              if (trackDocScores && !trackMaxScore) {
-                score = scorer.score();
-              }
-
-              // Copy hit into queue
-              comparator.copy(slot, doc);
-              add(slot, doc, score);
-              if (queueFull) {
-                comparator.setBottom(bottom.slot);
-              }
+          ++totalHits;
+          if (queueFull) {
+            if (reverseMul * comparator.compareBottom(doc) <= 0) {
+              // since docs are visited in doc Id order, if compare is 0, it means
+              // this document is largest than anything else in the queue, and
+              // therefore not competitive.
+              return;
             }
-          }
 
-        };
-      } else {
-        return new MultiComparatorLeafCollector(comparators, reverseMul, mayNeedScoresTwice) {
+            if (trackDocScores && !trackMaxScore) {
+              score = scorer.score();
+            }
 
-          @Override
-          public void collect(int doc) throws IOException {
-            float score = Float.NaN;
-            if (trackMaxScore) {
+            // This hit is competitive - replace bottom element in queue & adjustTop
+            comparator.copy(bottom.slot, doc);
+            updateBottom(doc, score);
+            comparator.setBottom(bottom.slot);
+          } else {
+            // Startup transient: queue hasn't gathered numHits yet
+            final int slot = totalHits - 1;
+
+            if (trackDocScores && !trackMaxScore) {
               score = scorer.score();
-              if (score > maxScore) {
-                maxScore = score;
-              }
             }
 
-            ++totalHits;
+            // Copy hit into queue
+            comparator.copy(slot, doc);
+            add(slot, doc, score);
             if (queueFull) {
-              if (compareBottom(doc) <= 0) {
-                // since docs are visited in doc Id order, if compare is 0, it means
-                // this document is largest than anything else in the queue, and
-                // therefore not competitive.
-                return;
-              }
-
-              if (trackDocScores && !trackMaxScore) {
-                score = scorer.score();
-              }
-
-              // This hit is competitive - replace bottom element in queue & adjustTop
-              copy(bottom.slot, doc);
-              updateBottom(doc, score);
-              setBottom(bottom.slot);
-            } else {
-              // Startup transient: queue hasn't gathered numHits yet
-              final int slot = totalHits - 1;
-
-              if (trackDocScores && !trackMaxScore) {
-                score = scorer.score();
-              }
-
-              // Copy hit into queue
-              copy(slot, doc);
-              add(slot, doc, score);
-              if (queueFull) {
-                setBottom(bottom.slot);
-              }
+              comparator.setBottom(bottom.slot);
             }
           }
+        }
 
-        };
-      }
+      };
     }
 
   }
@@ -321,14 +207,14 @@ public abstract class TopFieldCollector extends TopDocsCollector<Entry> {
           if (queueFull) {
             // Fastmatch: return if this hit is no better than
             // the worst hit currently in the queue:
-            final int cmp = compareBottom(doc);
+            final int cmp = reverseMul * comparator.compareBottom(doc);
             if (cmp <= 0) {
               // not competitive since documents are visited in doc id order
               return;
             }
           }
 
-          final int topCmp = compareTop(doc);
+          final int topCmp = reverseMul * comparator.compareTop(doc);
           if (topCmp > 0 || (topCmp == 0 && doc <= afterDoc)) {
             // Already collected on a previous page
             return;
@@ -336,7 +222,7 @@ public abstract class TopFieldCollector extends TopDocsCollector<Entry> {
 
           if (queueFull) {
             // This hit is competitive - replace bottom element in queue & adjustTop
-            copy(bottom.slot, doc);
+            comparator.copy(bottom.slot, doc);
 
             // Compute score only if it is competitive.
             if (trackDocScores && !trackMaxScore) {
@@ -344,7 +230,7 @@ public abstract class TopFieldCollector extends TopDocsCollector<Entry> {
             }
             updateBottom(doc, score);
 
-            setBottom(bottom.slot);
+            comparator.setBottom(bottom.slot);
           } else {
             collectedHits++;
 
@@ -352,7 +238,7 @@ public abstract class TopFieldCollector extends TopDocsCollector<Entry> {
             final int slot = collectedHits - 1;
             //System.out.println("    slot=" + slot);
             // Copy hit into queue
-            copy(slot, doc);
+            comparator.copy(slot, doc);
 
             // Compute score only if it is competitive.
             if (trackDocScores && !trackMaxScore) {
@@ -361,7 +247,7 @@ public abstract class TopFieldCollector extends TopDocsCollector<Entry> {
             bottom = pq.add(new Entry(slot, docBase + doc, score));
             queueFull = collectedHits == numHits;
             if (queueFull) {
-              setBottom(bottom.slot);
+              comparator.setBottom(bottom.slot);
             }
           }
         }


[26/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10190: Fixed assert message

Posted by ab...@apache.org.
SOLR-10190: Fixed assert message


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/99e8ef23
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/99e8ef23
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/99e8ef23

Branch: refs/heads/jira/solr-9858
Commit: 99e8ef2304b67712d45a2393e649c5319aaac972
Parents: 39887b8
Author: Tomas Fernandez Lobbe <tf...@apache.org>
Authored: Fri Feb 24 17:37:44 2017 -0800
Committer: Tomas Fernandez Lobbe <tf...@apache.org>
Committed: Fri Feb 24 17:37:44 2017 -0800

----------------------------------------------------------------------
 .../org/apache/solr/client/solrj/impl/CloudSolrClientTest.java     | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/99e8ef23/solr/solrj/src/test/org/apache/solr/client/solrj/impl/CloudSolrClientTest.java
----------------------------------------------------------------------
diff --git a/solr/solrj/src/test/org/apache/solr/client/solrj/impl/CloudSolrClientTest.java b/solr/solrj/src/test/org/apache/solr/client/solrj/impl/CloudSolrClientTest.java
index cff5c23..dd0dd16 100644
--- a/solr/solrj/src/test/org/apache/solr/client/solrj/impl/CloudSolrClientTest.java
+++ b/solr/solrj/src/test/org/apache/solr/client/solrj/impl/CloudSolrClientTest.java
@@ -166,7 +166,7 @@ public class CloudSolrClientTest extends SolrCloudTestCase {
         fail("Alias points to non-existing collection, add should fail");
       } catch (SolrException e) {
         assertEquals(SolrException.ErrorCode.BAD_REQUEST.code, e.code());
-        assertTrue("Unexpected error exception", e.getMessage().contains("Collection not found"));
+        assertTrue("Unexpected exception", e.getMessage().contains("Collection not found"));
       }
     }
   }


[37/50] [abbrv] lucene-solr:jira/solr-9858: LUCENE-7703: Record the index creation version.

Posted by ab...@apache.org.
LUCENE-7703: Record the index creation version.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/d9c0f259
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/d9c0f259
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/d9c0f259

Branch: refs/heads/jira/solr-9858
Commit: d9c0f2599d934766549b2566d7c0dd159c3af5c8
Parents: b6c5a8a
Author: Adrien Grand <jp...@gmail.com>
Authored: Wed Feb 22 16:11:52 2017 +0100
Committer: Adrien Grand <jp...@gmail.com>
Committed: Tue Feb 28 13:37:07 2017 +0100

----------------------------------------------------------------------
 lucene/CHANGES.txt                              |   6 ++
 .../index/TestBackwardsCompatibility.java       |  49 ++++++++++--
 .../lucene/index/TestFixBrokenOffsets.java      |   7 +-
 .../lucene/index/TestIndexWriterOnOldIndex.java |  55 +++++++++++++
 .../lucene/index/index.single-empty-doc.630.zip | Bin 0 -> 1363 bytes
 .../org/apache/lucene/index/IndexWriter.java    |  20 +++--
 .../org/apache/lucene/index/SegmentInfos.java   |  77 +++++++++++++++----
 .../apache/lucene/index/TestIndexWriter.java    |  11 ++-
 .../apache/lucene/index/TestSegmentInfos.java   |  11 ++-
 .../org/apache/lucene/index/IndexSplitter.java  |   4 +-
 .../lucene/replicator/nrt/ReplicaNode.java      |   3 +-
 11 files changed, 203 insertions(+), 40 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d9c0f259/lucene/CHANGES.txt
----------------------------------------------------------------------
diff --git a/lucene/CHANGES.txt b/lucene/CHANGES.txt
index c119eaa..21a29c3 100644
--- a/lucene/CHANGES.txt
+++ b/lucene/CHANGES.txt
@@ -5,6 +5,12 @@ http://s.apache.org/luceneversions
 
 ======================= Lucene 7.0.0 =======================
 
+New Features
+
+* LUCENE-7703: SegmentInfos now record the Lucene version at index creation
+  time. (Adrien Grand)
+
+
 API Changes
 
 * LUCENE-2605: Classic QueryParser no longer splits on whitespace by default.

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d9c0f259/lucene/backward-codecs/src/test/org/apache/lucene/index/TestBackwardsCompatibility.java
----------------------------------------------------------------------
diff --git a/lucene/backward-codecs/src/test/org/apache/lucene/index/TestBackwardsCompatibility.java b/lucene/backward-codecs/src/test/org/apache/lucene/index/TestBackwardsCompatibility.java
index 0b8f2c4..57ce52a 100644
--- a/lucene/backward-codecs/src/test/org/apache/lucene/index/TestBackwardsCompatibility.java
+++ b/lucene/backward-codecs/src/test/org/apache/lucene/index/TestBackwardsCompatibility.java
@@ -702,13 +702,27 @@ public class TestBackwardsCompatibility extends LuceneTestCase {
       if (VERBOSE) {
         System.out.println("\nTEST: old index " + name);
       }
+      Directory oldDir = oldIndexDirs.get(name);
+      Version indexCreatedVersion = SegmentInfos.readLatestCommit(oldDir).getIndexCreatedVersion();
+
       Directory targetDir = newDirectory();
+      // Simulate writing into an index that was created on the same version
+      new SegmentInfos(indexCreatedVersion).commit(targetDir);
       IndexWriter w = new IndexWriter(targetDir, newIndexWriterConfig(new MockAnalyzer(random())));
-      w.addIndexes(oldIndexDirs.get(name));
+      w.addIndexes(oldDir);
+      w.close();
+      targetDir.close();
+
+      // Now check that we forbid calling addIndexes with a different version
+      targetDir = newDirectory();
+      IndexWriter oldWriter = new IndexWriter(targetDir, newIndexWriterConfig(new MockAnalyzer(random())));
+      IllegalArgumentException e = expectThrows(IllegalArgumentException.class, () -> oldWriter.addIndexes(oldDir));
+      assertTrue(e.getMessage(), e.getMessage().startsWith("Cannot use addIndexes(Directory) with indexes that have been created by a different Lucene version."));
+
       if (VERBOSE) {
         System.out.println("\nTEST: done adding indices; now close");
       }
-      w.close();
+      oldWriter.close();
       
       targetDir.close();
     }
@@ -1221,6 +1235,20 @@ public class TestBackwardsCompatibility extends LuceneTestCase {
     }
   }
 
+  public void testIndexCreatedVersion() throws IOException {
+    for (String name : oldNames) {
+      Directory dir = oldIndexDirs.get(name);
+      SegmentInfos infos = SegmentInfos.readLatestCommit(dir);
+      // those indexes are created by a single version so we can
+      // compare the commit version with the created version
+      if (infos.getCommitLuceneVersion().onOrAfter(Version.LUCENE_7_0_0)) {
+        assertEquals(infos.getCommitLuceneVersion(), infos.getIndexCreatedVersion());
+      } else {
+        assertNull(infos.getIndexCreatedVersion());
+      }
+    }
+  }
+
   public void verifyUsesDefaultCodec(Directory dir, String name) throws Exception {
     DirectoryReader r = DirectoryReader.open(dir);
     for (LeafReaderContext context : r.leaves()) {
@@ -1284,7 +1312,7 @@ public class TestBackwardsCompatibility extends LuceneTestCase {
     }
   }
   
-  private int checkAllSegmentsUpgraded(Directory dir) throws IOException {
+  private int checkAllSegmentsUpgraded(Directory dir, Version indexCreatedVersion) throws IOException {
     final SegmentInfos infos = SegmentInfos.readLatestCommit(dir);
     if (VERBOSE) {
       System.out.println("checkAllSegmentsUpgraded: " + infos);
@@ -1293,6 +1321,7 @@ public class TestBackwardsCompatibility extends LuceneTestCase {
       assertEquals(Version.LATEST, si.info.getVersion());
     }
     assertEquals(Version.LATEST, infos.getCommitLuceneVersion());
+    assertEquals(indexCreatedVersion, infos.getIndexCreatedVersion());
     return infos.size();
   }
   
@@ -1310,10 +1339,11 @@ public class TestBackwardsCompatibility extends LuceneTestCase {
         System.out.println("testUpgradeOldIndex: index=" +name);
       }
       Directory dir = newDirectory(oldIndexDirs.get(name));
+      Version indexCreatedVersion = SegmentInfos.readLatestCommit(dir).getIndexCreatedVersion();
 
       newIndexUpgrader(dir).upgrade();
 
-      checkAllSegmentsUpgraded(dir);
+      checkAllSegmentsUpgraded(dir, indexCreatedVersion);
       
       dir.close();
     }
@@ -1324,7 +1354,9 @@ public class TestBackwardsCompatibility extends LuceneTestCase {
     PrintStream savedSystemOut = System.out;
     System.setOut(new PrintStream(new ByteArrayOutputStream(), false, "UTF-8"));
     try {
-      for (String name : oldIndexDirs.keySet()) {
+      for (Map.Entry<String,Directory> entry : oldIndexDirs.entrySet()) {
+        String name = entry.getKey();
+        Version indexCreatedVersion = SegmentInfos.readLatestCommit(entry.getValue()).getIndexCreatedVersion();
         Path dir = createTempDir(name);
         TestUtil.unzip(getDataInputStream("index." + name + ".zip"), dir);
         
@@ -1360,7 +1392,7 @@ public class TestBackwardsCompatibility extends LuceneTestCase {
         
         Directory upgradedDir = newFSDirectory(dir);
         try {
-          checkAllSegmentsUpgraded(upgradedDir);
+          checkAllSegmentsUpgraded(upgradedDir, indexCreatedVersion);
         } finally {
           upgradedDir.close();
         }
@@ -1377,6 +1409,7 @@ public class TestBackwardsCompatibility extends LuceneTestCase {
       }
       Directory dir = newDirectory(oldIndexDirs.get(name));
       assertEquals("Original index must be single segment", 1, getNumberOfSegments(dir));
+      Version indexCreatedVersion = SegmentInfos.readLatestCommit(dir).getIndexCreatedVersion();
 
       // create a bunch of dummy segments
       int id = 40;
@@ -1418,7 +1451,7 @@ public class TestBackwardsCompatibility extends LuceneTestCase {
       assertEquals(1, DirectoryReader.listCommits(dir).size());
       newIndexUpgrader(dir).upgrade();
 
-      final int segCount = checkAllSegmentsUpgraded(dir);
+      final int segCount = checkAllSegmentsUpgraded(dir, indexCreatedVersion);
       assertEquals("Index must still contain the same number of segments, as only one segment was upgraded and nothing else merged",
         origSegCount, segCount);
       
@@ -1435,7 +1468,7 @@ public class TestBackwardsCompatibility extends LuceneTestCase {
 
     newIndexUpgrader(dir).upgrade();
 
-    checkAllSegmentsUpgraded(dir);
+    checkAllSegmentsUpgraded(dir, null);
     
     dir.close();
   }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d9c0f259/lucene/backward-codecs/src/test/org/apache/lucene/index/TestFixBrokenOffsets.java
----------------------------------------------------------------------
diff --git a/lucene/backward-codecs/src/test/org/apache/lucene/index/TestFixBrokenOffsets.java b/lucene/backward-codecs/src/test/org/apache/lucene/index/TestFixBrokenOffsets.java
index 4ecbd13..917785e 100644
--- a/lucene/backward-codecs/src/test/org/apache/lucene/index/TestFixBrokenOffsets.java
+++ b/lucene/backward-codecs/src/test/org/apache/lucene/index/TestFixBrokenOffsets.java
@@ -78,10 +78,11 @@ public class TestFixBrokenOffsets extends LuceneTestCase {
     MockDirectoryWrapper tmpDir = newMockDirectory();
     tmpDir.setCheckIndexOnClose(false);
     IndexWriter w = new IndexWriter(tmpDir, new IndexWriterConfig());
-    w.addIndexes(dir);
+    IndexWriter finalW = w;
+    IllegalArgumentException e = expectThrows(IllegalArgumentException.class, () -> finalW.addIndexes(dir));
+    assertTrue(e.getMessage(), e.getMessage().startsWith("Cannot use addIndexes(Directory) with indexes that have been created by a different Lucene version."));
     w.close();
-    // OK: addIndexes(Directory...) also keeps version as 6.3.0, so offsets not checked:
-    TestUtil.checkIndex(tmpDir);
+    // OK: addIndexes(Directory...) refuses to execute if the index creation version is different so broken offsets are not carried over
     tmpDir.close();
 
     final MockDirectoryWrapper tmpDir2 = newMockDirectory();

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d9c0f259/lucene/backward-codecs/src/test/org/apache/lucene/index/TestIndexWriterOnOldIndex.java
----------------------------------------------------------------------
diff --git a/lucene/backward-codecs/src/test/org/apache/lucene/index/TestIndexWriterOnOldIndex.java b/lucene/backward-codecs/src/test/org/apache/lucene/index/TestIndexWriterOnOldIndex.java
new file mode 100644
index 0000000..73d933a
--- /dev/null
+++ b/lucene/backward-codecs/src/test/org/apache/lucene/index/TestIndexWriterOnOldIndex.java
@@ -0,0 +1,55 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.lucene.index;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.nio.file.Path;
+
+import org.apache.lucene.index.IndexWriterConfig.OpenMode;
+import org.apache.lucene.store.Directory;
+import org.apache.lucene.util.LuceneTestCase;
+import org.apache.lucene.util.TestUtil;
+import org.apache.lucene.util.Version;
+
+public class TestIndexWriterOnOldIndex extends LuceneTestCase {
+
+  public void testOpenModeAndCreatedVersion() throws IOException {
+    InputStream resource = getClass().getResourceAsStream("index.single-empty-doc.630.zip");
+    assertNotNull(resource);
+    Path path = createTempDir();
+    TestUtil.unzip(resource, path);
+    Directory dir = newFSDirectory(path);
+    for (OpenMode openMode : OpenMode.values()) {
+      Directory tmpDir = newDirectory(dir);
+      assertEquals(null /** 6.3.0 */, SegmentInfos.readLatestCommit(tmpDir).getIndexCreatedVersion());
+      IndexWriter w = new IndexWriter(tmpDir, newIndexWriterConfig().setOpenMode(openMode));
+      w.commit();
+      w.close();
+      switch (openMode) {
+        case CREATE:
+          assertEquals(Version.LATEST, SegmentInfos.readLatestCommit(tmpDir).getIndexCreatedVersion());
+          break;
+        default:
+          assertEquals(null /** 6.3.0 */, SegmentInfos.readLatestCommit(tmpDir).getIndexCreatedVersion());
+      }
+      tmpDir.close();
+    }
+    dir.close();
+  }
+
+}

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d9c0f259/lucene/backward-codecs/src/test/org/apache/lucene/index/index.single-empty-doc.630.zip
----------------------------------------------------------------------
diff --git a/lucene/backward-codecs/src/test/org/apache/lucene/index/index.single-empty-doc.630.zip b/lucene/backward-codecs/src/test/org/apache/lucene/index/index.single-empty-doc.630.zip
new file mode 100644
index 0000000..1bf1d08
Binary files /dev/null and b/lucene/backward-codecs/src/test/org/apache/lucene/index/index.single-empty-doc.630.zip differ

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d9c0f259/lucene/core/src/java/org/apache/lucene/index/IndexWriter.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/IndexWriter.java b/lucene/core/src/java/org/apache/lucene/index/IndexWriter.java
index cbf2ae2..da030ca 100644
--- a/lucene/core/src/java/org/apache/lucene/index/IndexWriter.java
+++ b/lucene/core/src/java/org/apache/lucene/index/IndexWriter.java
@@ -30,6 +30,7 @@ import java.util.LinkedList;
 import java.util.List;
 import java.util.Locale;
 import java.util.Map.Entry;
+import java.util.Objects;
 import java.util.Map;
 import java.util.Queue;
 import java.util.Set;
@@ -864,14 +865,13 @@ public class IndexWriter implements Closeable, TwoPhaseCommit, Accountable {
         // against an index that's currently open for
         // searching.  In this case we write the next
         // segments_N file with no segments:
-        SegmentInfos sis = null;
+        final SegmentInfos sis = new SegmentInfos(Version.LATEST);
         try {
-          sis = SegmentInfos.readLatestCommit(directory);
-          sis.clear();
+          final SegmentInfos previous = SegmentInfos.readLatestCommit(directory);
+          sis.updateGenerationVersionAndCounter(previous);
         } catch (IOException e) {
           // Likely this means it's a fresh directory
           initialIndexExists = false;
-          sis = new SegmentInfos();
         }
         
         segmentInfos = sis;
@@ -2624,6 +2624,9 @@ public class IndexWriter implements Closeable, TwoPhaseCommit, Accountable {
    *
    * <p>This requires this index not be among those to be added.
    *
+   * <p>All added indexes must have been created by the same
+   * Lucene version as this index.
+   *
    * @return The <a href="#sequence_number">sequence number</a>
    * for this operation
    *
@@ -2663,6 +2666,13 @@ public class IndexWriter implements Closeable, TwoPhaseCommit, Accountable {
           infoStream.message("IW", "addIndexes: process directory " + dir);
         }
         SegmentInfos sis = SegmentInfos.readLatestCommit(dir); // read infos from dir
+        if (Objects.equals(segmentInfos.getIndexCreatedVersion(), sis.getIndexCreatedVersion()) == false) {
+          throw new IllegalArgumentException("Cannot use addIndexes(Directory) with indexes that have been created "
+              + "by a different Lucene version. The current index was generated by "
+              + segmentInfos.getIndexCreatedVersion()
+              + " while one of the directories contains an index that was generated with "
+              + sis.getIndexCreatedVersion());
+        }
         totalMaxDoc += sis.totalMaxDoc();
         commits.add(sis);
       }
@@ -4600,7 +4610,7 @@ public class IndexWriter implements Closeable, TwoPhaseCommit, Accountable {
 
   // For infoStream output
   synchronized SegmentInfos toLiveInfos(SegmentInfos sis) {
-    final SegmentInfos newSIS = new SegmentInfos();
+    final SegmentInfos newSIS = new SegmentInfos(sis.getIndexCreatedVersion());
     final Map<SegmentCommitInfo,SegmentCommitInfo> liveSIS = new HashMap<>();
     for(SegmentCommitInfo info : segmentInfos) {
       liveSIS.put(info, info);

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d9c0f259/lucene/core/src/java/org/apache/lucene/index/SegmentInfos.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/SegmentInfos.java b/lucene/core/src/java/org/apache/lucene/index/SegmentInfos.java
index aaa1d89..12305d0 100644
--- a/lucene/core/src/java/org/apache/lucene/index/SegmentInfos.java
+++ b/lucene/core/src/java/org/apache/lucene/index/SegmentInfos.java
@@ -124,8 +124,10 @@ public final class SegmentInfos implements Cloneable, Iterable<SegmentCommitInfo
 
   /** Adds the {@link Version} that committed this segments_N file, as well as the {@link Version} of the oldest segment, since 5.3+ */
   public static final int VERSION_53 = 6;
+  /** The version that added information about the Lucene version at the time when the index has been created. */
+  public static final int VERSION_70 = 7;
 
-  static final int VERSION_CURRENT = VERSION_53;
+  static final int VERSION_CURRENT = VERSION_70;
 
   /** Used to name new segments. */
   // TODO: should this be a long ...?
@@ -153,18 +155,22 @@ public final class SegmentInfos implements Cloneable, Iterable<SegmentCommitInfo
   /** Id for this commit; only written starting with Lucene 5.0 */
   private byte[] id;
 
-  /** Which Lucene version wrote this commit, or null if this commit is pre-5.3. */
+  /** Which Lucene version wrote this commit. */
   private Version luceneVersion;
 
   /** Version of the oldest segment in the index, or null if there are no segments. */
   private Version minSegmentLuceneVersion;
 
-  /** Sole constructor. Typically you call this and then
-   *  use {@link #readLatestCommit(Directory) or
-   *  #readCommit(Directory,String)} to populate each {@link
-   *  SegmentCommitInfo}.  Alternatively, you can add/remove your
-   *  own {@link SegmentCommitInfo}s. */
-  public SegmentInfos() {
+  /** The Lucene version that was used to create the index. */
+  private final Version indexCreatedVersion;
+
+  /** Sole constructor.
+   *  @param indexCreatedVersion the Lucene version at index creation time, or {@code null} if the index was created before 7.0 */
+  public SegmentInfos(Version indexCreatedVersion) {
+    if (indexCreatedVersion != null && indexCreatedVersion.onOrAfter(Version.LUCENE_7_0_0) == false) {
+      throw new IllegalArgumentException("indexCreatedVersion may only be non-null if the index was created on or after 7.0, got " + indexCreatedVersion);
+    }
+    this.indexCreatedVersion = indexCreatedVersion;
   }
 
   /** Returns {@link SegmentCommitInfo} at the provided
@@ -302,19 +308,38 @@ public final class SegmentInfos implements Cloneable, Iterable<SegmentCommitInfo
     input.readBytes(id, 0, id.length);
     CodecUtil.checkIndexHeaderSuffix(input, Long.toString(generation, Character.MAX_RADIX));
 
-    SegmentInfos infos = new SegmentInfos();
-    infos.id = id;
-    infos.generation = generation;
-    infos.lastGeneration = generation;
-    if (format >= VERSION_53) {
-      infos.luceneVersion = Version.fromBits(input.readVInt(), input.readVInt(), input.readVInt());
-      if (infos.luceneVersion.onOrAfter(Version.LUCENE_6_0_0) == false) {
-        throw new IndexFormatTooOldException(input, "this index is too old (version: " + infos.luceneVersion + ")");
+    Version luceneVersion = Version.fromBits(input.readVInt(), input.readVInt(), input.readVInt());
+    if (luceneVersion.onOrAfter(Version.LUCENE_6_0_0) == false) {
+      // TODO: should we check indexCreatedVersion instead?
+      throw new IndexFormatTooOldException(input, "this index is too old (version: " + luceneVersion + ")");
+    }
+
+    Version indexCreatedVersion;
+    if (format >= VERSION_70) {
+      byte b = input.readByte();
+      switch (b) {
+        case 0:
+          // version is not known: pre-7.0 index that has been modified since the 7.0 upgrade
+          indexCreatedVersion = null;
+          break;
+        case 1:
+          // version is known: index has been created on or after 7.0
+          indexCreatedVersion = Version.fromBits(input.readVInt(), input.readVInt(), input.readVInt());
+          break;
+        default:
+          throw new CorruptIndexException("Illegal byte value for a boolean: " + b + ", expected 0 or 1", input);
       }
     } else {
-      throw new IndexFormatTooOldException(input, "this index segments file is too old (segment infos format: " + format + ")");
+      // pre-7.0 index that has not been modified since the 7.0 upgrade
+      indexCreatedVersion = null;
     }
 
+    SegmentInfos infos = new SegmentInfos(indexCreatedVersion);
+    infos.id = id;
+    infos.generation = generation;
+    infos.lastGeneration = generation;
+    infos.luceneVersion = luceneVersion;
+
     infos.version = input.readLong();
     //System.out.println("READ sis version=" + infos.version);
     infos.counter = input.readInt();
@@ -470,6 +495,17 @@ public final class SegmentInfos implements Cloneable, Iterable<SegmentCommitInfo
     out.writeVInt(Version.LATEST.bugfix);
     //System.out.println(Thread.currentThread().getName() + ": now write " + out.getName() + " with version=" + version);
 
+    if (indexCreatedVersion != null) {
+      // 7.0+ index
+      out.writeByte((byte) 1);
+      out.writeVInt(indexCreatedVersion.major);
+      out.writeVInt(indexCreatedVersion.minor);
+      out.writeVInt(indexCreatedVersion.bugfix);
+    } else {
+      // pre-7.0 index
+      out.writeByte((byte) 0);
+    }
+
     out.writeLong(version); 
     out.writeInt(counter); // write counter
     out.writeInt(size());
@@ -1001,4 +1037,11 @@ public final class SegmentInfos implements Cloneable, Iterable<SegmentCommitInfo
   public Version getMinSegmentLuceneVersion() {
     return minSegmentLuceneVersion;
   }
+
+  /** Return the version that was used to initially create the index. This
+   *  version is set when the index is first created and then never changes.
+   *  This returns {@code null} if the index was created before 7.0. */
+  public Version getIndexCreatedVersion() {
+    return indexCreatedVersion;
+  }
 }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d9c0f259/lucene/core/src/test/org/apache/lucene/index/TestIndexWriter.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/index/TestIndexWriter.java b/lucene/core/src/test/org/apache/lucene/index/TestIndexWriter.java
index e4f0ab0..d153ac3 100644
--- a/lucene/core/src/test/org/apache/lucene/index/TestIndexWriter.java
+++ b/lucene/core/src/test/org/apache/lucene/index/TestIndexWriter.java
@@ -94,6 +94,7 @@ import org.apache.lucene.util.SetOnce;
 import org.apache.lucene.util.StringHelper;
 import org.apache.lucene.util.TestUtil;
 import org.apache.lucene.util.ThreadInterruptedException;
+import org.apache.lucene.util.Version;
 import org.apache.lucene.util.automaton.Automata;
 import org.apache.lucene.util.automaton.Automaton;
 import org.apache.lucene.util.automaton.CharacterRunAutomaton;
@@ -2799,5 +2800,13 @@ public class TestIndexWriter extends LuceneTestCase {
     dir.close();
   }
 
-}
+  public void testRecordsIndexCreatedVersion() throws IOException {
+    Directory dir = newDirectory();
+    IndexWriter w = new IndexWriter(dir, newIndexWriterConfig());
+    w.commit();
+    w.close();
+    assertEquals(Version.LATEST, SegmentInfos.readLatestCommit(dir).getIndexCreatedVersion());
+    dir.close();
+  }
 
+}

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d9c0f259/lucene/core/src/test/org/apache/lucene/index/TestSegmentInfos.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/test/org/apache/lucene/index/TestSegmentInfos.java b/lucene/core/src/test/org/apache/lucene/index/TestSegmentInfos.java
index feea111..7552786 100644
--- a/lucene/core/src/test/org/apache/lucene/index/TestSegmentInfos.java
+++ b/lucene/core/src/test/org/apache/lucene/index/TestSegmentInfos.java
@@ -29,9 +29,14 @@ import java.util.Collections;
 
 public class TestSegmentInfos extends LuceneTestCase {
 
+  public void testIllegalCreatedVersion() {
+    IllegalArgumentException e = expectThrows(IllegalArgumentException.class, () -> new SegmentInfos(Version.LUCENE_6_5_0));
+    assertEquals("indexCreatedVersion may only be non-null if the index was created on or after 7.0, got 6.5.0", e.getMessage());
+  }
+
   // LUCENE-5954
   public void testVersionsNoSegments() throws IOException {
-    SegmentInfos sis = new SegmentInfos();
+    SegmentInfos sis = new SegmentInfos(Version.LATEST);
     BaseDirectoryWrapper dir = newDirectory();
     dir.setCheckIndexOnClose(false);
     sis.commit(dir);
@@ -48,7 +53,7 @@ public class TestSegmentInfos extends LuceneTestCase {
     byte id[] = StringHelper.randomId();
     Codec codec = Codec.getDefault();
 
-    SegmentInfos sis = new SegmentInfos();
+    SegmentInfos sis = new SegmentInfos(Version.LATEST);
     SegmentInfo info = new SegmentInfo(dir, Version.LUCENE_6_0_0, "_0", 1, false, Codec.getDefault(), 
                                        Collections.<String,String>emptyMap(), id, Collections.<String,String>emptyMap(), null);
     info.setFiles(Collections.<String>emptySet());
@@ -70,7 +75,7 @@ public class TestSegmentInfos extends LuceneTestCase {
     byte id[] = StringHelper.randomId();
     Codec codec = Codec.getDefault();
 
-    SegmentInfos sis = new SegmentInfos();
+    SegmentInfos sis = new SegmentInfos(Version.LATEST);
     SegmentInfo info = new SegmentInfo(dir, Version.LUCENE_6_0_0, "_0", 1, false, Codec.getDefault(), 
                                        Collections.<String,String>emptyMap(), id, Collections.<String,String>emptyMap(), null);
     info.setFiles(Collections.<String>emptySet());

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d9c0f259/lucene/misc/src/java/org/apache/lucene/index/IndexSplitter.java
----------------------------------------------------------------------
diff --git a/lucene/misc/src/java/org/apache/lucene/index/IndexSplitter.java b/lucene/misc/src/java/org/apache/lucene/index/IndexSplitter.java
index 368c285..a3d720d 100644
--- a/lucene/misc/src/java/org/apache/lucene/index/IndexSplitter.java
+++ b/lucene/misc/src/java/org/apache/lucene/index/IndexSplitter.java
@@ -49,7 +49,7 @@ import org.apache.lucene.util.SuppressForbidden;
  * careful!
  */
 public class IndexSplitter {
-  public SegmentInfos infos;
+  public final SegmentInfos infos;
 
   FSDirectory fsDir;
 
@@ -133,7 +133,7 @@ public class IndexSplitter {
   public void split(Path destDir, String[] segs) throws IOException {
     Files.createDirectories(destDir);
     FSDirectory destFSDir = FSDirectory.open(destDir);
-    SegmentInfos destInfos = new SegmentInfos();
+    SegmentInfos destInfos = new SegmentInfos(infos.getIndexCreatedVersion());
     destInfos.counter = infos.counter;
     for (String n : segs) {
       SegmentCommitInfo infoPerCommit = getInfo(n);

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d9c0f259/lucene/replicator/src/java/org/apache/lucene/replicator/nrt/ReplicaNode.java
----------------------------------------------------------------------
diff --git a/lucene/replicator/src/java/org/apache/lucene/replicator/nrt/ReplicaNode.java b/lucene/replicator/src/java/org/apache/lucene/replicator/nrt/ReplicaNode.java
index ce9c3ce..5319956 100644
--- a/lucene/replicator/src/java/org/apache/lucene/replicator/nrt/ReplicaNode.java
+++ b/lucene/replicator/src/java/org/apache/lucene/replicator/nrt/ReplicaNode.java
@@ -50,6 +50,7 @@ import org.apache.lucene.store.IOContext;
 import org.apache.lucene.store.IndexOutput;
 import org.apache.lucene.store.Lock;
 import org.apache.lucene.util.IOUtils;
+import org.apache.lucene.util.Version;
 
 /** Replica node, that pulls index changes from the primary node by copying newly flushed or merged index files.
  * 
@@ -138,7 +139,7 @@ public abstract class ReplicaNode extends Node {
       SegmentInfos infos;
       if (segmentsFileName == null) {
         // No index here yet:
-        infos = new SegmentInfos();
+        infos = new SegmentInfos(Version.LATEST);
         message("top: init: no segments in index");
       } else {
         message("top: init: read existing segments commit " + segmentsFileName);


[02/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10143: PointFields will create IndexOrDocValuesQuery when a field is both, indexed=true and docValues=true

Posted by ab...@apache.org.
SOLR-10143: PointFields will create IndexOrDocValuesQuery when a field is both, indexed=true and docValues=true


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/21690f5e
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/21690f5e
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/21690f5e

Branch: refs/heads/jira/solr-9858
Commit: 21690f5e126e1be0baf70cd3af2d570a18cd712d
Parents: 6ddf369
Author: Tomas Fernandez Lobbe <tf...@apache.org>
Authored: Wed Feb 22 10:28:53 2017 -0800
Committer: Tomas Fernandez Lobbe <tf...@apache.org>
Committed: Wed Feb 22 10:31:12 2017 -0800

----------------------------------------------------------------------
 .../java/org/apache/solr/schema/PointField.java |  9 +++
 .../test-files/solr/collection1/conf/schema.xml |  3 +-
 .../org/apache/solr/schema/PolyFieldTest.java   | 14 +++--
 .../org/apache/solr/schema/TestPointFields.java | 63 +++++++++++++++++++-
 .../solr/search/TestMaxScoreQueryParser.java    |  3 +-
 5 files changed, 83 insertions(+), 9 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/21690f5e/solr/core/src/java/org/apache/solr/schema/PointField.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/schema/PointField.java b/solr/core/src/java/org/apache/solr/schema/PointField.java
index 1168386..8746dac 100644
--- a/solr/core/src/java/org/apache/solr/schema/PointField.java
+++ b/solr/core/src/java/org/apache/solr/schema/PointField.java
@@ -28,6 +28,7 @@ import org.apache.lucene.document.SortedNumericDocValuesField;
 import org.apache.lucene.document.StoredField;
 import org.apache.lucene.index.IndexableField;
 import org.apache.lucene.queries.function.ValueSource;
+import org.apache.lucene.search.IndexOrDocValuesQuery;
 import org.apache.lucene.search.Query;
 import org.apache.lucene.search.SortedNumericSelector;
 import org.apache.lucene.util.BytesRef;
@@ -117,6 +118,10 @@ public abstract class PointField extends NumericFieldType {
     if (!field.indexed() && field.hasDocValues()) {
       // currently implemented as singleton range
       return getRangeQuery(parser, field, externalVal, externalVal, true, true);
+    } else if (field.indexed() && field.hasDocValues()) {
+      Query pointsQuery = getExactQuery(field, externalVal);
+      Query dvQuery = getDocValuesRangeQuery(parser, field, externalVal, externalVal, true, true);
+      return new IndexOrDocValuesQuery(pointsQuery, dvQuery);
     } else {
       return getExactQuery(field, externalVal);
     }
@@ -132,6 +137,10 @@ public abstract class PointField extends NumericFieldType {
       boolean maxInclusive) {
     if (!field.indexed() && field.hasDocValues()) {
       return getDocValuesRangeQuery(parser, field, min, max, minInclusive, maxInclusive);
+    } else if (field.indexed() && field.hasDocValues()) {
+      Query pointsQuery = getPointRangeQuery(parser, field, min, max, minInclusive, maxInclusive);
+      Query dvQuery = getDocValuesRangeQuery(parser, field, min, max, minInclusive, maxInclusive);
+      return new IndexOrDocValuesQuery(pointsQuery, dvQuery);
     } else {
       return getPointRangeQuery(parser, field, min, max, minInclusive, maxInclusive);
     }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/21690f5e/solr/core/src/test-files/solr/collection1/conf/schema.xml
----------------------------------------------------------------------
diff --git a/solr/core/src/test-files/solr/collection1/conf/schema.xml b/solr/core/src/test-files/solr/collection1/conf/schema.xml
index ef7fc8d..c53be9b 100644
--- a/solr/core/src/test-files/solr/collection1/conf/schema.xml
+++ b/solr/core/src/test-files/solr/collection1/conf/schema.xml
@@ -395,7 +395,7 @@
   <fieldType name="x" class="solr.PointType" dimension="1" subFieldType="double"/>
   <fieldType name="tenD" class="solr.PointType" dimension="10" subFieldType="double"/>
   <!-- Use the sub field suffix -->
-  <fieldType name="xyd" class="solr.PointType" dimension="2" subFieldSuffix="_d1"/>
+  <fieldType name="xyd" class="solr.PointType" dimension="2" subFieldSuffix="_d1_ndv"/>
   <fieldType name="geohash" class="solr.GeoHashField"/>
 
 
@@ -620,6 +620,7 @@
   <dynamicField name="*_f1" type="${solr.tests.floatClass:pfloat}" indexed="true" stored="true" multiValued="false"/>
   <dynamicField name="*_d" type="${solr.tests.doubleClass:pdouble}" indexed="true" stored="true"/>
   <dynamicField name="*_d1" type="${solr.tests.doubleClass:pdouble}" indexed="true" stored="true" multiValued="false"/>
+  <dynamicField name="*_d1_ndv" type="${solr.tests.doubleClass:pdouble}" indexed="true" docValues="false" stored="true" multiValued="false"/>
   <dynamicField name="*_dt" type="date" indexed="true" stored="true"/>
   <dynamicField name="*_dt1" type="date" indexed="true" stored="true" multiValued="false"/>
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/21690f5e/solr/core/src/test/org/apache/solr/schema/PolyFieldTest.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/schema/PolyFieldTest.java b/solr/core/src/test/org/apache/solr/schema/PolyFieldTest.java
index 56eb7e0..f788ba0 100644
--- a/solr/core/src/test/org/apache/solr/schema/PolyFieldTest.java
+++ b/solr/core/src/test/org/apache/solr/schema/PolyFieldTest.java
@@ -111,13 +111,17 @@ public class PolyFieldTest extends SolrTestCaseJ4 {
       //
     }
 
-    //
+    
     SchemaField s1 = schema.getField("test_p");
     SchemaField s2 = schema.getField("test_p");
-    ValueSource v1 = s1.getType().getValueSource(s1, null);
-    ValueSource v2 = s2.getType().getValueSource(s2, null);
-    assertEquals(v1, v2);
-    assertEquals(v1.hashCode(), v2.hashCode());
+    // If we use [Int/Double/Long/Float]PointField, we can't get the valueSource, since docValues is false
+    if (s1.createFields("1,2", 0).get(0).fieldType().pointDimensionCount() == 0) {
+      assertFalse(s2.getType().isPointField());
+      ValueSource v1 = s1.getType().getValueSource(s1, null);
+      ValueSource v2 = s2.getType().getValueSource(s2, null);
+      assertEquals(v1, v2);
+      assertEquals(v1.hashCode(), v2.hashCode());
+    }
   }
 
   @Test

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/21690f5e/solr/core/src/test/org/apache/solr/schema/TestPointFields.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/schema/TestPointFields.java b/solr/core/src/test/org/apache/solr/schema/TestPointFields.java
index bf34ce4..b3d0b97 100644
--- a/solr/core/src/test/org/apache/solr/schema/TestPointFields.java
+++ b/solr/core/src/test/org/apache/solr/schema/TestPointFields.java
@@ -21,6 +21,8 @@ import java.util.Locale;
 import java.util.Set;
 import java.util.TreeSet;
 
+import org.apache.lucene.search.IndexOrDocValuesQuery;
+import org.apache.lucene.search.PointRangeQuery;
 import org.apache.solr.SolrTestCaseJ4;
 import org.apache.solr.common.SolrException;
 import org.junit.After;
@@ -71,6 +73,7 @@ public class TestPointFields extends SolrTestCaseJ4 {
   public void testIntPointFieldRangeQuery() throws Exception {
     doTestIntPointFieldRangeQuery("number_p_i", "int", false);
     doTestIntPointFieldRangeQuery("number_p_i_ni_ns_dv", "int", false);
+    doTestIntPointFieldRangeQuery("number_p_i_dv", "int", false);
   }
   
   @Test
@@ -120,6 +123,7 @@ public class TestPointFields extends SolrTestCaseJ4 {
   public void testIntPointFieldMultiValuedRangeQuery() throws Exception {
     testPointFieldMultiValuedRangeQuery("number_p_i_mv", "int", getSequentialStringArrayWithInts(20));
     testPointFieldMultiValuedRangeQuery("number_p_i_ni_mv_dv", "int", getSequentialStringArrayWithInts(20));
+    testPointFieldMultiValuedRangeQuery("number_p_i_mv_dv", "int", getSequentialStringArrayWithInts(20));
   }
   
   //TODO MV SORT?
@@ -198,6 +202,7 @@ public class TestPointFields extends SolrTestCaseJ4 {
   public void testDoublePointFieldRangeQuery() throws Exception {
     doTestFloatPointFieldRangeQuery("number_p_d", "double", true);
     doTestFloatPointFieldRangeQuery("number_p_d_ni_ns_dv", "double", true);
+    doTestFloatPointFieldRangeQuery("number_p_d_dv", "double", true);
   }
   
   @Test
@@ -249,6 +254,7 @@ public class TestPointFields extends SolrTestCaseJ4 {
   public void testDoublePointFieldMultiValuedRangeQuery() throws Exception {
     testPointFieldMultiValuedRangeQuery("number_p_d_mv", "double", getSequentialStringArrayWithDoubles(20));
     testPointFieldMultiValuedRangeQuery("number_p_d_ni_mv_dv", "double", getSequentialStringArrayWithDoubles(20));
+    testPointFieldMultiValuedRangeQuery("number_p_d_mv_dv", "double", getSequentialStringArrayWithDoubles(20));
   }
   
   @Test
@@ -360,6 +366,7 @@ public class TestPointFields extends SolrTestCaseJ4 {
   public void testFloatPointFieldRangeQuery() throws Exception {
     doTestFloatPointFieldRangeQuery("number_p_f", "float", false);
     doTestFloatPointFieldRangeQuery("number_p_f_ni_ns_dv", "float", false);
+    doTestFloatPointFieldRangeQuery("number_p_f_dv", "float", false);
   }
   
   @Test
@@ -411,6 +418,7 @@ public class TestPointFields extends SolrTestCaseJ4 {
   public void testFloatPointFieldMultiValuedRangeQuery() throws Exception {
     testPointFieldMultiValuedRangeQuery("number_p_f_mv", "float", getSequentialStringArrayWithDoubles(20));
     testPointFieldMultiValuedRangeQuery("number_p_f_ni_mv_dv", "float", getSequentialStringArrayWithDoubles(20));
+    testPointFieldMultiValuedRangeQuery("number_p_f_mv_dv", "float", getSequentialStringArrayWithDoubles(20));
   }
   
   @Test
@@ -481,6 +489,7 @@ public class TestPointFields extends SolrTestCaseJ4 {
   public void testLongPointFieldRangeQuery() throws Exception {
     doTestIntPointFieldRangeQuery("number_p_l", "long", true);
     doTestIntPointFieldRangeQuery("number_p_l_ni_ns_dv", "long", true);
+    doTestIntPointFieldRangeQuery("number_p_l_dv", "long", true);
   }
   
   @Test
@@ -533,6 +542,7 @@ public class TestPointFields extends SolrTestCaseJ4 {
   public void testLongPointFieldMultiValuedRangeQuery() throws Exception {
     testPointFieldMultiValuedRangeQuery("number_p_l_mv", "long", getSequentialStringArrayWithInts(20));
     testPointFieldMultiValuedRangeQuery("number_p_l_ni_mv_dv", "long", getSequentialStringArrayWithInts(20));
+    testPointFieldMultiValuedRangeQuery("number_p_l_mv_dv", "long", getSequentialStringArrayWithInts(20));
   }
   
   @Test
@@ -578,6 +588,27 @@ public class TestPointFields extends SolrTestCaseJ4 {
     doTestSetQueries("number_p_l_ni_dv", getRandomStringArrayWithLongs(10, false), false);
   }
   
+  @Test
+  public void testIndexOrDocValuesQuery() throws Exception {
+    String[] fieldTypeNames = new String[]{"_p_i", "_p_l", "_p_d", "_p_f"};
+    FieldType[] fieldTypes = new FieldType[]{new IntPointField(), new LongPointField(), new DoublePointField(), new FloatPointField()};
+    assert fieldTypeNames.length == fieldTypes.length;
+    for (int i = 0; i < fieldTypeNames.length; i++) {
+      SchemaField fieldIndexed = h.getCore().getLatestSchema().getField("foo_" + fieldTypeNames[i]);
+      SchemaField fieldIndexedAndDv = h.getCore().getLatestSchema().getField("foo_" + fieldTypeNames[i] + "_dv");
+      SchemaField fieldIndexedMv = h.getCore().getLatestSchema().getField("foo_" + fieldTypeNames[i] + "_mv");
+      SchemaField fieldIndexedAndDvMv = h.getCore().getLatestSchema().getField("foo_" + fieldTypeNames[i] + "_mv_dv");
+      assertTrue(fieldTypes[i].getRangeQuery(null, fieldIndexed, "0", "10", true, true) instanceof PointRangeQuery);
+      assertTrue(fieldTypes[i].getRangeQuery(null, fieldIndexedAndDv, "0", "10", true, true) instanceof IndexOrDocValuesQuery);
+      assertTrue(fieldTypes[i].getRangeQuery(null, fieldIndexedMv, "0", "10", true, true) instanceof PointRangeQuery);
+      assertTrue(fieldTypes[i].getRangeQuery(null, fieldIndexedAndDvMv, "0", "10", true, true) instanceof IndexOrDocValuesQuery);
+      assertTrue(fieldTypes[i].getFieldQuery(null, fieldIndexed, "0") instanceof PointRangeQuery);
+      assertTrue(fieldTypes[i].getFieldQuery(null, fieldIndexedAndDv, "0") instanceof IndexOrDocValuesQuery);
+      assertTrue(fieldTypes[i].getFieldQuery(null, fieldIndexedMv, "0") instanceof PointRangeQuery);
+      assertTrue(fieldTypes[i].getFieldQuery(null, fieldIndexedAndDvMv, "0") instanceof IndexOrDocValuesQuery);
+    }
+  }
+  
   // Helper methods
   
   private String[] getRandomStringArrayWithDoubles(int length, boolean sorted) {
@@ -803,14 +834,29 @@ public class TestPointFields extends SolrTestCaseJ4 {
         "//result/doc[1]/" + type + "[@name='" + fieldName + "'][.='0']",
         "//result/doc[10]/" + type + "[@name='" + fieldName + "'][.='9']");
     
+    assertQ(req("q", fieldName + ":[0 TO 1] OR " + fieldName + ":[8 TO 9]" , "fl", "id, " + fieldName), 
+        "//*[@numFound='4']",
+        "//result/doc[1]/" + type + "[@name='" + fieldName + "'][.='0']",
+        "//result/doc[2]/" + type + "[@name='" + fieldName + "'][.='1']",
+        "//result/doc[3]/" + type + "[@name='" + fieldName + "'][.='8']",
+        "//result/doc[4]/" + type + "[@name='" + fieldName + "'][.='9']");
+    
+    assertQ(req("q", fieldName + ":[0 TO 1] AND " + fieldName + ":[1 TO 2]" , "fl", "id, " + fieldName), 
+        "//*[@numFound='1']",
+        "//result/doc[1]/" + type + "[@name='" + fieldName + "'][.='1']");
+    
+    assertQ(req("q", fieldName + ":[0 TO 1] AND NOT " + fieldName + ":[1 TO 2]" , "fl", "id, " + fieldName), 
+        "//*[@numFound='1']",
+        "//result/doc[1]/" + type + "[@name='" + fieldName + "'][.='0']");
+
     clearIndex();
     assertU(commit());
     
     String[] arr;
     if (testLong) {
-      arr = getRandomStringArrayWithLongs(10, true);
+      arr = getRandomStringArrayWithLongs(100, true);
     } else {
-      arr = getRandomStringArrayWithInts(10, true);
+      arr = getRandomStringArrayWithInts(100, true);
     }
     for (int i = 0; i < arr.length; i++) {
       assertU(adoc("id", String.valueOf(i), fieldName, arr[i]));
@@ -821,6 +867,8 @@ public class TestPointFields extends SolrTestCaseJ4 {
           "//*[@numFound='" + (i + 1) + "']");
       assertQ(req("q", fieldName + ":{" + arr[0] + " TO " + arr[i] + "}", "fl", "id, " + fieldName), 
           "//*[@numFound='" + (Math.max(0,  i-1)) + "']");
+      assertQ(req("q", fieldName + ":[" + arr[0] + " TO " + arr[i] + "] AND " + fieldName + ":" + arr[0].replace("-", "\\-"), "fl", "id, " + fieldName), 
+          "//*[@numFound='1']");
     }
   }
   
@@ -1092,6 +1140,17 @@ public class TestPointFields extends SolrTestCaseJ4 {
         "//*[@numFound='10']",
         "//result/doc[1]/arr[@name='" + fieldName + "']/" + type + "[1][.='" + numbers[0] + "']",
         "//result/doc[10]/arr[@name='" + fieldName + "']/" + type + "[1][.='" + numbers[9] + "']");
+    
+    assertQ(req("q", fieldName + ":[0 TO 1] OR " + fieldName + ":[8 TO 9]", "fl", "id, " + fieldName), 
+        "//*[@numFound='4']",
+        "//result/doc[1]/arr[@name='" + fieldName + "']/" + type + "[1][.='" + numbers[0] + "']",
+        "//result/doc[2]/arr[@name='" + fieldName + "']/" + type + "[1][.='" + numbers[1] + "']",
+        "//result/doc[3]/arr[@name='" + fieldName + "']/" + type + "[1][.='" + numbers[8] + "']",
+        "//result/doc[4]/arr[@name='" + fieldName + "']/" + type + "[1][.='" + numbers[9] + "']");
+    
+    assertQ(req("q", fieldName + ":[0 TO 0] AND " + fieldName + ":[10 TO 10]", "fl", "id, " + fieldName), 
+        "//*[@numFound='1']",
+        "//result/doc[1]/arr[@name='" + fieldName + "']/" + type + "[1][.='" + numbers[0] + "']");
   }
 
   private void testPointFieldMultiValuedFacetField(String nonDocValuesField, String dvFieldName, String[] numbers) throws Exception {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/21690f5e/solr/core/src/test/org/apache/solr/search/TestMaxScoreQueryParser.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/search/TestMaxScoreQueryParser.java b/solr/core/src/test/org/apache/solr/search/TestMaxScoreQueryParser.java
index 610e998..4699a66 100644
--- a/solr/core/src/test/org/apache/solr/search/TestMaxScoreQueryParser.java
+++ b/solr/core/src/test/org/apache/solr/search/TestMaxScoreQueryParser.java
@@ -46,7 +46,8 @@ public class TestMaxScoreQueryParser extends AbstractSolrTestCase {
     assertEquals(new BoostQuery(new TermQuery(new Term("text", "foo")), 3f), q);
 
     q = parse("price:[0 TO 10]");
-    assertTrue(q instanceof LegacyNumericRangeQuery || q instanceof PointRangeQuery);
+    assertTrue(q instanceof LegacyNumericRangeQuery 
+        || (q instanceof IndexOrDocValuesQuery && ((IndexOrDocValuesQuery)q).getIndexQuery() instanceof PointRangeQuery));
   }
 
   @Test


[11/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-10021: Cannot reload a core if it fails initialization.

Posted by ab...@apache.org.
SOLR-10021: Cannot reload a core if it fails initialization.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/8367e159
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/8367e159
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/8367e159

Branch: refs/heads/jira/solr-9858
Commit: 8367e159e4a287a34adf6552a5aecfe3b8073d8e
Parents: c53b7c3
Author: Erick Erickson <er...@apache.org>
Authored: Wed Feb 22 17:46:36 2017 -0800
Committer: Erick Erickson <er...@apache.org>
Committed: Wed Feb 22 17:46:36 2017 -0800

----------------------------------------------------------------------
 solr/CHANGES.txt                                |  2 +
 .../org/apache/solr/core/CoreContainer.java     | 44 +++++++++++---------
 .../solr/handler/admin/CoreAdminOperation.java  |  6 +--
 .../handler/admin/CoreAdminHandlerTest.java     |  5 ++-
 .../client/solrj/request/TestCoreAdmin.java     | 39 +++++++++++++++++
 5 files changed, 69 insertions(+), 27 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/8367e159/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index ed30d53..fc5bfe1 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -244,6 +244,8 @@ Other Changes
 * SOLR-9848: Lower solr.cloud.wait-for-updates-with-stale-state-pause back down from 7 seconds. 
   (Mark Miller) 
 
+* SOLR-10020: Cannot reload a core if it fails initialization. (Mike Drob via Erick Erickson)
+
 ==================  6.4.2 ==================
 
 Consult the LUCENE_CHANGES.txt file for additional, low level, changes in this release.

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/8367e159/solr/core/src/java/org/apache/solr/core/CoreContainer.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/core/CoreContainer.java b/solr/core/src/java/org/apache/solr/core/CoreContainer.java
index f7a8f33..e3977d7 100644
--- a/solr/core/src/java/org/apache/solr/core/CoreContainer.java
+++ b/solr/core/src/java/org/apache/solr/core/CoreContainer.java
@@ -1116,28 +1116,32 @@ public class CoreContainer {
    * @param name the name of the SolrCore to reload
    */
   public void reload(String name) {
-
     SolrCore core = solrCores.getCoreFromAnyList(name, false);
-    if (core == null)
-      throw new SolrException( SolrException.ErrorCode.BAD_REQUEST, "No such core: " + name );
-
-    CoreDescriptor cd = core.getCoreDescriptor();
-    try {
-      solrCores.waitAddPendingCoreOps(name);
-      ConfigSet coreConfig = coreConfigService.getConfig(cd);
-      log.info("Reloading SolrCore '{}' using configuration from {}", cd.getName(), coreConfig.getName());
-      SolrCore newCore = core.reload(coreConfig);
-      registerCore(name, newCore, false, false);
-    } catch (SolrCoreState.CoreIsClosedException e) {
-      throw e;
-    } catch (Exception e) {
-      coreInitFailures.put(cd.getName(), new CoreLoadFailure(cd, e));
-      throw new SolrException(ErrorCode.SERVER_ERROR, "Unable to reload core [" + cd.getName() + "]", e);
-    }
-    finally {
-      solrCores.removeFromPendingOps(name);
+    if (core != null) {
+      CoreDescriptor cd = core.getCoreDescriptor();
+      try {
+        solrCores.waitAddPendingCoreOps(cd.getName());
+        ConfigSet coreConfig = coreConfigService.getConfig(cd);
+        log.info("Reloading SolrCore '{}' using configuration from {}", cd.getName(), coreConfig.getName());
+        SolrCore newCore = core.reload(coreConfig);
+        registerCore(cd.getName(), newCore, false, false);
+      } catch (SolrCoreState.CoreIsClosedException e) {
+        throw e;
+      } catch (Exception e) {
+        coreInitFailures.put(cd.getName(), new CoreLoadFailure(cd, e));
+        throw new SolrException(ErrorCode.SERVER_ERROR, "Unable to reload core [" + cd.getName() + "]", e);
+      }
+      finally {
+        solrCores.removeFromPendingOps(cd.getName());
+      }
+    } else {
+      CoreLoadFailure clf = coreInitFailures.get(name);
+      if (clf != null) {
+        create(clf.cd, true, false);
+      } else {
+        throw new SolrException(SolrException.ErrorCode.BAD_REQUEST, "No such core: " + name );
+      }
     }
-
   }
 
   /**

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/8367e159/solr/core/src/java/org/apache/solr/handler/admin/CoreAdminOperation.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/handler/admin/CoreAdminOperation.java b/solr/core/src/java/org/apache/solr/handler/admin/CoreAdminOperation.java
index a5782db..e712407 100644
--- a/solr/core/src/java/org/apache/solr/handler/admin/CoreAdminOperation.java
+++ b/solr/core/src/java/org/apache/solr/handler/admin/CoreAdminOperation.java
@@ -102,11 +102,7 @@ enum CoreAdminOperation implements CoreAdminOp {
   }),
   RELOAD_OP(RELOAD, it -> {
     SolrParams params = it.req.getParams();
-    String cname = params.get(CoreAdminParams.CORE);
-
-    if (cname == null || !it.handler.coreContainer.getCoreNames().contains(cname)) {
-      throw new SolrException(ErrorCode.BAD_REQUEST, "Core with core name [" + cname + "] does not exist.");
-    }
+    String cname = params.required().get(CoreAdminParams.CORE);
 
     try {
       it.handler.coreContainer.reload(cname);

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/8367e159/solr/core/src/test/org/apache/solr/handler/admin/CoreAdminHandlerTest.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/handler/admin/CoreAdminHandlerTest.java b/solr/core/src/test/org/apache/solr/handler/admin/CoreAdminHandlerTest.java
index 1a596ab..a81cf13 100644
--- a/solr/core/src/test/org/apache/solr/handler/admin/CoreAdminHandlerTest.java
+++ b/solr/core/src/test/org/apache/solr/handler/admin/CoreAdminHandlerTest.java
@@ -349,7 +349,8 @@ public class CoreAdminHandlerTest extends SolrTestCaseJ4 {
           , resp);
       fail("Was able to successfully reload non-existent-core");
     } catch (Exception e) {
-      assertEquals("Expected error message for non-existent core.", "Core with core name [non-existent-core] does not exist.", e.getMessage());
+      String e1 = e.getCause().getMessage();
+      assertEquals("Expected error message for non-existent core.", "No such core: non-existent-core", e.getCause().getMessage());
     }
 
     // test null core
@@ -364,7 +365,7 @@ public class CoreAdminHandlerTest extends SolrTestCaseJ4 {
       if (!(e instanceof SolrException)) {
         fail("Expected SolrException but got " + e);
       }
-      assertEquals("Expected error message for non-existent core.", "Core with core name [null] does not exist.", e.getMessage());
+      assertEquals("Expected error message for non-existent core.", "Missing required parameter: core", e.getMessage());
     }
 
   }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/8367e159/solr/solrj/src/test/org/apache/solr/client/solrj/request/TestCoreAdmin.java
----------------------------------------------------------------------
diff --git a/solr/solrj/src/test/org/apache/solr/client/solrj/request/TestCoreAdmin.java b/solr/solrj/src/test/org/apache/solr/client/solrj/request/TestCoreAdmin.java
index c8c67ec..b2174cd 100644
--- a/solr/solrj/src/test/org/apache/solr/client/solrj/request/TestCoreAdmin.java
+++ b/solr/solrj/src/test/org/apache/solr/client/solrj/request/TestCoreAdmin.java
@@ -19,6 +19,9 @@ package org.apache.solr.client.solrj.request;
 import java.io.File;
 import java.io.IOException;
 import java.lang.invoke.MethodHandles;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
 import java.util.Collection;
 
 import com.carrotsearch.randomizedtesting.annotations.ThreadLeakFilters;
@@ -41,6 +44,7 @@ import org.apache.solr.common.SolrException;
 import org.apache.solr.common.SolrInputDocument;
 import org.apache.solr.common.params.ModifiableSolrParams;
 import org.apache.solr.common.util.NamedList;
+import org.apache.solr.core.CoreContainer;
 import org.apache.solr.core.SolrCore;
 import org.apache.solr.metrics.SolrCoreMetricManager;
 import org.apache.solr.metrics.SolrMetricManager;
@@ -292,6 +296,41 @@ public class TestCoreAdmin extends AbstractEmbeddedSolrServerTestCase {
       expectThrows(SolrException.class, () -> recoverRequestCmd.process(getSolrAdmin()));
   }
   
+  @Test
+  public void testReloadCoreAfterFailure() throws Exception {
+    cores.shutdown();
+    useFactory(null); // use FS factory
+
+    try {
+      cores = CoreContainer.createAndLoad(SOLR_HOME, getSolrXml());
+
+      String ddir = CoreAdminRequest.getCoreStatus("core0", getSolrCore0()).getDataDirectory();
+      Path data = Paths.get(ddir, "index");
+      assumeTrue("test can't handle relative data directory paths (yet?)", data.isAbsolute());
+
+      getSolrCore0().add(new SolrInputDocument("id", "core0-1"));
+      getSolrCore0().commit();
+
+      cores.shutdown();
+
+      // destroy the index
+      Files.move(data.resolve("_0.si"), data.resolve("backup"));
+      cores = CoreContainer.createAndLoad(SOLR_HOME, getSolrXml());
+
+      // Need to run a query to confirm that the core couldn't load
+      expectThrows(SolrException.class, () -> getSolrCore0().query(new SolrQuery("*:*")));
+
+      // We didn't fix anything, so should still throw
+      expectThrows(SolrException.class, () -> CoreAdminRequest.reloadCore("core0", getSolrCore0()));
+
+      Files.move(data.resolve("backup"), data.resolve("_0.si"));
+      CoreAdminRequest.reloadCore("core0", getSolrCore0());
+      assertEquals(1, getSolrCore0().query(new SolrQuery("*:*")).getResults().getNumFound());
+    } finally {
+      resetFactory();
+    }
+  }
+
   @BeforeClass
   public static void before() {
     // wtf?


[40/50] [abbrv] lucene-solr:jira/solr-9858: LUCENE-7410: Make cache keys and close listeners less trappy.

Posted by ab...@apache.org.
http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/test/org/apache/solr/search/TestSolr4Spatial2.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/search/TestSolr4Spatial2.java b/solr/core/src/test/org/apache/solr/search/TestSolr4Spatial2.java
index 7a7dc6b..1fcfe9a 100644
--- a/solr/core/src/test/org/apache/solr/search/TestSolr4Spatial2.java
+++ b/solr/core/src/test/org/apache/solr/search/TestSolr4Spatial2.java
@@ -157,7 +157,7 @@ public class TestSolr4Spatial2 extends SolrTestCaseJ4 {
 
 
   protected Object getFirstLeafReaderKey() {
-    return getSearcher().getRawReader().leaves().get(0).reader().getCoreCacheKey();
+    return getSearcher().getRawReader().leaves().get(0).reader().getCoreCacheHelper().getKey();
   }
 
   @Test// SOLR-8541

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/test/org/apache/solr/uninverting/TestDocTermOrds.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/uninverting/TestDocTermOrds.java b/solr/core/src/test/org/apache/solr/uninverting/TestDocTermOrds.java
index 67b62cf..69b89b4 100644
--- a/solr/core/src/test/org/apache/solr/uninverting/TestDocTermOrds.java
+++ b/solr/core/src/test/org/apache/solr/uninverting/TestDocTermOrds.java
@@ -218,7 +218,7 @@ public class TestDocTermOrds extends LuceneTestCase {
     TestUtil.checkReader(slowR);
     verify(slowR, idToOrds, termsArray, null);
 
-    FieldCache.DEFAULT.purgeByCacheKey(slowR.getCoreCacheKey());
+    FieldCache.DEFAULT.purgeByCacheKey(slowR.getCoreCacheHelper().getKey());
 
     r.close();
     dir.close();
@@ -338,7 +338,7 @@ public class TestDocTermOrds extends LuceneTestCase {
       verify(slowR, idToOrdsPrefix, termsArray, prefixRef);
     }
 
-    FieldCache.DEFAULT.purgeByCacheKey(slowR.getCoreCacheKey());
+    FieldCache.DEFAULT.purgeByCacheKey(slowR.getCoreCacheHelper().getKey());
 
     r.close();
     dir.close();

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/test/org/apache/solr/uninverting/TestFieldCache.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/uninverting/TestFieldCache.java b/solr/core/src/test/org/apache/solr/uninverting/TestFieldCache.java
index 2d2c381..60bb9e8 100644
--- a/solr/core/src/test/org/apache/solr/uninverting/TestFieldCache.java
+++ b/solr/core/src/test/org/apache/solr/uninverting/TestFieldCache.java
@@ -267,7 +267,7 @@ public class TestFieldCache extends LuceneTestCase {
     termOrds = cache.getDocTermOrds(reader, "bogusfield", null);
     assertTrue(termOrds.getValueCount() == 0);
 
-    FieldCache.DEFAULT.purgeByCacheKey(reader.getCoreCacheKey());
+    FieldCache.DEFAULT.purgeByCacheKey(reader.getCoreCacheHelper().getKey());
   }
 
   public void testEmptyIndex() throws Exception {
@@ -279,7 +279,7 @@ public class TestFieldCache extends LuceneTestCase {
     TestUtil.checkReader(reader);
     FieldCache.DEFAULT.getTerms(reader, "foobar");
     FieldCache.DEFAULT.getTermsIndex(reader, "foobar");
-    FieldCache.DEFAULT.purgeByCacheKey(reader.getCoreCacheKey());
+    FieldCache.DEFAULT.purgeByCacheKey(reader.getCoreCacheHelper().getKey());
     r.close();
     dir.close();
   }

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/test/org/apache/solr/uninverting/TestFieldCacheSanityChecker.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/uninverting/TestFieldCacheSanityChecker.java b/solr/core/src/test/org/apache/solr/uninverting/TestFieldCacheSanityChecker.java
deleted file mode 100644
index b031681..0000000
--- a/solr/core/src/test/org/apache/solr/uninverting/TestFieldCacheSanityChecker.java
+++ /dev/null
@@ -1,164 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements.  See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the "License"); you may not use this file except in compliance with
- * the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-package org.apache.solr.uninverting;
-
-import java.io.IOException;
-
-import org.apache.lucene.analysis.MockAnalyzer;
-import org.apache.lucene.document.Document;
-import org.apache.lucene.document.Field;
-import org.apache.lucene.index.DirectoryReader;
-import org.apache.lucene.index.IndexWriter;
-import org.apache.lucene.index.LeafReader;
-import org.apache.lucene.index.MultiReader;
-import org.apache.lucene.legacy.LegacyDoubleField;
-import org.apache.lucene.legacy.LegacyFloatField;
-import org.apache.lucene.legacy.LegacyIntField;
-import org.apache.lucene.legacy.LegacyLongField;
-import org.apache.lucene.store.Directory;
-import org.apache.lucene.util.LuceneTestCase;
-import org.apache.solr.index.SlowCompositeReaderWrapper;
-import org.apache.solr.uninverting.FieldCacheSanityChecker.Insanity;
-import org.apache.solr.uninverting.FieldCacheSanityChecker.InsanityType;
-
-public class TestFieldCacheSanityChecker extends LuceneTestCase {
-
-  protected LeafReader readerA;
-  protected LeafReader readerB;
-  protected LeafReader readerX;
-  protected LeafReader readerAclone;
-  protected Directory dirA, dirB;
-  private static final int NUM_DOCS = 1000;
-
-  @Override
-  public void setUp() throws Exception {
-    super.setUp();
-    dirA = newDirectory();
-    dirB = newDirectory();
-
-    IndexWriter wA = new IndexWriter(dirA, newIndexWriterConfig(new MockAnalyzer(random())));
-    IndexWriter wB = new IndexWriter(dirB, newIndexWriterConfig(new MockAnalyzer(random())));
-
-    long theLong = Long.MAX_VALUE;
-    double theDouble = Double.MAX_VALUE;
-    int theInt = Integer.MAX_VALUE;
-    float theFloat = Float.MAX_VALUE;
-    for (int i = 0; i < NUM_DOCS; i++){
-      Document doc = new Document();
-      doc.add(new LegacyLongField("theLong", theLong--, Field.Store.NO));
-      doc.add(new LegacyDoubleField("theDouble", theDouble--, Field.Store.NO));
-      doc.add(new LegacyIntField("theInt", theInt--, Field.Store.NO));
-      doc.add(new LegacyFloatField("theFloat", theFloat--, Field.Store.NO));
-      if (0 == i % 3) {
-        wA.addDocument(doc);
-      } else {
-        wB.addDocument(doc);
-      }
-    }
-    wA.close();
-    wB.close();
-    DirectoryReader rA = DirectoryReader.open(dirA);
-    readerA = SlowCompositeReaderWrapper.wrap(rA);
-    readerAclone = SlowCompositeReaderWrapper.wrap(rA);
-    readerA = SlowCompositeReaderWrapper.wrap(DirectoryReader.open(dirA));
-    readerB = SlowCompositeReaderWrapper.wrap(DirectoryReader.open(dirB));
-    readerX = SlowCompositeReaderWrapper.wrap(new MultiReader(readerA, readerB));
-  }
-
-  @Override
-  public void tearDown() throws Exception {
-    readerA.close();
-    readerAclone.close();
-    readerB.close();
-    readerX.close();
-    dirA.close();
-    dirB.close();
-    super.tearDown();
-  }
-
-  public void testSanity() throws IOException {
-    FieldCache cache = FieldCache.DEFAULT;
-    cache.purgeAllCaches();
-
-    cache.getNumerics(readerA, "theDouble", FieldCache.LEGACY_DOUBLE_PARSER);
-    cache.getNumerics(readerAclone, "theDouble", FieldCache.LEGACY_DOUBLE_PARSER);
-    cache.getNumerics(readerB, "theDouble", FieldCache.LEGACY_DOUBLE_PARSER);
-
-    cache.getNumerics(readerX, "theInt", FieldCache.LEGACY_INT_PARSER);
-
-    // // // 
-
-    Insanity[] insanity = 
-      FieldCacheSanityChecker.checkSanity(cache.getCacheEntries());
-    
-    if (0 < insanity.length)
-      dumpArray(getTestClass().getName() + "#" + getTestName() 
-          + " INSANITY", insanity, System.err);
-
-    assertEquals("shouldn't be any cache insanity", 0, insanity.length);
-    cache.purgeAllCaches();
-  }
-
-  public void testInsanity1() throws IOException {
-    FieldCache cache = FieldCache.DEFAULT;
-    cache.purgeAllCaches();
-
-    cache.getNumerics(readerX, "theInt", FieldCache.LEGACY_INT_PARSER);
-    cache.getTerms(readerX, "theInt");
-
-    // // // 
-
-    Insanity[] insanity = 
-      FieldCacheSanityChecker.checkSanity(cache.getCacheEntries());
-
-    assertEquals("wrong number of cache errors", 1, insanity.length);
-    assertEquals("wrong type of cache error", 
-                 InsanityType.VALUEMISMATCH,
-                 insanity[0].getType());
-    assertEquals("wrong number of entries in cache error", 2,
-                 insanity[0].getCacheEntries().length);
-
-    // we expect bad things, don't let tearDown complain about them
-    cache.purgeAllCaches();
-  }
-
-  public void testInsanity2() throws IOException {
-    FieldCache cache = FieldCache.DEFAULT;
-    cache.purgeAllCaches();
-
-    cache.getTerms(readerA, "theInt");
-    cache.getTerms(readerB, "theInt");
-    cache.getTerms(readerX, "theInt");
-
-
-    // // // 
-
-    Insanity[] insanity = 
-      FieldCacheSanityChecker.checkSanity(cache.getCacheEntries());
-    
-    assertEquals("wrong number of cache errors", 1, insanity.length);
-    assertEquals("wrong type of cache error", 
-                 InsanityType.SUBREADER,
-                 insanity[0].getType());
-    assertEquals("wrong number of entries in cache error", 3,
-                 insanity[0].getCacheEntries().length);
-
-    // we expect bad things, don't let tearDown complain about them
-    cache.purgeAllCaches();
-  }
-
-}

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/df6f8307/solr/core/src/test/org/apache/solr/uninverting/TestLegacyFieldCache.java
----------------------------------------------------------------------
diff --git a/solr/core/src/test/org/apache/solr/uninverting/TestLegacyFieldCache.java b/solr/core/src/test/org/apache/solr/uninverting/TestLegacyFieldCache.java
index 9dc047b..e38e193 100644
--- a/solr/core/src/test/org/apache/solr/uninverting/TestLegacyFieldCache.java
+++ b/solr/core/src/test/org/apache/solr/uninverting/TestLegacyFieldCache.java
@@ -32,26 +32,20 @@ import org.apache.lucene.index.IndexWriterConfig;
 import org.apache.lucene.index.LeafReader;
 import org.apache.lucene.index.NumericDocValues;
 import org.apache.lucene.index.RandomIndexWriter;
-import org.apache.lucene.index.Terms;
-import org.apache.lucene.index.TermsEnum;
 import org.apache.lucene.legacy.LegacyDoubleField;
 import org.apache.lucene.legacy.LegacyFloatField;
 import org.apache.lucene.legacy.LegacyIntField;
 import org.apache.lucene.legacy.LegacyLongField;
-import org.apache.lucene.legacy.LegacyNumericUtils;
 import org.apache.lucene.store.Directory;
 import org.apache.lucene.util.Bits;
 import org.apache.lucene.util.BytesRef;
-import org.apache.lucene.util.IOUtils;
 import org.apache.lucene.util.LuceneTestCase;
 import org.apache.lucene.util.TestUtil;
 import org.apache.solr.index.SlowCompositeReaderWrapper;
 import org.junit.AfterClass;
 import org.junit.BeforeClass;
 
-import java.io.ByteArrayOutputStream;
 import java.io.IOException;
-import java.io.PrintStream;
 import java.util.HashSet;
 import java.util.Set;
 import java.util.concurrent.CyclicBarrier;
@@ -106,31 +100,6 @@ public class TestLegacyFieldCache extends LuceneTestCase {
     directory.close();
     directory = null;
   }
-  
-  public void testInfoStream() throws Exception {
-    try {
-      FieldCache cache = FieldCache.DEFAULT;
-      ByteArrayOutputStream bos = new ByteArrayOutputStream(1024);
-      cache.setInfoStream(new PrintStream(bos, false, IOUtils.UTF_8));
-      cache.getNumerics(reader, "theDouble", FieldCache.LEGACY_DOUBLE_PARSER);
-      cache.getNumerics(reader, "theDouble", new FieldCache.Parser() {
-        @Override
-        public TermsEnum termsEnum(Terms terms) throws IOException {
-          return LegacyNumericUtils.filterPrefixCodedLongs(terms.iterator());
-        }
-        @Override
-        public long parseValue(BytesRef term) {
-          int val = (int) LegacyNumericUtils.prefixCodedToLong(term);
-          if (val<0) val ^= 0x7fffffff;
-          return val;
-        }
-      });
-      assertTrue(bos.toString(IOUtils.UTF_8).indexOf("WARNING") != -1);
-    } finally {
-      FieldCache.DEFAULT.setInfoStream(null);
-      FieldCache.DEFAULT.purgeAllCaches();
-    }
-  }
 
   public void test() throws IOException {
     FieldCache cache = FieldCache.DEFAULT;
@@ -174,7 +143,7 @@ public class TestLegacyFieldCache extends LuceneTestCase {
       assertEquals(i%2 == 0, docsWithField.get(i));
     }
 
-    FieldCache.DEFAULT.purgeByCacheKey(reader.getCoreCacheKey());
+    FieldCache.DEFAULT.purgeByCacheKey(reader.getCoreCacheHelper().getKey());
   }
 
   public void testEmptyIndex() throws Exception {
@@ -186,7 +155,7 @@ public class TestLegacyFieldCache extends LuceneTestCase {
     TestUtil.checkReader(reader);
     FieldCache.DEFAULT.getTerms(reader, "foobar");
     FieldCache.DEFAULT.getTermsIndex(reader, "foobar");
-    FieldCache.DEFAULT.purgeByCacheKey(reader.getCoreCacheKey());
+    FieldCache.DEFAULT.purgeByCacheKey(reader.getCoreCacheHelper().getKey());
     r.close();
     dir.close();
   }


[17/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-9764: fix CHANGES entry

Posted by ab...@apache.org.
SOLR-9764: fix CHANGES entry


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/05c17c9a
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/05c17c9a
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/05c17c9a

Branch: refs/heads/jira/solr-9858
Commit: 05c17c9a516d8501b2dcce9b5910a3d0b5510bc4
Parents: a0aef2f
Author: yonik <yo...@apache.org>
Authored: Thu Feb 23 16:55:36 2017 -0500
Committer: yonik <yo...@apache.org>
Committed: Thu Feb 23 16:55:36 2017 -0500

----------------------------------------------------------------------
 solr/CHANGES.txt | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/05c17c9a/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index 9ece4f8..0302615 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -189,7 +189,7 @@ Optimizations
 * SOLR-9941: Clear the deletes lists at UpdateLog before replaying from log. This prevents redundantly pre-applying
   DBQs, during the log replay, to every update in the log as if the DBQs were out of order. (hossman, Ishan Chattopadhyaya)
 
-* SOLR-9764: All filters that which all documents in the index now share the same memory (DocSet).
+* SOLR-9764: All filters that match all documents in the index now share the same memory (DocSet).
   (Michael Sun, yonik)
 
 * SOLR-9584: Support Solr being proxied with another endpoint than default /solr, by using relative links


[22/50] [abbrv] lucene-solr:jira/solr-9858: LUCENE-7710: BlockPackedReader now throws CorruptIndexException if bitsPerValue is out of bounds, not generic IOException

Posted by ab...@apache.org.
LUCENE-7710: BlockPackedReader now throws CorruptIndexException if bitsPerValue is out of bounds, not generic IOException


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/cab3aae1
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/cab3aae1
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/cab3aae1

Branch: refs/heads/jira/solr-9858
Commit: cab3aae11dd6e781acabf513095eb11606feddde
Parents: 2e56c0e
Author: Mike McCandless <mi...@apache.org>
Authored: Fri Feb 24 17:13:49 2017 -0500
Committer: Mike McCandless <mi...@apache.org>
Committed: Fri Feb 24 17:13:49 2017 -0500

----------------------------------------------------------------------
 lucene/CHANGES.txt                                               | 4 ++++
 .../java/org/apache/lucene/util/packed/BlockPackedReader.java    | 3 ++-
 2 files changed, 6 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/cab3aae1/lucene/CHANGES.txt
----------------------------------------------------------------------
diff --git a/lucene/CHANGES.txt b/lucene/CHANGES.txt
index 5d3a077..1d45ab8 100644
--- a/lucene/CHANGES.txt
+++ b/lucene/CHANGES.txt
@@ -178,6 +178,10 @@ Improvements
   earlier than regular queries in order to improve cache efficiency.
   (Adrien Grand)
 
+* LUCENE-7710: BlockPackedReader throws CorruptIndexException and includes
+  IndexInput description instead of plain IOException (Mike Drob via
+  Mike McCandless)
+
 Optimizations
 
 * LUCENE-7641: Optimized point range queries to compute documents that do not

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/cab3aae1/lucene/core/src/java/org/apache/lucene/util/packed/BlockPackedReader.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/util/packed/BlockPackedReader.java b/lucene/core/src/java/org/apache/lucene/util/packed/BlockPackedReader.java
index 82bf93f..986cba7 100644
--- a/lucene/core/src/java/org/apache/lucene/util/packed/BlockPackedReader.java
+++ b/lucene/core/src/java/org/apache/lucene/util/packed/BlockPackedReader.java
@@ -28,6 +28,7 @@ import static org.apache.lucene.util.packed.PackedInts.numBlocks;
 
 import java.io.IOException;
 
+import org.apache.lucene.index.CorruptIndexException;
 import org.apache.lucene.store.IndexInput;
 import org.apache.lucene.util.Accountable;
 import org.apache.lucene.util.LongValues;
@@ -58,7 +59,7 @@ public final class BlockPackedReader extends LongValues implements Accountable {
       final int bitsPerValue = token >>> BPV_SHIFT;
       sumBPV += bitsPerValue;
       if (bitsPerValue > 64) {
-        throw new IOException("Corrupted");
+        throw new CorruptIndexException("Corrupted Block#" + i, in);
       }
       if ((token & MIN_VALUE_EQUALS_0) == 0) {
         if (minValues == null) {


[05/50] [abbrv] lucene-solr:jira/solr-9858: SOLR-9824: Some bulk update paths could be very slow due to CUSC polling.

Posted by ab...@apache.org.
SOLR-9824: Some bulk update paths could be very slow due to CUSC polling.


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/d6337ac3
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/d6337ac3
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/d6337ac3

Branch: refs/heads/jira/solr-9858
Commit: d6337ac3e566c504766d69499ab470bd26744a29
Parents: 2f82409
Author: markrmiller <ma...@apache.org>
Authored: Wed Feb 22 13:00:42 2017 -0500
Committer: markrmiller <ma...@apache.org>
Committed: Wed Feb 22 14:44:18 2017 -0500

----------------------------------------------------------------------
 solr/CHANGES.txt                                |   2 +
 .../handler/loader/ContentStreamLoader.java     |   2 -
 .../solr/handler/loader/JavabinLoader.java      |   3 -
 .../apache/solr/update/AddUpdateCommand.java    |   2 -
 .../apache/solr/update/SolrCmdDistributor.java  |  15 +-
 .../solr/update/StreamingSolrClients.java       |   2 +-
 .../processor/DistributedUpdateProcessor.java   |   7 +
 .../solrj/impl/ConcurrentUpdateSolrClient.java  | 317 ++++++++++++++-----
 8 files changed, 252 insertions(+), 98 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d6337ac3/solr/CHANGES.txt
----------------------------------------------------------------------
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index a6b5504..dcea40c 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -178,6 +178,8 @@ Bug Fixes
 
 * SOLR-10168: ShardSplit can fail with NPE in OverseerCollectionMessageHandler#waitForCoreAdminAsyncCallToComplete. (Mark Miller)
 
+* SOLR-9824: Some bulk update paths could be very slow due to CUSC polling. (David Smiley, Mark Miller)
+
 Optimizations
 ----------------------
 

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d6337ac3/solr/core/src/java/org/apache/solr/handler/loader/ContentStreamLoader.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/handler/loader/ContentStreamLoader.java b/solr/core/src/java/org/apache/solr/handler/loader/ContentStreamLoader.java
index 1dd038f..7751b43 100644
--- a/solr/core/src/java/org/apache/solr/handler/loader/ContentStreamLoader.java
+++ b/solr/core/src/java/org/apache/solr/handler/loader/ContentStreamLoader.java
@@ -29,8 +29,6 @@ import org.apache.solr.update.processor.UpdateRequestProcessor;
  */
 public abstract class ContentStreamLoader {
 
-  protected static final int pollQueueTime = Integer.getInteger("solr.cloud.replication.poll-queue-time-ms", 25);
-
   /**
    * This should be called once for each RequestHandler
    */

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d6337ac3/solr/core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/handler/loader/JavabinLoader.java b/solr/core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
index 6114280..873bcd1 100644
--- a/solr/core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
+++ b/solr/core/src/java/org/apache/solr/handler/loader/JavabinLoader.java
@@ -116,9 +116,6 @@ public class JavabinLoader extends ContentStreamLoader {
 
   private AddUpdateCommand getAddCommand(SolrQueryRequest req, SolrParams params) {
     AddUpdateCommand addCmd = new AddUpdateCommand(req);
-    // since we can give a hint to the leader that the end of a batch is being processed, it's OK to have a larger
-    // pollQueueTime than the default 0 since we can optimize around not waiting unnecessarily
-    addCmd.pollQueueTime = pollQueueTime;
     addCmd.overwrite = params.getBool(UpdateParams.OVERWRITE, true);
     addCmd.commitWithin = params.getInt(UpdateParams.COMMIT_WITHIN, -1);
     return addCmd;

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d6337ac3/solr/core/src/java/org/apache/solr/update/AddUpdateCommand.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/update/AddUpdateCommand.java b/solr/core/src/java/org/apache/solr/update/AddUpdateCommand.java
index 0ede728..f526397 100644
--- a/solr/core/src/java/org/apache/solr/update/AddUpdateCommand.java
+++ b/solr/core/src/java/org/apache/solr/update/AddUpdateCommand.java
@@ -60,8 +60,6 @@ public class AddUpdateCommand extends UpdateCommand implements Iterable<Document
    public int commitWithin = -1;
 
    public boolean isLastDocInBatch = false;
-
-   public int pollQueueTime = 0;
    
    public AddUpdateCommand(SolrQueryRequest req) {
      super(req);

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d6337ac3/solr/core/src/java/org/apache/solr/update/SolrCmdDistributor.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/update/SolrCmdDistributor.java b/solr/core/src/java/org/apache/solr/update/SolrCmdDistributor.java
index 5caf43e..dac4000 100644
--- a/solr/core/src/java/org/apache/solr/update/SolrCmdDistributor.java
+++ b/solr/core/src/java/org/apache/solr/update/SolrCmdDistributor.java
@@ -36,6 +36,7 @@ import org.apache.solr.update.processor.DistributedUpdateProcessor.RequestReplic
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
 
+import java.io.Closeable;
 import java.io.IOException;
 import java.io.InputStream;
 import java.lang.invoke.MethodHandles;
@@ -51,7 +52,7 @@ import java.util.concurrent.ExecutorService;
 import java.util.concurrent.Future;
 
 
-public class SolrCmdDistributor {
+public class SolrCmdDistributor implements Closeable {
   private static final int MAX_RETRIES_ON_FORWARD = 25;
   private static final Logger log = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
   
@@ -96,6 +97,10 @@ public class SolrCmdDistributor {
       clients.shutdown();
     }
   }
+  
+  public void close() {
+    clients.shutdown();
+  }
 
   private void doRetriesIfNeeded() {
     // NOTE: retries will be forwards to a single url
@@ -210,7 +215,7 @@ public class SolrCmdDistributor {
       if (cmd.isInPlaceUpdate()) {
         params.set(DistributedUpdateProcessor.DISTRIB_INPLACE_PREVVERSION, String.valueOf(cmd.prevVersion));
       }
-      submit(new Req(cmd, node, uReq, synchronous, rrt, cmd.pollQueueTime), false);
+      submit(new Req(cmd, node, uReq, synchronous, rrt), false);
     }
     
   }
@@ -314,19 +319,17 @@ public class SolrCmdDistributor {
     public boolean synchronous;
     public UpdateCommand cmd;
     public RequestReplicationTracker rfTracker;
-    public int pollQueueTime;
 
     public Req(UpdateCommand cmd, Node node, UpdateRequest uReq, boolean synchronous) {
-      this(cmd, node, uReq, synchronous, null, 0);
+      this(cmd, node, uReq, synchronous, null);
     }
     
-    public Req(UpdateCommand cmd, Node node, UpdateRequest uReq, boolean synchronous, RequestReplicationTracker rfTracker, int pollQueueTime) {
+    public Req(UpdateCommand cmd, Node node, UpdateRequest uReq, boolean synchronous, RequestReplicationTracker rfTracker) {
       this.node = node;
       this.uReq = uReq;
       this.synchronous = synchronous;
       this.cmd = cmd;
       this.rfTracker = rfTracker;
-      this.pollQueueTime = pollQueueTime;
     }
     
     public String toString() {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d6337ac3/solr/core/src/java/org/apache/solr/update/StreamingSolrClients.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/update/StreamingSolrClients.java b/solr/core/src/java/org/apache/solr/update/StreamingSolrClients.java
index fc50be2..7c630f4 100644
--- a/solr/core/src/java/org/apache/solr/update/StreamingSolrClients.java
+++ b/solr/core/src/java/org/apache/solr/update/StreamingSolrClients.java
@@ -73,9 +73,9 @@ public class StreamingSolrClients {
       // on a greater scale since the current behavior is to only increase the number of connections/Runners when
       // the queue is more than half full.
       client = new ErrorReportingConcurrentUpdateSolrClient(url, httpClient, 100, runnerCount, updateExecutor, true, req);
+      client.setPollQueueTime(Integer.MAX_VALUE); // minimize connections created
       client.setParser(new BinaryResponseParser());
       client.setRequestWriter(new BinaryRequestWriter());
-      client.setPollQueueTime(req.pollQueueTime);
       Set<String> queryParams = new HashSet<>(2);
       queryParams.add(DistributedUpdateProcessor.DISTRIB_FROM);
       queryParams.add(DistributingUpdateProcessorFactory.DISTRIB_UPDATE_PARAM);

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d6337ac3/solr/core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
----------------------------------------------------------------------
diff --git a/solr/core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java b/solr/core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
index c6ccb71..ec093cf 100644
--- a/solr/core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
+++ b/solr/core/src/java/org/apache/solr/update/processor/DistributedUpdateProcessor.java
@@ -828,6 +828,13 @@ public class DistributedUpdateProcessor extends UpdateRequestProcessor {
     // Given that, it may also make sense to move the version reporting out of this
     // processor too.
   }
+  
+  @Override
+  protected void doClose() {
+    if (cmdDistrib != null) {
+      cmdDistrib.close();
+    }
+  }
  
   // TODO: optionally fail if n replicas are not reached...
   private void doFinish() {

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d6337ac3/solr/solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrClient.java
----------------------------------------------------------------------
diff --git a/solr/solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrClient.java b/solr/solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrClient.java
index 5c3f289..4eac2a5 100644
--- a/solr/solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrClient.java
+++ b/solr/solrj/src/java/org/apache/solr/client/solrj/impl/ConcurrentUpdateSolrClient.java
@@ -30,6 +30,7 @@ import java.util.concurrent.CountDownLatch;
 import java.util.concurrent.ExecutorService;
 import java.util.concurrent.LinkedBlockingQueue;
 import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicInteger;
 
 import org.apache.http.HttpResponse;
 import org.apache.http.HttpStatus;
@@ -87,7 +88,13 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
   private boolean internalHttpClient;
   private volatile Integer connectionTimeout;
   private volatile Integer soTimeout;
-
+  private volatile boolean closed;
+  
+  AtomicInteger pollInterrupts;
+  AtomicInteger pollExits;
+  AtomicInteger blockLoops;
+  AtomicInteger emptyQueueLoops;
+  
   /**
    * Uses an internally managed HttpClient instance.
    * 
@@ -156,6 +163,13 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
       scheduler = ExecutorUtil.newMDCAwareCachedThreadPool(new SolrjNamedThreadFactory("concurrentUpdateScheduler"));
       shutdownExecutor = true;
     }
+    
+    if (log.isDebugEnabled()) {
+      pollInterrupts = new AtomicInteger();
+      pollExits = new AtomicInteger();
+      blockLoops = new AtomicInteger();
+      emptyQueueLoops = new AtomicInteger();
+    }
   }
 
   public Set<String> getQueryParams() {
@@ -174,13 +188,19 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
    * Opens a connection and sends everything...
    */
   class Runner implements Runnable {
+    volatile Thread thread = null;
+    volatile boolean inPoll = false;
+    
+    public Thread getThread() {
+      return thread;
+    }
+    
     @Override
     public void run() {
+      this.thread = Thread.currentThread();
       log.debug("starting runner: {}", this);
-
       // This loop is so we can continue if an element was added to the queue after the last runner exited.
       for (;;) {
-
         try {
 
           sendUpdateStream();
@@ -191,7 +211,6 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
           }
           handleError(e);
         } finally {
-
           synchronized (runners) {
             // check to see if anything else was added to the queue
             if (runners.size() == 1 && !queue.isEmpty() && !scheduler.isShutdown()) {
@@ -205,26 +224,42 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
               break;
             }
           }
-
         }
       }
 
       log.debug("finished: {}", this);
     }
 
+    public void interruptPoll() {
+      Thread lthread = thread;
+      if (inPoll && lthread != null) {
+        lthread.interrupt();
+      }
+    }
+    
     //
     // Pull from the queue multiple times and streams over a single connection.
     // Exits on exception, interruption, or an empty queue to pull from.
     //
     void sendUpdateStream() throws Exception {
+    
       while (!queue.isEmpty()) {
         HttpPost method = null;
         HttpResponse response = null;
-
+        
         InputStream rspBody = null;
         try {
-          final Update update = 
-              queue.poll(pollQueueTime, TimeUnit.MILLISECONDS);
+          Update update;
+          notifyQueueAndRunnersIfEmptyQueue();
+          try {
+            inPoll = true;
+            update = queue.poll(pollQueueTime, TimeUnit.MILLISECONDS);
+          } catch (InterruptedException e) {
+            if (log.isDebugEnabled()) pollInterrupts.incrementAndGet();
+            continue;
+          } finally {
+            inPoll = false;
+          }
           if (update == null)
             break;
 
@@ -234,61 +269,73 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
           final ModifiableSolrParams origParams = new ModifiableSolrParams(update.getRequest().getParams());
 
           EntityTemplate template = new EntityTemplate(new ContentProducer() {
-
+            
             @Override
             public void writeTo(OutputStream out) throws IOException {
-              try {
-                if (isXml) {
-                  out.write("<stream>".getBytes(StandardCharsets.UTF_8)); // can be anything
+
+              if (isXml) {
+                out.write("<stream>".getBytes(StandardCharsets.UTF_8)); // can be anything
+              }
+              Update upd = update;
+              while (upd != null) {
+                UpdateRequest req = upd.getRequest();
+                SolrParams currentParams = new ModifiableSolrParams(req.getParams());
+                if (!origParams.toNamedList().equals(currentParams.toNamedList())) {
+                  queue.add(upd); // params are different, push back to queue
+                  break;
                 }
-                Update upd = update;
-                while (upd != null) {
-                  UpdateRequest req = upd.getRequest();
-                  SolrParams currentParams = new ModifiableSolrParams(req.getParams());
-                  if (!origParams.toNamedList().equals(currentParams.toNamedList())) {
-                    queue.add(upd); // params are different, push back to queue
-                    break;
-                  }
 
-                  client.requestWriter.write(req, out);
-                  if (isXml) {
-                    // check for commit or optimize
-                    SolrParams params = req.getParams();
-                    if (params != null) {
-                      String fmt = null;
-                      if (params.getBool(UpdateParams.OPTIMIZE, false)) {
-                        fmt = "<optimize waitSearcher=\"%s\" />";
-                      } else if (params.getBool(UpdateParams.COMMIT, false)) {
-                        fmt = "<commit waitSearcher=\"%s\" />";
-                      }
-                      if (fmt != null) {
-                        byte[] content = String.format(Locale.ROOT,
-                            fmt,
-                            params.getBool(UpdateParams.WAIT_SEARCHER, false)
-                                + "").getBytes(StandardCharsets.UTF_8);
-                        out.write(content);
-                      }
+                client.requestWriter.write(req, out);
+                if (isXml) {
+                  // check for commit or optimize
+                  SolrParams params = req.getParams();
+                  if (params != null) {
+                    String fmt = null;
+                    if (params.getBool(UpdateParams.OPTIMIZE, false)) {
+                      fmt = "<optimize waitSearcher=\"%s\" />";
+                    } else if (params.getBool(UpdateParams.COMMIT, false)) {
+                      fmt = "<commit waitSearcher=\"%s\" />";
+                    }
+                    if (fmt != null) {
+                      byte[] content = String.format(Locale.ROOT,
+                          fmt, params.getBool(UpdateParams.WAIT_SEARCHER, false)
+                              + "")
+                          .getBytes(StandardCharsets.UTF_8);
+                      out.write(content);
                     }
                   }
-                  out.flush();
-
-                  if (pollQueueTime > 0 && threadCount == 1 && req.isLastDocInBatch()) {
-                    // no need to wait to see another doc in the queue if we've hit the last doc in a batch
-                    upd = queue.poll(0, TimeUnit.MILLISECONDS);
-                  } else {
-                    upd = queue.poll(pollQueueTime, TimeUnit.MILLISECONDS);
-                  }
-
                 }
-
-                if (isXml) {
-                  out.write("</stream>".getBytes(StandardCharsets.UTF_8));
+                out.flush();
+
+                notifyQueueAndRunnersIfEmptyQueue();
+                inPoll = true;
+                try {
+                  while (true) {
+                    try {
+                      upd = queue.poll(pollQueueTime, TimeUnit.MILLISECONDS);
+                      break;
+                    } catch (InterruptedException e) {
+                      if (log.isDebugEnabled()) pollInterrupts.incrementAndGet();
+                      if (!queue.isEmpty()) {
+                        continue;
+                      }
+                      if (log.isDebugEnabled()) pollExits.incrementAndGet();
+                      upd = null;
+                      break;
+                    } finally {
+                      inPoll = false;
+                    }
+                  }
+                }finally {
+                  inPoll = false;
                 }
+              }
 
-              } catch (InterruptedException e) {
-                Thread.currentThread().interrupt();
-                log.warn("", e);
+              if (isXml) {
+                out.write("</stream>".getBytes(StandardCharsets.UTF_8));
               }
+            
+            
             }
           });
 
@@ -318,10 +365,13 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
           method.setEntity(template);
           method.addHeader("User-Agent", HttpSolrClient.AGENT);
           method.addHeader("Content-Type", contentType);
-
+          
+       
           response = client.getHttpClient()
               .execute(method, HttpClientUtil.createNewHttpClientRequestContext());
+          
           rspBody = response.getEntity().getContent();
+            
           int statusCode = response.getStatusLine().getStatusCode();
           if (statusCode != HttpStatus.SC_OK) {
             StringBuilder msg = new StringBuilder();
@@ -364,6 +414,7 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
           } else {
             onSuccess(response);
           }
+          
         } finally {
           try {
             if (response != null) {
@@ -372,10 +423,25 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
           } catch (Exception e) {
             log.error("Error consuming and closing http response stream.", e);
           }
+          notifyQueueAndRunnersIfEmptyQueue();
         }
       }
     }
   }
+  
+  private void notifyQueueAndRunnersIfEmptyQueue() {
+    if (queue.size() == 0) {
+      synchronized (queue) {
+        // queue may be empty
+        queue.notifyAll();
+      }
+      synchronized (runners) {
+        // we notify runners too - if there is a high queue poll time and this is the update
+        // that emptied the queue, we make an attempt to avoid the 250ms timeout in blockUntilFinished
+        runners.notifyAll();
+      }
+    }
+  }
 
   // *must* be called with runners monitor held, e.g. synchronized(runners){ addRunner() }
   private void addRunner() {
@@ -383,7 +449,9 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
     try {
       Runner r = new Runner();
       runners.add(r);
+      
       scheduler.execute(r);  // this can throw an exception if the scheduler has been shutdown, but that should be fine.
+
     } finally {
       MDC.remove("ConcurrentUpdateSolrClient.url");
     }
@@ -517,29 +585,52 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
   public synchronized void blockUntilFinished() {
     lock = new CountDownLatch(1);
     try {
+
+      waitForEmptyQueue();
+      interruptRunnerThreadsPolling();
+
       synchronized (runners) {
 
         // NOTE: if the executor is shut down, runners may never become empty (a scheduled task may never be run,
-        // which means it would never remove itself from the runners list.  This is why we don't wait forever
+        // which means it would never remove itself from the runners list. This is why we don't wait forever
         // and periodically check if the scheduler is shutting down.
+        int loopCount = 0;
         while (!runners.isEmpty()) {
-          try {
-            runners.wait(250);
-          } catch (InterruptedException e) {
-            Thread.interrupted();
-          }
+          
+          if (log.isDebugEnabled()) blockLoops.incrementAndGet();
           
           if (scheduler.isShutdown())
             break;
-                      
+          
+          loopCount++;
+          
           // Need to check if the queue is empty before really considering this is finished (SOLR-4260)
           int queueSize = queue.size();
           if (queueSize > 0 && runners.isEmpty()) {
             // TODO: can this still happen?
-            log.warn("No more runners, but queue still has "+
-              queueSize+" adding more runners to process remaining requests on queue");
+            log.warn("No more runners, but queue still has " +
+                queueSize + " adding more runners to process remaining requests on queue");
             addRunner();
           }
+          
+          interruptRunnerThreadsPolling();
+          
+          // try to avoid the worst case wait timeout
+          // without bad spin
+          int timeout;
+          if (loopCount < 3) {
+            timeout = 10;
+          } else if (loopCount < 10) {
+            timeout = 25;
+          } else {
+            timeout = 250;
+          }
+          
+          try {
+            runners.wait(timeout);
+          } catch (InterruptedException e) {
+            Thread.currentThread().interrupt();
+          }
         }
       }
     } finally {
@@ -548,6 +639,29 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
     }
   }
 
+  private void waitForEmptyQueue() {
+
+    while (!queue.isEmpty()) {
+      if (log.isDebugEnabled()) emptyQueueLoops.incrementAndGet();
+
+      synchronized (runners) {
+        int queueSize = queue.size();
+        if (queueSize > 0 && runners.isEmpty()) {
+          log.warn("No more runners, but queue still has " +
+              queueSize + " adding more runners to process remaining requests on queue");
+          addRunner();
+        }
+      }
+      synchronized (queue) {
+        try {
+          queue.wait(250);
+        } catch (InterruptedException e) {
+          Thread.currentThread().interrupt();
+        }
+      }
+    }
+  }
+
   public void handleError(Throwable ex) {
     log.error("error", ex);
   }
@@ -560,19 +674,42 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
   }
 
   @Override
-  public void close() {
-    if (internalHttpClient) IOUtils.closeQuietly(client);
-    if (shutdownExecutor) {
-      scheduler.shutdown();
-      try {
-        if (!scheduler.awaitTermination(60, TimeUnit.SECONDS)) {
+  public synchronized void close() {
+    if (closed) {
+      interruptRunnerThreadsPolling();
+      return;
+    }
+    closed = true;
+    
+    try {
+      if (shutdownExecutor) {
+        scheduler.shutdown();
+        interruptRunnerThreadsPolling();
+        try {
+          if (!scheduler.awaitTermination(60, TimeUnit.SECONDS)) {
+            scheduler.shutdownNow();
+            if (!scheduler.awaitTermination(60, TimeUnit.SECONDS)) log
+                .error("ExecutorService did not terminate");
+          }
+        } catch (InterruptedException ie) {
           scheduler.shutdownNow();
-          if (!scheduler.awaitTermination(60, TimeUnit.SECONDS)) log
-              .error("ExecutorService did not terminate");
+          Thread.currentThread().interrupt();
         }
-      } catch (InterruptedException ie) {
-        scheduler.shutdownNow();
-        Thread.currentThread().interrupt();
+      } else {
+        interruptRunnerThreadsPolling();
+      }
+    } finally {
+      if (internalHttpClient) IOUtils.closeQuietly(client);
+      if (log.isDebugEnabled()) {
+        log.debug("STATS pollInteruppts={} pollExists={} blockLoops={} emptyQueueLoops={}", pollInterrupts.get(), pollExits.get(), blockLoops.get(), emptyQueueLoops.get());
+      }
+    }
+  }
+
+  private void interruptRunnerThreadsPolling() {
+    synchronized (runners) {
+      for (Runner runner : runners) {
+        runner.interruptPoll();
       }
     }
   }
@@ -590,17 +727,29 @@ public class ConcurrentUpdateSolrClient extends SolrClient {
   }
 
   public void shutdownNow() {
-    if (internalHttpClient) IOUtils.closeQuietly(client);
-    if (shutdownExecutor) {
-      scheduler.shutdownNow(); // Cancel currently executing tasks
-      try {
-        if (!scheduler.awaitTermination(30, TimeUnit.SECONDS)) 
-          log.error("ExecutorService did not terminate");
-      } catch (InterruptedException ie) {
-        scheduler.shutdownNow();
-        Thread.currentThread().interrupt();
+    if (closed) {
+      return;
+    }
+    closed = true;
+    try {
+
+      if (shutdownExecutor) {
+        scheduler.shutdown();
+        interruptRunnerThreadsPolling();
+        scheduler.shutdownNow(); // Cancel currently executing tasks
+        try {
+          if (!scheduler.awaitTermination(30, TimeUnit.SECONDS))
+            log.error("ExecutorService did not terminate");
+        } catch (InterruptedException ie) {
+          scheduler.shutdownNow();
+          Thread.currentThread().interrupt();
+        }
+      } else {
+        interruptRunnerThreadsPolling();
       }
-    }    
+    } finally {
+      if (internalHttpClient) IOUtils.closeQuietly(client);
+    }
   }
   
   public void setParser(ResponseParser responseParser) {


[12/50] [abbrv] lucene-solr:jira/solr-9858: LUCENE-7706: Update MergeScheduler's documentation - clone() is no longer there

Posted by ab...@apache.org.
LUCENE-7706: Update MergeScheduler's documentation - clone() is no longer there


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/8ed8ecfc
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/8ed8ecfc
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/8ed8ecfc

Branch: refs/heads/jira/solr-9858
Commit: 8ed8ecfc7e972d8dbbf02497f53da54b1d6b8461
Parents: 8367e15
Author: Dawid Weiss <dw...@apache.org>
Authored: Thu Feb 23 11:51:03 2017 +0100
Committer: Dawid Weiss <dw...@apache.org>
Committed: Thu Feb 23 11:51:03 2017 +0100

----------------------------------------------------------------------
 lucene/core/src/java/org/apache/lucene/index/MergeScheduler.java | 3 ---
 1 file changed, 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/8ed8ecfc/lucene/core/src/java/org/apache/lucene/index/MergeScheduler.java
----------------------------------------------------------------------
diff --git a/lucene/core/src/java/org/apache/lucene/index/MergeScheduler.java b/lucene/core/src/java/org/apache/lucene/index/MergeScheduler.java
index 3a9f98f..65af45b 100644
--- a/lucene/core/src/java/org/apache/lucene/index/MergeScheduler.java
+++ b/lucene/core/src/java/org/apache/lucene/index/MergeScheduler.java
@@ -26,9 +26,6 @@ import org.apache.lucene.util.InfoStream;
  *  implementing this interface to execute the merges
  *  selected by a {@link MergePolicy}.  The default
  *  MergeScheduler is {@link ConcurrentMergeScheduler}.</p>
- *  <p>Implementers of sub-classes should make sure that {@link #clone()}
- *  returns an independent instance able to work with any {@link IndexWriter}
- *  instance.</p>
  * @lucene.experimental
 */
 public abstract class MergeScheduler implements Closeable {