You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@lucene.apache.org by dw...@apache.org on 2021/03/10 09:48:47 UTC

[lucene] branch branch_7_0 created (now e3d379a)

This is an automated email from the ASF dual-hosted git repository.

dweiss pushed a change to branch branch_7_0
in repository https://gitbox.apache.org/repos/asf/lucene.git.


      at e3d379a  SOLR-9743: documentation

This branch includes the following new commits:

     new a29a087  SOLR-10842: Convert all remaining {{quickstart.html}} links to {{guide/solr-tutorial.html}}; remove all references to quickstart from the build; and version the link to the ref guide's tutorial in Solr's versioned top-level documentation page.
     new db54a7a  Add version 7.0.2.  Add 7.0.1 backcompat test indexes
     new ce2b7c9  Add Lucene & Solr 7.0.1
     new 657ac33  remove unreleased/unsupported java9 note
     new f5ec4c0  LUCENE-7995: 'ant stage-maven-artifacts' should work from the top-level project directory, and should provide a better error message when its 'maven.dist.dir' param points to a non-existent directory
     new 719c992  addBackCompatIndexes.py: Don't generated sorted indexes for versions < 6.2
     new 1f41349  LUCENE-6144: Upgrade Ivy to 2.4.0; 'ant ivy-bootstrap' now removes old Ivy jars in ~/.ant/lib/.
     new e3d379a  SOLR-9743: documentation

The 8 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



[lucene] 06/08: addBackCompatIndexes.py: Don't generated sorted indexes for versions < 6.2

Posted by dw...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

dweiss pushed a commit to branch branch_7_0
in repository https://gitbox.apache.org/repos/asf/lucene.git

commit 719c9920f38fdc906e63d1622c2f227847ed5994
Author: Steve Rowe <sa...@apache.org>
AuthorDate: Tue Oct 24 12:27:55 2017 -0400

    addBackCompatIndexes.py: Don't generated sorted indexes for versions < 6.2
---
 dev-tools/scripts/addBackcompatIndexes.py | 8 +++++---
 1 file changed, 5 insertions(+), 3 deletions(-)

diff --git a/dev-tools/scripts/addBackcompatIndexes.py b/dev-tools/scripts/addBackcompatIndexes.py
index 08257ea..7a36002 100644
--- a/dev-tools/scripts/addBackcompatIndexes.py
+++ b/dev-tools/scripts/addBackcompatIndexes.py
@@ -103,7 +103,7 @@ def update_backcompat_tests(types, index_version, current_version):
   module = 'lucene/backward-codecs'
   filename = '%s/src/test/org/apache/lucene/index/TestBackwardsCompatibility.java' % module
   if not current_version.is_back_compat_with(index_version):
-    matcher = re.compile(r'final String\[\] unsupportedNames = {|};'),
+    matcher = re.compile(r'final String\[\] unsupportedNames = {|};')
   elif 'sorted' in types:
     matcher = re.compile(r'final static String\[\] oldSortedNames = {|};')
   else:
@@ -245,7 +245,8 @@ def main():
   current_version = scriptutil.Version.parse(scriptutil.find_current_version())
   create_and_add_index(source, 'cfs', c.version, current_version, c.temp_dir)
   create_and_add_index(source, 'nocfs', c.version, current_version, c.temp_dir)
-  create_and_add_index(source, 'sorted', c.version, current_version, c.temp_dir)
+  if c.version.major > 6 or (c.version.major == 6 and c.version.minor >= 2):
+    create_and_add_index(source, 'sorted', c.version, current_version, c.temp_dir)
   if c.version.minor == 0 and c.version.bugfix == 0 and c.version.major < current_version.major:
     create_and_add_index(source, 'moreterms', c.version, current_version, c.temp_dir)
     create_and_add_index(source, 'dvupdates', c.version, current_version, c.temp_dir)
@@ -254,7 +255,8 @@ def main():
     
   print('\nAdding backwards compatibility tests')
   update_backcompat_tests(['cfs', 'nocfs'], c.version, current_version)
-  update_backcompat_tests(['sorted'], c.version, current_version)
+  if c.version.major > 6 or (c.version.major == 6 and c.version.minor >= 2):
+    update_backcompat_tests(['sorted'], c.version, current_version)
 
   print('\nTesting changes')
   check_backcompat_tests()


[lucene] 04/08: remove unreleased/unsupported java9 note

Posted by dw...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

dweiss pushed a commit to branch branch_7_0
in repository https://gitbox.apache.org/repos/asf/lucene.git

commit 657ac3375e713660a869177d15bb0a6dd505fad1
Author: Steve Rowe <sa...@gmail.com>
AuthorDate: Fri Oct 13 19:37:15 2017 -0400

    remove unreleased/unsupported java9 note
---
 lucene/JRE_VERSION_MIGRATION.txt | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/lucene/JRE_VERSION_MIGRATION.txt b/lucene/JRE_VERSION_MIGRATION.txt
index c840464..5bda8da 100644
--- a/lucene/JRE_VERSION_MIGRATION.txt
+++ b/lucene/JRE_VERSION_MIGRATION.txt
@@ -17,7 +17,7 @@ For reference, JRE major versions with their corresponding Unicode versions:
  * Java 6, Unicode 4.0
  * Java 7, Unicode 6.0
  * Java 8, Unicode 6.2
- * Java 9 (not yet released / offcially supported by Lucene), Unicode 8.0
+ * Java 9, Unicode 8.0
 
 In general, whether or not you need to re-index largely depends upon the data that
 you are searching, and what was changed in any given Unicode version. For example, 


[lucene] 03/08: Add Lucene & Solr 7.0.1

Posted by dw...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

dweiss pushed a commit to branch branch_7_0
in repository https://gitbox.apache.org/repos/asf/lucene.git

commit ce2b7c90b1c999479a84f90a13bf48f7dbd90ddc
Author: Steve Rowe <sa...@apache.org>
AuthorDate: Fri Oct 6 16:17:29 2017 -0400

    Add Lucene & Solr 7.0.1
---
 dev-tools/doap/lucene.rdf | 7 +++++++
 dev-tools/doap/solr.rdf   | 7 +++++++
 2 files changed, 14 insertions(+)

diff --git a/dev-tools/doap/lucene.rdf b/dev-tools/doap/lucene.rdf
index d09e1d1..9f0b752 100644
--- a/dev-tools/doap/lucene.rdf
+++ b/dev-tools/doap/lucene.rdf
@@ -68,6 +68,13 @@
 
     <release>
       <Version>
+        <name>lucene-7.0.1</name>
+        <created>2017-10-06</created>
+	      <revision>7.0.1</revision>
+      </Version>
+    </release>
+    <release>
+      <Version>
         <name>lucene-7.0.0</name>
         <created>2017-09-20</created>
 	      <revision>7.0.0</revision>
diff --git a/dev-tools/doap/solr.rdf b/dev-tools/doap/solr.rdf
index 31c18ff..b172507 100644
--- a/dev-tools/doap/solr.rdf
+++ b/dev-tools/doap/solr.rdf
@@ -68,6 +68,13 @@
 
     <release>
       <Version>
+        <name>solr-7.0.1</name>
+        <created>2017-10-06</created>
+        <revision>7.0.1</revision>
+      </Version>
+    </release>
+    <release>
+      <Version>
         <name>solr-7.0.0</name>
         <created>2017-09-20</created>
         <revision>7.0.0</revision>


[lucene] 05/08: LUCENE-7995: 'ant stage-maven-artifacts' should work from the top-level project directory, and should provide a better error message when its 'maven.dist.dir' param points to a non-existent directory

Posted by dw...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

dweiss pushed a commit to branch branch_7_0
in repository https://gitbox.apache.org/repos/asf/lucene.git

commit f5ec4c02e2dad6f8ca490670cb53e3bcf26e797e
Author: Steve Rowe <sa...@gmail.com>
AuthorDate: Mon Oct 16 16:00:14 2017 -0400

    LUCENE-7995: 'ant stage-maven-artifacts' should work from the top-level project directory, and should provide a better error message when its 'maven.dist.dir' param points to a non-existent directory
---
 dev-tools/scripts/write.stage.maven.build.xml.pl |  1 -
 lucene/common-build.xml                          | 10 ++++++++--
 2 files changed, 8 insertions(+), 3 deletions(-)

diff --git a/dev-tools/scripts/write.stage.maven.build.xml.pl b/dev-tools/scripts/write.stage.maven.build.xml.pl
index c5e8aa8..21f09e8 100755
--- a/dev-tools/scripts/write.stage.maven.build.xml.pl
+++ b/dev-tools/scripts/write.stage.maven.build.xml.pl
@@ -46,7 +46,6 @@ my $output_build_xml_file = $ARGV[1];
 my $common_build_xml = $ARGV[2];
 my $m2_credentials_prompt = $ARGV[3];
 my $m2_repository_id = $ARGV[4];
-my $m2_repository_url = $ARGV[5];
 if ($^O eq 'cygwin') { # Make sure Cygwin Perl can find the output path
   $output_build_xml_file = `cygpath -u "$output_build_xml_file"`;
   $output_build_xml_file =~ s/\s+$//; # Trim trailing whitespace
diff --git a/lucene/common-build.xml b/lucene/common-build.xml
index dba69db..37f35f3 100644
--- a/lucene/common-build.xml
+++ b/lucene/common-build.xml
@@ -1812,7 +1812,14 @@ ${ant.project.name}.test.dependencies=${test.classpath.list}
   <target name="stage-maven-artifacts">
     <sequential>
       <property name="output.build.xml" location="${build.dir}/stage_maven_build.xml"/>
-      <property name="dev-tools.scripts.dir" value="../dev-tools/scripts"/>
+      <property name="dev-tools.scripts.dir" value="${common.dir}/../dev-tools/scripts"/>
+      <fail message="maven.dist.dir '${maven.dist.dir}' does not exist!">
+        <condition>
+          <not>
+            <available file="${maven.dist.dir}" type="dir"/>
+          </not>
+        </condition>
+      </fail>
       <exec dir="." executable="${perl.exe}" failonerror="false" outputproperty="stage.maven.script.output"
         resultproperty="stage.maven.script.success">
         <arg value="-CSD"/>
@@ -1822,7 +1829,6 @@ ${ant.project.name}.test.dependencies=${test.classpath.list}
         <arg value="${common.dir}/common-build.xml"/> <!-- Imported from the ant file to be written -->
         <arg value="${m2.credentials.prompt}"/>
         <arg value="${m2.repository.id}"/>
-        <arg value="${m2.repository.url}"/>
       </exec>
       <echo message="${stage.maven.script.output}"/>
       <fail message="maven stage script failed!">


[lucene] 02/08: Add version 7.0.2. Add 7.0.1 backcompat test indexes

Posted by dw...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

dweiss pushed a commit to branch branch_7_0
in repository https://gitbox.apache.org/repos/asf/lucene.git

commit db54a7a5f8e2028fe42601cf0b3d6e137bb6a82f
Author: Steve Rowe <sa...@apache.org>
AuthorDate: Thu Oct 5 21:03:46 2017 -0400

    Add version 7.0.2.  Add 7.0.1 backcompat test indexes
---
 lucene/CHANGES.txt                                     |   3 +++
 .../lucene/index/TestBackwardsCompatibility.java       |   7 +++++--
 .../test/org/apache/lucene/index/index.7.0.1-cfs.zip   | Bin 0 -> 15617 bytes
 .../test/org/apache/lucene/index/index.7.0.1-nocfs.zip | Bin 0 -> 15596 bytes
 .../src/test/org/apache/lucene/index/sorted.7.0.1.zip  | Bin 0 -> 75330 bytes
 .../core/src/java/org/apache/lucene/util/Version.java  |  11 +++++++++--
 lucene/version.properties                              |   2 +-
 solr/CHANGES.txt                                       |  17 +++++++++++++++++
 .../solr/configsets/_default/conf/solrconfig.xml       |   2 +-
 solr/example/example-DIH/solr/atom/conf/solrconfig.xml |   2 +-
 solr/example/example-DIH/solr/db/conf/solrconfig.xml   |   2 +-
 solr/example/example-DIH/solr/mail/conf/solrconfig.xml |   2 +-
 solr/example/example-DIH/solr/solr/conf/solrconfig.xml |   2 +-
 solr/example/example-DIH/solr/tika/conf/solrconfig.xml |   2 +-
 solr/example/files/conf/solrconfig.xml                 |   2 +-
 .../solr/configsets/_default/conf/solrconfig.xml       |   2 +-
 .../sample_techproducts_configs/conf/solrconfig.xml    |   2 +-
 17 files changed, 44 insertions(+), 14 deletions(-)

diff --git a/lucene/CHANGES.txt b/lucene/CHANGES.txt
index 3b14b65..fa4ef80 100644
--- a/lucene/CHANGES.txt
+++ b/lucene/CHANGES.txt
@@ -3,6 +3,9 @@ Lucene Change Log
 For more information on past and future Lucene versions, please see:
 http://s.apache.org/luceneversions
 
+======================= Lucene 7.0.2 =======================
+(No Changes)
+
 ======================= Lucene 7.0.1 =======================
 
 Bug Fixes
diff --git a/lucene/backward-codecs/src/test/org/apache/lucene/index/TestBackwardsCompatibility.java b/lucene/backward-codecs/src/test/org/apache/lucene/index/TestBackwardsCompatibility.java
index f1f3e48..c9244a4 100644
--- a/lucene/backward-codecs/src/test/org/apache/lucene/index/TestBackwardsCompatibility.java
+++ b/lucene/backward-codecs/src/test/org/apache/lucene/index/TestBackwardsCompatibility.java
@@ -313,7 +313,9 @@ public class TestBackwardsCompatibility extends LuceneTestCase {
     "6.6.1-cfs",
     "6.6.1-nocfs",
     "7.0.0-cfs",
-    "7.0.0-nocfs"
+    "7.0.0-nocfs",
+    "7.0.1-cfs",
+    "7.0.1-nocfs"
   };
 
   public static String[] getOldNames() {
@@ -331,7 +333,8 @@ public class TestBackwardsCompatibility extends LuceneTestCase {
     "sorted.6.5.1",
     "sorted.6.6.0",
     "sorted.6.6.1",
-    "sorted.7.0.0"
+    "sorted.7.0.0",
+    "sorted.7.0.1"
   };
 
   public static String[] getOldSortedNames() {
diff --git a/lucene/backward-codecs/src/test/org/apache/lucene/index/index.7.0.1-cfs.zip b/lucene/backward-codecs/src/test/org/apache/lucene/index/index.7.0.1-cfs.zip
new file mode 100644
index 0000000..30c2c06
Binary files /dev/null and b/lucene/backward-codecs/src/test/org/apache/lucene/index/index.7.0.1-cfs.zip differ
diff --git a/lucene/backward-codecs/src/test/org/apache/lucene/index/index.7.0.1-nocfs.zip b/lucene/backward-codecs/src/test/org/apache/lucene/index/index.7.0.1-nocfs.zip
new file mode 100644
index 0000000..2f435f2
Binary files /dev/null and b/lucene/backward-codecs/src/test/org/apache/lucene/index/index.7.0.1-nocfs.zip differ
diff --git a/lucene/backward-codecs/src/test/org/apache/lucene/index/sorted.7.0.1.zip b/lucene/backward-codecs/src/test/org/apache/lucene/index/sorted.7.0.1.zip
new file mode 100644
index 0000000..9aa32e8
Binary files /dev/null and b/lucene/backward-codecs/src/test/org/apache/lucene/index/sorted.7.0.1.zip differ
diff --git a/lucene/core/src/java/org/apache/lucene/util/Version.java b/lucene/core/src/java/org/apache/lucene/util/Version.java
index 552f479..f991ff4 100644
--- a/lucene/core/src/java/org/apache/lucene/util/Version.java
+++ b/lucene/core/src/java/org/apache/lucene/util/Version.java
@@ -138,11 +138,18 @@ public final class Version {
 
   /**
    * Match settings and bugs in Lucene's 7.0.1 release.
+   * @deprecated (7.0.2) Use latest
+   */
+  @Deprecated
+  public static final Version LUCENE_7_0_1 = new Version(7, 0, 1);
+
+  /**
+   * Match settings and bugs in Lucene's 7.0.2 release.
    * <p>
    * Use this to get the latest &amp; greatest settings, bug
    * fixes, etc, for Lucene.
    */
-  public static final Version LUCENE_7_0_1 = new Version(7, 0, 1);
+  public static final Version LUCENE_7_0_2 = new Version(7, 0, 2);
 
   // To add a new version:
   //  * Only add above this comment
@@ -163,7 +170,7 @@ public final class Version {
    * some defaults may have changed and may break functionality 
    * in your application.
    */
-  public static final Version LATEST = LUCENE_7_0_1;
+  public static final Version LATEST = LUCENE_7_0_2;
 
   /**
    * Constant for backwards compatibility.
diff --git a/lucene/version.properties b/lucene/version.properties
index 3fa1bf9..491d4de 100644
--- a/lucene/version.properties
+++ b/lucene/version.properties
@@ -2,7 +2,7 @@
 
 # RELEASE MANAGER must change this file after creating a release and
 # enter new base version (format "x.y.z", no prefix/appendix): 
-version.base=7.0.1
+version.base=7.0.2
 
 # Other version property defaults, don't change:
 version.suffix=SNAPSHOT
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index 8472098..0ac26b3 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -16,6 +16,23 @@ In this release, there is an example Solr server including a bundled
 servlet container in the directory named "example".
 See the Solr tutorial at https://lucene.apache.org/solr/guide/solr-tutorial.html
 
+==================  7.0.2 ==================
+
+Consult the LUCENE_CHANGES.txt file for additional, low level, changes in this release.
+
+Versions of Major Components
+---------------------
+Apache Tika 1.13
+Carrot2 3.15.0
+Velocity 1.7 and Velocity Tools 2.0
+Apache UIMA 2.3.1
+Apache ZooKeeper 3.4.10
+Jetty 9.3.14.v20161028
+
+
+(No Changes)
+
+
 ==================  7.0.1 ==================
 
 Consult the LUCENE_CHANGES.txt file for additional, low level, changes in this release.
diff --git a/solr/core/src/test-files/solr/configsets/_default/conf/solrconfig.xml b/solr/core/src/test-files/solr/configsets/_default/conf/solrconfig.xml
index 43000fa..d6ae2f7 100644
--- a/solr/core/src/test-files/solr/configsets/_default/conf/solrconfig.xml
+++ b/solr/core/src/test-files/solr/configsets/_default/conf/solrconfig.xml
@@ -35,7 +35,7 @@
        that you fully re-index after changing this setting as it can
        affect both how text is indexed and queried.
   -->
-  <luceneMatchVersion>7.0.1</luceneMatchVersion>
+  <luceneMatchVersion>7.0.2</luceneMatchVersion>
 
   <!-- <lib/> directives can be used to instruct Solr to load any Jars
        identified and use them to resolve any "plugins" specified in
diff --git a/solr/example/example-DIH/solr/atom/conf/solrconfig.xml b/solr/example/example-DIH/solr/atom/conf/solrconfig.xml
index 40e1826..33646fe 100644
--- a/solr/example/example-DIH/solr/atom/conf/solrconfig.xml
+++ b/solr/example/example-DIH/solr/atom/conf/solrconfig.xml
@@ -36,7 +36,7 @@
     that you fully re-index after changing this setting as it can
     affect both how text is indexed and queried.
   -->
-  <luceneMatchVersion>7.0.1</luceneMatchVersion>
+  <luceneMatchVersion>7.0.2</luceneMatchVersion>
 
   <lib dir="${solr.install.dir:../../../..}/dist/" regex="solr-dataimporthandler-.*\.jar"/>
 
diff --git a/solr/example/example-DIH/solr/db/conf/solrconfig.xml b/solr/example/example-DIH/solr/db/conf/solrconfig.xml
index d12e733..8884580 100644
--- a/solr/example/example-DIH/solr/db/conf/solrconfig.xml
+++ b/solr/example/example-DIH/solr/db/conf/solrconfig.xml
@@ -35,7 +35,7 @@
        that you fully re-index after changing this setting as it can
        affect both how text is indexed and queried.
   -->
-  <luceneMatchVersion>7.0.1</luceneMatchVersion>
+  <luceneMatchVersion>7.0.2</luceneMatchVersion>
 
   <!-- <lib/> directives can be used to instruct Solr to load any Jars
        identified and use them to resolve any "plugins" specified in
diff --git a/solr/example/example-DIH/solr/mail/conf/solrconfig.xml b/solr/example/example-DIH/solr/mail/conf/solrconfig.xml
index 2cd357d..71c5e0d 100644
--- a/solr/example/example-DIH/solr/mail/conf/solrconfig.xml
+++ b/solr/example/example-DIH/solr/mail/conf/solrconfig.xml
@@ -35,7 +35,7 @@
        that you fully re-index after changing this setting as it can
        affect both how text is indexed and queried.
   -->
-  <luceneMatchVersion>7.0.1</luceneMatchVersion>
+  <luceneMatchVersion>7.0.2</luceneMatchVersion>
 
   <!-- <lib/> directives can be used to instruct Solr to load any Jars
        identified and use them to resolve any "plugins" specified in
diff --git a/solr/example/example-DIH/solr/solr/conf/solrconfig.xml b/solr/example/example-DIH/solr/solr/conf/solrconfig.xml
index c7fd997..97ff3fa 100644
--- a/solr/example/example-DIH/solr/solr/conf/solrconfig.xml
+++ b/solr/example/example-DIH/solr/solr/conf/solrconfig.xml
@@ -35,7 +35,7 @@
        that you fully re-index after changing this setting as it can
        affect both how text is indexed and queried.
   -->
-  <luceneMatchVersion>7.0.1</luceneMatchVersion>
+  <luceneMatchVersion>7.0.2</luceneMatchVersion>
 
   <!-- <lib/> directives can be used to instruct Solr to load any Jars
        identified and use them to resolve any "plugins" specified in
diff --git a/solr/example/example-DIH/solr/tika/conf/solrconfig.xml b/solr/example/example-DIH/solr/tika/conf/solrconfig.xml
index 285682d..06cf869 100644
--- a/solr/example/example-DIH/solr/tika/conf/solrconfig.xml
+++ b/solr/example/example-DIH/solr/tika/conf/solrconfig.xml
@@ -36,7 +36,7 @@
    that you fully re-index after changing this setting as it can
    affect both how text is indexed and queried.
   -->
-  <luceneMatchVersion>7.0.1</luceneMatchVersion>
+  <luceneMatchVersion>7.0.2</luceneMatchVersion>
 
   <!-- Load Data Import Handler and Apache Tika (extraction) libraries -->
   <lib dir="${solr.install.dir:../../../..}/dist/" regex="solr-dataimporthandler-.*\.jar"/>
diff --git a/solr/example/files/conf/solrconfig.xml b/solr/example/files/conf/solrconfig.xml
index e7c616c..e7fa607 100644
--- a/solr/example/files/conf/solrconfig.xml
+++ b/solr/example/files/conf/solrconfig.xml
@@ -35,7 +35,7 @@
        that you fully re-index after changing this setting as it can
        affect both how text is indexed and queried.
   -->
-  <luceneMatchVersion>7.0.1</luceneMatchVersion>
+  <luceneMatchVersion>7.0.2</luceneMatchVersion>
 
   <!-- <lib/> directives can be used to instruct Solr to load any Jars
        identified and use them to resolve any "plugins" specified in
diff --git a/solr/server/solr/configsets/_default/conf/solrconfig.xml b/solr/server/solr/configsets/_default/conf/solrconfig.xml
index 43000fa..d6ae2f7 100644
--- a/solr/server/solr/configsets/_default/conf/solrconfig.xml
+++ b/solr/server/solr/configsets/_default/conf/solrconfig.xml
@@ -35,7 +35,7 @@
        that you fully re-index after changing this setting as it can
        affect both how text is indexed and queried.
   -->
-  <luceneMatchVersion>7.0.1</luceneMatchVersion>
+  <luceneMatchVersion>7.0.2</luceneMatchVersion>
 
   <!-- <lib/> directives can be used to instruct Solr to load any Jars
        identified and use them to resolve any "plugins" specified in
diff --git a/solr/server/solr/configsets/sample_techproducts_configs/conf/solrconfig.xml b/solr/server/solr/configsets/sample_techproducts_configs/conf/solrconfig.xml
index 98cedb6..f77478e 100644
--- a/solr/server/solr/configsets/sample_techproducts_configs/conf/solrconfig.xml
+++ b/solr/server/solr/configsets/sample_techproducts_configs/conf/solrconfig.xml
@@ -35,7 +35,7 @@
        that you fully re-index after changing this setting as it can
        affect both how text is indexed and queried.
   -->
-  <luceneMatchVersion>7.0.1</luceneMatchVersion>
+  <luceneMatchVersion>7.0.2</luceneMatchVersion>
 
   <!-- <lib/> directives can be used to instruct Solr to load any Jars
        identified and use them to resolve any "plugins" specified in


[lucene] 01/08: SOLR-10842: Convert all remaining {{quickstart.html}} links to {{guide/solr-tutorial.html}}; remove all references to quickstart from the build; and version the link to the ref guide's tutorial in Solr's versioned top-level documentation page.

Posted by dw...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

dweiss pushed a commit to branch branch_7_0
in repository https://gitbox.apache.org/repos/asf/lucene.git

commit a29a08716766706bd913792cfd3a5dc1cd970de9
Author: Steve Rowe <sa...@apache.org>
AuthorDate: Wed Oct 4 14:47:11 2017 -0400

    SOLR-10842: Convert all remaining {{quickstart.html}} links to {{guide/solr-tutorial.html}}; remove all references to quickstart from the build; and version the link to the ref guide's tutorial in Solr's versioned top-level documentation page.
---
 dev-tools/scripts/checkJavadocLinks.py |   2 +-
 solr/CHANGES.txt                       |   2 +-
 solr/README.txt                        |   4 +-
 solr/build.xml                         |  29 +-
 solr/contrib/ltr/README.md             |   2 +-
 solr/example/README.txt                |   4 +-
 solr/site/index.xsl                    |   3 +-
 solr/site/quickstart.mdtext            | 612 ---------------------------------
 8 files changed, 21 insertions(+), 637 deletions(-)

diff --git a/dev-tools/scripts/checkJavadocLinks.py b/dev-tools/scripts/checkJavadocLinks.py
index 2e3cdea..8ae0f4c 100644
--- a/dev-tools/scripts/checkJavadocLinks.py
+++ b/dev-tools/scripts/checkJavadocLinks.py
@@ -210,7 +210,7 @@ def checkAll(dirName):
         elif link.find('lucene.apache.org/solr/mirrors-solr-latest-redir.html') != -1:
           # OK
           pass
-        elif link.find('lucene.apache.org/solr/quickstart.html') != -1:
+        elif link.find('lucene.apache.org/solr/guide/') != -1:
           # OK
           pass
         elif link.find('lucene.apache.org/solr/downloads.html') != -1:
diff --git a/solr/CHANGES.txt b/solr/CHANGES.txt
index e78883b..8472098 100644
--- a/solr/CHANGES.txt
+++ b/solr/CHANGES.txt
@@ -14,7 +14,7 @@ Getting Started
 You need a Java 1.8 VM or later installed.
 In this release, there is an example Solr server including a bundled 
 servlet container in the directory named "example".
-See the Quick Start guide at http://lucene.apache.org/solr/quickstart.html
+See the Solr tutorial at https://lucene.apache.org/solr/guide/solr-tutorial.html
 
 ==================  7.0.1 ==================
 
diff --git a/solr/README.txt b/solr/README.txt
index 6af0cc6..0de8b57 100644
--- a/solr/README.txt
+++ b/solr/README.txt
@@ -87,8 +87,8 @@ For more information about Solr examples please read...
 
  * example/README.txt
    For more information about the "Solr Home" and Solr specific configuration
- * http://lucene.apache.org/solr/quickstart.html
-   For a Quick Start guide
+ * https://lucene.apache.org/solr/guide/solr-tutorial.html
+   For a Solr tutorial
  * http://lucene.apache.org/solr/resources.html
    For a list of other tutorials and introductory articles.
 
diff --git a/solr/build.xml b/solr/build.xml
index 5048f1d..064084d 100644
--- a/solr/build.xml
+++ b/solr/build.xml
@@ -191,22 +191,6 @@
     depends="javadocs,changes-to-html,process-webpages"/>
   <target name="compile-core" depends="compile-solr-core" unless="solr.core.compiled"/>
 
-  <target name="generate-website-quickstart"
-          description="Generate a version of the quickstart tutorial suitable for the website, at build/website/quickstart.mdtext">
-    <copy file="${common-solr.dir}/site/quickstart.mdtext" tofile="${common-solr.dir}/build/website/quickstart.mdtext"
-          overwrite="false" encoding="UTF-8">
-      <filterchain>
-        <tokenfilter>
-          <filetokenizer/>
-          <!-- Website images are under /solr/assets/images/ -->
-          <replaceregex pattern="src\s*=\s*&quot;images/" replace="src=&quot;/solr/assets/images/" flags="gs"/>
-          <!-- Redirect to the website's version-specific system requirements page -->
-          <replaceregex pattern="\(SYSTEM_REQUIREMENTS.html\)" replace="(/solr/api/SYSTEM_REQUIREMENTS.html)" flags="gs"/>
-        </tokenfilter>
-      </filterchain>
-    </copy>
-  </target>
-  
   <target name="documentation-online" description="Generate a link to the online documentation"
       depends="define-solr-javadoc-url">
     <xslt in="${ant.file}" out="${javadoc-online.dir}/index.html" style="site/online-link.xsl" force="true">
@@ -226,6 +210,16 @@
     <makeurl property="process-webpages.buildfiles" separator="|">
       <fileset dir="." includes="core/build.xml,test-framework/build.xml,solrj/build.xml,contrib/**/build.xml"/>
     </makeurl>
+
+    <loadresource property="doc-solr-guide-version-path">
+      <propertyresource name="version"/>
+      <filterchain>
+        <tokenfilter>
+          <filetokenizer/>
+          <replaceregex pattern="^(\d+)\.(\d+).*" replace="\1_\2"/>
+        </tokenfilter>
+      </filterchain>
+    </loadresource>
     <!--
       The XSL input file is ignored completely, but XSL expects one to be given,
       so we pass ourself (${ant.file}) here. The list of module build.xmls is given
@@ -239,6 +233,7 @@
       <param name="buildfiles" expression="${process-webpages.buildfiles}"/>
       <param name="version" expression="${version}"/>
       <param name="luceneJavadocUrl" expression="${lucene.javadoc.url}"/>
+      <param name="solrGuideVersion" expression="${doc-solr-guide-version-path}"/>
     </xslt>
     
     <markdown todir="${javadoc.dir}">
@@ -676,7 +671,7 @@
        <!-- NOTE: must currently exclude deprecated-list due to a javadocs bug (as of 1.7.0_09)
             javadocs generates invalid XML if you deprecate a method that takes a parameter
             with a generic type -->
-      <fileset dir="build/docs" includes="**/*.html" excludes="**/deprecated-list.html,quickstart.html"/>
+      <fileset dir="build/docs" includes="**/*.html" excludes="**/deprecated-list.html"/>
     </jtidy-macro>
     <echo message="Checking for broken links..."/>
     <check-broken-links dir="${javadoc.dir}"/>
diff --git a/solr/contrib/ltr/README.md b/solr/contrib/ltr/README.md
index 6b56cdf..6324ecf 100644
--- a/solr/contrib/ltr/README.md
+++ b/solr/contrib/ltr/README.md
@@ -14,7 +14,7 @@ For information on how to get started with solr ltr please see:
 
 For information on how to get started with solr please see:
  * [solr/README.txt](../../README.txt)
- * [Solr Quick Start](http://lucene.apache.org/solr/quickstart.html)
+ * [Solr Tutorial](https://lucene.apache.org/solr/guide/solr-tutorial.html)
 
 # How To Contribute
 
diff --git a/solr/example/README.txt b/solr/example/README.txt
index 4c8cca1..562c256 100644
--- a/solr/example/README.txt
+++ b/solr/example/README.txt
@@ -48,8 +48,8 @@ For more information about this example please read...
 
  * example/solr/README.txt
    For more information about the "Solr Home" and Solr specific configuration
- * http://lucene.apache.org/solr/quickstart.html
-   For a Tutorial using this example configuration
+ * https://lucene.apache.org/solr/guide/solr-tutorial.html
+   For a Solr tutorial
  * http://wiki.apache.org/solr/SolrResources 
    For a list of other tutorials and introductory articles.
 
diff --git a/solr/site/index.xsl b/solr/site/index.xsl
index b75fb9c..20eeea7 100644
--- a/solr/site/index.xsl
+++ b/solr/site/index.xsl
@@ -23,6 +23,7 @@
   <xsl:param name="buildfiles"/>
   <xsl:param name="version"/>
   <xsl:param name="luceneJavadocUrl"/>
+  <xsl:param name="solrGuideVersion"/>
   
   <!--
     NOTE: This template matches the root element of any given input XML document!
@@ -74,7 +75,7 @@
             <li><a href="http://wiki.apache.org/solr">Wiki</a>: Additional documentation, especially focused on using Solr.</li>
             <li><a href="changes/Changes.html">Changes</a>: List of changes in this release.</li>
             <li><a href="SYSTEM_REQUIREMENTS.html">System Requirements</a>: Minimum and supported Java versions.</li>
-            <li><a href="quickstart.html">Solr Quick Start</a>: This document covers the basics of running Solr using an example schema, and some sample data.</li>
+            <li><a href="https://lucene.apache.org/solr/guide/{$solrGuideVersion}/solr-tutorial.html">Solr Tutorial</a>: This document covers the basics of running Solr using an example schema, and some sample data.</li>
             <li><a href="{$luceneJavadocUrl}index.html">Lucene Documentation</a></li>
           </ul>
         <h2>API Javadocs</h2>
diff --git a/solr/site/quickstart.mdtext b/solr/site/quickstart.mdtext
deleted file mode 100644
index fcd9be0..0000000
--- a/solr/site/quickstart.mdtext
+++ /dev/null
@@ -1,612 +0,0 @@
-# Solr Quick Start
-
-## Overview
-
-This document covers getting Solr up and running, ingesting a variety of data sources into multiple collections,
-and getting a feel for the Solr administrative and search interfaces.
-
-## Requirements
-
-To follow along with this tutorial, you will need...
-
-1. To meet the [system requirements](SYSTEM_REQUIREMENTS.html)
-2. An Apache Solr release ([download](http://lucene.apache.org/solr/downloads.html)). This tutorial was written using Apache Solr 6.2.0.
-
-## Getting Started
-
-Please run the browser showing this tutorial and the Solr server on the same machine so tutorial links will correctly
-point to your Solr server.
-
-Begin by unzipping the Solr release and changing your working directory to the subdirectory where Solr was installed.
-Note that the base directory name may vary with the version of Solr downloaded.  For example, with a shell in UNIX,
-Cygwin, or MacOS:
-
-    /:$ ls solr*
-    solr-X.Y.Z.zip
-    /:$ unzip -q solr-X.Y.Z.zip
-    /:$ cd solr-X.Y.Z/
-
-Note that "X.Y.Z" will be replaced by an official Solr version (i.e. 6.4.3, 7.0.0, etc.)
-
-To launch Solr, run: `bin/solr start -e cloud -noprompt`
-
-    /solr-X.Y.Z:$ bin/solr start -e cloud -noprompt
-
-    Welcome to the SolrCloud example!
-
-    Starting up 2 Solr nodes for your example SolrCloud cluster.
-    ...
-
-    Started Solr server on port 8983 (pid=8404). Happy searching!
-    ...
-
-    Started Solr server on port 7574 (pid=8549). Happy searching!
-    ...
-
-    SolrCloud example running, please visit http://localhost:8983/solr
-
-    /solr-X.Y.Z:$ _
-
-You can see that the Solr is running by loading the Solr Admin UI in your web browser: <http://localhost:8983/solr/>.
-This is the main starting point for administering Solr.
-
-Solr will now be running two "nodes", one on port 7574 and one on port 8983.  There is one collection created
-automatically, `gettingstarted`, a two shard collection, each with two replicas.
-The [Cloud tab](http://localhost:8983/solr/#/~cloud) in the Admin UI diagrams the collection nicely:
-
-<img alt="Solr Quick Start: SolrCloud diagram" style="width:800px" src="images/quickstart-solrcloud.png" />
-
-## Indexing Data
-
-Your Solr server is up and running, but it doesn't contain any data.  The Solr install includes the `bin/post` tool in
-order to facilitate getting various types of documents easily into Solr from the start.  We'll be
-using this tool for the indexing examples below.
-
-You'll need a command shell to run these examples, rooted in the Solr install directory; the shell from where you
-launched Solr works just fine.
-
-* NOTE: Currently the `bin/post` tool does not have a comparable Windows script, but the underlying Java program invoked
-is available.  See the
-[Post Tool, Windows section](https://cwiki.apache.org/confluence/display/solr/Post+Tool#PostTool-Windows)
-for details.
-
-### Indexing a directory of "rich" files
-
-Let's first index local "rich" files including HTML, PDF, Microsoft Office formats (such as MS Word), plain text and
-many other formats.  `bin/post` features the ability to crawl a directory of files, optionally recursively even,
-sending the raw content of each file into Solr for extraction and indexing.   A Solr install includes a `docs/`
-subdirectory, so that makes a convenient set of (mostly) HTML files built-in to start with.
-
-    bin/post -c gettingstarted docs/
-
-Here's what it'll look like:
-
-    /solr-X.Y.Z:$ bin/post -c gettingstarted docs/
-    java -classpath /solr-X.Y.Z/dist/solr-core-X.Y.Z.jar -Dauto=yes -Dc=gettingstarted -Ddata=files -Drecursive=yes org.apache.solr.util.SimplePostTool docs/
-    SimplePostTool version 5.0.0
-    Posting files to [base] url http://localhost:8983/solr/gettingstarted/update...
-    Entering auto mode. File endings considered are xml,json,jsonl,csv,pdf,doc,docx,ppt,pptx,xls,xlsx,odt,odp,ods,ott,otp,ots,rtf,htm,html,txt,log
-    Entering recursive mode, max depth=999, delay=0s
-    Indexing directory docs (3 files, depth=0)
-    POSTing file index.html (text/html) to [base]/extract
-    POSTing file quickstart.html (text/html) to [base]/extract
-    POSTing file SYSTEM_REQUIREMENTS.html (text/html) to [base]/extract
-    Indexing directory docs/changes (1 files, depth=1)
-    POSTing file Changes.html (text/html) to [base]/extract
-    ...
-    4329 files indexed.
-    COMMITting Solr index changes to http://localhost:8983/solr/gettingstarted/update...
-    Time spent: 0:01:16.252
-
-The command-line breaks down as follows:
-
-   * `-c gettingstarted`: name of the collection to index into
-   * `docs/`: a relative path of the Solr install `docs/` directory
-
-You have now indexed thousands of documents into the `gettingstarted` collection in Solr and committed these changes.
-You can search for "solr" by loading the Admin UI [Query tab](http://localhost:8983/solr/#/gettingstarted/query),
-enter "solr" in the `q` param (replacing `*:*`, which matches all documents), and "Execute Query".
-See the [Searching](#searching) section below for more information.
-
-To index your own data, re-run the directory indexing command pointed to your own directory of documents.  For example,
-on a Mac instead of `docs/` try `~/Documents/` or `~/Desktop/`!   You may want to start from a clean, empty system
-again rather than have your content in addition to the Solr `docs/` directory; see the Cleanup section [below](#cleanup)
-for how to get back to a clean starting point.
-
-### Indexing Solr XML
-
-Solr supports indexing structured content in a variety of incoming formats.  The historically predominant format for
-getting structured content into Solr has been
-[Solr XML](https://cwiki.apache.org/confluence/display/solr/Uploading+Data+with+Index+Handlers#UploadingDatawithIndexHandlers-XMLFormattedIndexUpdates).
-Many Solr indexers have been coded to process domain content into Solr XML output, generally HTTP POSTed directly to
-Solr's `/update` endpoint.
-
-<a name="techproducts"></a>
-Solr's install includes a handful of Solr XML formatted files with example data (mostly mocked tech product data).
-NOTE: This tech product data has a more domain-specific configuration, including schema and browse UI.  The `bin/solr`
-script includes built-in support for this by running `bin/solr start -e techproducts` which not only starts Solr but
-also then indexes this data too (be sure to `bin/solr stop -all` before trying it out).
-However, the example below assumes Solr was started with `bin/solr start -e cloud` to stay consistent with all examples
-on this page, and thus the collection used is "gettingstarted", not "techproducts".
-
-Using `bin/post`, index the example Solr XML files in `example/exampledocs/`:
-
-    bin/post -c gettingstarted example/exampledocs/*.xml
-
-Here's what you'll see:
-
-    /solr-X.Y.Z:$ bin/post -c gettingstarted example/exampledocs/*.xml
-    java -classpath /solr-X.Y.Z/dist/solr-core-X.Y.Z.jar -Dauto=yes -Dc=gettingstarted -Ddata=files org.apache.solr.util.SimplePostTool example/exampledocs/gb18030-example.xml ...
-    SimplePostTool version 5.0.0
-    Posting files to [base] url http://localhost:8983/solr/gettingstarted/update...
-    Entering auto mode. File endings considered are xml,json,jsonl,csv,pdf,doc,docx,ppt,pptx,xls,xlsx,odt,odp,ods,ott,otp,ots,rtf,htm,html,txt,log
-    POSTing file gb18030-example.xml (application/xml) to [base]
-    POSTing file hd.xml (application/xml) to [base]
-    POSTing file ipod_other.xml (application/xml) to [base]
-    POSTing file ipod_video.xml (application/xml) to [base]
-    POSTing file manufacturers.xml (application/xml) to [base]
-    POSTing file mem.xml (application/xml) to [base]
-    POSTing file money.xml (application/xml) to [base]
-    POSTing file monitor.xml (application/xml) to [base]
-    POSTing file monitor2.xml (application/xml) to [base]
-    POSTing file mp500.xml (application/xml) to [base]
-    POSTing file sd500.xml (application/xml) to [base]
-    POSTing file solr.xml (application/xml) to [base]
-    POSTing file utf8-example.xml (application/xml) to [base]
-    POSTing file vidcard.xml (application/xml) to [base]
-    14 files indexed.
-    COMMITting Solr index changes to http://localhost:8983/solr/gettingstarted/update...
-    Time spent: 0:00:02.077
-
-...and now you can search for all sorts of things using the default
-[Solr Query Syntax](https://cwiki.apache.org/confluence/display/solr/The+Standard+Query+Parser#TheStandardQueryParser-SpecifyingTermsfortheStandardQueryParser)
-(a superset of the Lucene query syntax)...
-
-NOTE:
-You can browse the documents indexed at <http://localhost:8983/solr/gettingstarted/browse>.  The `/browse` UI allows
-getting a feel for how Solr's technical capabilities can be worked with in a familiar, though a bit rough and
-prototypical, interactive HTML view.  (The `/browse` view defaults to assuming the `gettingstarted` schema and data
-are a catch-all mix of structured XML, JSON, CSV example data, and unstructured rich documents.  Your own data may not
-look ideal at first, though the `/browse` templates are customizable.)
-
-### Indexing JSON
-
-Solr supports indexing JSON, either arbitrary structured JSON or "Solr JSON" (which is similar to Solr XML).
-
-Solr includes a small sample Solr JSON file to illustrate this capability.  Again using `bin/post`, index the
-sample JSON file:
-
-    bin/post -c gettingstarted example/exampledocs/books.json
-
-You'll see:
-
-    /solr-X.Y.Z:$ bin/post -c gettingstarted example/exampledocs/books.json
-    java -classpath /solr-X.Y.Z/dist/solr-core-X.Y.Z.jar -Dauto=yes -Dc=gettingstarted -Ddata=files org.apache.solr.util.SimplePostTool example/exampledocs/books.json
-    SimplePostTool version 5.0.0
-    Posting files to [base] url http://localhost:8983/solr/gettingstarted/update...
-    Entering auto mode. File endings considered are xml,json,jsonl,csv,pdf,doc,docx,ppt,pptx,xls,xlsx,odt,odp,ods,ott,otp,ots,rtf,htm,html,txt,log
-    POSTing file books.json (application/json) to [base]/json/docs
-    1 files indexed.
-    COMMITting Solr index changes to http://localhost:8983/solr/gettingstarted/update...
-    Time spent: 0:00:00.493
-
-For more information on indexing Solr JSON, see the Solr Reference Guide section
-[Solr-Style JSON](https://cwiki.apache.org/confluence/display/solr/Uploading+Data+with+Index+Handlers#UploadingDatawithIndexHandlers-Solr-StyleJSON)
-
-To flatten (and/or split) and index arbitrary structured JSON, a topic beyond this quick start guide, check out
-[Transforming and Indexing Custom JSON data](https://cwiki.apache.org/confluence/display/solr/Uploading+Data+with+Index+Handlers#UploadingDatawithIndexHandlers-TransformingandIndexingCustomJSON).
-
-### Indexing CSV (Comma/Column Separated Values)
-
-A great conduit of data into Solr is via CSV, especially when the documents are homogeneous by all having the
-same set of fields.  CSV can be conveniently exported from a spreadsheet such as Excel, or exported from databases such
-as MySQL.  When getting started with Solr, it can often be easiest to get your structured data into CSV format and then
-index that into Solr rather than a more sophisticated single step operation.
-
-Using `bin/post` index the included example CSV file:
-
-    bin/post -c gettingstarted example/exampledocs/books.csv
-
-In your terminal you'll see:
-
-    /solr-X.Y.Z:$ bin/post -c gettingstarted example/exampledocs/books.csv
-    java -classpath /solr-X.Y.Z/dist/solr-core-X.Y.Z.jar -Dauto=yes -Dc=gettingstarted -Ddata=files org.apache.solr.util.SimplePostTool example/exampledocs/books.csv
-    SimplePostTool version 5.0.0
-    Posting files to [base] url http://localhost:8983/solr/gettingstarted/update...
-    Entering auto mode. File endings considered are xml,json,jsonl,csv,pdf,doc,docx,ppt,pptx,xls,xlsx,odt,odp,ods,ott,otp,ots,rtf,htm,html,txt,log
-    POSTing file books.csv (text/csv) to [base]
-    1 files indexed.
-    COMMITting Solr index changes to http://localhost:8983/solr/gettingstarted/update...
-    Time spent: 0:00:00.109
-
-For more information, see the Solr Reference Guide section
-[CSV Formatted Index Updates](https://cwiki.apache.org/confluence/display/solr/Uploading+Data+with+Index+Handlers#UploadingDatawithIndexHandlers-CSVFormattedIndexUpdates)
-
-### Other indexing techniques
-
-* Import records from a database using the
-[Data Import Handler (DIH)](https://cwiki.apache.org/confluence/display/solr/Uploading+Structured+Data+Store+Data+with+the+Data+Import+Handler).
-
-* Use [SolrJ](https://cwiki.apache.org/confluence/display/solr/Using+SolrJ) from JVM-based languages or
-other [Solr clients](https://cwiki.apache.org/confluence/display/solr/Client+APIs) to programmatically create documents
-to send to Solr.
-
-* Use the Admin UI [Documents tab](http://localhost:8983/solr/#/gettingstarted/documents) to paste in a document to be
-indexed, or select `Document Builder` from the `Document Type` dropdown to build a document one field at a time.
-Click on the `Submit Document` button below the form to index your document.
-
-***
-
-## Updating Data
-
-You may notice that even if you index content in this guide more than once, it does not duplicate the results found.
-This is because the example `schema.xml` specifies a "`uniqueKey`" field called "`id`". Whenever you POST commands to
-Solr to add a document with the same value for the `uniqueKey` as an existing document, it automatically replaces it
-for you. You can see that that has happened by looking at the values for `numDocs` and `maxDoc` in the core-specific
-Overview section of the Solr Admin UI.
-
-`numDocs` represents the number of searchable documents in the index (and will be larger than the number of XML, JSON,
-or CSV files since some files contained more than one document).  The maxDoc value may be larger as the maxDoc count
-includes logically deleted documents that have not yet been physically removed from the index. You can re-post the
-sample files over and over again as much as you want and `numDocs` will never increase, because the new documents will
-constantly be replacing the old.
-
-Go ahead and edit any of the existing example data files, change some of the data, and re-run the SimplePostTool
-command.  You'll see your changes reflected in subsequent searches.
-
-## Deleting Data
-
-You can delete data by POSTing a delete command to the update URL and specifying the value of the document's unique key
-field, or a query that matches multiple documents (be careful with that one!). Since these commands are smaller, we
-specify them right on the command line rather than reference a JSON or XML file.
-
-Execute the following command to delete a specific document:
-
-    bin/post -c gettingstarted -d "<delete><id>SP2514N</id></delete>"
-
-
-## Searching
-
-Solr can be queried via REST clients, cURL, wget, Chrome POSTMAN, etc., as well as via the native clients available for
-many programming languages.
-
-The Solr Admin UI includes a query builder interface - see the `gettingstarted` query tab at
-<http://localhost:8983/solr/#/gettingstarted/query>.  If you click the `Execute Query` button without changing anything
-in the form, you'll get 10 documents in JSON format (`*:*` in the `q` param matches all documents):
-
-<img style="border:1px solid #ccc; width:800px" src="images/quickstart-query-screen.png" alt="Solr Quick Start: gettingstarted Query tab"/>
-
-The URL sent by the Admin UI to Solr is shown in light grey near the top right of the above screenshot - if you click on
-it, your browser will show you the raw response.  To use cURL, give the same URL in quotes on the `curl` command line:
-
-    curl "http://localhost:8983/solr/gettingstarted/select?q=*:*"
-
-
-### Basics
-
-#### Search for a single term
-
-To search for a term, give it as the `q` param value in the core-specific Solr Admin UI Query section, replace `*:*`
-with the term you want to find.  To search for "foundation":
-
-    curl "http://localhost:8983/solr/gettingstarted/select?q=foundation"
-
-You'll see:
-
-    $ curl "http://localhost:8983/solr/gettingstarted/select?q=foundation"
-    {
-      "responseHeader":{
-        "zkConnected":true,
-        "status":0,
-        "QTime":527,
-        "params":{
-          "q":"foundation",
-          "indent":"true",
-          "wt":"json"}},
-      "response":{"numFound":4156,"start":0,"maxScore":0.10203234,"docs":[
-          {
-            "id":"0553293354",
-            "cat":["book"],
-            "name":["Foundation"],
-    ...
-
-The response indicates that there are 4,156 hits (`"numFound":4156`), of which the first 10 were returned, since by
-default `start=0` and `rows=10`.  You can specify these params to page through results, where `start` is the
-(zero-based) position of the first result to return, and `rows` is the page size.
-
-To restrict fields returned in the response, use the `fl` param, which takes a comma-separated list of field names.
-E.g. to only return the `id` field:
-
-    curl "http://localhost:8983/solr/gettingstarted/select?q=foundation&fl=id"
-
-`q=foundation` matches nearly all of the docs we've indexed, since most of the files under `docs/` contain
-"The Apache Software Foundation".  To restrict search to a particular field, use the syntax "`q=field:value`",
-e.g. to search for `Foundation` only in the `name` field:
-
-    curl "http://localhost:8983/solr/gettingstarted/select?q=name:Foundation"
-
-The above request returns only one document (`"numFound":1`) - from the response:
-
-    ...
-      "response":{"numFound":1,"start":0,"maxScore":2.5902672,"docs":[
-          {
-            "id":"0553293354",
-            "cat":["book"],
-            "name":["Foundation"],
-    ...
-
-#### Phrase search
-
-To search for a multi-term phrase, enclose it in double quotes: `q="multiple terms here"`.  E.g. to search for
-"CAS latency" - note that the space between terms must be converted to "`+`" in a URL (the Admin UI will handle URL
-encoding for you automatically):
-
-    curl "http://localhost:8983/solr/gettingstarted/select?indent=true&q=\"CAS+latency\""
-
-You'll get back:
-
-    {
-      "responseHeader":{
-        "zkConnected":true,
-        "status":0,
-        "QTime":391,
-        "params":{
-          "q":"\"CAS latency\"",
-          "indent":"true",
-          "wt":"json"}},
-      "response":{"numFound":3,"start":0,"maxScore":22.027056,"docs":[
-          {
-            "id":"TWINX2048-3200PRO",
-            "name":["CORSAIR  XMS 2GB (2 x 1GB) 184-Pin DDR SDRAM Unbuffered DDR 400 (PC 3200) Dual Channel Kit System Memory - Retail"],
-            "manu":["Corsair Microsystems Inc."],
-            "manu_id_s":"corsair",
-            "cat":["electronics", "memory"],
-            "features":["CAS latency 2,  2-3-3-6 timing, 2.75v, unbuffered, heat-spreader"],
-    ...
-
-#### Combining searches
-
-By default, when you search for multiple terms and/or phrases in a single query, Solr will only require that one of them
-is present in order for a document to match.  Documents containing more terms will be sorted higher in the results list.
-
-You can require that a term or phrase is present by prefixing it with a "`+`"; conversely, to disallow the presence of a
-term or phrase, prefix it with a "`-`".
-
-To find documents that contain both terms "`one`" and "`three`", enter `+one +three` in the `q` param in the
-Admin UI Query tab.  Because the "`+`" character has a reserved purpose in URLs (encoding the space character),
-you must URL encode it for `curl` as "`%2B`":
-
-    curl "http://localhost:8983/solr/gettingstarted/select?q=%2Bone+%2Bthree"
-
-To search for documents that contain the term "`two`" but **don't** contain the term "`one`", enter `+two -one` in the
-`q` param in the Admin UI.  Again, URL encode "`+`" as "`%2B`":
-
-    curl "http://localhost:8983/solr/gettingstarted/select?q=%2Btwo+-one"
-
-#### In depth
-
-For more Solr search options, see the Solr Reference Guide's
-[Searching](https://cwiki.apache.org/confluence/display/solr/Searching) section.
-
-
-### Faceting
-
-One of Solr's most popular features is faceting.  Faceting allows the search results to be arranged into subsets (or
-buckets or categories), providing a count for each subset.  There are several types of faceting: field values, numeric
-and date ranges, pivots (decision tree), and arbitrary query faceting.
-
-#### Field facets
-
-In addition to providing search results, a Solr query can return the number of documents that contain each unique value
-in the whole result set.
-
-From the core-specific Admin UI Query tab, if you check the "`facet`" checkbox, you'll see a few facet-related options
-appear:
-
-<img style="border:1px solid #ccc" src="images/quickstart-admin-ui-facet-options.png" alt="Solr Quick Start: Query tab facet options"/>
-
-To see facet counts from all documents (`q=*:*`): turn on faceting (`facet=true`), and specify the field to facet on via
-the `facet.field` param.  If you only want facets, and no document contents, specify `rows=0`.  The `curl` command below
-will return facet counts for the `manu_id_s` field:
-
-    curl 'http://localhost:8983/solr/gettingstarted/select?q=*:*&rows=0'\
-    '&facet=true&facet.field=manu_id_s'
-
-In your terminal, you'll see:
-
-    {
-      "responseHeader":{
-        "zkConnected":true,
-        "status":0,
-        "QTime":201,
-        "params":{
-          "q":"*:*",
-          "facet.field":"manu_id_s",
-          "indent":"true",
-          "rows":"0",
-          "wt":"json",
-          "facet":"true"}},
-      "response":{"numFound":4374,"start":0,"maxScore":1.0,"docs":[]
-      },
-      "facet_counts":{
-        "facet_queries":{},
-        "facet_fields":{
-          "manu_id_s":[
-            "corsair",3,
-            "belkin",2,
-            "canon",2,
-            "apple",1,
-            "asus",1,
-            "ati",1,
-            "boa",1,
-            "dell",1,
-            "eu",1,
-            "maxtor",1,
-            "nor",1,
-            "uk",1,
-            "viewsonic",1,
-            "samsung",0]},
-        "facet_ranges":{},
-        "facet_intervals":{},
-        "facet_heatmaps":{}}}
-
-#### Range facets
-
-For numerics or dates, it's often desirable to partition the facet counts into ranges rather than discrete values.
-A prime example of numeric range faceting, using the example product data, is `price`.  In the `/browse` UI, it looks
-like this:
-
-<img style="border:1px solid #ccc" src="images/quickstart-range-facet.png" alt="Solr Quick Start: Range facets"/>
-
-The data for these price range facets can be seen in JSON format with this command:
-
-    curl 'http://localhost:8983/solr/gettingstarted/select?q=*:*&rows=0'\
-    '&facet=true'\
-    '&facet.range=price'\
-    '&f.price.facet.range.start=0'\
-    '&f.price.facet.range.end=600'\
-    '&f.price.facet.range.gap=50'\
-    '&facet.range.other=after'
-
-In your terminal you will see:
-
-    {
-      "responseHeader":{
-        "zkConnected":true,
-        "status":0,
-        "QTime":248,
-        "params":{
-          "facet.range":"price",
-          "q":"*:*",
-          "f.price.facet.range.start":"0",
-          "facet.range.other":"after",
-          "indent":"on",
-          "f.price.facet.range.gap":"50",
-          "rows":"0",
-          "wt":"json",
-          "facet":"true",
-          "f.price.facet.range.end":"600"}},
-      "response":{"numFound":4374,"start":0,"maxScore":1.0,"docs":[]
-      },
-      "facet_counts":{
-        "facet_queries":{},
-        "facet_fields":{},
-        "facet_ranges":{
-          "price":{
-            "counts":[
-              "0.0",19,
-              "50.0",1,
-              "100.0",0,
-              "150.0",2,
-              "200.0",0,
-              "250.0",1,
-              "300.0",1,
-              "350.0",2,
-              "400.0",0,
-              "450.0",1,
-              "500.0",0,
-              "550.0",0],
-            "gap":50.0,
-            "after":2,
-            "start":0.0,
-            "end":600.0}},
-        "facet_intervals":{},
-        "facet_heatmaps":{}}}
-
-#### Pivot facets
-
-Another faceting type is pivot facets, also known as "decision trees", allowing two or more fields to be nested for all
-the various possible combinations.  Using the example technical product data, pivot facets can be used to see how many
-of the products in the "book" category (the `cat` field) are in stock or not in stock.  Here's how to get at the raw
-data for this scenario:
-
-    curl 'http://localhost:8983/solr/gettingstarted/select?q=*:*&rows=0&facet=on&facet.pivot=cat,inStock'
-
-This results in the following response (trimmed to just the book category output), which says out of 14 items in the
-"book" category, 12 are in stock and 2 are not in stock:
-
-    ...
-    "facet_pivot":{
-      "cat,inStock":[{
-          "field":"cat",
-          "value":"book",
-          "count":14,
-          "pivot":[{
-              "field":"inStock",
-              "value":true,
-              "count":12},
-            {
-              "field":"inStock",
-              "value":false,
-              "count":2}]},
-    ...
-
-#### More faceting options
-
-For the full scoop on Solr faceting, visit the Solr Reference Guide's
-[Faceting](https://cwiki.apache.org/confluence/display/solr/Faceting) section.
-
-
-### Spatial
-
-Solr has sophisticated geospatial support, including searching within a specified distance range of a given location
-(or within a bounding box), sorting by distance, or even boosting results by the distance.  Some of the example tech
-products documents in `example/exampledocs/*.xml` have locations associated with them to illustrate the spatial
-capabilities. To run the tech products example, see the [techproducts example section](#techproducts). Spatial queries
-can be combined with any other types of queries, such as in this example of querying for "ipod" within 10 kilometers
-from San Francisco:
-
-<img style="border:1px solid #ccc; width:800px" src="images/quickstart-spatial.png" alt="Solr Quick Start: spatial search"/>
-
-The URL to this example is
-<http://localhost:8983/solr/techproducts/browse?q=ipod&pt=37.7752%2C-122.4232&d=10&sfield=store&fq=%7B%21bbox%7D&queryOpts=spatial&queryOpts=spatial>,
-leveraging the `/browse` UI to show a map for each item and allow easy selection of the location to search near.
-
-To learn more about Solr's spatial capabilities, see the Solr Reference Guide's
-[Spatial Search](https://cwiki.apache.org/confluence/display/solr/Spatial+Search) section.
-
-## Wrapping up
-
-If you've run the full set of commands in this quick start guide you have done the following:
-
-* Launched Solr into SolrCloud mode, two nodes, two collections including shards and replicas
-* Indexed a directory of rich text files
-* Indexed Solr XML files
-* Indexed Solr JSON files
-* Indexed CSV content
-* Opened the admin console, used its query interface to get JSON formatted results
-* Opened the /browse interface to explore Solr's features in a more friendly and familiar interface
-
-Nice work!   The script (see below) to run all of these items took under two minutes! (Your run time may vary, depending
-on your computer's power and resources available.)
-
-Here's a Unix script for convenient copying and pasting in order to run the key commands for this quick start guide:
-
-    date
-    bin/solr start -e cloud -noprompt
-      open http://localhost:8983/solr
-      bin/post -c gettingstarted docs/
-      open http://localhost:8983/solr/gettingstarted/browse
-      bin/post -c gettingstarted example/exampledocs/*.xml
-      bin/post -c gettingstarted example/exampledocs/books.json
-      bin/post -c gettingstarted example/exampledocs/books.csv
-      bin/post -c gettingstarted -d "<delete><id>SP2514N</id></delete>"
-      bin/solr healthcheck -c gettingstarted
-    date
-
-## Cleanup
-
-As you work through this guide, you may want to stop Solr and reset the environment back to the starting point.
-The following command line will stop Solr and remove the directories for each of the two nodes that the start script
-created:
-
-    bin/solr stop -all ; rm -Rf example/cloud/
-
-## Where to next?
-
-For more information on Solr, check out the following resources:
-
-  * [Solr Reference Guide](https://cwiki.apache.org/confluence/display/solr/Apache+Solr+Reference+Guide) (ensure you
-    match the version of the reference guide with your version of Solr)
-  * See also additional [Resources](http://lucene.apache.org/solr/resources.html)
-


[lucene] 07/08: LUCENE-6144: Upgrade Ivy to 2.4.0; 'ant ivy-bootstrap' now removes old Ivy jars in ~/.ant/lib/.

Posted by dw...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

dweiss pushed a commit to branch branch_7_0
in repository https://gitbox.apache.org/repos/asf/lucene.git

commit 1f41349718de41bc5817d2262fb657a0436a7bef
Author: Steve Rowe <sa...@apache.org>
AuthorDate: Thu Nov 2 17:00:43 2017 -0400

    LUCENE-6144: Upgrade Ivy to 2.4.0; 'ant ivy-bootstrap' now removes old Ivy jars in ~/.ant/lib/.
---
 lucene/CHANGES.txt                 | 18 +++++++++++++-
 lucene/common-build.xml            | 48 ++++++++++++++++++++++++++++++++------
 lucene/ivy-versions.properties     |  2 +-
 lucene/licenses/ivy-2.3.0.jar.sha1 |  1 -
 lucene/licenses/ivy-2.4.0.jar.sha1 |  1 +
 5 files changed, 60 insertions(+), 10 deletions(-)

diff --git a/lucene/CHANGES.txt b/lucene/CHANGES.txt
index fa4ef80..dd86a97 100644
--- a/lucene/CHANGES.txt
+++ b/lucene/CHANGES.txt
@@ -4,7 +4,11 @@ For more information on past and future Lucene versions, please see:
 http://s.apache.org/luceneversions
 
 ======================= Lucene 7.0.2 =======================
-(No Changes)
+
+Build
+
+* LUCENE-6144: Upgrade Ivy to 2.4.0; 'ant ivy-bootstrap' now removes old Ivy
+  jars in ~/.ant/lib/.  (Shawn Heisey, Steve Rowe)
 
 ======================= Lucene 7.0.1 =======================
 
@@ -232,6 +236,18 @@ Other
   that are trivially replaced by LeafReader.terms() and MultiFields.getTerms()
   (David Smiley)
 
+======================= Lucene 6.6.2 =======================
+
+Changes in Runtime Behavior
+
+* Resolving of external entities in queryparser/xml/CoreParser is disallowed
+  by default. See SOLR-11477 for details.
+
+Bug Fixes
+
+* SOLR-11477: Disallow resolving of external entities in queryparser/xml/CoreParser
+  by default. (Michael Stepankin, Olga Barinova, Uwe Schindler, Christine Poerschke)
+
 ======================= Lucene 6.6.1 =======================
 
 Bug Fixes
diff --git a/lucene/common-build.xml b/lucene/common-build.xml
index 37f35f3..312c5b8 100644
--- a/lucene/common-build.xml
+++ b/lucene/common-build.xml
@@ -80,13 +80,15 @@
   <!-- Needed in case a module needs the original build, also for compile-tools to be called from a module -->
   <property name="common.build.dir" location="${common.dir}/build"/>
 
-  <property name="ivy.bootstrap.version" value="2.3.0" />
+  <property name="ivy.bootstrap.version" value="2.4.0" /> <!-- UPGRADE NOTE: update disallowed_ivy_jars_regex below -->
+  <property name="disallowed_ivy_jars_regex" value="ivy-2\.[0123].*\.jar"/>
+
   <property name="ivy.default.configuration" value="*"/>
 
   <!-- Running ant targets in parralel may require this set to false because ivy:retrieve tasks may race with resolve -->
   <property name="ivy.sync" value="true"/>
   <property name="ivy.resolution-cache.dir" location="${common.build.dir}/ivy-resolution-cache"/>
-  <property name="ivy.lock-strategy" value="artifact-lock"/>
+  <property name="ivy.lock-strategy" value="artifact-lock-nio"/>
 
   <property name="local.caches" location="${common.dir}/../.caches" />
   <property name="tests.cachedir"  location="${local.caches}/test-stats" />
@@ -413,15 +415,38 @@
   <property name="ivy_bootstrap_url1" value="http://repo1.maven.org/maven2"/>
   <!-- you might need to tweak this from china so it works -->
   <property name="ivy_bootstrap_url2" value="http://uk.maven.org/maven2"/>
-  <property name="ivy_checksum_sha1" value="c5ebf1c253ad4959a29f4acfe696ee48cdd9f473"/>
+  <property name="ivy_checksum_sha1" value="5abe4c24bbe992a9ac07ca563d5bd3e8d569e9ed"/>
 
   <target name="ivy-availability-check" unless="ivy.available">
+    <path id="disallowed.ivy.jars">
+      <fileset dir="${ivy_install_path}">
+        <filename regex="${disallowed_ivy_jars_regex}"/>
+      </fileset>
+    </path>
+    <loadresource property="disallowed.ivy.jars.list">
+      <string value="${toString:disallowed.ivy.jars}"/>
+      <filterchain><tokenfilter><replacestring from="jar:" to="jar, "/></tokenfilter></filterchain>
+    </loadresource>
+    <condition property="disallowed.ivy.jar.found">
+      <resourcecount when="greater" count="0">
+        <path refid="disallowed.ivy.jars"/>
+      </resourcecount>
+    </condition>
+    <antcall target="-ivy-fail-disallowed-ivy-version"/>
+
     <condition property="ivy.available">
       <typefound uri="antlib:org.apache.ivy.ant" name="configure" />
     </condition>
     <antcall target="ivy-fail" />
   </target>
 
+  <target name="-ivy-fail-disallowed-ivy-version" if="disallowed.ivy.jar.found">
+    <sequential>
+      <echo message="Please delete the following disallowed Ivy jar(s): ${disallowed.ivy.jars.list}"/>
+      <fail>Found disallowed Ivy jar(s): ${disallowed.ivy.jars.list}</fail>
+    </sequential>
+  </target>
+
   <target name="ivy-fail" unless="ivy.available">
    <echo>
      This build requires Ivy and Ivy could not be found in your ant classpath.
@@ -459,19 +484,20 @@
     <fail>Ivy is not available</fail>
   </target>
 
-  <target name="ivy-bootstrap" description="Download and install Ivy in the users ant lib dir" depends="ivy-bootstrap1,ivy-bootstrap2,ivy-checksum"/>
+  <target name="ivy-bootstrap" description="Download and install Ivy in the users ant lib dir" 
+          depends="-ivy-bootstrap1,-ivy-bootstrap2,-ivy-checksum,-ivy-remove-old-versions"/>
 
   <!-- try to download from repo1.maven.org -->
-  <target name="ivy-bootstrap1">
+  <target name="-ivy-bootstrap1">
     <ivy-download src="${ivy_bootstrap_url1}" dest="${ivy_install_path}"/>
     <available file="${ivy_install_path}/ivy-${ivy.bootstrap.version}.jar" property="ivy.bootstrap1.success" />
   </target> 
 
-  <target name="ivy-bootstrap2" unless="ivy.bootstrap1.success">
+  <target name="-ivy-bootstrap2" unless="ivy.bootstrap1.success">
     <ivy-download src="${ivy_bootstrap_url2}" dest="${ivy_install_path}"/>
   </target>
 
-  <target name="ivy-checksum">
+  <target name="-ivy-checksum">
     <checksum file="${ivy_install_path}/ivy-${ivy.bootstrap.version}.jar"
               property="${ivy_checksum_sha1}"
               algorithm="SHA"
@@ -482,6 +508,14 @@
       </condition>
     </fail>
   </target>
+  
+  <target name="-ivy-remove-old-versions">
+    <delete verbose="true" failonerror="true">
+      <fileset dir="${ivy_install_path}">
+        <filename regex="${disallowed_ivy_jars_regex}"/>
+      </fileset>
+    </delete>
+  </target>
    
   <macrodef name="ivy-download">
       <attribute name="src"/>
diff --git a/lucene/ivy-versions.properties b/lucene/ivy-versions.properties
index 3482d64..c6d0cda 100644
--- a/lucene/ivy-versions.properties
+++ b/lucene/ivy-versions.properties
@@ -151,7 +151,7 @@ org.apache.hadoop.version = 2.7.4
 /org.apache.httpcomponents/httpcore = 4.4.1
 /org.apache.httpcomponents/httpmime = 4.4.1
 
-/org.apache.ivy/ivy = 2.3.0
+/org.apache.ivy/ivy = 2.4.0
 
 org.apache.james.apache.mime4j.version = 0.7.2
 /org.apache.james/apache-mime4j-core = ${org.apache.james.apache.mime4j.version}
diff --git a/lucene/licenses/ivy-2.3.0.jar.sha1 b/lucene/licenses/ivy-2.3.0.jar.sha1
deleted file mode 100644
index f4b036f..0000000
--- a/lucene/licenses/ivy-2.3.0.jar.sha1
+++ /dev/null
@@ -1 +0,0 @@
-c5ebf1c253ad4959a29f4acfe696ee48cdd9f473
diff --git a/lucene/licenses/ivy-2.4.0.jar.sha1 b/lucene/licenses/ivy-2.4.0.jar.sha1
new file mode 100644
index 0000000..3863b25
--- /dev/null
+++ b/lucene/licenses/ivy-2.4.0.jar.sha1
@@ -0,0 +1 @@
+5abe4c24bbe992a9ac07ca563d5bd3e8d569e9ed


[lucene] 08/08: SOLR-9743: documentation

Posted by dw...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

dweiss pushed a commit to branch branch_7_0
in repository https://gitbox.apache.org/repos/asf/lucene.git

commit e3d379af1b26c6d4341209a22370bcd9fe4ffabc
Author: Noble Paul <no...@apache.org>
AuthorDate: Fri Dec 8 19:30:57 2017 +1100

    SOLR-9743: documentation
---
 solr/solr-ref-guide/src/collections-api.adoc | 11 +++++++++++
 1 file changed, 11 insertions(+)

diff --git a/solr/solr-ref-guide/src/collections-api.adoc b/solr/solr-ref-guide/src/collections-api.adoc
index b7fa7f2..1732c2a 100644
--- a/solr/solr-ref-guide/src/collections-api.adoc
+++ b/solr/solr-ref-guide/src/collections-api.adoc
@@ -1904,6 +1904,17 @@ The name of the destination node. This parameter is required.
 `async`::
 Request ID to track this action which will be <<Asynchronous Calls,processed asynchronously>>.
 
+[[utilizenode]]
+== UTILIZENODE: Utilize a new node
+
+This command can be used to move some replicas from the existing nodes to a new node or lightly loaded node and reduce the load on them. This uses your autoscaling policies and preferences to identify which replica needs to be moved. It tries to fix any policy violations first and then it tries to move some load off of the most loaded nodes according to the preferences.
+
+`/admin/collections?action=UTILIZENODE&node=nodeName`
+=== UTILIZENODE Parameters
+
+`node`:: The name of the node that needs to be utilized. This parameter is required
+
+
 == Asynchronous Calls
 
 Since some collection API calls can be long running tasks (such as SPLITSHARD), you can optionally have the calls run asynchronously. Specifying `async=<request-id>` enables you to make an asynchronous call, the status of which can be requested using the <<requeststatus,REQUESTSTATUS>> call at any time.