You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@lucene.apache.org by ct...@apache.org on 2017/05/08 15:24:11 UTC

[2/2] lucene-solr:jira/solr-10290: SOLR-10296: conversion, remaining letter S minus solr-glossary

SOLR-10296: conversion, remaining letter S minus solr-glossary


Project: http://git-wip-us.apache.org/repos/asf/lucene-solr/repo
Commit: http://git-wip-us.apache.org/repos/asf/lucene-solr/commit/d05e3a40
Tree: http://git-wip-us.apache.org/repos/asf/lucene-solr/tree/d05e3a40
Diff: http://git-wip-us.apache.org/repos/asf/lucene-solr/diff/d05e3a40

Branch: refs/heads/jira/solr-10290
Commit: d05e3a4066fcf7446479a233f26a254c970fb95c
Parents: 3f9dc38
Author: Cassandra Targett <ct...@apache.org>
Authored: Mon May 8 10:23:43 2017 -0500
Committer: Cassandra Targett <ct...@apache.org>
Committed: Mon May 8 10:23:43 2017 -0500

----------------------------------------------------------------------
 .../src/solr-control-script-reference.adoc      | 111 ++--
 .../src/solr-cores-and-solr-xml.adoc            |  12 +-
 solr/solr-ref-guide/src/solr-field-types.adoc   |   6 +-
 .../src/solr-jdbc-apache-zeppelin.adoc          |  27 +-
 .../src/solr-jdbc-dbvisualizer.adoc             |   3 +-
 .../src/solr-jdbc-python-jython.adoc            |  73 +--
 solr/solr-ref-guide/src/solr-jdbc-r.adoc        |  19 +-
 .../src/solr-jdbc-squirrel-sql.adoc             |  13 -
 ...lrcloud-with-legacy-configuration-files.adoc |  26 +-
 solr/solr-ref-guide/src/solrcloud.adoc          |   2 +-
 solr/solr-ref-guide/src/spatial-search.adoc     |  98 ++--
 solr/solr-ref-guide/src/spell-checking.adoc     |  72 ++-
 solr/solr-ref-guide/src/stream-screen.adoc      |   2 +-
 .../src/streaming-expressions.adoc              | 508 ++++++++++---------
 solr/solr-ref-guide/src/suggester.adoc          |  34 +-
 15 files changed, 502 insertions(+), 504 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d05e3a40/solr/solr-ref-guide/src/solr-control-script-reference.adoc
----------------------------------------------------------------------
diff --git a/solr/solr-ref-guide/src/solr-control-script-reference.adoc b/solr/solr-ref-guide/src/solr-control-script-reference.adoc
index eeece8f..d150d39 100644
--- a/solr/solr-ref-guide/src/solr-control-script-reference.adoc
+++ b/solr/solr-ref-guide/src/solr-control-script-reference.adoc
@@ -2,11 +2,13 @@
 :page-shortname: solr-control-script-reference
 :page-permalink: solr-control-script-reference.html
 
-Solr includes a script known as "`bin/solr`" that allows you to start and stop Solr, create and delete collections or cores, perform operations on ZooKeeper and check the status of Solr and configured shards. You can find the script in the `bin/` directory of your Solr installation. The `bin/solr` script makes Solr easier to work with by providing simple commands and options to quickly accomplish common goals.
+Solr includes a script known as "`bin/solr`" that allows you to perform many common operations on your Solr installation or cluster.
 
-The headings below correspond to available commands. For each command, the available options are described with examples.
+You can start and stop Solr, create and delete collections or cores, perform operations on ZooKeeper and check the status of Solr and configured shards.
 
-More examples of bin/solr in use are available throughout the Solr Reference Guide, but particularly in the sections <<running-solr.adoc#running-solr,Running Solr>> and <<getting-started-with-solrcloud.adoc#getting-started-with-solrcloud,Getting Started with SolrCloud>>.
+You can find the script in the `bin/` directory of your Solr installation. The `bin/solr` script makes Solr easier to work with by providing simple commands and options to quickly accomplish common goals.
+
+More examples of `bin/solr` in use are available throughout the Solr Reference Guide, but particularly in the sections <<running-solr.adoc#running-solr,Running Solr>> and <<getting-started-with-solrcloud.adoc#getting-started-with-solrcloud,Getting Started with SolrCloud>>.
 
 [[SolrControlScriptReference-StartingandStopping]]
 == Starting and Stopping
@@ -14,9 +16,9 @@ More examples of bin/solr in use are available throughout the Solr Reference Gui
 [[SolrControlScriptReference-StartandRestart]]
 === Start and Restart
 
-The start command starts Solr. The restart command allows you to restart Solr while it is already running or if it has been stopped already.
+The `start` command starts Solr. The `restart` command allows you to restart Solr while it is already running or if it has been stopped already.
 
-The start and restart commands have several options to allow you to run in SolrCloud mode, use an example configuration set, start with a hostname or port that is not the default and point to a local ZooKeeper ensemble.
+The `start` and `restart` commands have several options to allow you to run in SolrCloud mode, use an example configuration set, start with a hostname or port that is not the default and point to a local ZooKeeper ensemble.
 
 `bin/solr start [options]`
 
@@ -26,16 +28,16 @@ The start and restart commands have several options to allow you to run in SolrC
 
 `bin/solr restart -help`
 
-When using the restart command, you must pass all of the parameters you initially passed when you started Solr. Behind the scenes, a stop request is initiated, so Solr will be stopped before being started again. If no nodes are already running, restart will skip the step to stop and proceed to starting Solr.
+When using the `restart` command, you must pass all of the parameters you initially passed when you started Solr. Behind the scenes, a stop request is initiated, so Solr will be stopped before being started again. If no nodes are already running, restart will skip the step to stop and proceed to starting Solr.
 
 [[SolrControlScriptReference-AvailableParameters]]
 ==== Available Parameters
 
-The bin/solr script provides many options to allow you to customize the server in common ways, such as changing the listening port. However, most of the defaults are adequate for most Solr installations, especially when just getting started.
+The`bin/solr` script provides many options to allow you to customize the server in common ways, such as changing the listening port. However, most of the defaults are adequate for most Solr installations, especially when just getting started.
 
 // TODO: This table has cells that won't work with PDF: https://github.com/ctargett/refguide-asciidoc-poc/issues/13
 
-[width="100%",cols="34%,33%,33%",options="header",]
+[width="100%",options="header",]
 |===
 |Parameter |Description |Example
 |-a "<string>" |Start Solr with additional JVM parameters, such as those starting with -X. If you are passing JVM parameters that begin with "-D", you can omit the -a option. |`bin/solr start -a "-Xdebug -Xrunjdwp:transport=dt_socket, server=y,suspend=n,address=1044"`
@@ -61,7 +63,6 @@ The available options are:
 * schemaless
 
 See the section <<SolrControlScriptReference-RunningwithExampleConfigurations,Running with Example Configurations>> below for more details on the example configurations.
-
  |`bin/solr start -e schemaless`
 |-f |Start Solr in the foreground; you cannot use this option when running examples with the -e option. |`bin/solr start -f`
 |-h <hostname> |Start Solr with the defined hostname. If this is not specified, 'localhost' will be assumed. |`bin/solr start -h search.mysolr.com`
@@ -97,26 +98,30 @@ It is not necessary to define all of the options when starting if the defaults a
 [[SolrControlScriptReference-SettingJavaSystemProperties]]
 ==== Setting Java System Properties
 
-The bin/solr script will pass any additional parameters that begin with -D to the JVM, which allows you to set arbitrary Java system properties. For example, to set the auto soft-commit frequency to 3 seconds, you can do:
+The `bin/solr` script will pass any additional parameters that begin with `-D` to the JVM, which allows you to set arbitrary Java system properties.
+
+For example, to set the auto soft-commit frequency to 3 seconds, you can do:
 
 `bin/solr start -Dsolr.autoSoftCommit.maxTime=3000`
 
 [[SolrControlScriptReference-SolrCloudMode]]
 ==== SolrCloud Mode
 
-The -c and -cloud options are equivalent:
+The `-c` and `-cloud` options are equivalent:
 
 `bin/solr start -c`
 
 `bin/solr start -cloud`
 
-If you specify a ZooKeeper connection string, such as `-z 192.168.1.4:2181`, then Solr will connect to ZooKeeper and join the cluster. If you do not specify the -z option when starting Solr in cloud mode, then Solr will launch an embedded ZooKeeper server listening on the Solr port + 1000, i.e., if Solr is running on port 8983, then the embedded ZooKeeper will be listening on port 9983.
+If you specify a ZooKeeper connection string, such as `-z 192.168.1.4:2181`, then Solr will connect to ZooKeeper and join the cluster.
+
+If you do not specify the `-z` option when starting Solr in cloud mode, then Solr will launch an embedded ZooKeeper server listening on the Solr port + 1000, i.e., if Solr is running on port 8983, then the embedded ZooKeeper will be listening on port 9983.
 
 [IMPORTANT]
 ====
-
-IMPORTANT: If your ZooKeeper connection string uses a chroot, such as `localhost:2181/solr`, then you need to create the /solr znode before launching SolrCloud using the bin/solr script. To do this use the "mkroot" command outlined below, for example: `bin/solr zk mkroot /solr -z 192.168.1.4:2181`
-
+If your ZooKeeper connection string uses a chroot, such as `localhost:2181/solr`, then you need to create the /solr znode before launching SolrCloud using the `bin/solr` script.
++
+To do this use the `mkroot` command outlined below, for example: `bin/solr zk mkroot /solr -z 192.168.1.4:2181`
 ====
 
 When starting in SolrCloud mode, the interactive script session will prompt you to choose a configset to use.
@@ -130,28 +135,26 @@ For more information about starting Solr in SolrCloud mode, see also the section
 
 The example configurations allow you to get started quickly with a configuration that mirrors what you hope to accomplish with Solr.
 
-Each example launches Solr in with a managed schema, which allows use of the <<schema-api.adoc#schema-api,Schema API>> to make schema edits, but does not allow manual editing of a Schema file If you would prefer to manually modify a `schema.xml` file directly, you can change this default as described in the section <<schema-factory-definition-in-solrconfig.adoc#schema-factory-definition-in-solrconfig,Schema Factory Definition in SolrConfig>>.
+Each example launches Solr with a managed schema, which allows use of the <<schema-api.adoc#schema-api,Schema API>> to make schema edits, but does not allow manual editing of a Schema file If you would prefer to manually modify a `schema.xml` file directly, you can change this default as described in the section <<schema-factory-definition-in-solrconfig.adoc#schema-factory-definition-in-solrconfig,Schema Factory Definition in SolrConfig>>.
 
 Unless otherwise noted in the descriptions below, the examples do not enable <<solrcloud.adoc#solrcloud,SolrCloud>> nor <<schemaless-mode.adoc#schemaless-mode,schemaless mode>>.
 
 The following examples are provided:
 
-* **cloud**: This example starts a 1-4 node SolrCloud cluster on a single machine. When chosen, an interactive session will start to guide you through options to select the initial configset to use, the number of nodes for your example cluster, the ports to use, and name of the collection to be created. When using this example, you can choose from any of the available configsets found in `$SOLR_HOME/server/solr/configsets`.
-* **techproducts**: This example starts Solr in standalone mode with a schema designed for the sample documents included in the `$SOLR_HOME/example/exampledocs` directory. The configset used can be found in `$SOLR_HOME/server/solr/configsets/sample_techproducts_configs`.
-* **dih**: This example starts Solr in standalone mode with the DataImportHandler (DIH) enabled and several example `dataconfig.xml` files pre-configured for different types of data supported with DIH (such as, database contents, email, RSS feeds, etc.). The configset used is customized for DIH, and is found in `$SOLR_HOME/example/example-DIH/solr/conf`. For more information about DIH, see the section <<uploading-structured-data-store-data-with-the-data-import-handler.adoc#uploading-structured-data-store-data-with-the-data-import-handler,Uploading Structured Data Store Data with the Data Import Handler>>.
-* **schemaless**: This example starts Solr in standalone mode using a managed schema, as described in the section <<schema-factory-definition-in-solrconfig.adoc#schema-factory-definition-in-solrconfig,Schema Factory Definition in SolrConfig>>, and provides a very minimal pre-defined schema. Solr will run in <<schemaless-mode.adoc#schemaless-mode,Schemaless Mode>> with this configuration, where Solr will create fields in the schema on the fly and will guess field types used in incoming documents. The configset used can be found in `$SOLR_HOME/server/solr/configsets/data_driven_schema_configs`.
+* *cloud*: This example starts a 1-4 node SolrCloud cluster on a single machine. When chosen, an interactive session will start to guide you through options to select the initial configset to use, the number of nodes for your example cluster, the ports to use, and name of the collection to be created. When using this example, you can choose from any of the available configsets found in `$SOLR_HOME/server/solr/configsets`.
+* *techproducts*: This example starts Solr in standalone mode with a schema designed for the sample documents included in the `$SOLR_HOME/example/exampledocs` directory. The configset used can be found in `$SOLR_HOME/server/solr/configsets/sample_techproducts_configs`.
+* *dih*: This example starts Solr in standalone mode with the DataImportHandler (DIH) enabled and several example `dataconfig.xml` files pre-configured for different types of data supported with DIH (such as, database contents, email, RSS feeds, etc.). The configset used is customized for DIH, and is found in `$SOLR_HOME/example/example-DIH/solr/conf`. For more information about DIH, see the section <<uploading-structured-data-store-data-with-the-data-import-handler.adoc#uploading-structured-data-store-data-with-the-data-import-handler,Uploading Structured Data Store Data with the Data Import Handler>>.
+* *schemaless*: This example starts Solr in standalone mode using a managed schema, as described in the section <<schema-factory-definition-in-solrconfig.adoc#schema-factory-definition-in-solrconfig,Schema Factory Definition in SolrConfig>>, and provides a very minimal pre-defined schema. Solr will run in <<schemaless-mode.adoc#schemaless-mode,Schemaless Mode>> with this configuration, where Solr will create fields in the schema on the fly and will guess field types used in incoming documents. The configset used can be found in `$SOLR_HOME/server/solr/configsets/data_driven_schema_configs`.
 
 [IMPORTANT]
 ====
-
-The run in-foreground option (-f) is not compatible with the -e option since the script needs to perform additional tasks after starting the Solr server.
-
+The run in-foreground option (`-f`) is not compatible with the `-e` option since the script needs to perform additional tasks after starting the Solr server.
 ====
 
 [[SolrControlScriptReference-Stop]]
 === Stop
 
-The stop command sends a STOP request to a running Solr node, which allows it to shutdown gracefully. The command will wait up to 5 seconds for Solr to stop gracefully and then will forcefully kill the process (kill -9).
+The `stop` command sends a STOP request to a running Solr node, which allows it to shutdown gracefully. The command will wait up to 5 seconds for Solr to stop gracefully and then will forcefully kill the process (kill -9).
 
 `bin/solr stop [options]`
 
@@ -174,7 +177,7 @@ The stop command sends a STOP request to a running Solr node, which allows it to
 [[SolrControlScriptReference-Version]]
 === Version
 
-The version command simply returns the version of Solr currently installed and immediately exists.
+The `version` command simply returns the version of Solr currently installed and immediately exists.
 
 [source,plain]
 ----
@@ -185,7 +188,9 @@ X.Y.0
 [[SolrControlScriptReference-Status]]
 === Status
 
-The status command displays basic JSON-formatted information for any Solr nodes found running on the local system. The status command uses the SOLR_PID_DIR environment variable to locate Solr process ID files to find running Solr instances; the SOLR_PID_DIR variable defaults to the bin directory.
+The `status` command displays basic JSON-formatted information for any Solr nodes found running on the local system.
+
+The `status` command uses the `SOLR_PID_DIR` environment variable to locate Solr process ID files to find running Solr instances, which defaults to the `bin` directory.
 
 `bin/solr status`
 
@@ -193,7 +198,7 @@ The output will include a status of each node of the cluster, as in this example
 
 [source,plain]
 ----
-Found 2 Solr nodes: 
+Found 2 Solr nodes:
 
 Solr process 39920 running on port 7574
 {
@@ -223,7 +228,7 @@ Solr process 39827 running on port 8865
 [[SolrControlScriptReference-Healthcheck]]
 === Healthcheck
 
-The healthcheck command generates a JSON-formatted health report for a collection when running in SolrCloud mode. The health report provides information about the state of every replica for all shards in a collection, including the number of committed documents and its current state.
+The `healthcheck` command generates a JSON-formatted health report for a collection when running in SolrCloud mode. The health report provides information about the state of every replica for all shards in a collection, including the number of committed documents and its current state.
 
 `bin/solr healthcheck [options]`
 
@@ -241,10 +246,10 @@ The healthcheck command generates a JSON-formatted health report for a collectio
 
 Below is an example healthcheck request and response using a non-standard ZooKeeper connect string, with 2 nodes running:
 
-[source,plain]
-----
-$ bin/solr healthcheck -c gettingstarted -z localhost:9865
+`$ bin/solr healthcheck -c gettingstarted -z localhost:9865`
 
+[source,json]
+----
 {
   "collection":"gettingstarted",
   "status":"healthy",
@@ -294,12 +299,12 @@ $ bin/solr healthcheck -c gettingstarted -z localhost:9865
 [[SolrControlScriptReference-CollectionsandCores]]
 == Collections and Cores
 
-The bin/solr script can also help you create new collections (in SolrCloud mode) or cores (in standalone mode), or delete collections.
+The `bin/solr` script can also help you create new collections (in SolrCloud mode) or cores (in standalone mode), or delete collections.
 
 [[SolrControlScriptReference-Create]]
 === Create
 
-The create command detects the mode that Solr is running in (standalone or SolrCloud) and then creates a core or collection depending on the mode.
+The `create` command detects the mode that Solr is running in (standalone or SolrCloud) and then creates a core or collection depending on the mode.
 
 `bin/solr create [options]`
 
@@ -310,7 +315,7 @@ The create command detects the mode that Solr is running in (standalone or SolrC
 
 // TODO: This table has cells that won't work with PDF: https://github.com/ctargett/refguide-asciidoc-poc/issues/13
 
-[width="100%",cols="34%,33%,33%",options="header",]
+[width="100%",options="header",]
 |===
 |Parameter |Description |Example
 |-c <name> |Name of the core or collection to create (required). |`bin/solr create -c mycollection`
@@ -345,7 +350,9 @@ a|
 [[SolrControlScriptReference-ConfigurationDirectoriesandSolrCloud]]
 ==== Configuration Directories and SolrCloud
 
-Before creating a collection in SolrCloud, the configuration directory used by the collection must be uploaded to ZooKeeper. The create command supports several use cases for how collections and configuration directories work. The main decision you need to make is whether a configuration directory in ZooKeeper should be shared across multiple collections. Let's work through a few examples to illustrate how configuration directories work in SolrCloud.
+Before creating a collection in SolrCloud, the configuration directory used by the collection must be uploaded to ZooKeeper. The create command supports several use cases for how collections and configuration directories work. The main decision you need to make is whether a configuration directory in ZooKeeper should be shared across multiple collections.
+
+Let's work through a few examples to illustrate how configuration directories work in SolrCloud.
 
 First, if you don't provide the `-d` or `-n` options, then the default configuration (`$SOLR_HOME/server/solr/configsets/data_driven_schema_configs/conf`) is uploaded to ZooKeeper using the same name as the collection. For example, the following command will result in the *data_driven_schema_configs* configuration being uploaded to `/configs/contacts` in ZooKeeper: `bin/solr create -c contacts`. If you create another collection, by doing `bin/solr create -c contacts2`, then another copy of the `data_driven_schema_configs` directory will be uploaded to ZooKeeper under `/configs/contacts2`. Any changes you make to the configuration for the contacts collection will not affect the contacts2 collection. Put simply, the default behavior creates a unique copy of the configuration directory for each collection you create.
 
@@ -363,7 +370,7 @@ The `data_driven_schema_configs` schema can mutate as data is indexed. Consequen
 [[SolrControlScriptReference-Delete]]
 === Delete
 
-The delete command detects the mode that Solr is running in (standalone or SolrCloud) and then deletes the specified core (standalone) or collection (SolrCloud) as appropriate.
+The `delete` command detects the mode that Solr is running in (standalone or SolrCloud) and then deletes the specified core (standalone) or collection (SolrCloud) as appropriate.
 
 `bin/solr delete [options]`
 
@@ -376,7 +383,7 @@ If running in SolrCloud mode, the delete command checks if the configuration dir
 
 // TODO: This table has cells that won't work with PDF: https://github.com/ctargett/refguide-asciidoc-poc/issues/13
 
-[width="100%",cols="34%,33%,33%",options="header",]
+[width="100%",options="header",]
 |===
 |Parameter |Description |Example
 |-c <name> |Name of the core / collection to delete (required). |`bin/solr delete -c mycoll`
@@ -397,7 +404,7 @@ This option is useful if you are running multiple standalone Solr instances on t
 [[SolrControlScriptReference-ZooKeeperOperations]]
 == ZooKeeper Operations
 
-The bin/solr script allows certain operations affecting ZooKeeper. These operations are for SolrCloud mode only. The operations are available as sub-commands, which each have their own set of options.
+The `bin/solr` script allows certain operations affecting ZooKeeper. These operations are for SolrCloud mode only. The operations are available as sub-commands, which each have their own set of options.
 
 `bin/solr zk [sub-command] [options]`
 
@@ -417,7 +424,7 @@ Use the `zk upconfig` command to upload one of the pre-configured configuration
 
 // TODO: This table has cells that won't work with PDF: https://github.com/ctargett/refguide-asciidoc-poc/issues/13
 
-[width="100%",cols="34%,33%,33%",options="header",]
+[width="100%",options="header",]
 |===
 |Parameter |Description |Example
 |-n <name> a|
@@ -448,9 +455,7 @@ An example of this command with these parameters is:
 .Reload Collections When Changing Configurations
 [WARNING]
 ====
-
 This command does *not* automatically make changes effective! It simply uploads the configuration sets to ZooKeeper. You can use the Collection API's <<collections-api.adoc#CollectionsAPI-reload,RELOAD command>> to reload any collections that uses this configuration set.
-
 ====
 
 [[SolrControlScriptReference-DownloadaConfigurationSet]]
@@ -465,7 +470,7 @@ Use the `zk downconfig` command to download a configuration set from ZooKeeper t
 
 // TODO: This table has cells that won't work with PDF: https://github.com/ctargett/refguide-asciidoc-poc/issues/13
 
-[width="100%",cols="34%,33%,33%",options="header",]
+[width="100%",options="header",]
 |===
 |Parameter |Description |Example
 |-n <name> |Name of config set in ZooKeeper to download. The Admin UI Cloud -> Tree -> configs node lists all available configuration sets. |`-n myconfig`
@@ -494,7 +499,7 @@ Use the `zk cp` command for transferring files and directories between ZooKeeper
 
 // TODO: This table has cells that won't work with PDF: https://github.com/ctargett/refguide-asciidoc-poc/issues/13
 
-[width="100%",cols="34%,33%,33%",options="header",]
+[width="100%",options="header",]
 |===
 |Parameter |Description |Example
 |-r |Optional. Do a recursive copy. The command will fail if the <src> has children unless '-r' is specified. |`-r`
@@ -504,7 +509,7 @@ Use the `zk cp` command for transferring files and directories between ZooKeeper
 `file:/Users/apache/configs/src`
 
 |<dest> |The file or path to copy to. If prefixed with `zk:` then the source is presumed to be ZooKeeper. If no prefix or the prefix is 'file:' this is the local drive. At least one of <src> or <dest> must be prefixed by `zk:` or the command will fail. If <dest> ends in a slash character it names a directory. |`zk:/configs/myconfigs/solrconfig.xml` `file:/Users/apache/configs/src`
-|-z <zkHost> |The ZooKeeper connection string. Unnecessary if ZK_HOST is defined in `solr.in.sh` or `solr.in.cmd`. |`-z 123.321.23.43:2181 `
+|-z <zkHost> |The ZooKeeper connection string. Unnecessary if ZK_HOST is defined in `solr.in.sh` or `solr.in.cmd`. |`-z 123.321.23.43:2181`
 |===
 
 An example of this command with the parameters is:
@@ -527,7 +532,7 @@ Use the `zk rm` command to remove a znode (and optionally all child nodes) from
 
 // TODO: This table has cells that won't work with PDF: https://github.com/ctargett/refguide-asciidoc-poc/issues/13
 
-[width="100%",cols="34%,33%,33%",options="header",]
+[width="100%",options="header",]
 |===
 |Parameter |Description |Example
 |-r |Optional. Do a recursive removal. The command will fail if the <path> has children unless '-r' is specified. |`-r`
@@ -545,7 +550,7 @@ The path is assumed to be a ZooKeeper node, no `zk:` prefix is necessary.
 
 `/configs/myconfigset/solrconfig.xml`
 
-|-z <zkHost> |The ZooKeeper connection string. Unnecessary if ZK_HOST is defined in `solr.in.sh` or `solr.in.cmd`. |`-z 123.321.23.43:2181 `
+|-z <zkHost> |The ZooKeeper connection string. Unnecessary if ZK_HOST is defined in `solr.in.sh` or `solr.in.cmd`. |`-z 123.321.23.43:2181`
 |===
 
 An example of this command with the parameters is:
@@ -564,12 +569,12 @@ Use the `zk mv` command to move (rename) a ZooKeeper znode
 [[SolrControlScriptReference-AvailableParameters.7]]
 ==== Available Parameters
 
-[width="100%",cols="34%,33%,33%",options="header",]
+[width="100%",options="header",]
 |===
 |Parameter |Description |Example
 |<src> |The znode to rename. The `zk:` prefix is assumed. |`/configs/oldconfigset`
 |<dest> |The new name of the znode. The `zk:` prefix is assumed. |`/configs/newconfigset`
-|-z <zkHost> |The ZooKeeper connection string. Unnecessary if ZK_HOST is defined in `solr.in.sh` or `solr.in.cmd`. |`-z 123.321.23.43:2181 `
+|-z <zkHost> |The ZooKeeper connection string. Unnecessary if ZK_HOST is defined in `solr.in.sh` or `solr.in.cmd`. |`-z 123.321.23.43:2181`
 |===
 
 An example of this command is:
@@ -586,12 +591,12 @@ Use the `zk ls` command to see the children of a znode.
 [[SolrControlScriptReference-AvailableParameters.8]]
 ==== Available Parameters
 
-[width="100%",cols="34%,33%,33%",options="header",]
+[width="100%",options="header",]
 |===
 |Parameter |Description |Example
 |-r |Optional. Recursively list all descendants of a znode. |`-r`
 |<path> |The path on ZooKeeper to list. |`/collections/mycollection`
-|-z <zkHost> |The ZooKeeper connection string. Unnecessary if ZK_HOST is defined in `solr.in.sh` or `solr.in.cmd`. |`-z 123.321.23.43:2181 `
+|-z <zkHost> |The ZooKeeper connection string. Unnecessary if ZK_HOST is defined in `solr.in.sh` or `solr.in.cmd`. |`-z 123.321.23.43:2181`
 |===
 
 An example of this command with the parameters is:
@@ -610,15 +615,15 @@ Use the `zk mkroot` command to create a znode. The primary use-case for this com
 [[SolrControlScriptReference-AvailableParameters.9]]
 ==== Available Parameters
 
-[width="100%",cols="34%,33%,33%",options="header",]
+[width="100%",options="header",]
 |===
 |Parameter |Description |Example
 |<path> |The path on ZooKeeper to create. Intermediate znodes will be created if necessary. A leading slash is assumed even if not specified. |`/solr`
-|-z <zkHost> |The ZooKeeper connection string. Unnecessary if ZK_HOST is defined in `solr.in.sh` or `solr.in.cmd`. |`-z 123.321.23.43:2181 `
+|-z <zkHost> |The ZooKeeper connection string. Unnecessary if ZK_HOST is defined in `solr.in.sh` or `solr.in.cmd`. |`-z 123.321.23.43:2181`
 |===
 
 Examples of this command:
 
-`bin/solr zk mkroot /solr -z 123.321.23.43:2181 `
+`bin/solr zk mkroot /solr -z 123.321.23.43:2181`
 
 `bin/solr zk mkroot /solr/production`

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d05e3a40/solr/solr-ref-guide/src/solr-cores-and-solr-xml.adoc
----------------------------------------------------------------------
diff --git a/solr/solr-ref-guide/src/solr-cores-and-solr-xml.adoc b/solr/solr-ref-guide/src/solr-cores-and-solr-xml.adoc
index e3cfa77..2e44bab 100644
--- a/solr/solr-ref-guide/src/solr-cores-and-solr-xml.adoc
+++ b/solr/solr-ref-guide/src/solr-cores-and-solr-xml.adoc
@@ -3,7 +3,7 @@
 :page-permalink: solr-cores-and-solr-xml.html
 :page-children: format-of-solr-xml, defining-core-properties, coreadmin-api, config-sets
 
-In Solr, the term _core_ is used to refer to a single index and associated transaction log and configuration files (including the `solrconfig.xml` and Schema files, among others). Your Solr installation can have multiple cores if needed, which allows you to index data with different structures in the same server, and maintain more control over how your data is presented to different audiences. In SolrCloud mode you will be more familiar with the term __collection.__ Behind the scenes a collection consists of one or more cores.
+In Solr, the term _core_ is used to refer to a single index and associated transaction log and configuration files (including the `solrconfig.xml` and Schema files, among others). Your Solr installation can have multiple cores if needed, which allows you to index data with different structures in the same server, and maintain more control over how your data is presented to different audiences. In SolrCloud mode you will be more familiar with the term _collection._ Behind the scenes a collection consists of one or more cores.
 
 Cores can be created using `bin/solr` script or as part of SolrCloud collection creation using the APIs. Core-specific properties (such as the directories to use for the indexes or configuration files, the core name, and other options) are defined in a `core.properties` file. Any `core.properties` file in any directory of your Solr installation (or in a directory under where `solr_home` is defined) will be found by Solr and the defined properties will be used for the core named in the file.
 
@@ -11,14 +11,12 @@ In standalone mode, `solr.xml` must reside in `solr_home`. In SolrCloud mode, `s
 
 [NOTE]
 ====
-
 In older versions of Solr, cores had to be predefined as `<core>` tags in `solr.xml` in order for Solr to know about them. Now, however, Solr supports automatic discovery of cores and they no longer need to be explicitly defined. The recommended way is to dynamically create cores/collections using the APIs.
-
 ====
 
 The following sections describe these options in more detail.
 
-* **<<format-of-solr-xml.adoc#format-of-solr-xml,Format of solr.xml>>**: Details on how to define `solr.xml`, including the acceptable parameters for the `solr.xml` file
-* **<<defining-core-properties.adoc#defining-core-properties,Defining core.properties>>**: Details on placement of `core.properties` and available property options.
-* **<<coreadmin-api.adoc#coreadmin-api,CoreAdmin API>>**: Tools and commands for core administration using a REST API.
-* **<<config-sets.adoc#config-sets,Config Sets>>**: How to use configsets to avoid duplicating effort when defining a new core.
+* *<<format-of-solr-xml.adoc#format-of-solr-xml,Format of solr.xml>>*: Details on how to define `solr.xml`, including the acceptable parameters for the `solr.xml` file
+* *<<defining-core-properties.adoc#defining-core-properties,Defining core.properties>>*: Details on placement of `core.properties` and available property options.
+* *<<coreadmin-api.adoc#coreadmin-api,CoreAdmin API>>*: Tools and commands for core administration using a REST API.
+* *<<config-sets.adoc#config-sets,Config Sets>>*: How to use configsets to avoid duplicating effort when defining a new core.

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d05e3a40/solr/solr-ref-guide/src/solr-field-types.adoc
----------------------------------------------------------------------
diff --git a/solr/solr-ref-guide/src/solr-field-types.adoc b/solr/solr-ref-guide/src/solr-field-types.adoc
index f8f08a6..b9bf1da 100644
--- a/solr/solr-ref-guide/src/solr-field-types.adoc
+++ b/solr/solr-ref-guide/src/solr-field-types.adoc
@@ -21,8 +21,4 @@ Topics covered in this section:
 
 * <<field-properties-by-use-case.adoc#field-properties-by-use-case,Field Properties by Use Case>>
 
-[[SolrFieldTypes-RelatedTopics]]
-== Related Topics
-
-* http://wiki.apache.org/solr/SchemaXml#Data_Types[SchemaXML-DataTypes]
-* {solr-javadocs}/solr-core/org/apache/solr/schema/FieldType.html[FieldType Javadoc]
+TIP: See also the {solr-javadocs}/solr-core/org/apache/solr/schema/FieldType.html[FieldType Javadoc].

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d05e3a40/solr/solr-ref-guide/src/solr-jdbc-apache-zeppelin.adoc
----------------------------------------------------------------------
diff --git a/solr/solr-ref-guide/src/solr-jdbc-apache-zeppelin.adoc b/solr/solr-ref-guide/src/solr-jdbc-apache-zeppelin.adoc
index 858aae2..e78a936 100644
--- a/solr/solr-ref-guide/src/solr-jdbc-apache-zeppelin.adoc
+++ b/solr/solr-ref-guide/src/solr-jdbc-apache-zeppelin.adoc
@@ -2,59 +2,52 @@
 :page-shortname: solr-jdbc-apache-zeppelin
 :page-permalink: solr-jdbc-apache-zeppelin.html
 
-[IMPORTANT]
-====
+The Solr JDBC driver can support Apache Zeppelin.
 
-This requires Apache Zeppelin 0.6.0 or greater which contains the JDBC interpreter.
-
-====
+IMPORTANT: This requires Apache Zeppelin 0.6.0 or greater which contains the JDBC interpreter.
 
-For http://zeppelin.apache.org[Apache Zeppelin], you will need to create a JDBC interpreter for Solr. This will add SolrJ to the interpreter classpath. Once the interpreter has been created, you can create a notebook to issue queries. The http://zeppelin.apache.org/docs/latest/interpreter/jdbc.html[Apache Zeppelin JDBC interpreter documentation] provides additional information about JDBC prefixes and other features.
+To use http://zeppelin.apache.org[Apache Zeppelin] with Solr, you will need to create a JDBC interpreter for Solr. This will add SolrJ to the interpreter classpath. Once the interpreter has been created, you can create a notebook to issue queries. The http://zeppelin.apache.org/docs/latest/interpreter/jdbc.html[Apache Zeppelin JDBC interpreter documentation] provides additional information about JDBC prefixes and other features.
 
 [[SolrJDBC-ApacheZeppelin-CreatetheApacheSolrJDBCInterpreter]]
 == Create the Apache Solr JDBC Interpreter
 
+.Click "Interpreter" in the top navigation
 image::images/solr-jdbc-apache-zeppelin/zeppelin_solrjdbc_1.png[image,height=400]
 
-
+.Click "Create"
 image::images/solr-jdbc-apache-zeppelin/zeppelin_solrjdbc_2.png[image,height=400]
 
-
+.Enter information about your Solr installation
 image::images/solr-jdbc-apache-zeppelin/zeppelin_solrjdbc_3.png[image,height=400]
 
-
 [NOTE]
 ====
-
 For most installations, Apache Zeppelin configures PostgreSQL as the JDBC interpreter default driver. The default driver can either be replaced by the Solr driver as outlined above or you can add a separate JDBC interpreter prefix as outlined in the http://zeppelin.apache.org/docs/latest/interpreter/jdbc.html[Apache Zeppelin JDBC interpreter documentation].
-
 ====
 
 [[SolrJDBC-ApacheZeppelin-CreateaNotebook]]
 == Create a Notebook
 
+.Click Notebook -> Create new note
 image::images/solr-jdbc-apache-zeppelin/zeppelin_solrjdbc_4.png[image,width=517,height=400]
 
-
+.Provide a name and click "Create Note"
 image::images/solr-jdbc-apache-zeppelin/zeppelin_solrjdbc_5.png[image,width=839,height=400]
 
-
 [[SolrJDBC-ApacheZeppelin-QuerywiththeNotebook]]
 == Query with the Notebook
 
 [IMPORTANT]
 ====
-
 For some notebooks, the JDBC interpreter will not be bound to the notebook by default. Instructions on how to bind the JDBC interpreter to a notebook are available https://zeppelin.apache.org/docs/latest/interpreter/jdbc.html#bind-to-notebook[here].
-
 ====
 
+.Results of Solr query
 image::images/solr-jdbc-apache-zeppelin/zeppelin_solrjdbc_6.png[image,width=481,height=400]
 
-
 The below code block assumes that the Apache Solr driver is setup as the default JDBC interpreter driver. If that is not the case, instructions for using a different prefix is available https://zeppelin.apache.org/docs/latest/interpreter/jdbc.html#how-to-use[here].
 
-[source,java]
+[source,sql]
 ----
 %jdbc
 select fielda, fieldb, from test limit 10

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d05e3a40/solr/solr-ref-guide/src/solr-jdbc-dbvisualizer.adoc
----------------------------------------------------------------------
diff --git a/solr/solr-ref-guide/src/solr-jdbc-dbvisualizer.adoc b/solr/solr-ref-guide/src/solr-jdbc-dbvisualizer.adoc
index f0f341c..af70dfd 100644
--- a/solr/solr-ref-guide/src/solr-jdbc-dbvisualizer.adoc
+++ b/solr/solr-ref-guide/src/solr-jdbc-dbvisualizer.adoc
@@ -2,6 +2,8 @@
 :page-shortname: solr-jdbc-dbvisualizer
 :page-permalink: solr-jdbc-dbvisualizer.html
 
+Solr's JDBC driver supports DBVisualizer for querying Solr.
+
 For https://www.dbvis.com/[DbVisualizer], you will need to create a new driver for Solr using the DbVisualizer Driver Manager. This will add several SolrJ client .jars to the DbVisualizer classpath. The files required are:
 
 * all .jars found in `$SOLR_HOME/dist/solrj-lib`
@@ -116,4 +118,3 @@ image::images/solr-jdbc-dbvisualizer/dbvisualizer_solrjdbc_19.png[image,width=57
 
 
 image::images/solr-jdbc-dbvisualizer/dbvisualizer_solrjdbc_20.png[image,width=556,height=400]
-

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d05e3a40/solr/solr-ref-guide/src/solr-jdbc-python-jython.adoc
----------------------------------------------------------------------
diff --git a/solr/solr-ref-guide/src/solr-jdbc-python-jython.adoc b/solr/solr-ref-guide/src/solr-jdbc-python-jython.adoc
index 37ae6ac..bb91425 100644
--- a/solr/solr-ref-guide/src/solr-jdbc-python-jython.adoc
+++ b/solr/solr-ref-guide/src/solr-jdbc-python-jython.adoc
@@ -2,35 +2,29 @@
 :page-shortname: solr-jdbc-python-jython
 :page-permalink: solr-jdbc-python-jython.html
 
-// OLD_CONFLUENCE_ID: SolrJDBC-Python/Jython-Python
+Solr's JDBC driver supports Python and Jython.
 
-[[SolrJDBC-Python_Jython-Python]]
 == Python
 
 Python supports accessing JDBC using the https://pypi.python.org/pypi/JayDeBeApi/[JayDeBeApi] library. The CLASSPATH variable must be configured to contain the solr-solrj jar and the supporting solrj-lib jars.
 
-// OLD_CONFLUENCE_ID: SolrJDBC-Python/Jython-JayDeBeApi
 
-[[SolrJDBC-Python_Jython-JayDeBeApi]]
 === JayDeBeApi
 
-*run.sh*
-
+.run.sh
 [source,bash]
 ----
 #!/usr/bin/env bash
- 
 # Java 8 must already be installed
- 
+
 pip install JayDeBeApi
- 
+
 export CLASSPATH="$(echo $(ls /opt/solr/dist/solr-solrj* /opt/solr/dist/solrj-lib/*) | tr ' ' ':')"
 
 python solr_jaydebeapi.py
 ----
 
-*solr_jaydebeapi.py*
-
+.solr_jaydebeapi.py
 [source,py]
 ----
 #!/usr/bin/env python
@@ -43,103 +37,90 @@ if __name__ == '__main__':
   jdbc_url = "jdbc:solr://localhost:9983?collection=test"
   driverName = "org.apache.solr.client.solrj.io.sql.DriverImpl"
   statement = "select fielda, fieldb, fieldc, fieldd_s, fielde_i from test limit 10"
- 
+
   conn = jaydebeapi.connect(driverName, jdbc_url)
   curs = conn.cursor()
   curs.execute(statement)
   print(curs.fetchall())
-  
+
   conn.close()
-  
+
   sys.exit(0)
 ----
 
-// OLD_CONFLUENCE_ID: SolrJDBC-Python/Jython-Jython
-
-[[SolrJDBC-Python_Jython-Jython]]
 == Jython
 
 Jython supports accessing JDBC natively with Java interfaces or with the zxJDBC library. The CLASSPATH variable must be configured to contain the solr-solrj jar and the supporting solrj-lib jars.
 
-*run.sh*
-
+.run.sh
 [source,bash]
 ----
 #!/usr/bin/env bash
- 
 # Java 8 and Jython must already be installed
- 
+
 export CLASSPATH="$(echo $(ls /opt/solr/dist/solr-solrj* /opt/solr/dist/solrj-lib/*) | tr ' ' ':')"
- 
+
 jython [solr_java_native.py | solr_zxjdbc.py]
 ----
 
-// OLD_CONFLUENCE_ID: SolrJDBC-Python/Jython-JavaNative
-
-[[SolrJDBC-Python_Jython-JavaNative]]
 === Java Native
 
-*solr_java_native.py*
-
+.solr_java_native.py
 [source,py]
 ----
 #!/usr/bin/env jython
- 
+
 # http://www.jython.org/jythonbook/en/1.0/DatabasesAndJython.html
 # https://wiki.python.org/jython/DatabaseExamples#SQLite_using_JDBC
- 
+
 import sys
- 
+
 from java.lang import Class
 from java.sql  import DriverManager, SQLException
- 
+
 if __name__ == '__main__':
   jdbc_url = "jdbc:solr://localhost:9983?collection=test"
   driverName = "org.apache.solr.client.solrj.io.sql.DriverImpl"
   statement = "select fielda, fieldb, fieldc, fieldd_s, fielde_i from test limit 10"
-  
+
   dbConn = DriverManager.getConnection(jdbc_url)
   stmt = dbConn.createStatement()
-  
+
   resultSet = stmt.executeQuery(statement)
   while resultSet.next():
     print(resultSet.getString("fielda"))
-  
+
   resultSet.close()
   stmt.close()
   dbConn.close()
-  
+
   sys.exit(0)
 ----
 
-// OLD_CONFLUENCE_ID: SolrJDBC-Python/Jython-zxJDBC
-
-[[SolrJDBC-Python_Jython-zxJDBC]]
 === zxJDBC
 
-*solr_zxjdbc.py*
-
+.solr_zxjdbc.py
 [source,py]
 ----
 #!/usr/bin/env jython
- 
+
 # http://www.jython.org/jythonbook/en/1.0/DatabasesAndJython.html
 # https://wiki.python.org/jython/DatabaseExamples#SQLite_using_ziclix
- 
+
 import sys
- 
+
 from com.ziclix.python.sql import zxJDBC
- 
+
 if __name__ == '__main__':
   jdbc_url = "jdbc:solr://localhost:9983?collection=test"
   driverName = "org.apache.solr.client.solrj.io.sql.DriverImpl"
   statement = "select fielda, fieldb, fieldc, fieldd_s, fielde_i from test limit 10"
-  
+
   with zxJDBC.connect(jdbc_url, None, None, driverName) as conn:
     with conn:
       with conn.cursor() as c:
         c.execute(statement)
         print(c.fetchall())
-  
+
   sys.exit(0)
 ----

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d05e3a40/solr/solr-ref-guide/src/solr-jdbc-r.adoc
----------------------------------------------------------------------
diff --git a/solr/solr-ref-guide/src/solr-jdbc-r.adoc b/solr/solr-ref-guide/src/solr-jdbc-r.adoc
index bd38ffe..3dedbcc 100644
--- a/solr/solr-ref-guide/src/solr-jdbc-r.adoc
+++ b/solr/solr-ref-guide/src/solr-jdbc-r.adoc
@@ -4,31 +4,28 @@
 
 R supports accessing JDBC using the https://www.rforge.net/RJDBC/[RJDBC] library.
 
-[[SolrJDBC-R-RJDBC]]
-=== RJDBC
-
-*run.sh*
+== RJDBC
 
+.run.sh
 [source,bash]
 ----
 #!/usr/bin/env bash
- 
+
 # Java 8 must already be installed and R configured with `R CMD javareconf`
 
 Rscript -e 'install.packages("RJDBC", dep=TRUE)'
 Rscript solr_rjdbc.R
 ----
 
-*solr_rjdbc.R*
-
-[source,java]
+.solr_rjdbc.R
+[source,r]
 ----
 # https://www.rforge.net/RJDBC/
- 
+
 library("RJDBC")
- 
+
 solrCP <- c(list.files('/opt/solr/dist/solrj-lib', full.names=TRUE), list.files('/opt/solr/dist', pattern='solrj', full.names=TRUE, recursive = TRUE))
- 
+
 drv <- JDBC("org.apache.solr.client.solrj.io.sql.DriverImpl",
            solrCP,
            identifier.quote="`")

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d05e3a40/solr/solr-ref-guide/src/solr-jdbc-squirrel-sql.adoc
----------------------------------------------------------------------
diff --git a/solr/solr-ref-guide/src/solr-jdbc-squirrel-sql.adoc b/solr/solr-ref-guide/src/solr-jdbc-squirrel-sql.adoc
index 9031312..bac4cbd 100644
--- a/solr/solr-ref-guide/src/solr-jdbc-squirrel-sql.adoc
+++ b/solr/solr-ref-guide/src/solr-jdbc-squirrel-sql.adoc
@@ -9,22 +9,18 @@ For http://squirrel-sql.sourceforge.net[SQuirreL SQL], you will need to create a
 
 Once the driver has been created, you can create a connection to Solr with the connection string format outlined in the generic section and use the editor to issue queries.
 
-[[SolrJDBC-SQuirreLSQL-AddSolrJDBCDriver]]
 == Add Solr JDBC Driver
 
-[[SolrJDBC-SQuirreLSQL-OpenDrivers]]
 === Open Drivers
 
 image::images/solr-jdbc-squirrel-sql/squirrelsql_solrjdbc_1.png[image,width=900,height=400]
 
 
-[[SolrJDBC-SQuirreLSQL-AddDriver]]
 === Add Driver
 
 image::images/solr-jdbc-squirrel-sql/squirrelsql_solrjdbc_2.png[image,width=892,height=400]
 
 
-[[SolrJDBC-SQuirreLSQL-NametheDriver]]
 === Name the Driver
 
 Provide a name for the driver, and provide the URL format: `jdbc:solr://<zk_connection_string>/?collection=<collection>`. Do not fill in values for the variables "```zk_connection_string```" and "```collection```", those will be defined later when the connection to Solr is configured.
@@ -32,7 +28,6 @@ Provide a name for the driver, and provide the URL format: `jdbc:solr://<zk_conn
 image::images/solr-jdbc-squirrel-sql/squirrelsql_solrjdbc_3.png[image,width=467,height=400]
 
 
-[[SolrJDBC-SQuirreLSQL-AddSolrJDBCjarstoClasspath]]
 === Add Solr JDBC jars to Classpath
 
 image::images/solr-jdbc-squirrel-sql/squirrelsql_solrjdbc_4.png[image,width=467,height=400]
@@ -47,7 +42,6 @@ image::images/solr-jdbc-squirrel-sql/squirrelsql_solrjdbc_5.png[image,width=469,
 image::images/solr-jdbc-squirrel-sql/squirrelsql_solrjdbc_7.png[image,width=467,height=400]
 
 
-[[SolrJDBC-SQuirreLSQL-AddtheSolrJDBCdriverclassname]]
 === Add the Solr JDBC driver class name
 
 After adding the .jars, you will need to additionally define the Class Name `org.apache.solr.client.solrj.io.sql.DriverImpl`.
@@ -55,39 +49,32 @@ After adding the .jars, you will need to additionally define the Class Name `org
 image::images/solr-jdbc-squirrel-sql/squirrelsql_solrjdbc_11.png[image,width=470,height=400]
 
 
-[[SolrJDBC-SQuirreLSQL-CreateanAlias]]
 == Create an Alias
 
 To define a JDBC connection, you must define an alias.
 
-[[SolrJDBC-SQuirreLSQL-OpenAliases]]
 === Open Aliases
 
 image::images/solr-jdbc-squirrel-sql/squirrelsql_solrjdbc_10.png[image,width=840,height=400]
 
 
-[[SolrJDBC-SQuirreLSQL-AddanAlias]]
 === Add an Alias
 
 image::images/solr-jdbc-squirrel-sql/squirrelsql_solrjdbc_12.png[image,width=959,height=400]
 
 
-[[SolrJDBC-SQuirreLSQL-ConfiguretheAlias]]
 === Configure the Alias
 
 image::images/solr-jdbc-squirrel-sql/squirrelsql_solrjdbc_14.png[image,width=470,height=400]
 
 
-[[SolrJDBC-SQuirreLSQL-ConnecttotheAlias]]
 === Connect to the Alias
 
 image::images/solr-jdbc-squirrel-sql/squirrelsql_solrjdbc_13.png[image,width=522,height=400]
 
 
-[[SolrJDBC-SQuirreLSQL-Querying]]
 == Querying
 
 Once you've successfully connected to Solr, you can use the SQL interface to enter queries and work with data.
 
 image::images/solr-jdbc-squirrel-sql/squirrelsql_solrjdbc_15.png[image,width=655,height=400]
-

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d05e3a40/solr/solr-ref-guide/src/solrcloud-with-legacy-configuration-files.adoc
----------------------------------------------------------------------
diff --git a/solr/solr-ref-guide/src/solrcloud-with-legacy-configuration-files.adoc b/solr/solr-ref-guide/src/solrcloud-with-legacy-configuration-files.adoc
index 155271c..b528abe 100644
--- a/solr/solr-ref-guide/src/solrcloud-with-legacy-configuration-files.adoc
+++ b/solr/solr-ref-guide/src/solrcloud-with-legacy-configuration-files.adoc
@@ -2,19 +2,21 @@
 :page-shortname: solrcloud-with-legacy-configuration-files
 :page-permalink: solrcloud-with-legacy-configuration-files.html
 
+If you are migrating from a non-SolrCloud environment to SolrCloud, this information may be helpful.
+
 All of the required configuration is already set up in the sample configurations shipped with Solr. You only need to add the following if you are migrating old configuration files. Do not remove these files and parameters from a new Solr instance if you intend to use Solr in SolrCloud mode.
 
 These properties exist in 3 files: `schema.xml`, `solrconfig.xml`, and `solr.xml`.
 
-\1. In `schema.xml`, you must have a `_version_` field defined:
-
+. In `schema.xml`, you must have a `_version_` field defined:
++
 [source,xml]
 ----
 <field name="_version_" type="long" indexed="true" stored="true" multiValued="false"/>
 ----
-
-\2. In `solrconfig.xml`, you must have an `UpdateLog` defined. This should be defined in the `updateHandler` section.
-
++
+. In `solrconfig.xml`, you must have an `UpdateLog` defined. This should be defined in the `updateHandler` section.
++
 [source,xml]
 ----
 <updateHandler>
@@ -25,9 +27,9 @@ These properties exist in 3 files: `schema.xml`, `solrconfig.xml`, and `solr.xml
   ...
 </updateHandler>
 ----
-
-\3. The http://wiki.apache.org/solr/UpdateRequestProcessor#Distributed_Updates[DistributedUpdateProcessor] is part of the default update chain and is automatically injected into any of your custom update chains, so you don't actually need to make any changes for this capability. However, should you wish to add it explicitly, you can still add it to the `solrconfig.xml` file as part of an `updateRequestProcessorChain`. For example:
-
++
+. The http://wiki.apache.org/solr/UpdateRequestProcessor#Distributed_Updates[DistributedUpdateProcessor] is part of the default update chain and is automatically injected into any of your custom update chains, so you don't actually need to make any changes for this capability. However, should you wish to add it explicitly, you can still add it to the `solrconfig.xml` file as part of an `updateRequestProcessorChain`. For example:
++
 [source,xml]
 ----
 <updateRequestProcessorChain name="sample">
@@ -37,17 +39,17 @@ These properties exist in 3 files: `schema.xml`, `solrconfig.xml`, and `solr.xml
   <processor class="solr.RunUpdateProcessorFactory" />
 </updateRequestProcessorChain>
 ----
-
++
 If you do not want the DistributedUpdateProcessFactory auto-injected into your chain (for example, if you want to use SolrCloud functionality, but you want to distribute updates yourself) then specify the `NoOpDistributingUpdateProcessorFactory` update processor factory in your chain:
-
++
 [source,xml]
 ----
 <updateRequestProcessorChain name="sample">
   <processor class="solr.LogUpdateProcessorFactory" />
-  <processor class="solr.NoOpDistributingUpdateProcessorFactory"/>  
+  <processor class="solr.NoOpDistributingUpdateProcessorFactory"/>
   <processor class="my.package.MyDistributedUpdateFactory"/>
   <processor class="solr.RunUpdateProcessorFactory" />
 </updateRequestProcessorChain>
 ----
-
++
 In the update process, Solr skips updating processors that have already been run on other nodes.

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d05e3a40/solr/solr-ref-guide/src/solrcloud.adoc
----------------------------------------------------------------------
diff --git a/solr/solr-ref-guide/src/solrcloud.adoc b/solr/solr-ref-guide/src/solrcloud.adoc
index b112597..644b143 100644
--- a/solr/solr-ref-guide/src/solrcloud.adoc
+++ b/solr/solr-ref-guide/src/solrcloud.adoc
@@ -3,7 +3,7 @@
 :page-permalink: solrcloud.html
 :page-children: getting-started-with-solrcloud, how-solrcloud-works, solrcloud-configuration-and-parameters, rule-based-replica-placement, cross-data-center-replication-cdcr-
 
-Apache Solr includes the ability to set up a cluster of Solr servers that combines fault tolerance and high availability. Called **SolrCloud**, these capabilities provide distributed indexing and search capabilities, supporting the following features:
+Apache Solr includes the ability to set up a cluster of Solr servers that combines fault tolerance and high availability. Called *SolrCloud*, these capabilities provide distributed indexing and search capabilities, supporting the following features:
 
 * Central configuration for the entire cluster
 * Automatic load balancing and fail-over for queries

http://git-wip-us.apache.org/repos/asf/lucene-solr/blob/d05e3a40/solr/solr-ref-guide/src/spatial-search.adoc
----------------------------------------------------------------------
diff --git a/solr/solr-ref-guide/src/spatial-search.adoc b/solr/solr-ref-guide/src/spatial-search.adoc
index 2d37c6a..c6da376 100644
--- a/solr/solr-ref-guide/src/spatial-search.adoc
+++ b/solr/solr-ref-guide/src/spatial-search.adoc
@@ -2,7 +2,9 @@
 :page-shortname: spatial-search
 :page-permalink: spatial-search.html
 
-Solr supports location data for use in spatial/geospatial searches. Using spatial search, you can:
+Solr supports location data for use in spatial/geospatial searches.
+
+Using spatial search, you can:
 
 * Index points or other shapes
 * Filter search results by a bounding box or circle or by other shapes
@@ -12,24 +14,25 @@ Solr supports location data for use in spatial/geospatial searches. Using spatia
 There are four main field types available for spatial search:
 
 * `LatLonPointSpatialField`
-* `LatLonType` (now deprecated) and its non-geodetic twin PointType
+* `LatLonType` (now deprecated) and its non-geodetic twin `PointType`
 * `SpatialRecursivePrefixTreeFieldType` (RPT for short), including `RptWithGeometrySpatialField`, a derivative
 * `BBoxField`
 
-LatLonPointSpatialField is the ideal field type for the most common use-cases for lat-lon point data. It replaces LatLonType which still exists for backwards compatibility. RPT offers some more features for more advanced/custom use cases / options like polygons and heatmaps.
+`LatLonPointSpatialField` is the ideal field type for the most common use-cases for lat-lon point data. It replaces LatLonType which still exists for backwards compatibility. RPT offers some more features for more advanced/custom use cases and options like polygons and heatmaps.
 
-RptWithGeometrySpatialField is for indexing and searching non-point data though it can do points too. It can't do sorting/boosting.
+`RptWithGeometrySpatialField` is for indexing and searching non-point data though it can do points too. It can't do sorting/boosting.
 
-BBoxField is for indexing bounding boxes, querying by a box, specifying a search predicate (Intersects,Within,Contains,Disjoint,Equals), and a relevancy sort/boost like overlapRatio or simply the area.
+`BBoxField` is for indexing bounding boxes, querying by a box, specifying a search predicate (Intersects,Within,Contains,Disjoint,Equals), and a relevancy sort/boost like overlapRatio or simply the area.
 
 Some esoteric details that are not in this guide can be found at http://wiki.apache.org/solr/SpatialSearch.
 
 [[SpatialSearch-LatLonPointSpatialField]]
 == LatLonPointSpatialField
 
-Here's how LatLonPointSpatialField should usually be configured in the schema:
+Here's how `LatLonPointSpatialField` (LLPSF) should usually be configured in the schema:
 
-`<fieldType name="location" class="solr.LatLonPointSpatialField" docValues="true"/>`
+[source,xml]
+<fieldType name="location" class="solr.LatLonPointSpatialField" docValues="true"/>
 
 LLPSF supports toggling `indexed`, `stored`, `docValues`, and `multiValued`. LLPSF internally uses a 2-dimensional Lucene "Points" (BDK tree) index when "indexed" is enabled (the default). When "docValues" is enabled, a latitude and longitudes pair are bit-interleaved into 64 bits and put into Lucene DocValues. The accuracy of the docValues data is about a centimeter.
 
@@ -38,7 +41,7 @@ LLPSF supports toggling `indexed`, `stored`, `docValues`, and `multiValued`. LLP
 
 For indexing geodetic points (latitude and longitude), supply it in "lat,lon" order (comma separated).
 
-For indexing non-geodetic points, it depends. Use "x y" (a space) if RPT. For PointType however, use "x,y" (a comma).
+For indexing non-geodetic points, it depends. Use `x y` (a space) if RPT. For PointType however, use `x,y` (a comma).
 
 If you'd rather use a standard industry format, Solr supports WKT and GeoJSON. However it's much bulkier than the raw coordinates for such simple data. (Not supported by the deprecated LatLonType or PointType)
 
@@ -49,7 +52,7 @@ There are two spatial Solr "query parsers" for geospatial search: `geofilt` and
 
 // TODO: This table has cells that won't work with PDF: https://github.com/ctargett/refguide-asciidoc-poc/issues/13
 
-[width="100%",cols="50%,50%",options="header",]
+[width="100%",options="header",]
 |===
 |Parameter |Description
 |d |the radial distance, usually in kilometers. (RPT & BBoxField can set other units via the setting `distanceUnits`)
@@ -67,9 +70,7 @@ There are two spatial Solr "query parsers" for geospatial search: `geofilt` and
 
 [WARNING]
 ====
-
 Don't use this for indexed non-point shapes (e.g. polygons). The results will be erroneous. And with RPT, it's only recommended for multi-valued point data, as the implementation doesn't scale very well and for single-valued fields, you should instead use a separate non-RPT field purely for distance sorting.
-
 ====
 
 When used with `BBoxField`, additional options are supported:
@@ -92,29 +93,41 @@ image::images/spatial-search/circle.png[image]
 [[SpatialSearch-bbox]]
 === `bbox`
 
-The `bbox` filter is very similar to `geofilt` except it uses the _bounding box_ of the calculated circle. See the blue box in the diagram below. It takes the same parameters as geofilt. Here's a sample query: `&q=*:*&fq={!bbox sfield=store}&pt=45.15,-93.85&d=5`. The rectangular shape is faster to compute and so it's sometimes used as an alternative to geofilt when it's acceptable to return points outside of the radius. However, if the ideal goal is a circle but you want it to run faster, then instead consider using the RPT field and try a large "distErrPct" value like `0.1` (10% radius). This will return results outside the radius but it will do so somewhat uniformly around the shape.
+The `bbox` filter is very similar to `geofilt` except it uses the _bounding box_ of the calculated circle. See the blue box in the diagram below. It takes the same parameters as geofilt.
+
+Here's a sample query:
+
+`&q=*:*&fq={!bbox sfield=store}&pt=45.15,-93.85&d=5`
+
+The rectangular shape is faster to compute and so it's sometimes used as an alternative to `geofilt` when it's acceptable to return points outside of the radius. However, if the ideal goal is a circle but you want it to run faster, then instead consider using the RPT field and try a large `distErrPct` value like `0.1` (10% radius). This will return results outside the radius but it will do so somewhat uniformly around the shape.
 
 image::images/spatial-search/bbox.png[image]
 
 
 [IMPORTANT]
 ====
-
-When a bounding box includes a pole, the bounding box ends up being a "bounding bowl" (a __spherical cap__) that includes all values north of the lowest latitude of the circle if it touches the north pole (or south of the highest latitude if it touches the south pole).
-
+When a bounding box includes a pole, the bounding box ends up being a "bounding bowl" (a _spherical cap_) that includes all values north of the lowest latitude of the circle if it touches the north pole (or south of the highest latitude if it touches the south pole).
 ====
 
 [[SpatialSearch-Filteringbyanarbitraryrectangle]]
-=== Filtering by an arbitrary rectangle
+=== Filtering by an Arbitrary Rectangle
+
+Sometimes the spatial search requirement calls for finding everything in a rectangular area, such as the area covered by a map the user is looking at. For this case, geofilt and bbox won't cut it. This is somewhat of a trick, but you can use Solr's range query syntax for this by supplying the lower-left corner as the start of the range and the upper-right corner as the end of the range.
 
-Sometimes the spatial search requirement calls for finding everything in a rectangular area, such as the area covered by a map the user is looking at. For this case, geofilt and bbox won't cut it. This is somewhat of a trick, but you can use Solr's range query syntax for this by supplying the lower-left corner as the start of the range and the upper-right corner as the end of the range. Here's an example: `&q=*:*&fq=store:[45,-94 TO 46,-93]`. LatLonType (deprecated) does *not* support rectangles that cross the dateline. For RPT and BBoxField, if you are non-geospatial coordinates (`geo="false"`) then you must quote the points due to the space, e.g. `"x y"`.
+Here's an example:
+
+`&q=*:*&fq=store:[45,-94 TO 46,-93]`
+
+LatLonType (deprecated) does *not* support rectangles that cross the dateline. For RPT and BBoxField, if you are non-geospatial coordinates (`geo="false"`) then you must quote the points due to the space, e.g. `"x y"`.
 
 // OLD_CONFLUENCE_ID: SpatialSearch-Optimizing:CacheorNot
 
 [[SpatialSearch-Optimizing_CacheorNot]]
 === Optimizing: Cache or Not
 
-It's most common to put a spatial query into an "fq" parameter – a filter query. By default, Solr will cache the query in the filter cache. If you know the filter query (be it spatial or not) is fairly unique and not likely to get a cache hit then specify `cache="false"` as a local-param as seen in the following example. The only spatial types which stand to benefit from this technique are LatLonPointSpatialField and LatLonType (deprecated). Enable docValues on the field (if it isn't already). LatLonType (deprecated) additionally requires a `cost="100"` (or more) local-param.
+It's most common to put a spatial query into an "fq" parameter – a filter query. By default, Solr will cache the query in the filter cache.
+
+If you know the filter query (be it spatial or not) is fairly unique and not likely to get a cache hit then specify `cache="false"` as a local-param as seen in the following example. The only spatial types which stand to benefit from this technique are LatLonPointSpatialField and LatLonType (deprecated). Enable docValues on the field (if it isn't already). LatLonType (deprecated) additionally requires a `cost="100"` (or more) local-param.
 
 `&q=...mykeywords...&fq=...someotherfilters...&fq={!geofilt cache=false}&sfield=store&pt=45.15,-93.85&d=5`
 
@@ -125,7 +138,14 @@ LLPSF does not support Solr's "PostFilter".
 [[SpatialSearch-DistanceSortingorBoosting_FunctionQueries_]]
 == Distance Sorting or Boosting (Function Queries)
 
-There are four distance function queries: `geodist`, see below, usually the most appropriate; http://wiki.apache.org/solr/FunctionQuery#dist[`dist`], to calculate the p-norm distance between multi-dimensional vectors; http://wiki.apache.org/solr/FunctionQuery#hsin.2C_ghhsin_-_Haversine_Formula[`hsin`], to calculate the distance between two points on a sphere; and https://wiki.apache.org/solr/FunctionQuery#sqedist_-_Squared_Euclidean_Distance[`sqedist`], to calculate the squared Euclidean distance between two points. For more information about these function queries, see the section on <<function-queries.adoc#function-queries,Function Queries>>.
+There are four distance function queries:
+
+* `geodist`, see below, usually the most appropriate;
+*  http://wiki.apache.org/solr/FunctionQuery#dist[`dist`], to calculate the p-norm distance between multi-dimensional vectors;
+* http://wiki.apache.org/solr/FunctionQuery#hsin.2C_ghhsin_-_Haversine_Formula[`hsin`], to calculate the distance between two points on a sphere;
+* https://wiki.apache.org/solr/FunctionQuery#sqedist_-_Squared_Euclidean_Distance[`sqedist`], to calculate the squared Euclidean distance between two points.
+
+For more information about these function queries, see the section on <<function-queries.adoc#function-queries,Function Queries>>.
 
 [[SpatialSearch-geodist]]
 === `geodist`
@@ -169,16 +189,16 @@ Using the <<the-dismax-query-parser.adoc#the-dismax-query-parser,DisMax>> or <<t
 
 RPT refers to either `SpatialRecursivePrefixTreeFieldType` (aka simply RPT) and an extended version: `RptWithGeometrySpatialField` (aka RPT with Geometry). RPT offers several functional improvements over LatLonPointSpatialField:
 
-* Non-geodetic – geo=false general x & y (__not__ latitude and longitude)
+* Non-geodetic – geo=false general x & y (_not_ latitude and longitude)
 * Query by polygons and other complex shapes, in addition to circles & rectangles
 * Ability to index non-point shapes (e.g. polygons) as well as points – see RptWithGeometrySpatialField
 * Heatmap grid faceting
 
-RPT _shares_ various features in common with LatLonPointSpatialField. Some are listed here:
+RPT _shares_ various features in common with `LatLonPointSpatialField`. Some are listed here:
 
 * Latitude/Longitude indexed point data; possibly multi-valued
-* Fast filtering with geofilt, bbox filters, and range query syntax (dateline crossing is supported)
-* Sort/boost via geodist
+* Fast filtering with `geofilt`, `bbox` filters, and range query syntax (dateline crossing is supported)
+* Sort/boost via `geodist`
 * Well-Known-Text (WKT) shape syntax (required for specifying polygons & other complex shapes), and GeoJSON too. In addition to indexing and searching, this works with the `wt=geojson` (GeoJSON Solr response-writer) and `[geo f=myfield]` (geo Solr document-transformer).
 
 [[SpatialSearch-Schemaconfiguration]]
@@ -188,7 +208,7 @@ To use RPT, the field type must be registered and configured in `schema.xml`. Th
 
 // TODO: This table has cells that won't work with PDF: https://github.com/ctargett/refguide-asciidoc-poc/issues/13
 
-[width="100%",cols="50%,50%",options="header",]
+[width="100%",options="header",]
 |===
 |Setting |Description
 |name |The name of the field type.
@@ -211,12 +231,16 @@ This is used to specify the units for distance measurements used throughout the
 |maxLevels |Sets the maximum grid depth for indexed data. Instead, it's usually more intuitive to compute an appropriate maxLevels by specifying `maxDistErr` .
 |===
 
-*_And there are others:_* `normWrapLongitude` _,_ `datelineRule`, `validationRule`, `autoIndex`, `allowMultiOverlap`, `precisionModel`. For further info, see notes below about spatialContextFactory implementations referenced above, especially the link to the JTS based one.
+*_And there are others:_* `normWrapLongitude` _,_ `datelineRule`, `validationRule`, `autoIndex`, `allowMultiOverlap`, `precisionModel`. For further info, see notes below about `spatialContextFactory` implementations referenced above, especially the link to the JTS based one.
 
 [[SpatialSearch-JTSandPolygons]]
 === JTS and Polygons
 
-As indicated above, `spatialContextFactory` must be set to `JTS` for polygon support, including multi-polygon. All other shapes, including even line-strings, are supported without JTS. JTS stands for http://sourceforge.net/projects/jts-topo-suite/[JTS Topology Suite], which does not come with Solr due to its LGPL license. You must download it (a JAR file) and put that in a special location internal to Solr: `SOLR_INSTALL/server/solr-webapp/webapp/WEB-INF/lib/`. You can readily download it here: https://repo1.maven.org/maven2/com/vividsolutions/jts-core/. It will not work if placed in other more typical Solr lib directories, unfortunately. When activated, there are additional configuration attributes available; see https://locationtech.github.io/spatial4j/apidocs/org/locationtech/spatial4j/context/jts/JtsSpatialContextFactory.html[org.locationtech.spatial4j.context.jts.JtsSpatialContextFactory] for the Javadocs, and remember to look at the superclass's options in as well. One option 
 in particular you should most likely enable is `autoIndex` (i.e., use JTS's PreparedGeometry) as it's been shown to be a major performance boost for non-trivial polygons.
+As indicated above, `spatialContextFactory` must be set to `JTS` for polygon support, including multi-polygon.
+
+All other shapes, including even line-strings, are supported without JTS. JTS stands for http://sourceforge.net/projects/jts-topo-suite/[JTS Topology Suite], which does not come with Solr due to its LGPL license. You must download it (a JAR file) and put that in a special location internal to Solr: `SOLR_INSTALL/server/solr-webapp/webapp/WEB-INF/lib/`. You can readily download it here: https://repo1.maven.org/maven2/com/vividsolutions/jts-core/. It will not work if placed in other more typical Solr lib directories, unfortunately.
+
+When activated, there are additional configuration attributes available; see https://locationtech.github.io/spatial4j/apidocs/org/locationtech/spatial4j/context/jts/JtsSpatialContextFactory.html[org.locationtech.spatial4j.context.jts.JtsSpatialContextFactory] for the Javadocs, and remember to look at the superclass's options in as well. One option in particular you should most likely enable is `autoIndex` (i.e., use JTS's PreparedGeometry) as it's been shown to be a major performance boost for non-trivial polygons.
 
 [source,xml]
 ----
@@ -233,13 +257,12 @@ Once the field type has been defined, define a field that uses it.
 
 Here's an example polygon query for a field "geo" that can be either solr.SpatialRecursivePrefixTreeFieldType or RptWithGeometrySpatialField:
 
-....
+[source,plain]
 &q=*:*&fq={!field f=geo}Intersects(POLYGON((-10 30, -40 40, -10 -20, 40 20, 0 0, -10 30)))
-....
 
 Inside the parenthesis following the search predicate is the shape definition. The format of that shape is governed by the `format` attribute on the field type, defaulting to WKT. If you prefer GeoJSON, you can specify that instead.
 
-*Beyond this reference guide and Spatila4j's docs, there are some details that remain at the Solr Wiki at* http://wiki.apache.org/solr/SolrAdaptersForLuceneSpatial4
+Beyond this Reference Guide and Spatila4j's docs, there are some details that remain at the Solr Wiki at http://wiki.apache.org/solr/SolrAdaptersForLuceneSpatial4.
 
 [[SpatialSearch-RptWithGeometrySpatialField]]
 === RptWithGeometrySpatialField
@@ -267,7 +290,7 @@ The RPT field supports generating a 2D grid of facet counts for documents having
 
 The heatmap feature is accessed from Solr's faceting feature. As a part of faceting, it supports the `key` local parameter as well as excluding tagged filter queries, just like other types of faceting do. This allows multiple heatmaps to be returned on the same field with different filters.
 
-[width="100%",cols="50%,50%",options="header",]
+[width="100%",options="header",]
 |===
 |Parameter |Description
 |facet |Set to `true` to enable faceting
@@ -279,17 +302,16 @@ The heatmap feature is accessed from Solr's faceting feature. As a part of facet
 |facet.heatmap.format |The format, either `ints2D` (default) or `png`.
 |===
 
-.Tip
-[NOTE]
+[TIP]
 ====
 
 You'll experiment with different distErrPct values (probably 0.10 - 0.20) with various input geometries till the default size is what you're looking for. The specific details of how it's computed isn't important. For high-detail grids used in point-plotting (loosely one cell per pixel), set distErr to be the number of decimal-degrees of several pixels or so of the map being displayed. Also, you probably don't want to use a geohash based grid because the cell orientation between grid levels flip-flops between being square and rectangle. Quad is consistent and has more levels, albeit at the expense of a larger index.
 
 ====
 
-Here's some sample output in JSON (with some ..... inserted for brevity):
+Here's some sample output in JSON (with "..." inserted for brevity):
 
-[source,java]
+[source,plain]
 ----
 {gridLevel=6,columns=64,rows=64,minX=-180.0,maxX=180.0,minY=-90.0,maxY=90.0,
 counts_ints2D=[[0, 0, 2, 1, ....],[1, 1, 3, 2, ...],...]}
@@ -322,14 +344,14 @@ To index a box, add a field value to a bbox field that's a string in the WKT/CQL
 
 To search, you can use the `{!bbox}` query parser, or the range syntax e.g. `[10,-10 TO 15,20]`, or the ENVELOPE syntax wrapped in parenthesis with a leading search predicate. The latter is the only way to choose a predicate other than Intersects. For example:
 
-....
+[source,plain]
 &q={!field f=bbox}Contains(ENVELOPE(-10, 20, 15, 10))
-....
+
 
 Now to sort the results by one of the relevancy modes, use it like this:
 
-....
+[source,plain]
 &q={!field f=bbox score=overlapRatio}Intersects(ENVELOPE(-10, 20, 15, 10))
-....
+
 
 The `score` local parameter can be one of `overlapRatio`, `area`, and `area2D`. `area` scores by the document area using surface-of-a-sphere (assuming `geo=true`) math, while `area2D` uses simple width * height. `overlapRatio` computes a [0-1] ranged score based on how much overlap exists relative to the document's area and the query area. The javadocs of {lucene-javadocs}/spatial-extras/org/apache/lucene/spatial/bbox/BBoxOverlapRatioValueSource.html[BBoxOverlapRatioValueSource] have more info on the formula. There is an additional parameter `queryTargetProportion` that allows you to weight the query side of the formula to the index (target) side of the formula. You can also use `&debug=results` to see useful score computation info.