You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@nlpcraft.apache.org by ar...@apache.org on 2020/09/13 23:16:20 UTC

[incubator-nlpcraft] branch NLPCRAFT-116 updated: Code review.

This is an automated email from the ASF dual-hosted git repository.

aradzinski pushed a commit to branch NLPCRAFT-116
in repository https://gitbox.apache.org/repos/asf/incubator-nlpcraft.git


The following commit(s) were added to refs/heads/NLPCRAFT-116 by this push:
     new 6f5a361  Code review.
6f5a361 is described below

commit 6f5a361ed79eac39504a97f496f68496b6d071fc
Author: Aaron Radzinski <ar...@datalingvo.com>
AuthorDate: Sun Sep 13 16:16:07 2020 -0700

    Code review.
---
 nlpcraft/src/main/resources/nlpcraft.conf          | 23 ++++++++++++----------
 .../nlpcraft/common/config/NCConfigurable.scala    |  7 +++----
 .../org/apache/nlpcraft/examples/alarm/README.md   |  2 +-
 .../org/apache/nlpcraft/examples/echo/README.md    |  2 +-
 .../apache/nlpcraft/examples/helloworld/README.md  |  2 +-
 .../apache/nlpcraft/examples/lightswitch/README.md |  2 +-
 .../org/apache/nlpcraft/examples/phone/README.md   |  5 ++---
 .../org/apache/nlpcraft/examples/sql/README.md     |  2 +-
 .../org/apache/nlpcraft/examples/time/README.md    |  2 +-
 .../org/apache/nlpcraft/examples/weather/README.md |  2 +-
 10 files changed, 25 insertions(+), 24 deletions(-)

diff --git a/nlpcraft/src/main/resources/nlpcraft.conf b/nlpcraft/src/main/resources/nlpcraft.conf
index 72ae0f9..edad00b 100644
--- a/nlpcraft/src/main/resources/nlpcraft.conf
+++ b/nlpcraft/src/main/resources/nlpcraft.conf
@@ -17,17 +17,20 @@
 
 #
 # This is joint configuration file for both the server and the data probes. Note that
-# server and probe configuration can be placed into separate files. You can also provide
-# configuration properties or override the default ones via environment variables.
+# server and probe configuration can be placed into separate files - each file containing only
+# 'nlpcraft.server' or 'nlpcraft.probe' sub-sections.
 #
-# To use environment variables:
+# You can also provide configuration properties or override the default ones via environment variables.
+# To use environment variables override:
 # 1. Set probe or server JVM system property -Dconfig.override_with_env_vars=true
 # 2. For each configuration 'x.y.z' set the environment variable CONFIG_FORCE_x_y_z=some_value
 #
-# Examples of environment variable usage:
-#   CONFIG_FORCE_nlpcraft_server_rest_host=localhost
-#   CONFIG_FORCE_nlpcraft_server_lifecycle.0=org.apache.nlpcraft.server.lifecycle.opencensus.NCStackdriverTraceExporter
-#   CONFIG_FORCE_nlpcraft_server_lifecycle.1=org.apache.nlpcraft.server.lifecycle.opencensus.NCStackdriverStatsExporter
+# Examples of environment variables:
+#   -- Overrides 'nlpcraft.sever.host' configuration property.
+#   CONFIG_FORCE_nlpcraft_server_rest_host="localhost"
+#
+#   -- Overrides 'nlpcraft.sever.models' configuration property.
+#   CONFIG_FORCE_nlpcraft_server_models="com.models.MyModel"
 #
 # See https://nlpcraft.apache.org/server-and-probe.html for more details.
 #
@@ -38,7 +41,7 @@ nlpcraft {
     # | REST server configuration. |
     # +----------------------------+
     server {
-        # Specify class names for server lifecycle components.
+        # Comma-separated list of class names for server lifecycle components.
         # Each class should implement 'NCServerLifecycle' interface/trait and provide an no-arg constructor.
         #
         # The following built-in OpenCensus exporters are supported as lifecycle components:
@@ -180,7 +183,7 @@ nlpcraft {
         # Supported formats: MDY, DMY, YMD.
         datesFormatStyle = MDY
 
-        # Enabled built-in token providers (each token represents a named entity).
+        # Comma-separated list of enabled built-in token providers (each token represents a named entity).
         # User models can only use built-in tokens from the token providers configured here.
         #
         # Supported values:
@@ -198,7 +201,7 @@ nlpcraft {
         # See Integrations section (https://nlpcraft.apache.org/integrations.html) for details on how to
         # configure 3rd party token providers.
         # By default - only NLPCraft tokens are enabled and can be used by the user data models.
-        tokenProviders = nlpcraft
+        tokenProviders = "nlpcraft"
 
         # If Spacy is enabled as a token provider (value 'spacy') - defines Spacy proxy URL.
         # spacy.proxy.url=http://localhost:5002
diff --git a/nlpcraft/src/main/scala/org/apache/nlpcraft/common/config/NCConfigurable.scala b/nlpcraft/src/main/scala/org/apache/nlpcraft/common/config/NCConfigurable.scala
index be199b1..1cb4d7a 100644
--- a/nlpcraft/src/main/scala/org/apache/nlpcraft/common/config/NCConfigurable.scala
+++ b/nlpcraft/src/main/scala/org/apache/nlpcraft/common/config/NCConfigurable.scala
@@ -276,10 +276,9 @@ object NCConfigurable extends LazyLogging {
       *   2. Use environment variables in a form of 'CONFIG_FORCE_x_y_z' to override configuration
       *      property 'x.y.z' from the file.
       * <p>
-      * Examples: TODO:
-      *   CONFIG_FORCE_nlpcraft_server_rest_host=localhost
-      *   CONFIG_FORCE_nlpcraft_server_lifecycle.0=org.apache.nlpcraft.server.lifecycle.opencensus.NCStackdriverTraceExporter
-      *   CONFIG_FORCE_nlpcraft_server_lifecycle.1=org.apache.nlpcraft.server.lifecycle.opencensus.NCStackdriverStatsExporter
+      * Examples:
+      *   CONFIG_FORCE_nlpcraft_server_rest_host="localhost"
+      *   CONFIG_FORCE_nlpcraft_server_models="com.mymodels.MyModel"
       *
       * @param overrideCfg Optional overriding configuration.
       * @param cfgFileOpt Optional file name.
diff --git a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/alarm/README.md b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/alarm/README.md
index 232e977..0f63d39 100644
--- a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/alarm/README.md
+++ b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/alarm/README.md
@@ -44,7 +44,7 @@ If not using built-in test framework (i.e. not using embedded probe) you need to
  * Run probe standalone and use your own [REST client](https://nlpcraft.apache.org/using-rest.html):
     * **Main class:** `org.apache.nlpcraft.NCStart`
     * **VM arguments:** `-Dconfig.override_with_env_vars=true`
-    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models.0=org.apache.nlpcraft.examples.alarm.AlarmModel`
+    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models="org.apache.nlpcraft.examples.alarm.AlarmModel"`
     * **Program arguments:** `-probe`
      
 ### Documentation
diff --git a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/echo/README.md b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/echo/README.md
index 6170884..0be1754 100644
--- a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/echo/README.md
+++ b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/echo/README.md
@@ -47,7 +47,7 @@ If not using built-in test framework (i.e. not using embedded probe) you need to
  * Run probe standalone and use your own [REST client](https://nlpcraft.apache.org/using-rest.html):
     * **Main class:** `org.apache.nlpcraft.NCStart`
     * **VM arguments:** `-Dconfig.override_with_env_vars=true`
-    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models.0=org.apache.nlpcraft.examples.echo.EchoModel`
+    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models="org.apache.nlpcraft.examples.echo.EchoModel"`
     * **Program arguments:** `-probe`
     
 ### Documentation  
diff --git a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/helloworld/README.md b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/helloworld/README.md
index 9d280b1..edeaecc 100644
--- a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/helloworld/README.md
+++ b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/helloworld/README.md
@@ -42,7 +42,7 @@ If not using built-in test framework (i.e. not using embedded probe) you need to
  * Run probe standalone and use your own [REST client](https://nlpcraft.apache.org/using-rest.html):
     * **Main class:** `org.apache.nlpcraft.NCStart`
     * **VM arguments:** `-Dconfig.override_with_env_vars=true`
-    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models.0=org.apache.nlpcraft.examples.helloworld.HelloWorldModel`
+    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models="org.apache.nlpcraft.examples.helloworld.HelloWorldModel"`
     * **Program arguments:** `-probe`
  
 ### Documentation
diff --git a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/lightswitch/README.md b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/lightswitch/README.md
index 402d8d7..289f3b4 100644
--- a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/lightswitch/README.md
+++ b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/lightswitch/README.md
@@ -44,7 +44,7 @@ If not using built-in test framework (i.e. not using embedded probe) you need to
  * Run data probe standalone and use your own [REST client](https://nlpcraft.apache.org/using-rest.html):
     * **Main class:** `org.apache.nlpcraft.NCStart`
     * **VM arguments:** `-Dconfig.override_with_env_vars=true`
-    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models.0=org.apache.nlpcraft.examples.lightswitch.LightSwitchModel`
+    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models="org.apache.nlpcraft.examples.lightswitch.LightSwitchModel"`
     * **Program arguments:** `-probe`
 
 ### Documentation
diff --git a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/phone/README.md b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/phone/README.md
index cc6c948..29ad3f4 100644
--- a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/phone/README.md
+++ b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/phone/README.md
@@ -36,8 +36,7 @@ embedded probe and starts it automatically:
     * **Main class:** `org.apache.nlpcraft.NCStart`
     * **VM arguments:** `-Dconfig.override_with_env_vars=true`
     * **Environment variables:** 
-      * `CONFIG_FORCE_nlpcraft_server_tokenProviders.0=nlpcraft`
-      * `CONFIG_FORCE_nlpcraft_server_tokenProviders.1=google`
+      * `CONFIG_FORCE_nlpcraft_server_tokenProviders="nlpcraft, google"`
     * **Program arguments:** `-server`
  * Test using built-in model auto-validator:
     * **Main class:** `org.apache.nlpcraft.model.tools.test.NCTestAutoModelValidator`
@@ -47,7 +46,7 @@ If not using built-in test framework (i.e. not using embedded probe) you need to
  * Run probe standalone and use your own [REST client](https://nlpcraft.apache.org/using-rest.html):
     * **Main class:** `org.apache.nlpcraft.NCStart`
     * **VM arguments:** `-Dconfig.override_with_env_vars=true`
-    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models.0=org.apache.nlpcraft.examples.phone.PhoneModel`
+    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models="org.apache.nlpcraft.examples.phone.PhoneModel"`
     * **Program arguments:** `-probe`
 
 ### Documentation  
diff --git a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/sql/README.md b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/sql/README.md
index 1a821bf..8ca409c 100644
--- a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/sql/README.md
+++ b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/sql/README.md
@@ -45,7 +45,7 @@ If not using built-in test framework (i.e. not using embedded probe) you need to
  * To run probe standalone and use your own [REST client](https://nlpcraft.apache.org/using-rest.html):
     * **Main class:** `org.apache.nlpcraft.NCStart`
     * **VM arguments:** `-Dconfig.override_with_env_vars=true`
-    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models.0=org.apache.nlpcraft.examples.sql.SqlModel`
+    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models="org.apache.nlpcraft.examples.sql.SqlModel"`
     * **Program arguments:** `-probe`
     
     When running data probe standalone you need run H2 database sever manually (from command line or IDE):
diff --git a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/time/README.md b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/time/README.md
index 9bbec40..f08da3b 100644
--- a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/time/README.md
+++ b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/time/README.md
@@ -42,7 +42,7 @@ If not using built-in test framework (i.e. not using embedded probe) you need to
  * Run probe standalone and use your own [REST client](https://nlpcraft.apache.org/using-rest.html):
     * **Main class:** `org.apache.nlpcraft.NCStart`
     * **VM arguments:** `-Dconfig.override_with_env_vars=true`
-    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models.0=org.apache.nlpcraft.examples.time.TimeModel`
+    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models="org.apache.nlpcraft.examples.time.TimeModel"`
     * **Program arguments:** `-probe`
 
 ### Documentation  
diff --git a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/weather/README.md b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/weather/README.md
index cf36ee7..925b4a1 100644
--- a/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/weather/README.md
+++ b/nlpcraft/src/main/scala/org/apache/nlpcraft/examples/weather/README.md
@@ -43,7 +43,7 @@ If not using built-in test framework (i.e. not using embedded probe) you need to
  * Run probe standalone and use your own [REST client](https://nlpcraft.apache.org/using-rest.html):
     * **Main class:** `org.apache.nlpcraft.NCStart`
     * **VM arguments:** `-Dconfig.override_with_env_vars=true`
-    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models.0=org.apache.nlpcraft.examples.weather.WeatherModel`
+    * **Environment variables:** `CONFIG_FORCE_nlpcraft_probe_models="org.apache.nlpcraft.examples.weather.WeatherModel"`
     * **Program arguments:** `-probe`
 
 ### Documentation