You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@nlpcraft.apache.org by ar...@apache.org on 2021/05/11 18:18:32 UTC
[incubator-nlpcraft-website] branch master updated: Update
server-and-probe.html
This is an automated email from the ASF dual-hosted git repository.
aradzinski pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-nlpcraft-website.git
The following commit(s) were added to refs/heads/master by this push:
new 837e4c5 Update server-and-probe.html
837e4c5 is described below
commit 837e4c5b52d2ddc0728d3070bd85307fa866d5fb
Author: Aaron Radzinski <ar...@apache.org>
AuthorDate: Tue May 11 11:18:21 2021 -0700
Update server-and-probe.html
---
server-and-probe.html | 12 ++++++++++--
1 file changed, 10 insertions(+), 2 deletions(-)
diff --git a/server-and-probe.html b/server-and-probe.html
index 4a872ad..ff0cda5 100644
--- a/server-and-probe.html
+++ b/server-and-probe.html
@@ -49,7 +49,7 @@ id: server_and_probe
<a href="/download.html#zip">Binary</a> NLPCraft ZIP download comes with a single executable JAR file that includes all
necessary dependencies (except for examples): <code>build/<b>apache-nlpcraft-incubating-{{site.latest_version}}-all-deps.jar</b></code>.
This single all-inclusive JAR file can be used to start any NLPCraft runtime components as standard
- Java applications and includes binary classes for:
+ Java applications:
</p>
<ul>
<li><a href="#server">REST Server</a></li>
@@ -238,7 +238,8 @@ id: server_and_probe
<div class="tab-content">
<div class="tab-pane fade show active" id="nav-probe-script" role="tabpanel">
<pre class="brush: bash">
- $ bin/nlpcraft.sh start-probe
+ $ bin/nlpcraft.sh start-probe --cp=/path/to/model/classes
+ $ bin/nlpcraft.sh start-probe --cp=/path/to/model/classes --mdls=com.package.MyModel
</pre>
<p>
<b>NOTES:</b>
@@ -250,6 +251,13 @@ id: server_and_probe
for <i class="fab fa-fw fa-windows"></i>.
</li>
<li>
+ <code>--cp</code> parameter must provide additional JVM classpath for models to deploy in this probe.
+ </li>
+ <li>
+ Optional <code>--mdls</code> parameter can be used to specify a one or more specific models to deploy if more than
+ one model is available.
+ </li>
+ <li>
Run <code class="script">bin/nlpcraft.sh help --cmd=start-probe</code> to get a full help on this command.
</li>
</ul>