You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@manifoldcf.apache.org by ri...@apache.org on 2012/09/20 15:25:05 UTC

svn commit: r1388020 - /manifoldcf/trunk/site/src/documentation/content/xdocs/en_US/how-to-build-and-deploy.xml

Author: ridder
Date: Thu Sep 20 13:25:05 2012
New Revision: 1388020

URL: http://svn.apache.org/viewvc?rev=1388020&view=rev
Log:
Added English documentation for CONNECTORS-486

Modified:
    manifoldcf/trunk/site/src/documentation/content/xdocs/en_US/how-to-build-and-deploy.xml

Modified: manifoldcf/trunk/site/src/documentation/content/xdocs/en_US/how-to-build-and-deploy.xml
URL: http://svn.apache.org/viewvc/manifoldcf/trunk/site/src/documentation/content/xdocs/en_US/how-to-build-and-deploy.xml?rev=1388020&r1=1388019&r2=1388020&view=diff
==============================================================================
--- manifoldcf/trunk/site/src/documentation/content/xdocs/en_US/how-to-build-and-deploy.xml (original)
+++ manifoldcf/trunk/site/src/documentation/content/xdocs/en_US/how-to-build-and-deploy.xml Thu Sep 20 13:25:05 2012
@@ -532,6 +532,7 @@ cd dist/example
             <tr><td>org.apache.manifoldcf.scheduling</td><td>No</td><td>Log document scheduling activity.  Legal values INFO, WARN, or DEBUG.</td></tr>
             <tr><td>org.apache.manifoldcf.authorityconnectors</td><td>No</td><td>Log authority connector activity.  Legal values INFO, WARN, or DEBUG.</td></tr>
             <tr><td>org.apache.manifoldcf.authorityservice</td><td>No</td><td>Log authority service activity.  Legal values are INFO, WARN, or DEBUG.</td></tr>
+            <tr><td>org.apache.manifoldcf.seed</td><td>Yes, if file encryption is used</td><td>Specify the seed value to be used for encrypting the file to which the crawler configuration is exported.</td></tr>
           </table>
           <p></p>
         </section>
@@ -579,8 +580,8 @@ cd dist/example
             <tr><td>org.apache.manifoldcf.crawler.UnRegister</td><td><em>classname</em></td><td>Un-register a repository connector class</td></tr>
             <tr><td>org.apache.manifoldcf.crawler.UnRegisterAll</td><td>None</td><td>Un-register all repository connector classes</td></tr>
             <tr><td>org.apache.manifoldcf.crawler.SynchronizeConnectors</td><td>None</td><td>Un-register all registered repository connector classes that can't be found</td></tr>
-            <tr><td>org.apache.manifoldcf.crawler.ExportConfiguration</td><td><em>filename</em></td><td>Export crawler configuration to a file</td></tr>
-            <tr><td>org.apache.manifoldcf.crawler.ImportConfiguration</td><td><em>filename</em></td><td>Import crawler configuration from a file</td></tr>
+            <tr><td>org.apache.manifoldcf.crawler.ExportConfiguration</td><td><em>filename</em> [<em>passcode</em>]</td><td>Export crawler configuration to a file</td></tr>
+            <tr><td>org.apache.manifoldcf.crawler.ImportConfiguration</td><td><em>filename</em> [<em>passcode</em>]</td><td>Import crawler configuration from a file</td></tr>
           </table>
           <p></p>
           <table>
@@ -598,6 +599,12 @@ cd dist/example
           <p></p>
         </section>
         <section>
+          <title>Encrypting crawler configuration data</title>
+          <p></p>
+          <p>By adding a passcode as a second argument to the ExportConfiguration command class, the file will be encrypted by using the AES algorithm. This can be useful to prevent repository passwords to be stored in clear text. In order to use this functionality, you must enter a seed value to your configuration file. The same passcode along with the seed value are used to decrypt the file with the ImportConfiguration command class. See the documentation for the commands and properties above to find the correct arguments and settings.</p>
+          <p></p>
+        </section>
+        <section>
           <title>Initializing the database</title>
           <p></p>
           <p>If you run the multiprocess model, you will need to initialize the database before you start the agents process or use the crawler UI.  To do this, all you need to do is