You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@labs.apache.org by th...@apache.org on 2008/02/22 09:34:39 UTC

svn commit: r630116 - /labs/droids/trunk/src/documentation/content/xdocs/index.xml

Author: thorsten
Date: Fri Feb 22 00:34:37 2008
New Revision: 630116

URL: http://svn.apache.org/viewvc?rev=630116&view=rev
Log:
white noise - formating changes

Modified:
    labs/droids/trunk/src/documentation/content/xdocs/index.xml

Modified: labs/droids/trunk/src/documentation/content/xdocs/index.xml
URL: http://svn.apache.org/viewvc/labs/droids/trunk/src/documentation/content/xdocs/index.xml?rev=630116&r1=630115&r2=630116&view=diff
==============================================================================
--- labs/droids/trunk/src/documentation/content/xdocs/index.xml (original)
+++ labs/droids/trunk/src/documentation/content/xdocs/index.xml Fri Feb 22 00:34:37 2008
@@ -15,65 +15,91 @@
   See the License for the specific language governing permissions and
   limitations under the License.
 -->
-<!DOCTYPE document PUBLIC "-//APACHE//DTD Documentation V2.0//EN" "http://forrest.apache.org/dtd/document-v20.dtd">
+<!DOCTYPE document PUBLIC "-//APACHE//DTD Documentation V2.0//EN"
+"http://forrest.apache.org/dtd/document-v20.dtd">
 <document>
   <header>
     <title>Welcome to Apache Droids</title>
   </header>
+
   <body>
-    <section id="what">
+    <section
+     id="what">
       <title>What is this?</title>
-      <p>Droids aims to be an intelligent standalone robot framework that allows to create
-        robots as plugins, which can automatically seek out relevant online information
-        based on the user's specifications. For the core I formally took nutch, ripped out and
-        modified the plugin/extension framework. </p>
-      <p> However the current is not based on this framework but is using Spring instead. The
-        main reason is that Spring has become a standard and helps to make Droids as extensible
-        as possible. </p>
-      <p>Droids makes it very easy to extend existing robots or write a new one from scratch.
-        </p>
+
+      <p>Droids aims to be an intelligent standalone robot framework that
+      allows to create robots as plugins, which can automatically seek out
+      relevant online information based on the user's specifications. For the
+      core I formally took nutch, ripped out and modified the plugin/extension
+      framework.</p>
+
+      <p>However the current is not based on this framework but is using Spring
+      instead. The main reason is that Spring has become a standard and helps
+      to make Droids as extensible as possible.</p>
+
+      <p>Droids makes it very easy to extend existing robots or write a new one
+      from scratch.</p>
     </section>
+
     <section>
       <title>Why was it created?</title>
-      <p>Mainly because of personal curiosity: The background of this work is that Cocoon
-        trunk does not provide a crawler anymore and Forrest is based on it, meaning we cannot
-        update anymore till we found a crawler replacement. Getting more involved in Solr and
-        Nutch I see request for a generic standalone crawler. </p>
+
+      <p>Mainly because of personal curiosity: The background of this work is
+      that Cocoon trunk does not provide a crawler anymore and Forrest is based
+      on it, meaning we cannot update anymore till we found a crawler
+      replacement. Getting more involved in Solr and Nutch I see request for a
+      generic standalone crawler.</p>
     </section>
+
     <section>
       <title>Requirements</title>
+
       <ul>
         <li>Apache Ant version 1.7.0 or higher</li>
+
         <li>JDK 1.5 or higher</li>
       </ul>
     </section>
-    <warning label="HEADSUP"> !!! Please ONLY crawl localhost NEVER a internet site when you
-      test the first time!!! You will need to adjust the urlfilters to limit loops. </warning>
+
+    <warning
+     label="HEADSUP">!!! Please ONLY crawl localhost NEVER a internet site when
+    you test the first time!!! You will need to adjust the urlfilters to limit
+    loops.</warning>
+
     <section>
       <title>Links / related projects</title>
+
       <ul>
         <li>
-          <a href="http://lucene.apache.org/nutch/">Nutch web-search software</a>
+          <a
+           href="http://lucene.apache.org/nutch/">Nutch web-search software</a>
         </li>
+
         <li>
-          <a href="http://www.robotstxt.org/wc/robots.html">The Web Robots Pages</a>
+          <a
+           href="http://www.robotstxt.org/wc/robots.html">The Web Robots
+          Pages</a>
         </li>
+
         <li>
           <a
-            href="http://www.andreas-hess.info/programming/webcrawler/index.html">
-            Programming webcrawler</a>
+           href="http://www.andreas-hess.info/programming/webcrawler/index.html">
+          Programming webcrawler</a>
         </li>
+
         <li>
           <a
-            href="http://www.andreas-hess.info/programming/webcrawler/index.html">
-            Writing a Web Crawler in the Java Programming Language</a>
+           href="http://www.andreas-hess.info/programming/webcrawler/index.html">
+          Writing a Web Crawler in the Java Programming Language</a>
         </li>
+
         <li>
           <a
-            href="http://svn.apache.org/repos/asf/httpcomponents/norobots-rfc/trunk/src/java/org/apache/http/norobots/">
-            Norbert</a>
+           href="http://svn.apache.org/repos/asf/httpcomponents/norobots-rfc/trunk/src/java/org/apache/http/norobots/">
+          Norbert</a>
         </li>
       </ul>
     </section>
   </body>
 </document>
+



---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@labs.apache.org
For additional commands, e-mail: commits-help@labs.apache.org