You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@falcon.apache.org by pa...@apache.org on 2016/03/01 08:22:35 UTC

[32/51] [partial] falcon git commit: Copying over falcon site related files to git asf-site

http://git-wip-us.apache.org/repos/asf/falcon/blob/8609ffd6/site/0.5-incubating/InstallationSteps.html
----------------------------------------------------------------------
diff --git a/site/0.5-incubating/InstallationSteps.html b/site/0.5-incubating/InstallationSteps.html
new file mode 100644
index 0000000..1fb47a0
--- /dev/null
+++ b/site/0.5-incubating/InstallationSteps.html
@@ -0,0 +1,362 @@
+<!DOCTYPE html>
+<!--
+ | Generated by Apache Maven Doxia at 2015-11-30
+ | Rendered using Apache Maven Fluido Skin 1.3.0
+-->
+<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
+  <head>
+    <meta charset="UTF-8" />
+    <meta name="viewport" content="width=device-width, initial-scale=1.0" />
+    <meta name="Date-Revision-yyyymmdd" content="20151130" />
+    <meta http-equiv="Content-Language" content="en" />
+    <title>Falcon - Building & Installing Falcon</title>
+    <link rel="stylesheet" href="./css/apache-maven-fluido-1.3.0.min.css" />
+    <link rel="stylesheet" href="./css/site.css" />
+    <link rel="stylesheet" href="./css/print.css" media="print" />
+
+      
+    <script type="text/javascript" src="./js/apache-maven-fluido-1.3.0.min.js"></script>
+
+    
+            </head>
+        <body class="topBarDisabled">
+          
+                        
+                    
+    
+        <div class="container">
+          <div id="banner">
+        <div class="pull-left">
+                                    <a href="http://falcon.incubator.apache.org/index.html" id="bannerLeft">
+                                                                                                <img src="images/falcon-logo.png"  alt="Falcon" width="200px" height="45px"/>
+                </a>
+                      </div>
+        <div class="pull-right">                  <a href="http://incubator.apache.org" id="bannerRight">
+                                                                                                <img src="images/apache-incubator-logo.png"  alt="Apache Incubator"/>
+                </a>
+      </div>
+        <div class="clear"><hr/></div>
+      </div>
+
+      <div id="breadcrumbs">
+        <ul class="breadcrumb">
+                
+                    
+                              <li class="">
+                    <a href="index.html" title="Home">
+        Home</a>
+        </li>
+      <li class="divider ">/</li>
+        <li class="">Building & Installing Falcon</li>
+        
+                
+                    
+      
+                                              
+    <li class="pull-right">              <a href="http://s.apache.org/falcon-0.5-release-notes" class="externalLink" title="Released: 2014-09-22">
+        Released: 2014-09-22</a>
+  </li>
+
+        <li class="divider pull-right">|</li>
+      
+    <li class="pull-right">              <a href="http://archive.apache.org/dist/incubator/falcon/" class="externalLink" title="0.5-incubating">
+        0.5-incubating</a>
+  </li>
+
+                        </ul>
+      </div>
+
+      
+                
+        <div id="bodyColumn" >
+                                  
+            <div class="section">
+<h3>Building &amp; Installing Falcon<a name="Building__Installing_Falcon"></a></h3></div>
+<div class="section">
+<h4>Building Falcon<a name="Building_Falcon"></a></h4>
+<div class="source">
+<pre>
+git clone https://git-wip-us.apache.org/repos/asf/falcon.git falcon
+
+cd falcon
+
+export MAVEN_OPTS=&quot;-Xmx1024m -XX:MaxPermSize=256m&quot; &amp;&amp; mvn clean install [For hadoop 1]
+export MAVEN_OPTS=&quot;-Xmx1024m -XX:MaxPermSize=256m&quot; &amp;&amp; mvn clean install -Phadoop-2 [For hadoop 2]
+
+[optionally -Dhadoop.version=&lt;&lt;hadoop.version&gt;&gt; can be appended to build for a specific version of hadoop]
+[optionally -Doozie.version=&lt;&lt;oozie version&gt;&gt; can be appended to build with a specific version of oozie. Oozie versions &gt;= 3.oozie-3.2.0-incubating are supported]
+
+
+</pre></div>
+<p>Once the build successfully completes, artifacts can be packaged for deployment. The package can be built in embedded or distributed mode.</p>
+<p><b>Embedded Mode</b></p>
+<div class="source">
+<pre>
+
+mvn clean assembly:assembly -DskipTests -DskipCheck=true [For hadoop 1]
+mvn clean assembly:assembly -DskipTests -DskipCheck=true -P hadoop-2 [For hadoop 2]
+
+
+</pre></div>
+<p>Tar can be found in {project dir}/target/falcon-${project.version}-bin.tar.gz</p>
+<p>Tar is structured as follows</p>
+<div class="source">
+<pre>
+
+|- bin
+   |- falcon
+   |- falcon-start
+   |- falcon-stop
+   |- falcon-config.sh
+   |- service-start.sh
+   |- service-stop.sh
+|- conf
+   |- startup.properties
+   |- runtime.properties
+   |- client.properties
+   |- log4j.xml
+   |- falcon-env.sh
+|- docs
+|- client
+   |- lib (client support libs)
+|- server
+   |- webapp
+      |- falcon.war
+|- hadooplibs
+|- README
+|- NOTICE.txt
+|- LICENSE.txt
+|- DISCLAIMER.txt
+|- CHANGES.txt
+
+</pre></div>
+<p><b>Distributed Mode</b></p>
+<div class="source">
+<pre>
+
+mvn clean assembly:assembly -DskipTests -DskipCheck=true -Pdistributed,hadoop-1 [For hadoop 1]
+mvn clean assembly:assembly -DskipTests -DskipCheck=true -Pdistributed,hadoop-2 [For hadoop 2]
+
+
+</pre></div>
+<p>Tar can be found in {project dir}/target/falcon-distributed-${project.version}-server.tar.gz</p>
+<p>Tar is structured as follows</p>
+<div class="source">
+<pre>
+
+|- bin
+   |- falcon
+   |- falcon-start
+   |- falcon-stop
+   |- falcon-config.sh
+   |- service-start.sh
+   |- service-stop.sh
+   |- prism-stop
+   |- prism-start
+|- conf
+   |- startup.properties
+   |- runtime.properties
+   |- client.properties
+   |- log4j.xml
+   |- falcon-env.sh
+|- docs
+|- client
+   |- lib (client support libs)
+|- server
+   |- webapp
+      |- falcon.war
+      |- prism.war
+|- hadooplibs
+|- README
+|- NOTICE.txt
+|- LICENSE.txt
+|- DISCLAIMER.txt
+|- CHANGES.txt
+
+</pre></div></div>
+<div class="section">
+<h4>Installing &amp; running Falcon<a name="Installing__running_Falcon"></a></h4>
+<p><b>Installing falcon</b></p>
+<div class="source">
+<pre>
+tar -xzvf {falcon package}
+cd falcon-distributed-${project.version} or falcon-${project.version}
+
+</pre></div>
+<p><b>Configuring Falcon</b></p>
+<p>By default config directory used by falcon is {package dir}/conf. To override this set environment variable FALCON_CONF to the path of the conf dir.</p>
+<p>falcon-env.sh has been added to the falcon conf. This file can be used to set various environment variables that you need for you services. In addition you can set any other environment variables you might need. This file will be sourced by falcon scripts before any commands are executed. The following environment variables are available to set.</p>
+<div class="source">
+<pre>
+# The java implementation to use. If JAVA_HOME is not found we expect java and jar to be in path
+#export JAVA_HOME=
+
+# any additional java opts you want to set. This will apply to both client and server operations
+#export FALCON_OPTS=
+
+# any additional java opts that you want to set for client only
+#export FALCON_CLIENT_OPTS=
+
+# java heap size we want to set for the client. Default is 1024MB
+#export FALCON_CLIENT_HEAP=
+
+# any additional opts you want to set for prism service.
+#export FALCON_PRISM_OPTS=
+
+# java heap size we want to set for the prism service. Default is 1024MB
+#export FALCON_PRISM_HEAP=
+
+# any additional opts you want to set for falcon service.
+#export FALCON_SERVER_OPTS=
+
+# java heap size we want to set for the falcon server. Default is 1024MB
+#export FALCON_SERVER_HEAP=
+
+# What is is considered as falcon home dir. Default is the base location of the installed software
+#export FALCON_HOME_DIR=
+
+# Where log files are stored. Default is logs directory under the base install location
+#export FALCON_LOG_DIR=
+
+# Where pid files are stored. Default is logs directory under the base install location
+#export FALCON_PID_DIR=
+
+# where the falcon active mq data is stored. Default is logs/data directory under the base install location
+#export FALCON_DATA_DIR=
+
+# Where do you want to expand the war file. By Default it is in /server/webapp dir under the base install dir.
+#export FALCON_EXPANDED_WEBAPP_DIR=
+
+</pre></div>
+<p><b>Starting Falcon Server</b></p>
+<div class="source">
+<pre>
+bin/falcon-start [-port &lt;port&gt;]
+
+</pre></div>
+<p>By default,  * falcon server starts at port 15443 (https) by default . To change the port, use -port option</p>
+<ul>
+<li>falcon.enableTLS can be set to true or false explicitly to enable SSL, if not port that end with 443 will automatically put falcon on <a class="externalLink" href="https://">https://</a></li></ul>* falcon server starts embedded active mq. To control this behaviour, set the following system properties using -D option in environment variable FALCON_OPTS:
+<ul>
+<li>falcon.embeddedmq=&lt;true/false&gt; - Should server start embedded active mq, default true</li>
+<li>falcon.embeddedmq.port=&lt;port&gt; - Port for embedded active mq, default 61616</li>
+<li>falcon.embeddedmq.data=&lt;path&gt; - Data path for embedded active mq, default {package dir}/logs/data</li></ul>* falcon server starts with conf from {package dir}/conf. To override this (to use the same conf with multiple falcon upgrades), set environment variable FALCON_CONF to the path of conf dir
+<p><b><i>Adding Extension Libraries</i></b> Library extensions allows users to add custom libraries to entity lifecycles such as feed retention, feed replication and process execution. This is useful for usecases such as adding filesystem extensions. To enable this, add the following configs to startup.properties: *.libext.paths=&lt;paths to be added to all entity lifecycles&gt; *.libext.feed.paths=&lt;paths to be added to all feed lifecycles&gt; *.libext.feed.retentions.paths=&lt;paths to be added to feed retention workflow&gt; *.libext.feed.replication.paths=&lt;paths to be added to feed replication workflow&gt; *.libext.process.paths=&lt;paths to be added to process workflow&gt;</p>
+<p>The configured jars are added to falcon classpath and the corresponding workflows</p>
+<p><b>Starting Prism</b></p>
+<div class="source">
+<pre>
+bin/prism-start [-port &lt;port&gt;]
+
+</pre></div>
+<p>By default,  * prism server starts at port 16443. To change the port, use -port option</p>
+<ul>
+<li>falcon.enableTLS can be set to true or false explicitly to enable SSL, if not port that end with 443 will automatically put prism on <a class="externalLink" href="https://">https://</a></li></ul>* prism starts with conf from {package dir}/conf. To override this (to use the same conf with multiple prism upgrades), set environment variable FALCON_CONF to the path of conf dir
+<p><b>Using Falcon</b></p>
+<div class="source">
+<pre>
+bin/falcon admin -version
+Falcon server build version: {Version:&quot;0.3-SNAPSHOT-rd7e2be9afa2a5dc96acd1ec9e325f39c6b2f17f7&quot;,Mode:&quot;embedded&quot;}
+
+----
+
+bin/falcon help
+(for more details about falcon cli usage)
+
+</pre></div>
+<p><b>Dashboard</b></p>
+<p>Once falcon / prism is started, you can view the status of falcon entities using the Web-based dashboard. The web UI works in both distributed and embedded mode. You can open your browser at the corresponding port to use the web UI.</p>
+<p><b>Stopping Falcon Server</b></p>
+<div class="source">
+<pre>
+bin/falcon-stop
+
+</pre></div>
+<p><b>Stopping Prism</b></p>
+<div class="source">
+<pre>
+bin/prism-stop
+
+</pre></div></div>
+<div class="section">
+<h4>Preparing Oozie and Falcon packages for deployment<a name="Preparing_Oozie_and_Falcon_packages_for_deployment"></a></h4>
+<div class="source">
+<pre>
+cd &lt;&lt;project home&gt;&gt;
+src/bin/package.sh &lt;&lt;hadoop-version&gt;&gt; &lt;&lt;oozie-version&gt;&gt;
+
+&gt;&gt; ex. src/bin/package.sh 1.1.2 3.1.3-incubating or src/bin/package.sh 0.20.2-cdh3u5 4.0.0
+&gt;&gt; Falcon package is available in &lt;&lt;falcon home&gt;&gt;/target/falcon-&lt;&lt;version&gt;&gt;-bin.tar.gz
+&gt;&gt; Oozie package is available in &lt;&lt;falcon home&gt;&gt;/target/oozie-3.3.2-distro.tar.gz
+
+</pre></div></div>
+<div class="section">
+<h4>Running Examples using embedded package<a name="Running_Examples_using_embedded_package"></a></h4>
+<div class="source">
+<pre>
+bin/falcon-start
+
+</pre></div>
+<p>Make sure the hadoop and oozie endpoints are according to your setup in examples/entity/filesystem/standalone-cluster.xml</p>
+<div class="source">
+<pre>
+bin/falcon entity -submit -type cluster -file examples/entity/filesystem/standalone-cluster.xml
+
+</pre></div>
+<p>Submit input and output feeds:</p>
+<div class="source">
+<pre>
+bin/falcon entity -submit -type feed -file examples/entity/filesystem/in-feed.xml
+bin/falcon entity -submit -type feed -file examples/entity/filesystem/out-feed.xml
+
+</pre></div>
+<p>Set-up workflow for the process:</p>
+<div class="source">
+<pre>
+hadoop fs -put examples/app /
+
+</pre></div>
+<p>Submit and schedule the process:</p>
+<div class="source">
+<pre>
+bin/falcon entity -submitAndSchedule -type process -file examples/entity/filesystem/oozie-mr-process.xml
+bin/falcon entity -submitAndSchedule -type process -file examples/entity/filesystem/pig-process.xml
+
+</pre></div>
+<p>Generate input data:</p>
+<div class="source">
+<pre>
+examples/data/generate.sh &lt;&lt;hdfs endpoint&gt;&gt;
+
+</pre></div>
+<p>Get status of instances:</p>
+<div class="source">
+<pre>
+bin/falcon instance -status -type process -name oozie-mr-process -start 2013-11-15T00:05Z -end 2013-11-15T01:00Z
+
+</pre></div>
+<p>HCat based example entities are in examples/entity/hcat.</p></div>
+                  </div>
+          </div>
+
+    <hr/>
+
+    <footer>
+            <div class="container">
+              <div class="row span12">Copyright &copy;                    2013-2015
+                        <a href="http://www.apache.org">Apache Software Foundation</a>.
+            All Rights Reserved.      
+                    
+      </div>
+
+                          
+                <p id="poweredBy" class="pull-right">
+                          <a href="http://maven.apache.org/" title="Built by Maven" class="poweredBy">
+        <img class="builtBy" alt="Built by Maven" src="./images/logos/maven-feather.png" />
+      </a>
+              </p>
+        
+                </div>
+    </footer>
+  </body>
+</html>

http://git-wip-us.apache.org/repos/asf/falcon/blob/8609ffd6/site/0.5-incubating/OnBoarding.html
----------------------------------------------------------------------
diff --git a/site/0.5-incubating/OnBoarding.html b/site/0.5-incubating/OnBoarding.html
new file mode 100644
index 0000000..a37b1a3
--- /dev/null
+++ b/site/0.5-incubating/OnBoarding.html
@@ -0,0 +1,377 @@
+<!DOCTYPE html>
+<!--
+ | Generated by Apache Maven Doxia at 2015-11-30
+ | Rendered using Apache Maven Fluido Skin 1.3.0
+-->
+<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
+  <head>
+    <meta charset="UTF-8" />
+    <meta name="viewport" content="width=device-width, initial-scale=1.0" />
+    <meta name="Date-Revision-yyyymmdd" content="20151130" />
+    <meta http-equiv="Content-Language" content="en" />
+    <title>Falcon - Contents</title>
+    <link rel="stylesheet" href="./css/apache-maven-fluido-1.3.0.min.css" />
+    <link rel="stylesheet" href="./css/site.css" />
+    <link rel="stylesheet" href="./css/print.css" media="print" />
+
+      
+    <script type="text/javascript" src="./js/apache-maven-fluido-1.3.0.min.js"></script>
+
+    
+            </head>
+        <body class="topBarDisabled">
+          
+                        
+                    
+    
+        <div class="container">
+          <div id="banner">
+        <div class="pull-left">
+                                    <a href="http://falcon.incubator.apache.org/index.html" id="bannerLeft">
+                                                                                                <img src="images/falcon-logo.png"  alt="Falcon" width="200px" height="45px"/>
+                </a>
+                      </div>
+        <div class="pull-right">                  <a href="http://incubator.apache.org" id="bannerRight">
+                                                                                                <img src="images/apache-incubator-logo.png"  alt="Apache Incubator"/>
+                </a>
+      </div>
+        <div class="clear"><hr/></div>
+      </div>
+
+      <div id="breadcrumbs">
+        <ul class="breadcrumb">
+                
+                    
+                              <li class="">
+                    <a href="index.html" title="Home">
+        Home</a>
+        </li>
+      <li class="divider ">/</li>
+        <li class="">Contents</li>
+        
+                
+                    
+      
+                                              
+    <li class="pull-right">              <a href="http://s.apache.org/falcon-0.5-release-notes" class="externalLink" title="Released: 2014-09-22">
+        Released: 2014-09-22</a>
+  </li>
+
+        <li class="divider pull-right">|</li>
+      
+    <li class="pull-right">              <a href="http://archive.apache.org/dist/incubator/falcon/" class="externalLink" title="0.5-incubating">
+        0.5-incubating</a>
+  </li>
+
+                        </ul>
+      </div>
+
+      
+                
+        <div id="bodyColumn" >
+                                  
+            <div class="section">
+<h3>Contents<a name="Contents"></a></h3>
+<p></p>
+<ul>
+<li><a href="#Onboarding Steps">Onboarding Steps</a></li>
+<li><a href="#Sample Pipeline">Sample Pipeline</a></li>
+<li><a href="./HiveIntegration.html">Hive Examples</a></li></ul></div>
+<div class="section">
+<h4>Onboarding Steps<a name="Onboarding_Steps"></a></h4>
+<p></p>
+<ul>
+<li>Create cluster definition for the cluster, specifying name node, job tracker, workflow engine endpoint, messaging endpoint. Refer to <a href="./EntitySpecification.html">cluster definition</a> for details.</li>
+<li>Create Feed definitions for each of the input and output specifying frequency, data path, ownership. Refer to <a href="./EntitySpecification.html">feed definition</a> for details.</li>
+<li>Create Process definition for your job. Process defines configuration for the workflow job. Important attributes are frequency, inputs/outputs and workflow path. Refer to <a href="./EntitySpecification.html">process definition</a> for process details.</li>
+<li>Define workflow for your job using the workflow engine(only oozie is supported as of now). Refer <a class="externalLink" href="http://oozie.apache.org/docs/3.1.3-incubating/WorkflowFunctionalSpec.html">Oozie Workflow Specification</a>. The libraries required for the workflow should be available in lib folder in workflow path.</li>
+<li>Set-up workflow definition, libraries and referenced scripts on hadoop.</li>
+<li>Submit cluster definition</li>
+<li>Submit and schedule feed and process definitions</li></ul></div>
+<div class="section">
+<h4>Sample Pipeline<a name="Sample_Pipeline"></a></h4></div>
+<div class="section">
+<h5>Cluster   <a name="Cluster"></a></h5>
+<p>Cluster definition that contains end points for name node, job tracker, oozie and jms server:</p>
+<div class="source">
+<pre>
+&lt;?xml version=&quot;1.0&quot;?&gt;
+&lt;!--
+    Cluster configuration
+  --&gt;
+&lt;cluster colo=&quot;ua2&quot; description=&quot;&quot; name=&quot;corp&quot; xmlns=&quot;uri:falcon:cluster:0.1&quot;
+    xmlns:xsi=&quot;http://www.w3.org/2001/XMLSchema-instance&quot;&gt;    
+    &lt;interfaces&gt;
+        &lt;interface type=&quot;readonly&quot; endpoint=&quot;hftp://name-node.com:50070&quot; version=&quot;0.20.2-cdh3u0&quot; /&gt;
+
+        &lt;interface type=&quot;write&quot; endpoint=&quot;hdfs://name-node.com:54310&quot; version=&quot;0.20.2-cdh3u0&quot; /&gt;
+
+        &lt;interface type=&quot;execute&quot; endpoint=&quot;job-tracker:54311&quot; version=&quot;0.20.2-cdh3u0&quot; /&gt;
+
+        &lt;interface type=&quot;workflow&quot; endpoint=&quot;http://oozie.com:11000/oozie/&quot; version=&quot;3.1.4&quot; /&gt;
+
+        &lt;interface type=&quot;messaging&quot; endpoint=&quot;tcp://jms-server.com:61616?daemon=true&quot; version=&quot;5.1.6&quot; /&gt;
+    &lt;/interfaces&gt;
+
+    &lt;locations&gt;
+        &lt;location name=&quot;staging&quot; path=&quot;/projects/falcon/staging&quot; /&gt;
+        &lt;location name=&quot;temp&quot; path=&quot;/tmp&quot; /&gt;
+        &lt;location name=&quot;working&quot; path=&quot;/projects/falcon/working&quot; /&gt;
+    &lt;/locations&gt;
+&lt;/cluster&gt;
+
+</pre></div></div>
+<div class="section">
+<h5>Input Feed<a name="Input_Feed"></a></h5>
+<p>Hourly feed that defines feed path, frequency, ownership and validity:</p>
+<div class="source">
+<pre>
+&lt;?xml version=&quot;1.0&quot; encoding=&quot;UTF-8&quot;?&gt;
+&lt;!--
+    Hourly sample input data
+  --&gt;
+
+&lt;feed description=&quot;sample input data&quot; name=&quot;SampleInput&quot; xmlns=&quot;uri:falcon:feed:0.1&quot;
+    xmlns:xsi=&quot;http://www.w3.org/2001/XMLSchema-instance&quot;&gt;
+    &lt;groups&gt;group&lt;/groups&gt;
+
+    &lt;frequency&gt;hours(1)&lt;/frequency&gt;
+
+    &lt;late-arrival cut-off=&quot;hours(6)&quot; /&gt;
+
+    &lt;clusters&gt;
+        &lt;cluster name=&quot;corp&quot; type=&quot;source&quot;&gt;
+            &lt;validity start=&quot;2009-01-01T00:00Z&quot; end=&quot;2099-12-31T00:00Z&quot; timezone=&quot;UTC&quot; /&gt;
+            &lt;retention limit=&quot;months(24)&quot; action=&quot;delete&quot; /&gt;
+        &lt;/cluster&gt;
+    &lt;/clusters&gt;
+
+    &lt;locations&gt;
+        &lt;location type=&quot;data&quot; path=&quot;/projects/bootcamp/data/${YEAR}-${MONTH}-${DAY}-${HOUR}/SampleInput&quot; /&gt;
+        &lt;location type=&quot;stats&quot; path=&quot;/projects/bootcamp/stats/SampleInput&quot; /&gt;
+        &lt;location type=&quot;meta&quot; path=&quot;/projects/bootcamp/meta/SampleInput&quot; /&gt;
+    &lt;/locations&gt;
+
+    &lt;ACL owner=&quot;suser&quot; group=&quot;users&quot; permission=&quot;0755&quot; /&gt;
+
+    &lt;schema location=&quot;/none&quot; provider=&quot;none&quot; /&gt;
+&lt;/feed&gt;
+
+</pre></div></div>
+<div class="section">
+<h5>Output Feed<a name="Output_Feed"></a></h5>
+<p>Daily feed that defines feed path, frequency, ownership and validity:</p>
+<div class="source">
+<pre>
+&lt;?xml version=&quot;1.0&quot; encoding=&quot;UTF-8&quot;?&gt;
+&lt;!--
+    Daily sample output data
+  --&gt;
+
+&lt;feed description=&quot;sample output data&quot; name=&quot;SampleOutput&quot; xmlns=&quot;uri:falcon:feed:0.1&quot;
+xmlns:xsi=&quot;http://www.w3.org/2001/XMLSchema-instance&quot;&gt;
+    &lt;groups&gt;group&lt;/groups&gt;
+
+    &lt;frequency&gt;days(1)&lt;/frequency&gt;
+
+    &lt;late-arrival cut-off=&quot;hours(6)&quot; /&gt;
+
+    &lt;clusters&gt;
+        &lt;cluster name=&quot;corp&quot; type=&quot;source&quot;&gt;
+            &lt;validity start=&quot;2009-01-01T00:00Z&quot; end=&quot;2099-12-31T00:00Z&quot; timezone=&quot;UTC&quot; /&gt;
+            &lt;retention limit=&quot;months(24)&quot; action=&quot;delete&quot; /&gt;
+        &lt;/cluster&gt;
+    &lt;/clusters&gt;
+
+    &lt;locations&gt;
+        &lt;location type=&quot;data&quot; path=&quot;/projects/bootcamp/output/${YEAR}-${MONTH}-${DAY}/SampleOutput&quot; /&gt;
+        &lt;location type=&quot;stats&quot; path=&quot;/projects/bootcamp/stats/SampleOutput&quot; /&gt;
+        &lt;location type=&quot;meta&quot; path=&quot;/projects/bootcamp/meta/SampleOutput&quot; /&gt;
+    &lt;/locations&gt;
+
+    &lt;ACL owner=&quot;suser&quot; group=&quot;users&quot; permission=&quot;0755&quot; /&gt;
+
+    &lt;schema location=&quot;/none&quot; provider=&quot;none&quot; /&gt;
+&lt;/feed&gt;
+
+</pre></div></div>
+<div class="section">
+<h5>Process<a name="Process"></a></h5>
+<p>Sample process which runs daily at 6th hour on corp cluster. It takes one input - SampleInput for the previous day(24 instances). It generates one output - SampleOutput for previous day. The workflow is defined at /projects/bootcamp/workflow/workflow.xml. Any libraries available for the workflow should be at /projects/bootcamp/workflow/lib. The process also defines properties queueName, ssh.host, and fileTimestamp which are passed to the workflow. In addition, Falcon exposes the following properties to the workflow: nameNode, jobTracker(hadoop properties), input and output(Input/Output properties).</p>
+<div class="source">
+<pre>
+&lt;?xml version=&quot;1.0&quot; encoding=&quot;UTF-8&quot;?&gt;
+&lt;!--
+    Daily sample process. Runs at 6th hour every day. Input - last day's hourly data. Generates output for yesterday
+ --&gt;
+&lt;process name=&quot;SampleProcess&quot;&gt;
+    &lt;cluster name=&quot;corp&quot; /&gt;
+
+    &lt;frequency&gt;days(1)&lt;/frequency&gt;
+
+    &lt;validity start=&quot;2012-04-03T06:00Z&quot; end=&quot;2022-12-30T00:00Z&quot; timezone=&quot;UTC&quot; /&gt;
+
+    &lt;inputs&gt;
+        &lt;input name=&quot;input&quot; feed=&quot;SampleInput&quot; start=&quot;yesterday(0,0)&quot; end=&quot;today(-1,0)&quot; /&gt;
+    &lt;/inputs&gt;
+
+    &lt;outputs&gt;
+            &lt;output name=&quot;output&quot; feed=&quot;SampleOutput&quot; instance=&quot;yesterday(0,0)&quot; /&gt;
+    &lt;/outputs&gt;
+
+    &lt;properties&gt;
+        &lt;property name=&quot;queueName&quot; value=&quot;reports&quot; /&gt;
+        &lt;property name=&quot;ssh.host&quot; value=&quot;host.com&quot; /&gt;
+        &lt;property name=&quot;fileTimestamp&quot; value=&quot;${coord:formatTime(coord:nominalTime(), 'yyyy-MM-dd')}&quot; /&gt;
+    &lt;/properties&gt;
+
+    &lt;workflow engine=&quot;oozie&quot; path=&quot;/projects/bootcamp/workflow&quot; /&gt;
+
+    &lt;retry policy=&quot;backoff&quot; delay=&quot;minutes(5)&quot; attempts=&quot;3&quot; /&gt;
+    
+    &lt;late-process policy=&quot;exp-backoff&quot; delay=&quot;hours(1)&quot;&gt;
+        &lt;late-input input=&quot;input&quot; workflow-path=&quot;/projects/bootcamp/workflow/lateinput&quot; /&gt;
+    &lt;/late-process&gt;
+&lt;/process&gt;
+
+</pre></div></div>
+<div class="section">
+<h5>Oozie Workflow<a name="Oozie_Workflow"></a></h5>
+<p>The sample user workflow contains 3 actions:</p>
+<ul>
+<li>Pig action - Executes pig script /projects/bootcamp/workflow/script.pig</li>
+<li>concatenator - Java action that concatenates part files and generates a single file</li>
+<li>file upload - ssh action that gets the concatenated file from hadoop and sends the file to a remote host</li></ul>
+<div class="source">
+<pre>
+&lt;workflow-app xmlns=&quot;uri:oozie:workflow:0.2&quot; name=&quot;sample-wf&quot;&gt;
+        &lt;start to=&quot;pig&quot; /&gt;
+
+        &lt;action name=&quot;pig&quot;&gt;
+                &lt;pig&gt;
+                        &lt;job-tracker&gt;${jobTracker}&lt;/job-tracker&gt;
+                        &lt;name-node&gt;${nameNode}&lt;/name-node&gt;
+                        &lt;prepare&gt;
+                                &lt;delete path=&quot;${output}&quot;/&gt;
+                        &lt;/prepare&gt;
+                        &lt;configuration&gt;
+                                &lt;property&gt;
+                                        &lt;name&gt;mapred.job.queue.name&lt;/name&gt;
+                                        &lt;value&gt;${queueName}&lt;/value&gt;
+                                &lt;/property&gt;
+                                &lt;property&gt;
+                                        &lt;name&gt;mapreduce.fileoutputcommitter.marksuccessfuljobs&lt;/name&gt;
+                                        &lt;value&gt;true&lt;/value&gt;
+                                &lt;/property&gt;
+                        &lt;/configuration&gt;
+                        &lt;script&gt;${nameNode}/projects/bootcamp/workflow/script.pig&lt;/script&gt;
+                        &lt;param&gt;input=${input}&lt;/param&gt;
+                        &lt;param&gt;output=${output}&lt;/param&gt;
+                        &lt;file&gt;lib/dependent.jar&lt;/file&gt;
+                &lt;/pig&gt;
+                &lt;ok to=&quot;concatenator&quot; /&gt;
+                &lt;error to=&quot;fail&quot; /&gt;
+        &lt;/action&gt;
+
+        &lt;action name=&quot;concatenator&quot;&gt;
+                &lt;java&gt;
+                        &lt;job-tracker&gt;${jobTracker}&lt;/job-tracker&gt;
+                        &lt;name-node&gt;${nameNode}&lt;/name-node&gt;
+                        &lt;prepare&gt;
+                                &lt;delete path=&quot;${nameNode}/projects/bootcamp/concat/data-${fileTimestamp}.csv&quot;/&gt;
+                        &lt;/prepare&gt;
+                        &lt;configuration&gt;
+                                &lt;property&gt;
+                                        &lt;name&gt;mapred.job.queue.name&lt;/name&gt;
+                                        &lt;value&gt;${queueName}&lt;/value&gt;
+                                &lt;/property&gt;
+                        &lt;/configuration&gt;
+                        &lt;main-class&gt;com.wf.Concatenator&lt;/main-class&gt;
+                        &lt;arg&gt;${output}&lt;/arg&gt;
+                        &lt;arg&gt;${nameNode}/projects/bootcamp/concat/data-${fileTimestamp}.csv&lt;/arg&gt;
+                &lt;/java&gt;
+                &lt;ok to=&quot;fileupload&quot; /&gt;
+                &lt;error to=&quot;fail&quot;/&gt;
+        &lt;/action&gt;
+                        
+        &lt;action name=&quot;fileupload&quot;&gt;
+                &lt;ssh&gt;
+                        &lt;host&gt;localhost&lt;/host&gt;
+                        &lt;command&gt;/tmp/fileupload.sh&lt;/command&gt;
+                        &lt;args&gt;${nameNode}/projects/bootcamp/concat/data-${fileTimestamp}.csv&lt;/args&gt;
+                        &lt;args&gt;${wf:conf(&quot;ssh.host&quot;)}&lt;/args&gt;
+                        &lt;capture-output/&gt;
+                &lt;/ssh&gt;
+                &lt;ok to=&quot;fileUploadDecision&quot; /&gt;
+                &lt;error to=&quot;fail&quot;/&gt;
+        &lt;/action&gt;
+
+        &lt;decision name=&quot;fileUploadDecision&quot;&gt;
+                &lt;switch&gt;
+                        &lt;case to=&quot;end&quot;&gt;
+                                ${wf:actionData('fileupload')['output'] == '0'}
+                        &lt;/case&gt;
+                        &lt;default to=&quot;fail&quot;/&gt;
+                &lt;/switch&gt;
+        &lt;/decision&gt;
+
+        &lt;kill name=&quot;fail&quot;&gt;
+                &lt;message&gt;Workflow failed, error message[${wf:errorMessage(wf:lastErrorNode())}]&lt;/message&gt;
+        &lt;/kill&gt;
+
+        &lt;end name=&quot;end&quot; /&gt;
+&lt;/workflow-app&gt;
+
+</pre></div></div>
+<div class="section">
+<h5>File Upload Script<a name="File_Upload_Script"></a></h5>
+<p>The script gets the file from hadoop, rsyncs the file to /tmp on remote host and deletes the file from hadoop</p>
+<div class="source">
+<pre>
+#!/bin/bash
+
+trap 'echo &quot;output=$?&quot;; exit $?' ERR INT TERM
+
+echo &quot;Arguments: $@&quot;
+SRCFILE=$1
+DESTHOST=$3
+
+FILENAME=`basename $SRCFILE`
+rm -f /tmp/$FILENAME
+hadoop fs -copyToLocal $SRCFILE /tmp/
+echo &quot;Copied $SRCFILE to /tmp&quot;
+
+rsync -ztv --rsh=ssh --stats /tmp/$FILENAME $DESTHOST:/tmp
+echo &quot;rsynced $FILENAME to $DESTUSER@$DESTHOST:$DESTFILE&quot;
+
+hadoop fs -rmr $SRCFILE
+echo &quot;Deleted $SRCFILE&quot;
+
+rm -f /tmp/$FILENAME
+echo &quot;output=0&quot;
+
+</pre></div></div>
+                  </div>
+          </div>
+
+    <hr/>
+
+    <footer>
+            <div class="container">
+              <div class="row span12">Copyright &copy;                    2013-2015
+                        <a href="http://www.apache.org">Apache Software Foundation</a>.
+            All Rights Reserved.      
+                    
+      </div>
+
+                          
+                <p id="poweredBy" class="pull-right">
+                          <a href="http://maven.apache.org/" title="Built by Maven" class="poweredBy">
+        <img class="builtBy" alt="Built by Maven" src="./images/logos/maven-feather.png" />
+      </a>
+              </p>
+        
+                </div>
+    </footer>
+  </body>
+</html>

http://git-wip-us.apache.org/repos/asf/falcon/blob/8609ffd6/site/0.5-incubating/PrismSetup.png
----------------------------------------------------------------------
diff --git a/site/0.5-incubating/PrismSetup.png b/site/0.5-incubating/PrismSetup.png
new file mode 100644
index 0000000..b0dc9a5
Binary files /dev/null and b/site/0.5-incubating/PrismSetup.png differ

http://git-wip-us.apache.org/repos/asf/falcon/blob/8609ffd6/site/0.5-incubating/ProcessSchedule.png
----------------------------------------------------------------------
diff --git a/site/0.5-incubating/ProcessSchedule.png b/site/0.5-incubating/ProcessSchedule.png
new file mode 100644
index 0000000..a7dd788
Binary files /dev/null and b/site/0.5-incubating/ProcessSchedule.png differ

http://git-wip-us.apache.org/repos/asf/falcon/blob/8609ffd6/site/0.5-incubating/Security.html
----------------------------------------------------------------------
diff --git a/site/0.5-incubating/Security.html b/site/0.5-incubating/Security.html
new file mode 100644
index 0000000..2db4e7d
--- /dev/null
+++ b/site/0.5-incubating/Security.html
@@ -0,0 +1,289 @@
+<!DOCTYPE html>
+<!--
+ | Generated by Apache Maven Doxia at 2015-11-30
+ | Rendered using Apache Maven Fluido Skin 1.3.0
+-->
+<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
+  <head>
+    <meta charset="UTF-8" />
+    <meta name="viewport" content="width=device-width, initial-scale=1.0" />
+    <meta name="Date-Revision-yyyymmdd" content="20151130" />
+    <meta http-equiv="Content-Language" content="en" />
+    <title>Falcon - Securing Falcon</title>
+    <link rel="stylesheet" href="./css/apache-maven-fluido-1.3.0.min.css" />
+    <link rel="stylesheet" href="./css/site.css" />
+    <link rel="stylesheet" href="./css/print.css" media="print" />
+
+      
+    <script type="text/javascript" src="./js/apache-maven-fluido-1.3.0.min.js"></script>
+
+    
+            </head>
+        <body class="topBarDisabled">
+          
+                        
+                    
+    
+        <div class="container">
+          <div id="banner">
+        <div class="pull-left">
+                                    <a href="http://falcon.incubator.apache.org/index.html" id="bannerLeft">
+                                                                                                <img src="images/falcon-logo.png"  alt="Falcon" width="200px" height="45px"/>
+                </a>
+                      </div>
+        <div class="pull-right">                  <a href="http://incubator.apache.org" id="bannerRight">
+                                                                                                <img src="images/apache-incubator-logo.png"  alt="Apache Incubator"/>
+                </a>
+      </div>
+        <div class="clear"><hr/></div>
+      </div>
+
+      <div id="breadcrumbs">
+        <ul class="breadcrumb">
+                
+                    
+                              <li class="">
+                    <a href="index.html" title="Home">
+        Home</a>
+        </li>
+      <li class="divider ">/</li>
+        <li class="">Securing Falcon</li>
+        
+                
+                    
+      
+                                              
+    <li class="pull-right">              <a href="http://s.apache.org/falcon-0.5-release-notes" class="externalLink" title="Released: 2014-09-22">
+        Released: 2014-09-22</a>
+  </li>
+
+        <li class="divider pull-right">|</li>
+      
+    <li class="pull-right">              <a href="http://archive.apache.org/dist/incubator/falcon/" class="externalLink" title="0.5-incubating">
+        0.5-incubating</a>
+  </li>
+
+                        </ul>
+      </div>
+
+      
+                
+        <div id="bodyColumn" >
+                                  
+            <div class="section">
+<h2>Securing Falcon<a name="Securing_Falcon"></a></h2></div>
+<div class="section">
+<h3>Overview<a name="Overview"></a></h3>
+<p>Apache Falcon enforces authentication on protected resources. Once authentication has been established it sets a signed HTTP Cookie that contains an authentication token with the user name, user principal, authentication type and expiration time.</p>
+<p>It does so by using <a class="externalLink" href="./Http://hadoop.apache.org/docs/current/hadoop-auth/index.html.html">Hadoop Auth</a>. Hadoop Auth is a Java library consisting of a client and a server components to enable Kerberos SPNEGO authentication for HTTP. Hadoop Auth also supports additional authentication mechanisms on the client and the server side via 2 simple interfaces.</p></div>
+<div class="section">
+<h3>Authentication Methods<a name="Authentication_Methods"></a></h3>
+<p>It supports 2 authentication methods, simple and kerberos out of the box.</p></div>
+<div class="section">
+<h4>Pseudo/Simple Authentication<a name="PseudoSimple_Authentication"></a></h4>
+<p>Falcon authenticates the user by simply trusting the value of the query string parameter 'user.name'. This is the default mode Falcon is configured with.</p></div>
+<div class="section">
+<h4>Kerberos Authentication<a name="Kerberos_Authentication"></a></h4>
+<p>Falcon uses HTTP Kerberos SPNEGO to authenticate the user.</p></div>
+<div class="section">
+<h3>Server Side Configuration Setup<a name="Server_Side_Configuration_Setup"></a></h3></div>
+<div class="section">
+<h4>Common Configuration Parameters<a name="Common_Configuration_Parameters"></a></h4>
+<div class="source">
+<pre>
+# Authentication type must be specified: simple|kerberos
+*.falcon.authentication.type=kerberos
+
+</pre></div></div>
+<div class="section">
+<h4>Kerberos Configuration<a name="Kerberos_Configuration"></a></h4>
+<div class="source">
+<pre>
+##### Service Configuration
+
+# Indicates the Kerberos principal to be used in Falcon Service.
+*.falcon.service.authentication.kerberos.principal=falcon/_HOST@EXAMPLE.COM
+
+# Location of the keytab file with the credentials for the Service principal.
+*.falcon.service.authentication.kerberos.keytab=/etc/security/keytabs/falcon.service.keytab
+
+# name node principal to talk to config store
+*.dfs.namenode.kerberos.principal=nn/_HOST@EXAMPLE.COM
+
+##### SPNEGO Configuration
+
+# Authentication type must be specified: simple|kerberos|&lt;class&gt;
+# org.apache.falcon.security.RemoteUserInHeaderBasedAuthenticationHandler can be used for backwards compatibility
+*.falcon.http.authentication.type=kerberos
+
+# Indicates how long (in seconds) an authentication token is valid before it has to be renewed.
+*.falcon.http.authentication.token.validity=36000
+
+# The signature secret for signing the authentication tokens.
+*.falcon.http.authentication.signature.secret=falcon
+
+# The domain to use for the HTTP cookie that stores the authentication token.
+*.falcon.http.authentication.cookie.domain=
+
+# Indicates if anonymous requests are allowed when using 'simple' authentication.
+*.falcon.http.authentication.simple.anonymous.allowed=true
+
+# Indicates the Kerberos principal to be used for HTTP endpoint.
+# The principal MUST start with 'HTTP/' as per Kerberos HTTP SPNEGO specification.
+*.falcon.http.authentication.kerberos.principal=HTTP/_HOST@EXAMPLE.COM
+
+# Location of the keytab file with the credentials for the HTTP principal.
+*.falcon.http.authentication.kerberos.keytab=/etc/security/keytabs/spnego.service.keytab
+
+# The kerberos names rules is to resolve kerberos principal names, refer to Hadoop's KerberosName for more details.
+*.falcon.http.authentication.kerberos.name.rules=DEFAULT
+
+# Comma separated list of black listed users
+*.falcon.http.authentication.blacklisted.users=
+
+</pre></div></div>
+<div class="section">
+<h4>Pseudo/Simple Configuration<a name="PseudoSimple_Configuration"></a></h4>
+<div class="source">
+<pre>
+##### SPNEGO Configuration
+
+# Authentication type must be specified: simple|kerberos|&lt;class&gt;
+# org.apache.falcon.security.RemoteUserInHeaderBasedAuthenticationHandler can be used for backwards compatibility
+*.falcon.http.authentication.type=simple
+
+# Indicates how long (in seconds) an authentication token is valid before it has to be renewed.
+*.falcon.http.authentication.token.validity=36000
+
+# The signature secret for signing the authentication tokens.
+*.falcon.http.authentication.signature.secret=falcon
+
+# The domain to use for the HTTP cookie that stores the authentication token.
+*.falcon.http.authentication.cookie.domain=
+
+# Indicates if anonymous requests are allowed when using 'simple' authentication.
+*.falcon.http.authentication.simple.anonymous.allowed=true
+
+# Comma separated list of black listed users
+*.falcon.http.authentication.blacklisted.users=
+
+</pre></div></div>
+<div class="section">
+<h4>SSL Configuration<a name="SSL_Configuration"></a></h4>
+<div class="source">
+<pre>
+*.falcon.enableTLS=true
+*.keystore.file=/path/to/keystore/file
+*.keystore.password=password
+
+</pre></div></div>
+<div class="section">
+<h4>Distributed Falcon Setup<a name="Distributed_Falcon_Setup"></a></h4>
+<p>Falcon should be configured to communicate with Prism over TLS in secure mode. Its not enabled by default.</p></div>
+<div class="section">
+<h3>Changes to ownership and permissions of directories managed by Falcon<a name="Changes_to_ownership_and_permissions_of_directories_managed_by_Falcon"></a></h3>
+<p></p>
+<table border="0" class="table table-striped">
+<tr class="a">
+<th>Directory</th>
+<th>Location</th>
+<th>Owner</th>
+<th>Permissions</th></tr>
+<tr class="b">
+<td>Configuration Store</td>
+<td>${config.store.uri}</td>
+<td>falcon</td>
+<td>750</td></tr>
+<tr class="a">
+<td>Oozie coord/bundle XMLs</td>
+<td>${cluster.staging-location}/workflows/{entity}/{entity-name}</td>
+<td>falcon</td>
+<td>644</td></tr>
+<tr class="b">
+<td>Shared libs</td>
+<td>{cluster.working}/{lib,libext}</td>
+<td>falcon</td>
+<td>755</td></tr>
+<tr class="a">
+<td>App logs</td>
+<td>${cluster.staging-location}/workflows/{entity}/{entity-name}/logs</td>
+<td>falcon</td>
+<td>777</td></tr></table></div>
+<div class="section">
+<h3>Backwards compatibility<a name="Backwards_compatibility"></a></h3></div>
+<div class="section">
+<h4>Scheduled Entities<a name="Scheduled_Entities"></a></h4>
+<p>Entities already scheduled with an earlier version of Falcon are not compatible with this version</p></div>
+<div class="section">
+<h4>Falcon Clients<a name="Falcon_Clients"></a></h4>
+<p>Older Falcon clients are backwards compatible wrt Authentication and user information sent as part of the HTTP header, Remote-User is still honoured when the authentication type is configured as below:</p>
+<div class="source">
+<pre>
+*.falcon.http.authentication.type=org.apache.falcon.security.RemoteUserInHeaderBasedAuthenticationHandler
+
+</pre></div></div>
+<div class="section">
+<h4>Blacklisted super users for authentication<a name="Blacklisted_super_users_for_authentication"></a></h4>
+<p>The blacklist users used to have the following super users: hdfs, mapreduce, oozie, and falcon. The list is externalized from code into Startup.properties file and is empty now and needs to be configured specifically in the file.</p></div>
+<div class="section">
+<h4>Falcon Dashboard<a name="Falcon_Dashboard"></a></h4>
+<p>The dashboard assumes an anonymous user in Pseudo/Simple method and hence anonymous users must be enabled for it to work.</p>
+<div class="source">
+<pre>
+# Indicates if anonymous requests are allowed when using 'simple' authentication.
+*.falcon.http.authentication.simple.anonymous.allowed=true
+
+</pre></div>
+<p>In Kerberos method, the browser must support HTTP Kerberos SPNEGO.</p></div>
+<div class="section">
+<h3>Known Limitations<a name="Known_Limitations"></a></h3>
+<p></p>
+<ul>
+<li>ActiveMQ topics are not secure but will be in the near future</li>
+<li>Entities already scheduled with an earlier version of Falcon are not compatible with this version as new</li></ul>workflow parameters are being passed back into Falcon such as the user are required
+<ul>
+<li>Use of hftp as the scheme for read only interface in cluster entity <a class="externalLink" href="https://issues.apache.org/jira/browse/HADOOP-10215">will not work in Oozie</a></li></ul>The alternative is to use webhdfs scheme instead and its been tested with <a href="./DistCp.html">DistCp</a>.</div>
+<div class="section">
+<h3>Examples<a name="Examples"></a></h3></div>
+<div class="section">
+<h4>Accessing the server using Falcon CLI (Java client)<a name="Accessing_the_server_using_Falcon_CLI_Java_client"></a></h4>
+<p>There is no change in the way the CLI is used. The CLI has been changed to work with the configured authentication method.</p></div>
+<div class="section">
+<h4>Accessing the server using curl<a name="Accessing_the_server_using_curl"></a></h4>
+<p>Try accessing protected resources using curl. The protected resources are:</p>
+<div class="source">
+<pre>
+$ kinit
+Please enter the password for venkatesh@LOCALHOST:
+
+$ curl http://localhost:15000/api/admin/version
+
+$ curl http://localhost:15000/api/admin/version?user.name=venkatesh
+
+$ curl --negotiate -u foo -b ~/cookiejar.txt -c ~/cookiejar.txt curl http://localhost:15000/api/admin/version
+
+</pre></div></div>
+                  </div>
+          </div>
+
+    <hr/>
+
+    <footer>
+            <div class="container">
+              <div class="row span12">Copyright &copy;                    2013-2015
+                        <a href="http://www.apache.org">Apache Software Foundation</a>.
+            All Rights Reserved.      
+                    
+      </div>
+
+                          
+                <p id="poweredBy" class="pull-right">
+                          <a href="http://maven.apache.org/" title="Built by Maven" class="poweredBy">
+        <img class="builtBy" alt="Built by Maven" src="./images/logos/maven-feather.png" />
+      </a>
+              </p>
+        
+                </div>
+    </footer>
+  </body>
+</html>