You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@nifi.apache.org by al...@apache.org on 2018/07/17 01:35:44 UTC

svn commit: r1836075 [10/39] - in /nifi/site/trunk/docs/nifi-docs: ./ components/org.apache.nifi/nifi-ambari-nar/1.7.1/ components/org.apache.nifi/nifi-ambari-nar/1.7.1/org.apache.nifi.reporting.ambari.AmbariReportingTask/ components/org.apache.nifi/ni...

Added: nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.CreateHadoopSequenceFile/index.html
URL: http://svn.apache.org/viewvc/nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.CreateHadoopSequenceFile/index.html?rev=1836075&view=auto
==============================================================================
--- nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.CreateHadoopSequenceFile/index.html (added)
+++ nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.CreateHadoopSequenceFile/index.html Tue Jul 17 01:35:38 2018
@@ -0,0 +1,3 @@
+<!DOCTYPE html><html lang="en"><head><meta charset="utf-8"></meta><title>CreateHadoopSequenceFile</title><link rel="stylesheet" href="../../../../../css/component-usage.css" type="text/css"></link></head><script type="text/javascript">window.onload = function(){if(self==top) { document.getElementById('nameHeader').style.display = "inherit"; } }</script><body><h1 id="nameHeader" style="display: none;">CreateHadoopSequenceFile</h1><h2>Description: </h2><p>Creates Hadoop Sequence Files from incoming flow files</p><p><a href="additionalDetails.html">Additional Details...</a></p><h3>Tags: </h3><p>hadoop, sequence file, create, sequencefile</p><h3>Properties: </h3><p>In the list below, the names of required properties appear in <strong>bold</strong>. Any other properties (not in bold) are considered optional. The table also indicates any default values, and whether a property supports the <a href="../../../../../html/expression-language-guide.html">NiFi Expression Language</a>.</p><table 
 id="properties"><tr><th>Name</th><th>Default Value</th><th>Allowable Values</th><th>Description</th></tr><tr><td id="name">Hadoop Configuration Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A file or comma separated list of files which contains the Hadoop file system configuration. Without this, Hadoop will search the classpath for a 'core-site.xml' and 'hdfs-site.xml' file or will revert to a default configuration. To use swebhdfs, see 'Additional Details' section of PutHDFS's documentation.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Credentials Service</td><td id="default-value"></td><td id="allowable-values"><strong>Controller Service API: </strong><br/>KerberosCredentialsService<br/><strong>Implementation: </strong><a href="../../../nifi-kerberos-credentials-service-nar/1.7.1/org.apache.nifi.kerberos.KeytabCredentialsService/index.html
 ">KeytabCredentialsService</a></td><td id="description">Specifies the Kerberos Credentials Controller Service that should be used for authenticating with Kerberos</td></tr><tr><td id="name">Kerberos Principal</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos principal to authenticate as. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Keytab</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos keytab associated with the principal. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Relogin Period</td><td id="default-value">4 hours</td><td id="allowable-values"></td><td id="description">Perio
 d of time which should pass before attempting a kerberos relogin.
+
+This property has been deprecated, and has no effect on processing. Relogins now occur automatically.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Additional Classpath Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A comma-separated list of paths to files and/or directories that will be added to the classpath. When specifying a directory, all files with in the directory will be added to the classpath, but further sub-directories will not be included.</td></tr><tr><td id="name">Compression type</td><td id="default-value"></td><td id="allowable-values"><ul><li>NONE</li><li>RECORD</li><li>BLOCK</li></ul></td><td id="description">Type of compression to use when creating Sequence File</td></tr><tr><td id="name"><strong>Compression codec</strong></td><td id="default-value">NONE</td><td id="allowable-values"><ul><li>NONE <img src="../../../../../html/images/
 iconInfo.png" alt="No compression" title="No compression"></img></li><li>DEFAULT <img src="../../../../../html/images/iconInfo.png" alt="Default ZLIB compression" title="Default ZLIB compression"></img></li><li>BZIP <img src="../../../../../html/images/iconInfo.png" alt="BZIP compression" title="BZIP compression"></img></li><li>GZIP <img src="../../../../../html/images/iconInfo.png" alt="GZIP compression" title="GZIP compression"></img></li><li>LZ4 <img src="../../../../../html/images/iconInfo.png" alt="LZ4 compression" title="LZ4 compression"></img></li><li>LZO <img src="../../../../../html/images/iconInfo.png" alt="LZO compression - it assumes LD_LIBRARY_PATH has been set and jar is available" title="LZO compression - it assumes LD_LIBRARY_PATH has been set and jar is available"></img></li><li>SNAPPY <img src="../../../../../html/images/iconInfo.png" alt="Snappy compression" title="Snappy compression"></img></li><li>AUTOMATIC <img src="../../../../../html/images/iconInfo.png" alt=
 "Will attempt to automatically detect the compression codec." title="Will attempt to automatically detect the compression codec."></img></li></ul></td><td id="description">No Description Provided.</td></tr></table><h3>Relationships: </h3><table id="relationships"><tr><th>Name</th><th>Description</th></tr><tr><td>success</td><td>Generated Sequence Files are sent to this relationship</td></tr><tr><td>failure</td><td>Incoming files that failed to generate a Sequence File are sent to this relationship</td></tr></table><h3>Reads Attributes: </h3>None specified.<h3>Writes Attributes: </h3>None specified.<h3>State management: </h3>This component does not store state.<h3>Restricted: </h3>This component is not restricted.<h3>Input requirement: </h3>This component requires an incoming relationship.<h3>System Resource Considerations:</h3>None specified.<h3>See Also:</h3><p><a href="../org.apache.nifi.processors.hadoop.PutHDFS/index.html">PutHDFS</a></p></body></html>
\ No newline at end of file

Added: nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.DeleteHDFS/index.html
URL: http://svn.apache.org/viewvc/nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.DeleteHDFS/index.html?rev=1836075&view=auto
==============================================================================
--- nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.DeleteHDFS/index.html (added)
+++ nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.DeleteHDFS/index.html Tue Jul 17 01:35:38 2018
@@ -0,0 +1,3 @@
+<!DOCTYPE html><html lang="en"><head><meta charset="utf-8"></meta><title>DeleteHDFS</title><link rel="stylesheet" href="../../../../../css/component-usage.css" type="text/css"></link></head><script type="text/javascript">window.onload = function(){if(self==top) { document.getElementById('nameHeader').style.display = "inherit"; } }</script><body><h1 id="nameHeader" style="display: none;">DeleteHDFS</h1><h2>Description: </h2><p>Deletes one or more files or directories from HDFS. The path can be provided as an attribute from an incoming FlowFile, or a statically set path that is periodically removed. If this processor has an incoming connection, itwill ignore running on a periodic basis and instead rely on incoming FlowFiles to trigger a delete. Note that you may use a wildcard character to match multiple files or directories. If there are no incoming connections no flowfiles will be transfered to any output relationships.  If there is an incoming flowfile then provided there are no de
 tected failures it will be transferred to success otherwise it will be sent to false. If knowledge of globbed files deleted is necessary use ListHDFS first to produce a specific list of files to delete. </p><h3>Tags: </h3><p>hadoop, HDFS, delete, remove, filesystem</p><h3>Properties: </h3><p>In the list below, the names of required properties appear in <strong>bold</strong>. Any other properties (not in bold) are considered optional. The table also indicates any default values, and whether a property supports the <a href="../../../../../html/expression-language-guide.html">NiFi Expression Language</a>.</p><table id="properties"><tr><th>Name</th><th>Default Value</th><th>Allowable Values</th><th>Description</th></tr><tr><td id="name">Hadoop Configuration Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A file or comma separated list of files which contains the Hadoop file system configuration. Without this, Hadoop will search the classpath
  for a 'core-site.xml' and 'hdfs-site.xml' file or will revert to a default configuration. To use swebhdfs, see 'Additional Details' section of PutHDFS's documentation.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Credentials Service</td><td id="default-value"></td><td id="allowable-values"><strong>Controller Service API: </strong><br/>KerberosCredentialsService<br/><strong>Implementation: </strong><a href="../../../nifi-kerberos-credentials-service-nar/1.7.1/org.apache.nifi.kerberos.KeytabCredentialsService/index.html">KeytabCredentialsService</a></td><td id="description">Specifies the Kerberos Credentials Controller Service that should be used for authenticating with Kerberos</td></tr><tr><td id="name">Kerberos Principal</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos principal to authenticate as. Requires nifi.kerberos.krb5.file to be set in
  your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Keytab</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos keytab associated with the principal. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Relogin Period</td><td id="default-value">4 hours</td><td id="allowable-values"></td><td id="description">Period of time which should pass before attempting a kerberos relogin.
+
+This property has been deprecated, and has no effect on processing. Relogins now occur automatically.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Additional Classpath Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A comma-separated list of paths to files and/or directories that will be added to the classpath. When specifying a directory, all files with in the directory will be added to the classpath, but further sub-directories will not be included.</td></tr><tr><td id="name"><strong>Path</strong></td><td id="default-value"></td><td id="allowable-values"></td><td id="description">The HDFS file or directory to delete. A wildcard expression may be used to only delete certain files<br/><strong>Supports Expression Language: true (will be evaluated using flow file attributes and variable registry)</strong></td></tr><tr><td id="name"><strong>Recursive</str
 ong></td><td id="default-value">true</td><td id="allowable-values"><ul><li>true</li><li>false</li></ul></td><td id="description">Remove contents of a non-empty directory recursively</td></tr></table><h3>Relationships: </h3><table id="relationships"><tr><th>Name</th><th>Description</th></tr><tr><td>success</td><td>When an incoming flowfile is used then if there are no errors invoking delete the flowfile will route here.</td></tr><tr><td>failure</td><td>When an incoming flowfile is used and there is a failure while deleting then the flowfile will route here.</td></tr></table><h3>Reads Attributes: </h3>None specified.<h3>Writes Attributes: </h3><table id="writes-attributes"><tr><th>Name</th><th>Description</th></tr><tr><td>hdfs.filename</td><td>HDFS file to be deleted. If multiple files are deleted, then only the last filename is set.</td></tr><tr><td>hdfs.path</td><td>HDFS Path specified in the delete request. If multiple paths are deleted, then only the last path is set.</td></tr><tr
 ><td>hdfs.error.message</td><td>HDFS error message related to the hdfs.error.code</td></tr></table><h3>State management: </h3>This component does not store state.<h3>Restricted: </h3><table id="restrictions"><tr><th>Required Permission</th><th>Explanation</th></tr><tr><td>write filesystem</td><td>Provides operator the ability to delete any file that NiFi has access to in HDFS or the local filesystem.</td></tr></table><h3>Input requirement: </h3>This component allows an incoming relationship.<h3>System Resource Considerations:</h3>None specified.<h3>See Also:</h3><p><a href="../org.apache.nifi.processors.hadoop.ListHDFS/index.html">ListHDFS</a>, <a href="../org.apache.nifi.processors.hadoop.PutHDFS/index.html">PutHDFS</a></p></body></html>
\ No newline at end of file

Added: nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.FetchHDFS/index.html
URL: http://svn.apache.org/viewvc/nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.FetchHDFS/index.html?rev=1836075&view=auto
==============================================================================
--- nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.FetchHDFS/index.html (added)
+++ nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.FetchHDFS/index.html Tue Jul 17 01:35:38 2018
@@ -0,0 +1,3 @@
+<!DOCTYPE html><html lang="en"><head><meta charset="utf-8"></meta><title>FetchHDFS</title><link rel="stylesheet" href="../../../../../css/component-usage.css" type="text/css"></link></head><script type="text/javascript">window.onload = function(){if(self==top) { document.getElementById('nameHeader').style.display = "inherit"; } }</script><body><h1 id="nameHeader" style="display: none;">FetchHDFS</h1><h2>Description: </h2><p>Retrieves a file from HDFS. The content of the incoming FlowFile is replaced by the content of the file in HDFS. The file in HDFS is left intact without any changes being made to it.</p><h3>Tags: </h3><p>hadoop, hdfs, get, ingest, fetch, source</p><h3>Properties: </h3><p>In the list below, the names of required properties appear in <strong>bold</strong>. Any other properties (not in bold) are considered optional. The table also indicates any default values, and whether a property supports the <a href="../../../../../html/expression-language-guide.html">NiFi Expre
 ssion Language</a>.</p><table id="properties"><tr><th>Name</th><th>Default Value</th><th>Allowable Values</th><th>Description</th></tr><tr><td id="name">Hadoop Configuration Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A file or comma separated list of files which contains the Hadoop file system configuration. Without this, Hadoop will search the classpath for a 'core-site.xml' and 'hdfs-site.xml' file or will revert to a default configuration. To use swebhdfs, see 'Additional Details' section of PutHDFS's documentation.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Credentials Service</td><td id="default-value"></td><td id="allowable-values"><strong>Controller Service API: </strong><br/>KerberosCredentialsService<br/><strong>Implementation: </strong><a href="../../../nifi-kerberos-credentials-service-nar/1.7.1/org.apache.nifi.kerberos.Keyta
 bCredentialsService/index.html">KeytabCredentialsService</a></td><td id="description">Specifies the Kerberos Credentials Controller Service that should be used for authenticating with Kerberos</td></tr><tr><td id="name">Kerberos Principal</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos principal to authenticate as. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Keytab</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos keytab associated with the principal. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Relogin Period</td><td id="default-value">4 hours</td><td id="allowable-values"><
 /td><td id="description">Period of time which should pass before attempting a kerberos relogin.
+
+This property has been deprecated, and has no effect on processing. Relogins now occur automatically.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Additional Classpath Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A comma-separated list of paths to files and/or directories that will be added to the classpath. When specifying a directory, all files with in the directory will be added to the classpath, but further sub-directories will not be included.</td></tr><tr><td id="name"><strong>HDFS Filename</strong></td><td id="default-value">${path}/${filename}</td><td id="allowable-values"></td><td id="description">The name of the HDFS file to retrieve<br/><strong>Supports Expression Language: true (will be evaluated using flow file attributes and variable registry)</strong></td></tr><tr><td id="name"><strong>Compression codec</strong></td><td id="default-va
 lue">NONE</td><td id="allowable-values"><ul><li>NONE <img src="../../../../../html/images/iconInfo.png" alt="No compression" title="No compression"></img></li><li>DEFAULT <img src="../../../../../html/images/iconInfo.png" alt="Default ZLIB compression" title="Default ZLIB compression"></img></li><li>BZIP <img src="../../../../../html/images/iconInfo.png" alt="BZIP compression" title="BZIP compression"></img></li><li>GZIP <img src="../../../../../html/images/iconInfo.png" alt="GZIP compression" title="GZIP compression"></img></li><li>LZ4 <img src="../../../../../html/images/iconInfo.png" alt="LZ4 compression" title="LZ4 compression"></img></li><li>LZO <img src="../../../../../html/images/iconInfo.png" alt="LZO compression - it assumes LD_LIBRARY_PATH has been set and jar is available" title="LZO compression - it assumes LD_LIBRARY_PATH has been set and jar is available"></img></li><li>SNAPPY <img src="../../../../../html/images/iconInfo.png" alt="Snappy compression" title="Snappy com
 pression"></img></li><li>AUTOMATIC <img src="../../../../../html/images/iconInfo.png" alt="Will attempt to automatically detect the compression codec." title="Will attempt to automatically detect the compression codec."></img></li></ul></td><td id="description">No Description Provided.</td></tr></table><h3>Relationships: </h3><table id="relationships"><tr><th>Name</th><th>Description</th></tr><tr><td>success</td><td>FlowFiles will be routed to this relationship once they have been updated with the content of the HDFS file</td></tr><tr><td>comms.failure</td><td>FlowFiles will be routed to this relationship if the content of the HDFS file cannot be retrieve due to a communications failure. This generally indicates that the Fetch should be tried again.</td></tr><tr><td>failure</td><td>FlowFiles will be routed to this relationship if the content of the HDFS file cannot be retrieved and trying again will likely not be helpful. This would occur, for instance, if the file is not found or i
 f there is a permissions issue</td></tr></table><h3>Reads Attributes: </h3>None specified.<h3>Writes Attributes: </h3><table id="writes-attributes"><tr><th>Name</th><th>Description</th></tr><tr><td>hdfs.failure.reason</td><td>When a FlowFile is routed to 'failure', this attribute is added indicating why the file could not be fetched from HDFS</td></tr></table><h3>State management: </h3>This component does not store state.<h3>Restricted: </h3><table id="restrictions"><tr><th>Required Permission</th><th>Explanation</th></tr><tr><td>read filesystem</td><td>Provides operator the ability to retrieve any file that NiFi has access to in HDFS or the local filesystem.</td></tr></table><h3>Input requirement: </h3>This component requires an incoming relationship.<h3>System Resource Considerations:</h3>None specified.<h3>See Also:</h3><p><a href="../org.apache.nifi.processors.hadoop.ListHDFS/index.html">ListHDFS</a>, <a href="../org.apache.nifi.processors.hadoop.GetHDFS/index.html">GetHDFS</a>,
  <a href="../org.apache.nifi.processors.hadoop.PutHDFS/index.html">PutHDFS</a></p></body></html>
\ No newline at end of file

Added: nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.GetHDFS/index.html
URL: http://svn.apache.org/viewvc/nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.GetHDFS/index.html?rev=1836075&view=auto
==============================================================================
--- nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.GetHDFS/index.html (added)
+++ nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.GetHDFS/index.html Tue Jul 17 01:35:38 2018
@@ -0,0 +1,3 @@
+<!DOCTYPE html><html lang="en"><head><meta charset="utf-8"></meta><title>GetHDFS</title><link rel="stylesheet" href="../../../../../css/component-usage.css" type="text/css"></link></head><script type="text/javascript">window.onload = function(){if(self==top) { document.getElementById('nameHeader').style.display = "inherit"; } }</script><body><h1 id="nameHeader" style="display: none;">GetHDFS</h1><h2>Description: </h2><p>Fetch files from Hadoop Distributed File System (HDFS) into FlowFiles. This Processor will delete the file from HDFS after fetching it.</p><h3>Tags: </h3><p>hadoop, HDFS, get, fetch, ingest, source, filesystem</p><h3>Properties: </h3><p>In the list below, the names of required properties appear in <strong>bold</strong>. Any other properties (not in bold) are considered optional. The table also indicates any default values, and whether a property supports the <a href="../../../../../html/expression-language-guide.html">NiFi Expression Language</a>.</p><table id="prope
 rties"><tr><th>Name</th><th>Default Value</th><th>Allowable Values</th><th>Description</th></tr><tr><td id="name">Hadoop Configuration Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A file or comma separated list of files which contains the Hadoop file system configuration. Without this, Hadoop will search the classpath for a 'core-site.xml' and 'hdfs-site.xml' file or will revert to a default configuration. To use swebhdfs, see 'Additional Details' section of PutHDFS's documentation.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Credentials Service</td><td id="default-value"></td><td id="allowable-values"><strong>Controller Service API: </strong><br/>KerberosCredentialsService<br/><strong>Implementation: </strong><a href="../../../nifi-kerberos-credentials-service-nar/1.7.1/org.apache.nifi.kerberos.KeytabCredentialsService/index.html">KeytabC
 redentialsService</a></td><td id="description">Specifies the Kerberos Credentials Controller Service that should be used for authenticating with Kerberos</td></tr><tr><td id="name">Kerberos Principal</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos principal to authenticate as. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Keytab</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos keytab associated with the principal. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Relogin Period</td><td id="default-value">4 hours</td><td id="allowable-values"></td><td id="description">Period of time
  which should pass before attempting a kerberos relogin.
+
+This property has been deprecated, and has no effect on processing. Relogins now occur automatically.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Additional Classpath Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A comma-separated list of paths to files and/or directories that will be added to the classpath. When specifying a directory, all files with in the directory will be added to the classpath, but further sub-directories will not be included.</td></tr><tr><td id="name"><strong>Directory</strong></td><td id="default-value"></td><td id="allowable-values"></td><td id="description">The HDFS directory from which files should be read<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name"><strong>Recurse Subdirectories</strong></td><td id="default-value">true</td><td id="all
 owable-values"><ul><li>true</li><li>false</li></ul></td><td id="description">Indicates whether to pull files from subdirectories of the HDFS directory</td></tr><tr><td id="name"><strong>Keep Source File</strong></td><td id="default-value">false</td><td id="allowable-values"><ul><li>true</li><li>false</li></ul></td><td id="description">Determines whether to delete the file from HDFS after it has been successfully transferred. If true, the file will be fetched repeatedly. This is intended for testing only.</td></tr><tr><td id="name">File Filter Regex</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A Java Regular Expression for filtering Filenames; if a filter is supplied then only files whose names match that Regular Expression will be fetched, otherwise all files will be fetched</td></tr><tr><td id="name"><strong>Filter Match Name Only</strong></td><td id="default-value">true</td><td id="allowable-values"><ul><li>true</li><li>false</li></ul></td><t
 d id="description">If true then File Filter Regex will match on just the filename, otherwise subdirectory names will be included with filename in the regex comparison</td></tr><tr><td id="name"><strong>Ignore Dotted Files</strong></td><td id="default-value">true</td><td id="allowable-values"><ul><li>true</li><li>false</li></ul></td><td id="description">If true, files whose names begin with a dot (".") will be ignored</td></tr><tr><td id="name"><strong>Minimum File Age</strong></td><td id="default-value">0 sec</td><td id="allowable-values"></td><td id="description">The minimum age that a file must be in order to be pulled; any file younger than this amount of time (based on last modification date) will be ignored</td></tr><tr><td id="name">Maximum File Age</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">The maximum age that a file must be in order to be pulled; any file older than this amount of time (based on last modification date) will be ignore
 d</td></tr><tr><td id="name"><strong>Polling Interval</strong></td><td id="default-value">0 sec</td><td id="allowable-values"></td><td id="description">Indicates how long to wait between performing directory listings</td></tr><tr><td id="name"><strong>Batch Size</strong></td><td id="default-value">100</td><td id="allowable-values"></td><td id="description">The maximum number of files to pull in each iteration, based on run schedule.</td></tr><tr><td id="name">IO Buffer Size</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Amount of memory to use to buffer file contents during IO. This overrides the Hadoop Configuration</td></tr><tr><td id="name"><strong>Compression codec</strong></td><td id="default-value">NONE</td><td id="allowable-values"><ul><li>NONE <img src="../../../../../html/images/iconInfo.png" alt="No compression" title="No compression"></img></li><li>DEFAULT <img src="../../../../../html/images/iconInfo.png" alt="Default ZLIB compression
 " title="Default ZLIB compression"></img></li><li>BZIP <img src="../../../../../html/images/iconInfo.png" alt="BZIP compression" title="BZIP compression"></img></li><li>GZIP <img src="../../../../../html/images/iconInfo.png" alt="GZIP compression" title="GZIP compression"></img></li><li>LZ4 <img src="../../../../../html/images/iconInfo.png" alt="LZ4 compression" title="LZ4 compression"></img></li><li>LZO <img src="../../../../../html/images/iconInfo.png" alt="LZO compression - it assumes LD_LIBRARY_PATH has been set and jar is available" title="LZO compression - it assumes LD_LIBRARY_PATH has been set and jar is available"></img></li><li>SNAPPY <img src="../../../../../html/images/iconInfo.png" alt="Snappy compression" title="Snappy compression"></img></li><li>AUTOMATIC <img src="../../../../../html/images/iconInfo.png" alt="Will attempt to automatically detect the compression codec." title="Will attempt to automatically detect the compression codec."></img></li></ul></td><td id="de
 scription">No Description Provided.</td></tr></table><h3>Relationships: </h3><table id="relationships"><tr><th>Name</th><th>Description</th></tr><tr><td>success</td><td>All files retrieved from HDFS are transferred to this relationship</td></tr></table><h3>Reads Attributes: </h3>None specified.<h3>Writes Attributes: </h3><table id="writes-attributes"><tr><th>Name</th><th>Description</th></tr><tr><td>filename</td><td>The name of the file that was read from HDFS.</td></tr><tr><td>path</td><td>The path is set to the relative path of the file's directory on HDFS. For example, if the Directory property is set to /tmp, then files picked up from /tmp will have the path attribute set to "./". If the Recurse Subdirectories property is set to true and a file is picked up from /tmp/abc/1/2/3, then the path attribute will be set to "abc/1/2/3".</td></tr></table><h3>State management: </h3>This component does not store state.<h3>Restricted: </h3><table id="restrictions"><tr><th>Required Permissio
 n</th><th>Explanation</th></tr><tr><td>read filesystem</td><td>Provides operator the ability to retrieve any file that NiFi has access to in HDFS or the local filesystem.</td></tr><tr><td>write filesystem</td><td>Provides operator the ability to delete any file that NiFi has access to in HDFS or the local filesystem.</td></tr></table><h3>Input requirement: </h3>This component does not allow an incoming relationship.<h3>System Resource Considerations:</h3>None specified.<h3>See Also:</h3><p><a href="../org.apache.nifi.processors.hadoop.PutHDFS/index.html">PutHDFS</a>, <a href="../org.apache.nifi.processors.hadoop.ListHDFS/index.html">ListHDFS</a></p></body></html>
\ No newline at end of file

Added: nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.GetHDFSFileInfo/index.html
URL: http://svn.apache.org/viewvc/nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.GetHDFSFileInfo/index.html?rev=1836075&view=auto
==============================================================================
--- nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.GetHDFSFileInfo/index.html (added)
+++ nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.GetHDFSFileInfo/index.html Tue Jul 17 01:35:38 2018
@@ -0,0 +1,3 @@
+<!DOCTYPE html><html lang="en"><head><meta charset="utf-8"></meta><title>GetHDFSFileInfo</title><link rel="stylesheet" href="../../../../../css/component-usage.css" type="text/css"></link></head><script type="text/javascript">window.onload = function(){if(self==top) { document.getElementById('nameHeader').style.display = "inherit"; } }</script><body><h1 id="nameHeader" style="display: none;">GetHDFSFileInfo</h1><h2>Description: </h2><p>Retrieves a listing of files and directories from HDFS. This processor creates a FlowFile(s) that represents the HDFS file/dir with relevant information. Main purpose of this processor to provide functionality similar to HDFS Client, i.e. count, du, ls, test, etc. Unlike ListHDFS, this processor is stateless, supports incoming connections and provides information on a dir level. </p><h3>Tags: </h3><p>hadoop, HDFS, get, list, ingest, source, filesystem</p><h3>Properties: </h3><p>In the list below, the names of required properties appear in <strong>bold
 </strong>. Any other properties (not in bold) are considered optional. The table also indicates any default values, and whether a property supports the <a href="../../../../../html/expression-language-guide.html">NiFi Expression Language</a>.</p><table id="properties"><tr><th>Name</th><th>Default Value</th><th>Allowable Values</th><th>Description</th></tr><tr><td id="name">Hadoop Configuration Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A file or comma separated list of files which contains the Hadoop file system configuration. Without this, Hadoop will search the classpath for a 'core-site.xml' and 'hdfs-site.xml' file or will revert to a default configuration. To use swebhdfs, see 'Additional Details' section of PutHDFS's documentation.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Credentials Service</td><td id="default-value"></td><td i
 d="allowable-values"><strong>Controller Service API: </strong><br/>KerberosCredentialsService<br/><strong>Implementation: </strong><a href="../../../nifi-kerberos-credentials-service-nar/1.7.1/org.apache.nifi.kerberos.KeytabCredentialsService/index.html">KeytabCredentialsService</a></td><td id="description">Specifies the Kerberos Credentials Controller Service that should be used for authenticating with Kerberos</td></tr><tr><td id="name">Kerberos Principal</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos principal to authenticate as. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Keytab</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos keytab associated with the principal. Requires nifi.kerberos.krb5.file to be set in your nifi.properties
 <br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Relogin Period</td><td id="default-value">4 hours</td><td id="allowable-values"></td><td id="description">Period of time which should pass before attempting a kerberos relogin.
+
+This property has been deprecated, and has no effect on processing. Relogins now occur automatically.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Additional Classpath Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A comma-separated list of paths to files and/or directories that will be added to the classpath. When specifying a directory, all files with in the directory will be added to the classpath, but further sub-directories will not be included.</td></tr><tr><td id="name"><strong>Full path</strong></td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A directory to start listing from, or a file's full path.<br/><strong>Supports Expression Language: true (will be evaluated using flow file attributes and variable registry)</strong></td></tr><tr><td id="name"><strong>Recurse Subdirectories</strong></td><td id="default-
 value">true</td><td id="allowable-values"><ul><li>true</li><li>false</li></ul></td><td id="description">Indicates whether to list files from subdirectories of the HDFS directory</td></tr><tr><td id="name">Directory Filter</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Regex. Only directories whose names match the given regular expression will be picked up. If not provided, any filter would be apply (performance considerations).<br/><strong>Supports Expression Language: true (will be evaluated using flow file attributes and variable registry)</strong></td></tr><tr><td id="name">File Filter</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Regex. Only files whose names match the given regular expression will be picked up. If not provided, any filter would be apply (performance considerations).<br/><strong>Supports Expression Language: true (will be evaluated using flow file attributes and variable registry)</strong
 ></td></tr><tr><td id="name">Exclude Files</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Regex. Files whose names match the given regular expression will not be picked up. If not provided, any filter won't be apply (performance considerations).<br/><strong>Supports Expression Language: true (will be evaluated using flow file attributes and variable registry)</strong></td></tr><tr><td id="name"><strong>Ignore Dotted Directories</strong></td><td id="default-value">true</td><td id="allowable-values"><ul><li>true</li><li>false</li></ul></td><td id="description">If true, directories whose names begin with a dot (".") will be ignored</td></tr><tr><td id="name"><strong>Ignore Dotted Files</strong></td><td id="default-value">true</td><td id="allowable-values"><ul><li>true</li><li>false</li></ul></td><td id="description">If true, files whose names begin with a dot (".") will be ignored</td></tr><tr><td id="name"><strong>Group Results</strong></td><td id=
 "default-value">gethdfsfileinfo-group-all</td><td id="allowable-values"><ul><li>All <img src="../../../../../html/images/iconInfo.png" alt="Group all results into a single flowfile." title="Group all results into a single flowfile."></img></li><li>Parent Directory <img src="../../../../../html/images/iconInfo.png" alt="Group HDFS objects by their parent directories only. Processor will generate flowfile for each directory (if recursive). If 'Recurse Subdirectories' property set to 'false', then will have the same effect as 'All'" title="Group HDFS objects by their parent directories only. Processor will generate flowfile for each directory (if recursive). If 'Recurse Subdirectories' property set to 'false', then will have the same effect as 'All'"></img></li><li>None <img src="../../../../../html/images/iconInfo.png" alt="Don't group results. Generate flowfile per each HDFS object." title="Don't group results. Generate flowfile per each HDFS object."></img></li></ul></td><td id="des
 cription">Groups HDFS objects</td></tr><tr><td id="name"><strong>Destination</strong></td><td id="default-value">gethdfsfileinfo-dest-content</td><td id="allowable-values"><ul><li>Attributes <img src="../../../../../html/images/iconInfo.png" alt="Details of given HDFS object will be stored in attributes of flowfile. WARNING: In case when scan finds thousands or millions of objects, having huge values in attribute could impact flow file repo and GC/heap usage. Use content destination for such cases." title="Details of given HDFS object will be stored in attributes of flowfile. WARNING: In case when scan finds thousands or millions of objects, having huge values in attribute could impact flow file repo and GC/heap usage. Use content destination for such cases."></img></li><li>Content <img src="../../../../../html/images/iconInfo.png" alt="Details of given HDFS object will be stored in a content in JSON format" title="Details of given HDFS object will be stored in a content in JSON for
 mat"></img></li></ul></td><td id="description">Sets the destination for the resutls. When set to 'Content', attributes of flowfile won't be used for storing results. </td></tr></table><h3>Relationships: </h3><table id="relationships"><tr><th>Name</th><th>Description</th></tr><tr><td>success</td><td>All successfully generated FlowFiles are transferred to this relationship</td></tr><tr><td>not found</td><td>If no objects are found, original FlowFile are transferred to this relationship</td></tr><tr><td>failure</td><td>All failed attempts to access HDFS will be routed to this relationship</td></tr><tr><td>original</td><td>Original FlowFiles are transferred to this relationship</td></tr></table><h3>Reads Attributes: </h3>None specified.<h3>Writes Attributes: </h3><table id="writes-attributes"><tr><th>Name</th><th>Description</th></tr><tr><td>hdfs.objectName</td><td>The name of the file/dir found on HDFS.</td></tr><tr><td>hdfs.path</td><td>The path is set to the absolute path of the obje
 ct's parent directory on HDFS. For example, if an object is a directory 'foo', under directory '/bar' then 'hdfs.objectName' will have value 'foo', and 'hdfs.path' will be '/bar'</td></tr><tr><td>hdfs.type</td><td>The type of an object. Possible values: directory, file, link</td></tr><tr><td>hdfs.owner</td><td>The user that owns the object in HDFS</td></tr><tr><td>hdfs.group</td><td>The group that owns the object in HDFS</td></tr><tr><td>hdfs.lastModified</td><td>The timestamp of when the object in HDFS was last modified, as milliseconds since midnight Jan 1, 1970 UTC</td></tr><tr><td>hdfs.length</td><td>In case of files: The number of bytes in the file in HDFS.  In case of dirs: Retuns storage space consumed by directory. </td></tr><tr><td>hdfs.count.files</td><td>In case of type='directory' will represent total count of files under this dir. Won't be populated to other types of HDFS objects. </td></tr><tr><td>hdfs.count.dirs</td><td>In case of type='directory' will represent total
  count of directories under this dir (including itself). Won't be populated to other types of HDFS objects. </td></tr><tr><td>hdfs.replication</td><td>The number of HDFS replicas for the file</td></tr><tr><td>hdfs.permissions</td><td>The permissions for the object in HDFS. This is formatted as 3 characters for the owner, 3 for the group, and 3 for other users. For example rw-rw-r--</td></tr><tr><td>hdfs.status</td><td>The status contains comma separated list of file/dir paths, which couldn't be listed/accessed. Status won't be set if no errors occured.</td></tr><tr><td>hdfs.full.tree</td><td>When destination is 'attribute', will be populated with full tree of HDFS directory in JSON format.WARNING: In case when scan finds thousands or millions of objects, having huge values in attribute could impact flow file repo and GC/heap usage. Use content destination for such cases</td></tr></table><h3>State management: </h3>This component does not store state.<h3>Restricted: </h3>This componen
 t is not restricted.<h3>Input requirement: </h3>This component allows an incoming relationship.<h3>System Resource Considerations:</h3>None specified.<h3>See Also:</h3><p><a href="../org.apache.nifi.processors.hadoop.ListHDFS/index.html">ListHDFS</a>, <a href="../org.apache.nifi.processors.hadoop.GetHDFS/index.html">GetHDFS</a>, <a href="../org.apache.nifi.processors.hadoop.FetchHDFS/index.html">FetchHDFS</a>, <a href="../org.apache.nifi.processors.hadoop.PutHDFS/index.html">PutHDFS</a></p></body></html>
\ No newline at end of file

Added: nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.GetHDFSSequenceFile/index.html
URL: http://svn.apache.org/viewvc/nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.GetHDFSSequenceFile/index.html?rev=1836075&view=auto
==============================================================================
--- nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.GetHDFSSequenceFile/index.html (added)
+++ nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.GetHDFSSequenceFile/index.html Tue Jul 17 01:35:38 2018
@@ -0,0 +1,3 @@
+<!DOCTYPE html><html lang="en"><head><meta charset="utf-8"></meta><title>GetHDFSSequenceFile</title><link rel="stylesheet" href="../../../../../css/component-usage.css" type="text/css"></link></head><script type="text/javascript">window.onload = function(){if(self==top) { document.getElementById('nameHeader').style.display = "inherit"; } }</script><body><h1 id="nameHeader" style="display: none;">GetHDFSSequenceFile</h1><h2>Description: </h2><p>Fetch sequence files from Hadoop Distributed File System (HDFS) into FlowFiles</p><h3>Tags: </h3><p>hadoop, HDFS, get, fetch, ingest, source, sequence file</p><h3>Properties: </h3><p>In the list below, the names of required properties appear in <strong>bold</strong>. Any other properties (not in bold) are considered optional. The table also indicates any default values, and whether a property supports the <a href="../../../../../html/expression-language-guide.html">NiFi Expression Language</a>.</p><table id="properties"><tr><th>Name</th><th>De
 fault Value</th><th>Allowable Values</th><th>Description</th></tr><tr><td id="name">Hadoop Configuration Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A file or comma separated list of files which contains the Hadoop file system configuration. Without this, Hadoop will search the classpath for a 'core-site.xml' and 'hdfs-site.xml' file or will revert to a default configuration. To use swebhdfs, see 'Additional Details' section of PutHDFS's documentation.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Credentials Service</td><td id="default-value"></td><td id="allowable-values"><strong>Controller Service API: </strong><br/>KerberosCredentialsService<br/><strong>Implementation: </strong><a href="../../../nifi-kerberos-credentials-service-nar/1.7.1/org.apache.nifi.kerberos.KeytabCredentialsService/index.html">KeytabCredentialsService</a></td><td 
 id="description">Specifies the Kerberos Credentials Controller Service that should be used for authenticating with Kerberos</td></tr><tr><td id="name">Kerberos Principal</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos principal to authenticate as. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Keytab</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos keytab associated with the principal. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Relogin Period</td><td id="default-value">4 hours</td><td id="allowable-values"></td><td id="description">Period of time which should pass before atte
 mpting a kerberos relogin.
+
+This property has been deprecated, and has no effect on processing. Relogins now occur automatically.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Additional Classpath Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A comma-separated list of paths to files and/or directories that will be added to the classpath. When specifying a directory, all files with in the directory will be added to the classpath, but further sub-directories will not be included.</td></tr><tr><td id="name"><strong>Directory</strong></td><td id="default-value"></td><td id="allowable-values"></td><td id="description">The HDFS directory from which files should be read<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name"><strong>Recurse Subdirectories</strong></td><td id="default-value">true</td><td id="all
 owable-values"><ul><li>true</li><li>false</li></ul></td><td id="description">Indicates whether to pull files from subdirectories of the HDFS directory</td></tr><tr><td id="name"><strong>Keep Source File</strong></td><td id="default-value">false</td><td id="allowable-values"><ul><li>true</li><li>false</li></ul></td><td id="description">Determines whether to delete the file from HDFS after it has been successfully transferred. If true, the file will be fetched repeatedly. This is intended for testing only.</td></tr><tr><td id="name">File Filter Regex</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A Java Regular Expression for filtering Filenames; if a filter is supplied then only files whose names match that Regular Expression will be fetched, otherwise all files will be fetched</td></tr><tr><td id="name"><strong>Filter Match Name Only</strong></td><td id="default-value">true</td><td id="allowable-values"><ul><li>true</li><li>false</li></ul></td><t
 d id="description">If true then File Filter Regex will match on just the filename, otherwise subdirectory names will be included with filename in the regex comparison</td></tr><tr><td id="name"><strong>Ignore Dotted Files</strong></td><td id="default-value">true</td><td id="allowable-values"><ul><li>true</li><li>false</li></ul></td><td id="description">If true, files whose names begin with a dot (".") will be ignored</td></tr><tr><td id="name"><strong>Minimum File Age</strong></td><td id="default-value">0 sec</td><td id="allowable-values"></td><td id="description">The minimum age that a file must be in order to be pulled; any file younger than this amount of time (based on last modification date) will be ignored</td></tr><tr><td id="name">Maximum File Age</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">The maximum age that a file must be in order to be pulled; any file older than this amount of time (based on last modification date) will be ignore
 d</td></tr><tr><td id="name"><strong>Polling Interval</strong></td><td id="default-value">0 sec</td><td id="allowable-values"></td><td id="description">Indicates how long to wait between performing directory listings</td></tr><tr><td id="name"><strong>Batch Size</strong></td><td id="default-value">100</td><td id="allowable-values"></td><td id="description">The maximum number of files to pull in each iteration, based on run schedule.</td></tr><tr><td id="name">IO Buffer Size</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Amount of memory to use to buffer file contents during IO. This overrides the Hadoop Configuration</td></tr><tr><td id="name"><strong>Compression codec</strong></td><td id="default-value">NONE</td><td id="allowable-values"><ul><li>NONE <img src="../../../../../html/images/iconInfo.png" alt="No compression" title="No compression"></img></li><li>DEFAULT <img src="../../../../../html/images/iconInfo.png" alt="Default ZLIB compression
 " title="Default ZLIB compression"></img></li><li>BZIP <img src="../../../../../html/images/iconInfo.png" alt="BZIP compression" title="BZIP compression"></img></li><li>GZIP <img src="../../../../../html/images/iconInfo.png" alt="GZIP compression" title="GZIP compression"></img></li><li>LZ4 <img src="../../../../../html/images/iconInfo.png" alt="LZ4 compression" title="LZ4 compression"></img></li><li>LZO <img src="../../../../../html/images/iconInfo.png" alt="LZO compression - it assumes LD_LIBRARY_PATH has been set and jar is available" title="LZO compression - it assumes LD_LIBRARY_PATH has been set and jar is available"></img></li><li>SNAPPY <img src="../../../../../html/images/iconInfo.png" alt="Snappy compression" title="Snappy compression"></img></li><li>AUTOMATIC <img src="../../../../../html/images/iconInfo.png" alt="Will attempt to automatically detect the compression codec." title="Will attempt to automatically detect the compression codec."></img></li></ul></td><td id="de
 scription">No Description Provided.</td></tr><tr><td id="name"><strong>FlowFile Content</strong></td><td id="default-value">VALUE ONLY</td><td id="allowable-values"><ul><li>VALUE ONLY</li><li>KEY VALUE PAIR</li></ul></td><td id="description">Indicate if the content is to be both the key and value of the Sequence File, or just the value.</td></tr></table><h3>Relationships: </h3><table id="relationships"><tr><th>Name</th><th>Description</th></tr><tr><td>success</td><td>All files retrieved from HDFS are transferred to this relationship</td></tr></table><h3>Reads Attributes: </h3>None specified.<h3>Writes Attributes: </h3><table id="writes-attributes"><tr><th>Name</th><th>Description</th></tr><tr><td>filename</td><td>The name of the file that was read from HDFS.</td></tr><tr><td>path</td><td>The path is set to the relative path of the file's directory on HDFS. For example, if the Directory property is set to /tmp, then files picked up from /tmp will have the path attribute set to "./". 
 If the Recurse Subdirectories property is set to true and a file is picked up from /tmp/abc/1/2/3, then the path attribute will be set to "abc/1/2/3".</td></tr></table><h3>State management: </h3>This component does not store state.<h3>Restricted: </h3><table id="restrictions"><tr><th>Required Permission</th><th>Explanation</th></tr><tr><td>read filesystem</td><td>Provides operator the ability to retrieve any file that NiFi has access to in HDFS or the local filesystem.</td></tr><tr><td>write filesystem</td><td>Provides operator the ability to delete any file that NiFi has access to in HDFS or the local filesystem.</td></tr></table><h3>Input requirement: </h3>This component does not allow an incoming relationship.<h3>System Resource Considerations:</h3>None specified.<h3>See Also:</h3><p><a href="../org.apache.nifi.processors.hadoop.PutHDFS/index.html">PutHDFS</a></p></body></html>
\ No newline at end of file

Added: nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.ListHDFS/index.html
URL: http://svn.apache.org/viewvc/nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.ListHDFS/index.html?rev=1836075&view=auto
==============================================================================
--- nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.ListHDFS/index.html (added)
+++ nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.ListHDFS/index.html Tue Jul 17 01:35:38 2018
@@ -0,0 +1,3 @@
+<!DOCTYPE html><html lang="en"><head><meta charset="utf-8"></meta><title>ListHDFS</title><link rel="stylesheet" href="../../../../../css/component-usage.css" type="text/css"></link></head><script type="text/javascript">window.onload = function(){if(self==top) { document.getElementById('nameHeader').style.display = "inherit"; } }</script><body><h1 id="nameHeader" style="display: none;">ListHDFS</h1><h2>Description: </h2><p>Retrieves a listing of files from HDFS. Each time a listing is performed, the files with the latest timestamp will be excluded and picked up during the next execution of the processor. This is done to ensure that we do not miss any files, or produce duplicates, in the cases where files with the same timestamp are written immediately before and after a single execution of the processor. For each file that is listed in HDFS, this processor creates a FlowFile that represents the HDFS file to be fetched in conjunction with FetchHDFS. This Processor is designed to run o
 n Primary Node only in a cluster. If the primary node changes, the new Primary Node will pick up where the previous node left off without duplicating all of the data. Unlike GetHDFS, this Processor does not delete any data from HDFS.</p><h3>Tags: </h3><p>hadoop, HDFS, get, list, ingest, source, filesystem</p><h3>Properties: </h3><p>In the list below, the names of required properties appear in <strong>bold</strong>. Any other properties (not in bold) are considered optional. The table also indicates any default values, and whether a property supports the <a href="../../../../../html/expression-language-guide.html">NiFi Expression Language</a>.</p><table id="properties"><tr><th>Name</th><th>Default Value</th><th>Allowable Values</th><th>Description</th></tr><tr><td id="name">Hadoop Configuration Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A file or comma separated list of files which contains the Hadoop file system configuration. Witho
 ut this, Hadoop will search the classpath for a 'core-site.xml' and 'hdfs-site.xml' file or will revert to a default configuration. To use swebhdfs, see 'Additional Details' section of PutHDFS's documentation.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Credentials Service</td><td id="default-value"></td><td id="allowable-values"><strong>Controller Service API: </strong><br/>KerberosCredentialsService<br/><strong>Implementation: </strong><a href="../../../nifi-kerberos-credentials-service-nar/1.7.1/org.apache.nifi.kerberos.KeytabCredentialsService/index.html">KeytabCredentialsService</a></td><td id="description">Specifies the Kerberos Credentials Controller Service that should be used for authenticating with Kerberos</td></tr><tr><td id="name">Kerberos Principal</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos principal to authenticate as. Requ
 ires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Keytab</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos keytab associated with the principal. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Relogin Period</td><td id="default-value">4 hours</td><td id="allowable-values"></td><td id="description">Period of time which should pass before attempting a kerberos relogin.
+
+This property has been deprecated, and has no effect on processing. Relogins now occur automatically.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Additional Classpath Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A comma-separated list of paths to files and/or directories that will be added to the classpath. When specifying a directory, all files with in the directory will be added to the classpath, but further sub-directories will not be included.</td></tr><tr><td id="name">Distributed Cache Service</td><td id="default-value"></td><td id="allowable-values"><strong>Controller Service API: </strong><br/>DistributedMapCacheClient<br/><strong>Implementations: </strong><a href="../../../nifi-hbase_1_1_2-client-service-nar/1.7.1/org.apache.nifi.hbase.HBase_1_1_2_ClientMapCacheService/index.html">HBase_1_1_2_ClientMapCacheService</a><br/><a href="../../..
 /nifi-couchbase-nar/1.7.1/org.apache.nifi.couchbase.CouchbaseMapCacheClient/index.html">CouchbaseMapCacheClient</a><br/><a href="../../../nifi-distributed-cache-services-nar/1.7.1/org.apache.nifi.distributed.cache.client.DistributedMapCacheClientService/index.html">DistributedMapCacheClientService</a><br/><a href="../../../nifi-redis-nar/1.7.1/org.apache.nifi.redis.service.RedisDistributedMapCacheClientService/index.html">RedisDistributedMapCacheClientService</a></td><td id="description">Specifies the Controller Service that should be used to maintain state about what has been pulled from HDFS so that if a new node begins pulling data, it won't duplicate all of the work that has been done.</td></tr><tr><td id="name"><strong>Directory</strong></td><td id="default-value"></td><td id="allowable-values"></td><td id="description">The HDFS directory from which files should be read<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td>
 </tr><tr><td id="name"><strong>Recurse Subdirectories</strong></td><td id="default-value">true</td><td id="allowable-values"><ul><li>true</li><li>false</li></ul></td><td id="description">Indicates whether to list files from subdirectories of the HDFS directory</td></tr><tr><td id="name"><strong>File Filter</strong></td><td id="default-value">[^\.].*</td><td id="allowable-values"></td><td id="description">Only files whose names match the given regular expression will be picked up</td></tr><tr><td id="name">Minimum File Age</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">The minimum age that a file must be in order to be pulled; any file younger than this amount of time (based on last modification date) will be ignored</td></tr><tr><td id="name">Maximum File Age</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">The maximum age that a file must be in order to be pulled; any file older than this amount of time (based 
 on last modification date) will be ignored. Minimum value is 100ms.</td></tr></table><h3>Relationships: </h3><table id="relationships"><tr><th>Name</th><th>Description</th></tr><tr><td>success</td><td>All FlowFiles are transferred to this relationship</td></tr></table><h3>Reads Attributes: </h3>None specified.<h3>Writes Attributes: </h3><table id="writes-attributes"><tr><th>Name</th><th>Description</th></tr><tr><td>filename</td><td>The name of the file that was read from HDFS.</td></tr><tr><td>path</td><td>The path is set to the absolute path of the file's directory on HDFS. For example, if the Directory property is set to /tmp, then files picked up from /tmp will have the path attribute set to "./". If the Recurse Subdirectories property is set to true and a file is picked up from /tmp/abc/1/2/3, then the path attribute will be set to "/tmp/abc/1/2/3".</td></tr><tr><td>hdfs.owner</td><td>The user that owns the file in HDFS</td></tr><tr><td>hdfs.group</td><td>The group that owns the
  file in HDFS</td></tr><tr><td>hdfs.lastModified</td><td>The timestamp of when the file in HDFS was last modified, as milliseconds since midnight Jan 1, 1970 UTC</td></tr><tr><td>hdfs.length</td><td>The number of bytes in the file in HDFS</td></tr><tr><td>hdfs.replication</td><td>The number of HDFS replicas for hte file</td></tr><tr><td>hdfs.permissions</td><td>The permissions for the file in HDFS. This is formatted as 3 characters for the owner, 3 for the group, and 3 for other users. For example rw-rw-r--</td></tr></table><h3>State management: </h3><table id="stateful"><tr><th>Scope</th><th>Description</th></tr><tr><td>CLUSTER</td><td>After performing a listing of HDFS files, the latest timestamp of all the files listed and the latest timestamp of all the files transferred are both stored. This allows the Processor to list only files that have been added or modified after this date the next time that the Processor is run, without having to store all of the actual filenames/paths w
 hich could lead to performance problems. State is stored across the cluster so that this Processor can be run on Primary Node only and if a new Primary Node is selected, the new node can pick up where the previous node left off, without duplicating the data.</td></tr></table><h3>Restricted: </h3>This component is not restricted.<h3>Input requirement: </h3>This component does not allow an incoming relationship.<h3>System Resource Considerations:</h3>None specified.<h3>See Also:</h3><p><a href="../org.apache.nifi.processors.hadoop.GetHDFS/index.html">GetHDFS</a>, <a href="../org.apache.nifi.processors.hadoop.FetchHDFS/index.html">FetchHDFS</a>, <a href="../org.apache.nifi.processors.hadoop.PutHDFS/index.html">PutHDFS</a></p></body></html>
\ No newline at end of file

Added: nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.MoveHDFS/index.html
URL: http://svn.apache.org/viewvc/nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.MoveHDFS/index.html?rev=1836075&view=auto
==============================================================================
--- nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.MoveHDFS/index.html (added)
+++ nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.MoveHDFS/index.html Tue Jul 17 01:35:38 2018
@@ -0,0 +1,3 @@
+<!DOCTYPE html><html lang="en"><head><meta charset="utf-8"></meta><title>MoveHDFS</title><link rel="stylesheet" href="../../../../../css/component-usage.css" type="text/css"></link></head><script type="text/javascript">window.onload = function(){if(self==top) { document.getElementById('nameHeader').style.display = "inherit"; } }</script><body><h1 id="nameHeader" style="display: none;">MoveHDFS</h1><h2>Description: </h2><p>Rename existing files or a directory of files (non-recursive) on Hadoop Distributed File System (HDFS).</p><h3>Tags: </h3><p>hadoop, HDFS, put, move, filesystem, moveHDFS</p><h3>Properties: </h3><p>In the list below, the names of required properties appear in <strong>bold</strong>. Any other properties (not in bold) are considered optional. The table also indicates any default values, and whether a property supports the <a href="../../../../../html/expression-language-guide.html">NiFi Expression Language</a>.</p><table id="properties"><tr><th>Name</th><th>Default V
 alue</th><th>Allowable Values</th><th>Description</th></tr><tr><td id="name">Hadoop Configuration Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A file or comma separated list of files which contains the Hadoop file system configuration. Without this, Hadoop will search the classpath for a 'core-site.xml' and 'hdfs-site.xml' file or will revert to a default configuration. To use swebhdfs, see 'Additional Details' section of PutHDFS's documentation.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Credentials Service</td><td id="default-value"></td><td id="allowable-values"><strong>Controller Service API: </strong><br/>KerberosCredentialsService<br/><strong>Implementation: </strong><a href="../../../nifi-kerberos-credentials-service-nar/1.7.1/org.apache.nifi.kerberos.KeytabCredentialsService/index.html">KeytabCredentialsService</a></td><td id="des
 cription">Specifies the Kerberos Credentials Controller Service that should be used for authenticating with Kerberos</td></tr><tr><td id="name">Kerberos Principal</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos principal to authenticate as. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Keytab</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos keytab associated with the principal. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Relogin Period</td><td id="default-value">4 hours</td><td id="allowable-values"></td><td id="description">Period of time which should pass before attempting 
 a kerberos relogin.
+
+This property has been deprecated, and has no effect on processing. Relogins now occur automatically.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Additional Classpath Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A comma-separated list of paths to files and/or directories that will be added to the classpath. When specifying a directory, all files with in the directory will be added to the classpath, but further sub-directories will not be included.</td></tr><tr><td id="name"><strong>Conflict Resolution Strategy</strong></td><td id="default-value">fail</td><td id="allowable-values"><ul><li>replace <img src="../../../../../html/images/iconInfo.png" alt="Replaces the existing file if any." title="Replaces the existing file if any."></img></li><li>ignore <img src="../../../../../html/images/iconInfo.png" alt="Failed rename operation stops processing and
  routes to success." title="Failed rename operation stops processing and routes to success."></img></li><li>fail <img src="../../../../../html/images/iconInfo.png" alt="Failing to rename a file routes to failure." title="Failing to rename a file routes to failure."></img></li></ul></td><td id="description">Indicates what should happen when a file with the same name already exists in the output directory</td></tr><tr><td id="name"><strong>Input Directory or File</strong></td><td id="default-value">${path}</td><td id="allowable-values"></td><td id="description">The HDFS directory from which files should be read, or a single file to read.<br/><strong>Supports Expression Language: true (will be evaluated using flow file attributes and variable registry)</strong></td></tr><tr><td id="name"><strong>Output Directory</strong></td><td id="default-value"></td><td id="allowable-values"></td><td id="description">The HDFS directory where the files will be moved to<br/><strong>Supports Expression
  Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name"><strong>HDFS Operation</strong></td><td id="default-value">move</td><td id="allowable-values"><ul><li>move</li><li>copy</li></ul></td><td id="description">The operation that will be performed on the source file</td></tr><tr><td id="name">File Filter Regex</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A Java Regular Expression for filtering Filenames; if a filter is supplied then only files whose names match that Regular Expression will be fetched, otherwise all files will be fetched</td></tr><tr><td id="name"><strong>Ignore Dotted Files</strong></td><td id="default-value">true</td><td id="allowable-values"><ul><li>true</li><li>false</li></ul></td><td id="description">If true, files whose names begin with a dot (".") will be ignored</td></tr><tr><td id="name">Remote Owner</td><td id="default-value"></td><td id="allowable-values"></td><td id="descr
 iption">Changes the owner of the HDFS file to this value after it is written. This only works if NiFi is running as a user that has HDFS super user privilege to change owner</td></tr><tr><td id="name">Remote Group</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Changes the group of the HDFS file to this value after it is written. This only works if NiFi is running as a user that has HDFS super user privilege to change group</td></tr></table><h3>Relationships: </h3><table id="relationships"><tr><th>Name</th><th>Description</th></tr><tr><td>success</td><td>Files that have been successfully renamed on HDFS are transferred to this relationship</td></tr><tr><td>failure</td><td>Files that could not be renamed on HDFS are transferred to this relationship</td></tr></table><h3>Reads Attributes: </h3><table id="reads-attributes"><tr><th>Name</th><th>Description</th></tr><tr><td>filename</td><td>The name of the file written to HDFS comes from the value of th
 is attribute.</td></tr></table><h3>Writes Attributes: </h3><table id="writes-attributes"><tr><th>Name</th><th>Description</th></tr><tr><td>filename</td><td>The name of the file written to HDFS is stored in this attribute.</td></tr><tr><td>absolute.hdfs.path</td><td>The absolute path to the file on HDFS is stored in this attribute.</td></tr></table><h3>State management: </h3>This component does not store state.<h3>Restricted: </h3><table id="restrictions"><tr><th>Required Permission</th><th>Explanation</th></tr><tr><td>read filesystem</td><td>Provides operator the ability to retrieve any file that NiFi has access to in HDFS or the local filesystem.</td></tr><tr><td>write filesystem</td><td>Provides operator the ability to delete any file that NiFi has access to in HDFS or the local filesystem.</td></tr></table><h3>Input requirement: </h3>This component allows an incoming relationship.<h3>System Resource Considerations:</h3>None specified.<h3>See Also:</h3><p><a href="../org.apache.ni
 fi.processors.hadoop.PutHDFS/index.html">PutHDFS</a>, <a href="../org.apache.nifi.processors.hadoop.GetHDFS/index.html">GetHDFS</a></p></body></html>
\ No newline at end of file

Added: nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.PutHDFS/additionalDetails.html
URL: http://svn.apache.org/viewvc/nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.PutHDFS/additionalDetails.html?rev=1836075&view=auto
==============================================================================
--- nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.PutHDFS/additionalDetails.html (added)
+++ nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.PutHDFS/additionalDetails.html Tue Jul 17 01:35:38 2018
@@ -0,0 +1,101 @@
+<!DOCTYPE html>
+<html lang="en">
+<!--
+      Licensed to the Apache Software Foundation (ASF) under one or more
+      contributor license agreements.  See the NOTICE file distributed with
+      this work for additional information regarding copyright ownership.
+      The ASF licenses this file to You under the Apache License, Version 2.0
+      (the "License"); you may not use this file except in compliance with
+      the License.  You may obtain a copy of the License at
+          http://www.apache.org/licenses/LICENSE-2.0
+      Unless required by applicable law or agreed to in writing, software
+      distributed under the License is distributed on an "AS IS" BASIS,
+      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+      See the License for the specific language governing permissions and
+      limitations under the License.
+    -->
+
+<head>
+  <meta charset="utf-8" />
+  <title>PutHDFS</title>
+  <link rel="stylesheet" href="../../../../../css/component-usage.css" type="text/css" />
+</head>
+
+<body>
+  <!-- Processor Documentation ================================================== -->
+  <h2>SSL Configuration:</h2>
+  <p>
+    Hadoop provides the ability to configure keystore and/or truststore properties. If you want to use SSL-secured file system like swebhdfs, you can use the Hadoop configurations instead of using SSL Context Service.
+    <ol>
+      <li>create 'ssl-client.xml' to configure the truststores.</li>
+      <p>ssl-client.xml Properties:</p>
+      <table>
+        <tr>
+          <th>Property</th>
+          <th>Default Value</th>
+          <th>Explanation</th>
+        </tr>
+        <tr>
+          <td>ssl.client.truststore.type</td>
+          <td>jks</td>
+          <td>Truststore file type</td>
+        </tr>
+        <tr>
+          <td>ssl.client.truststore.location</td>
+          <td>NONE</td>
+          <td>Truststore file location</td>
+        </tr>
+        <tr>
+          <td>ssl.client.truststore.password</td>
+          <td>NONE</td>
+          <td>Truststore file password</td>
+        </tr>
+        <tr>
+          <td>ssl.client.truststore.reload.interval</td>
+          <td>10000</td>
+          <td>Truststore reload interval, in milliseconds</td>
+        </tr>
+      </table>
+
+      <p>ssl-client.xml Example:</p>
+      <pre>
+&lt;configuration&gt;
+  &lt;property&gt;
+    &lt;name&gt;ssl.client.truststore.type&lt;/name&gt;
+    &lt;value&gt;jks&lt;/value&gt;
+  &lt;/property&gt;
+  &lt;property&gt;
+    &lt;name&gt;ssl.client.truststore.location&lt;/name&gt;
+    &lt;value&gt;/path/to/truststore.jks&lt;/value&gt;
+  &lt;/property&gt;
+  &lt;property&gt;
+    &lt;name&gt;ssl.client.truststore.password&lt;/name&gt;
+    &lt;value&gt;clientfoo&lt;/value&gt;
+  &lt;/property&gt;
+  &lt;property&gt;
+    &lt;name&gt;ssl.client.truststore.reload.interval&lt;/name&gt;
+    &lt;value&gt;10000&lt;/value&gt;
+  &lt;/property&gt;
+&lt;/configuration&gt;
+                    </pre>
+
+      <li>put 'ssl-client.xml' to the location looked up in the classpath, like under NiFi conriguration directory.</li>
+
+      <li>set the name of 'ssl-client.xml' to <i>hadoop.ssl.client.conf</i> in the 'core-site.xml' which HDFS processors use.</li>
+      <pre>
+&lt;configuration&gt;
+    &lt;property&gt;
+      &lt;name&gt;fs.defaultFS&lt;/name&gt;
+      &lt;value&gt;swebhdfs://{namenode.hostname:port}&lt;/value&gt;
+    &lt;/property&gt;
+    &lt;property&gt;
+      &lt;name&gt;hadoop.ssl.client.conf&lt;/name&gt;
+      &lt;value&gt;ssl-client.xml&lt;/value&gt;
+    &lt;/property&gt;
+&lt;configuration&gt;
+                  </pre>
+    </ol>
+  </p>
+</body>
+
+</html>

Added: nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.PutHDFS/index.html
URL: http://svn.apache.org/viewvc/nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.PutHDFS/index.html?rev=1836075&view=auto
==============================================================================
--- nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.PutHDFS/index.html (added)
+++ nifi/site/trunk/docs/nifi-docs/components/org.apache.nifi/nifi-hadoop-nar/1.7.1/org.apache.nifi.processors.hadoop.PutHDFS/index.html Tue Jul 17 01:35:38 2018
@@ -0,0 +1,3 @@
+<!DOCTYPE html><html lang="en"><head><meta charset="utf-8"></meta><title>PutHDFS</title><link rel="stylesheet" href="../../../../../css/component-usage.css" type="text/css"></link></head><script type="text/javascript">window.onload = function(){if(self==top) { document.getElementById('nameHeader').style.display = "inherit"; } }</script><body><h1 id="nameHeader" style="display: none;">PutHDFS</h1><h2>Description: </h2><p>Write FlowFile data to Hadoop Distributed File System (HDFS)</p><p><a href="additionalDetails.html">Additional Details...</a></p><h3>Tags: </h3><p>hadoop, HDFS, put, copy, filesystem</p><h3>Properties: </h3><p>In the list below, the names of required properties appear in <strong>bold</strong>. Any other properties (not in bold) are considered optional. The table also indicates any default values, and whether a property supports the <a href="../../../../../html/expression-language-guide.html">NiFi Expression Language</a>.</p><table id="properties"><tr><th>Name</th><th
 >Default Value</th><th>Allowable Values</th><th>Description</th></tr><tr><td id="name">Hadoop Configuration Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A file or comma separated list of files which contains the Hadoop file system configuration. Without this, Hadoop will search the classpath for a 'core-site.xml' and 'hdfs-site.xml' file or will revert to a default configuration. To use swebhdfs, see 'Additional Details' section of PutHDFS's documentation.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Credentials Service</td><td id="default-value"></td><td id="allowable-values"><strong>Controller Service API: </strong><br/>KerberosCredentialsService<br/><strong>Implementation: </strong><a href="../../../nifi-kerberos-credentials-service-nar/1.7.1/org.apache.nifi.kerberos.KeytabCredentialsService/index.html">KeytabCredentialsService</a></td><
 td id="description">Specifies the Kerberos Credentials Controller Service that should be used for authenticating with Kerberos</td></tr><tr><td id="name">Kerberos Principal</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos principal to authenticate as. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Keytab</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Kerberos keytab associated with the principal. Requires nifi.kerberos.krb5.file to be set in your nifi.properties<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Kerberos Relogin Period</td><td id="default-value">4 hours</td><td id="allowable-values"></td><td id="description">Period of time which should pass before a
 ttempting a kerberos relogin.
+
+This property has been deprecated, and has no effect on processing. Relogins now occur automatically.<br/><strong>Supports Expression Language: true (will be evaluated using variable registry only)</strong></td></tr><tr><td id="name">Additional Classpath Resources</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A comma-separated list of paths to files and/or directories that will be added to the classpath. When specifying a directory, all files with in the directory will be added to the classpath, but further sub-directories will not be included.</td></tr><tr><td id="name"><strong>Directory</strong></td><td id="default-value"></td><td id="allowable-values"></td><td id="description">The parent HDFS directory to which files should be written. The directory will be created if it doesn't exist.<br/><strong>Supports Expression Language: true (will be evaluated using flow file attributes and variable registry)</strong></td></tr><tr><td id="name"><strong
 >Conflict Resolution Strategy</strong></td><td id="default-value">fail</td><td id="allowable-values"><ul><li>replace <img src="../../../../../html/images/iconInfo.png" alt="Replaces the existing file if any." title="Replaces the existing file if any."></img></li><li>ignore <img src="../../../../../html/images/iconInfo.png" alt="Ignores the flow file and routes it to success." title="Ignores the flow file and routes it to success."></img></li><li>fail <img src="../../../../../html/images/iconInfo.png" alt="Penalizes the flow file and routes it to failure." title="Penalizes the flow file and routes it to failure."></img></li><li>append <img src="../../../../../html/images/iconInfo.png" alt="Appends to the existing file if any, creates a new file otherwise." title="Appends to the existing file if any, creates a new file otherwise."></img></li></ul></td><td id="description">Indicates what should happen when a file with the same name already exists in the output directory</td></tr><tr><t
 d id="name">Block Size</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Size of each block as written to HDFS. This overrides the Hadoop Configuration</td></tr><tr><td id="name">IO Buffer Size</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Amount of memory to use to buffer file contents during IO. This overrides the Hadoop Configuration</td></tr><tr><td id="name">Replication</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Number of times that HDFS will replicate each file. This overrides the Hadoop Configuration</td></tr><tr><td id="name">Permissions umask</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">A umask represented as an octal number which determines the permissions of files written to HDFS. This overrides the Hadoop Configuration dfs.umaskmode</td></tr><tr><td id="name">Remote Owner</td><td id="default-value"></td><td id="allowable-
 values"></td><td id="description">Changes the owner of the HDFS file to this value after it is written. This only works if NiFi is running as a user that has HDFS super user privilege to change owner<br/><strong>Supports Expression Language: true (will be evaluated using flow file attributes and variable registry)</strong></td></tr><tr><td id="name">Remote Group</td><td id="default-value"></td><td id="allowable-values"></td><td id="description">Changes the group of the HDFS file to this value after it is written. This only works if NiFi is running as a user that has HDFS super user privilege to change group<br/><strong>Supports Expression Language: true (will be evaluated using flow file attributes and variable registry)</strong></td></tr><tr><td id="name"><strong>Compression codec</strong></td><td id="default-value">NONE</td><td id="allowable-values"><ul><li>NONE <img src="../../../../../html/images/iconInfo.png" alt="No compression" title="No compression"></img></li><li>DEFAULT <i
 mg src="../../../../../html/images/iconInfo.png" alt="Default ZLIB compression" title="Default ZLIB compression"></img></li><li>BZIP <img src="../../../../../html/images/iconInfo.png" alt="BZIP compression" title="BZIP compression"></img></li><li>GZIP <img src="../../../../../html/images/iconInfo.png" alt="GZIP compression" title="GZIP compression"></img></li><li>LZ4 <img src="../../../../../html/images/iconInfo.png" alt="LZ4 compression" title="LZ4 compression"></img></li><li>LZO <img src="../../../../../html/images/iconInfo.png" alt="LZO compression - it assumes LD_LIBRARY_PATH has been set and jar is available" title="LZO compression - it assumes LD_LIBRARY_PATH has been set and jar is available"></img></li><li>SNAPPY <img src="../../../../../html/images/iconInfo.png" alt="Snappy compression" title="Snappy compression"></img></li><li>AUTOMATIC <img src="../../../../../html/images/iconInfo.png" alt="Will attempt to automatically detect the compression codec." title="Will attempt t
 o automatically detect the compression codec."></img></li></ul></td><td id="description">No Description Provided.</td></tr></table><h3>Relationships: </h3><table id="relationships"><tr><th>Name</th><th>Description</th></tr><tr><td>success</td><td>Files that have been successfully written to HDFS are transferred to this relationship</td></tr><tr><td>failure</td><td>Files that could not be written to HDFS for some reason are transferred to this relationship</td></tr></table><h3>Reads Attributes: </h3><table id="reads-attributes"><tr><th>Name</th><th>Description</th></tr><tr><td>filename</td><td>The name of the file written to HDFS comes from the value of this attribute.</td></tr></table><h3>Writes Attributes: </h3><table id="writes-attributes"><tr><th>Name</th><th>Description</th></tr><tr><td>filename</td><td>The name of the file written to HDFS is stored in this attribute.</td></tr><tr><td>absolute.hdfs.path</td><td>The absolute path to the file on HDFS is stored in this attribute.</
 td></tr></table><h3>State management: </h3>This component does not store state.<h3>Restricted: </h3><table id="restrictions"><tr><th>Required Permission</th><th>Explanation</th></tr><tr><td>write filesystem</td><td>Provides operator the ability to delete any file that NiFi has access to in HDFS or the local filesystem.</td></tr></table><h3>Input requirement: </h3>This component requires an incoming relationship.<h3>System Resource Considerations:</h3>None specified.<h3>See Also:</h3><p><a href="../org.apache.nifi.processors.hadoop.GetHDFS/index.html">GetHDFS</a></p></body></html>
\ No newline at end of file