You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tajo.apache.org by hy...@apache.org on 2014/04/09 13:39:16 UTC

svn commit: r1585942 [2/3] - in /tajo/site/docs: 0.8.0/ 0.8.0/_sources/configuration/ 0.8.0/_sources/partitioning/ 0.8.0/_sources/table_management/ 0.8.0/configuration/ 0.8.0/partitioning/ 0.8.0/table_management/ current/ current/_sources/configuration...

Modified: tajo/site/docs/0.8.0/table_management/rcfile.html
URL: http://svn.apache.org/viewvc/tajo/site/docs/0.8.0/table_management/rcfile.html?rev=1585942&r1=1585941&r2=1585942&view=diff
==============================================================================
--- tajo/site/docs/0.8.0/table_management/rcfile.html (original)
+++ tajo/site/docs/0.8.0/table_management/rcfile.html Wed Apr  9 11:39:15 2014
@@ -7,7 +7,7 @@
   <meta charset="utf-8">
   <meta name="viewport" content="width=device-width, initial-scale=1.0">
   
-  <title>RCFIle &mdash; Apache Tajo 0.8.0 documentation</title>
+  <title>RCFile &mdash; Apache Tajo 0.8.0 documentation</title>
   
 
   
@@ -31,7 +31,7 @@
     <link rel="top" title="Apache Tajo 0.8.0 documentation" href="../index.html"/>
         <link rel="up" title="File Formats" href="file_formats.html"/>
         <link rel="next" title="Parquet" href="parquet.html"/>
-        <link rel="prev" title="CSV" href="csv.html"/> 
+        <link rel="prev" title="CSV (TextFile)" href="csv.html"/> 
 
   
   <script src="https://cdnjs.cloudflare.com/ajax/libs/modernizr/2.6.2/modernizr.min.js"></script>
@@ -153,7 +153,7 @@
       
           <li><a href="file_formats.html">File Formats</a> &raquo;</li>
       
-    <li>RCFIle</li>
+    <li>RCFile</li>
       <li class="wy-breadcrumbs-aside">
         
           <a href="../_sources/table_management/rcfile.txt" rel="nofollow"> View page source</a>
@@ -165,8 +165,127 @@
           <div role="main">
             
   <div class="section" id="rcfile">
-<h1>RCFIle<a class="headerlink" href="#rcfile" title="Permalink to this headline">¶</a></h1>
-<p>(TODO)</p>
+<h1>RCFile<a class="headerlink" href="#rcfile" title="Permalink to this headline">¶</a></h1>
+<p>RCFile, short of Record Columnar File, are flat files consisting of binary key/value pairs,
+which shares many similarities with SequenceFile.</p>
+<div class="section" id="how-to-create-a-rcfile-table">
+<h2>How to Create a RCFile Table?<a class="headerlink" href="#how-to-create-a-rcfile-table" title="Permalink to this headline">¶</a></h2>
+<p>If you are not familiar with the <tt class="docutils literal"><span class="pre">CREATE</span> <span class="pre">TABLE</span></tt> statement, please refer to the Data Definition Language <a class="reference internal" href="../sql_language/ddl.html"><em>Data Definition Language</em></a>.</p>
+<p>In order to specify a certain file format for your table, you need to use the <tt class="docutils literal"><span class="pre">USING</span></tt> clause in your <tt class="docutils literal"><span class="pre">CREATE</span> <span class="pre">TABLE</span></tt>
+statement. Below is an example statement for creating a table using RCFile.</p>
+<div class="highlight-sql"><div class="highlight"><pre><span class="k">CREATE</span> <span class="k">TABLE</span> <span class="n">table1</span> <span class="p">(</span>
+  <span class="n">id</span> <span class="nb">int</span><span class="p">,</span>
+  <span class="n">name</span> <span class="nb">text</span><span class="p">,</span>
+  <span class="n">score</span> <span class="nb">float</span><span class="p">,</span>
+  <span class="k">type</span> <span class="nb">text</span>
+<span class="p">)</span> <span class="k">USING</span> <span class="n">RCFILE</span><span class="p">;</span>
+</pre></div>
+</div>
+</div>
+<div class="section" id="physical-properties">
+<h2>Physical Properties<a class="headerlink" href="#physical-properties" title="Permalink to this headline">¶</a></h2>
+<p>Some table storage formats provide parameters for enabling or disabling features and adjusting physical parameters.
+The <tt class="docutils literal"><span class="pre">WITH</span></tt> clause in the CREATE TABLE statement allows users to set those parameters.</p>
+<p>Now, the RCFile storage type provides the following physical properties.</p>
+<ul class="simple">
+<li><tt class="docutils literal"><span class="pre">rcfile.serde</span></tt> : custom (De)serializer class. <tt class="docutils literal"><span class="pre">org.apache.tajo.storage.BinarySerializerDeserializer</span></tt> is the default (de)serializer class.</li>
+<li><tt class="docutils literal"><span class="pre">rcfile.null</span></tt> : NULL character. It is only used when a table uses <tt class="docutils literal"><span class="pre">org.apache.tajo.storage.TextSerializerDeserializer</span></tt>. The default NULL character is an empty string <tt class="docutils literal"><span class="pre">''</span></tt>. Hive&#8217;s default NULL character is <tt class="docutils literal"><span class="pre">'\\N'</span></tt>.</li>
+<li><tt class="docutils literal"><span class="pre">compression.codec</span></tt> : Compression codec. You can enable compression feature and set specified compression algorithm. The compression algorithm used to compress files. The compression codec name should be the fully qualified class name inherited from <a class="reference external" href="https://hadoop.apache.org/docs/current/api/org/apache/hadoop/io/compress/CompressionCodec.html">org.apache.hadoop.io.compress.CompressionCodec</a>. By default, compression is disabled.</li>
+</ul>
+<p>The following is an example for creating a table using RCFile that uses compression.</p>
+<div class="highlight-sql"><div class="highlight"><pre><span class="k">CREATE</span> <span class="k">TABLE</span> <span class="n">table1</span> <span class="p">(</span>
+  <span class="n">id</span> <span class="nb">int</span><span class="p">,</span>
+  <span class="n">name</span> <span class="nb">text</span><span class="p">,</span>
+  <span class="n">score</span> <span class="nb">float</span><span class="p">,</span>
+  <span class="k">type</span> <span class="nb">text</span>
+<span class="p">)</span> <span class="k">USING</span> <span class="n">RCFILE</span> <span class="k">WITH</span> <span class="p">(</span><span class="s1">&#39;compression.codec&#39;</span><span class="o">=</span><span class="s1">&#39;org.apache.hadoop.io.compress.SnappyCodec&#39;</span><span class="p">);</span>
+</pre></div>
+</div>
+</div>
+<div class="section" id="rcfile-de-serializers">
+<h2>RCFile (De)serializers<a class="headerlink" href="#rcfile-de-serializers" title="Permalink to this headline">¶</a></h2>
+<p>Tajo provides two built-in (De)serializer for RCFile:</p>
+<ul class="simple">
+<li><tt class="docutils literal"><span class="pre">org.apache.tajo.storage.TextSerializerDeserializer</span></tt>: stores column values in a plain-text form.</li>
+<li><tt class="docutils literal"><span class="pre">org.apache.tajo.storage.BinarySerializerDeserializer</span></tt>: stores column values in a binary file format.</li>
+</ul>
+<p>The RCFile format can store some metadata in the RCFile header. Tajo writes the (de)serializer class name into
+the metadata header of each RCFile when the RCFile is created in Tajo.</p>
+<div class="admonition note">
+<p class="first admonition-title">Note</p>
+<p class="last"><tt class="docutils literal"><span class="pre">org.apache.tajo.storage.BinarySerializerDeserializer</span></tt> is the default (de) serializer for RCFile.</p>
+</div>
+</div>
+<div class="section" id="compatibility-issues-with-apache-hive">
+<h2>Compatibility Issues with Apache Hive™<a class="headerlink" href="#compatibility-issues-with-apache-hive" title="Permalink to this headline">¶</a></h2>
+<p>Regardless of whether the RCFiles are written by Apache Hive™ or Apache Tajo™, the files are compatible in both systems.
+In other words, Tajo can process RCFiles written by Apache Hive and vice versa.</p>
+<p>Since there are no metadata in RCFiles written by Hive, we need to manually specify the (de)serializer class name
+by setting a physical property.</p>
+<p>In Hive, there are two SerDe, and they correspond to the following (de)serializer in Tajo.</p>
+<ul class="simple">
+<li><tt class="docutils literal"><span class="pre">org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe</span></tt>: corresponds to <tt class="docutils literal"><span class="pre">TextSerializerDeserializer</span></tt> in Tajo.</li>
+<li><tt class="docutils literal"><span class="pre">org.apache.hadoop.hive.serde2.columnar.LazyBinaryColumnarSerDe</span></tt>: corresponds to <tt class="docutils literal"><span class="pre">BinarySerializerDeserializer</span></tt> in Tajo.</li>
+</ul>
+<p>The compatibility issue mostly occurs when a user creates an external table pointing to data of an existing table.
+The following section explains two cases: 1) the case where Tajo reads RCFile written by Hive, and
+2) the case where Hive reads RCFile written by Tajo.</p>
+<div class="section" id="when-tajo-reads-rcfile-generated-in-hive">
+<h3>When Tajo reads RCFile generated in Hive<a class="headerlink" href="#when-tajo-reads-rcfile-generated-in-hive" title="Permalink to this headline">¶</a></h3>
+<p>To create an external RCFile table generated with <tt class="docutils literal"><span class="pre">ColumnarSerDe</span></tt> in Hive,
+you should set the physical property <tt class="docutils literal"><span class="pre">rcfile.serde</span></tt> in Tajo as follows:</p>
+<div class="highlight-sql"><div class="highlight"><pre><span class="k">CREATE</span> <span class="k">EXTERNAL</span> <span class="k">TABLE</span> <span class="n">table1</span> <span class="p">(</span>
+  <span class="n">id</span> <span class="nb">int</span><span class="p">,</span>
+  <span class="n">name</span> <span class="nb">text</span><span class="p">,</span>
+  <span class="n">score</span> <span class="nb">float</span><span class="p">,</span>
+  <span class="k">type</span> <span class="nb">text</span>
+<span class="p">)</span> <span class="k">USING</span> <span class="n">RCFILE</span> <span class="k">with</span> <span class="p">(</span> <span class="s1">&#39;rcfile.serde&#39;</span><span class="o">=</span><span class="s1">&#39;org.apache.tajo.storage.TextSerializerDeserializer&#39;</span><span class="p">,</span> <span class="s1">&#39;rcfile.null&#39;</span><span class="o">=</span><span class="s1">&#39;\\N&#39;</span> <span class="p">)</span>
+<span class="k">LOCATION</span> <span class="s1">&#39;....&#39;</span><span class="p">;</span>
+</pre></div>
+</div>
+<p>To create an external RCFile table generated with <tt class="docutils literal"><span class="pre">LazyBinaryColumnarSerDe</span></tt> in Hive,
+you should set the physical property <tt class="docutils literal"><span class="pre">rcfile.serde</span></tt> in Tajo as follows:</p>
+<div class="highlight-sql"><div class="highlight"><pre><span class="k">CREATE</span> <span class="k">EXTERNAL</span> <span class="k">TABLE</span> <span class="n">table1</span> <span class="p">(</span>
+  <span class="n">id</span> <span class="nb">int</span><span class="p">,</span>
+  <span class="n">name</span> <span class="nb">text</span><span class="p">,</span>
+  <span class="n">score</span> <span class="nb">float</span><span class="p">,</span>
+  <span class="k">type</span> <span class="nb">text</span>
+<span class="p">)</span> <span class="k">USING</span> <span class="n">RCFILE</span> <span class="k">WITH</span> <span class="p">(</span><span class="s1">&#39;rcfile.serde&#39;</span> <span class="o">=</span> <span class="s1">&#39;org.apache.tajo.storage.BinarySerializerDeserializer&#39;</span><span class="p">)</span>
+<span class="k">LOCATION</span> <span class="s1">&#39;....&#39;</span><span class="p">;</span>
+</pre></div>
+</div>
+<div class="admonition note">
+<p class="first admonition-title">Note</p>
+<p class="last">As we mentioned above, <tt class="docutils literal"><span class="pre">BinarySerializerDeserializer</span></tt> is the default (de) serializer for RCFile.
+So, you can omit the <tt class="docutils literal"><span class="pre">rcfile.serde</span></tt> only for <tt class="docutils literal"><span class="pre">org.apache.tajo.storage.BinarySerializerDeserializer</span></tt>.</p>
+</div>
+</div>
+<div class="section" id="when-hive-reads-rcfile-generated-in-tajo">
+<h3>When Hive reads RCFile generated in Tajo<a class="headerlink" href="#when-hive-reads-rcfile-generated-in-tajo" title="Permalink to this headline">¶</a></h3>
+<p>To create an external RCFile table written by Tajo with <tt class="docutils literal"><span class="pre">TextSerializerDeserializer</span></tt>,
+you should set the <tt class="docutils literal"><span class="pre">SERDE</span></tt> as follows:</p>
+<div class="highlight-sql"><div class="highlight"><pre><span class="k">CREATE</span> <span class="k">TABLE</span> <span class="n">table1</span> <span class="p">(</span>
+  <span class="n">id</span> <span class="nb">int</span><span class="p">,</span>
+  <span class="n">name</span> <span class="n">string</span><span class="p">,</span>
+  <span class="n">score</span> <span class="nb">float</span><span class="p">,</span>
+  <span class="k">type</span> <span class="n">string</span>
+<span class="p">)</span> <span class="k">ROW</span> <span class="n">FORMAT</span> <span class="n">SERDE</span> <span class="s1">&#39;org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe&#39;</span> <span class="n">STORED</span> <span class="k">AS</span> <span class="n">RCFILE</span>
+<span class="k">LOCATION</span> <span class="s1">&#39;&lt;hdfs_location&gt;&#39;</span><span class="p">;</span>
+</pre></div>
+</div>
+<p>To create an external RCFile table written by Tajo with <tt class="docutils literal"><span class="pre">BinarySerializerDeserializer</span></tt>,
+you should set the <tt class="docutils literal"><span class="pre">SERDE</span></tt> as follows:</p>
+<div class="highlight-sql"><div class="highlight"><pre><span class="k">CREATE</span> <span class="k">TABLE</span> <span class="n">table1</span> <span class="p">(</span>
+  <span class="n">id</span> <span class="nb">int</span><span class="p">,</span>
+  <span class="n">name</span> <span class="n">string</span><span class="p">,</span>
+  <span class="n">score</span> <span class="nb">float</span><span class="p">,</span>
+  <span class="k">type</span> <span class="n">string</span>
+<span class="p">)</span> <span class="k">ROW</span> <span class="n">FORMAT</span> <span class="n">SERDE</span> <span class="s1">&#39;org.apache.hadoop.hive.serde2.columnar.LazyBinaryColumnarSerDe&#39;</span> <span class="n">STORED</span> <span class="k">AS</span> <span class="n">RCFILE</span>
+<span class="k">LOCATION</span> <span class="s1">&#39;&lt;hdfs_location&gt;&#39;</span><span class="p">;</span>
+</pre></div>
+</div>
+</div>
+</div>
 </div>
 
 
@@ -178,7 +297,7 @@
         <a href="parquet.html" class="btn btn-neutral float-right" title="Parquet"/>Next <span class="fa fa-arrow-circle-right"></span></a>
       
       
-        <a href="csv.html" class="btn btn-neutral" title="CSV"><span class="fa fa-arrow-circle-left"></span> Previous</a>
+        <a href="csv.html" class="btn btn-neutral" title="CSV (TextFile)"><span class="fa fa-arrow-circle-left"></span> Previous</a>
       
     </div>
   

Modified: tajo/site/docs/current/_sources/configuration/cluster_setup.txt
URL: http://svn.apache.org/viewvc/tajo/site/docs/current/_sources/configuration/cluster_setup.txt?rev=1585942&r1=1585941&r2=1585942&view=diff
==============================================================================
--- tajo/site/docs/current/_sources/configuration/cluster_setup.txt (original)
+++ tajo/site/docs/current/_sources/configuration/cluster_setup.txt Wed Apr  9 11:39:15 2014
@@ -33,10 +33,30 @@ Please add the following configs to tajo
   </property>
 
   <property>
+    <name>tajo.resource-tracker.rpc.address</name>
+    <value>hostname:26003</value>
+  </property>
+
+  <property>
     <name>tajo.catalog.client-rpc.address</name>
     <value>hostname:26005</value>
   </property>
 
+Workers
+--------------------------------------------------------
+
+The file ``conf/workers`` lists all host names of workers, one per line.
+By default, this file contains the single entry ``localhost``.
+You can easily add host names of workers via your favorite text editor.
+
+For example: ::
+
+  $ cat > conf/workers
+  host1.domain.com
+  host2.domain.com
+  ....
+
+  <ctrl + d>
 
 Make base directories and set permissions
 --------------------------------------------------------

Modified: tajo/site/docs/current/_sources/partitioning/column_partitioning.txt
URL: http://svn.apache.org/viewvc/tajo/site/docs/current/_sources/partitioning/column_partitioning.txt?rev=1585942&r1=1585941&r2=1585942&view=diff
==============================================================================
--- tajo/site/docs/current/_sources/partitioning/column_partitioning.txt (original)
+++ tajo/site/docs/current/_sources/partitioning/column_partitioning.txt Wed Apr  9 11:39:15 2014
@@ -2,4 +2,51 @@
 Column Partitioning
 *********************************
 
-.. todo::
\ No newline at end of file
+The column table partition is designed to support the partition of Apache Hive™.
+
+================================================
+How to Create a Column Partitioned Table
+================================================
+
+You can create a partitioned table by using the ``PARTITION BY`` clause. For a column partitioned table, you should use
+the ``PARTITION BY COLUMN`` clause with partition keys.
+
+For example, assume there is a table ``orders`` composed of the following schema. ::
+
+  id          INT,
+  item_name   TEXT,
+  price       FLOAT
+
+Also, assume that you want to use ``order_date TEXT`` and ``ship_date TEXT`` as the partition keys.
+Then, you should create a table as follows:
+
+.. code-block:: sql
+
+  CREATE TABLE orders (
+    id INT,
+    item_name TEXT,
+    price
+  ) PARTITION BY COLUMN (order_date TEXT, ship_date TEXT);
+
+==================================================
+Partition Pruning on Column Partitioned Tables
+==================================================
+
+The following predicates in the ``WHERE`` clause can be used to prune unqualified column partitions without processing
+during query planning phase.
+
+* ``=``
+* ``<>``
+* ``>``
+* ``<``
+* ``>=``
+* ``<=``
+* LIKE predicates with a leading wild-card character
+* IN list predicates
+
+==================================================
+Compatibility Issues with Apache Hive™
+==================================================
+
+If partitioned tables of Hive are created as external tables in Tajo, Tajo can process the Hive partitioned tables directly.
+There haven't known compatibility issues yet.
\ No newline at end of file

Modified: tajo/site/docs/current/_sources/partitioning/intro_to_partitioning.txt
URL: http://svn.apache.org/viewvc/tajo/site/docs/current/_sources/partitioning/intro_to_partitioning.txt?rev=1585942&r1=1585941&r2=1585942&view=diff
==============================================================================
--- tajo/site/docs/current/_sources/partitioning/intro_to_partitioning.txt (original)
+++ tajo/site/docs/current/_sources/partitioning/intro_to_partitioning.txt Wed Apr  9 11:39:15 2014
@@ -2,9 +2,8 @@
 Introduction to Partitioning
 **************************************
 
-======================
-Partition Key
-======================
+Table partitioning provides two benefits: easy table management and data pruning by partition keys.
+Currently, Apache Tajo only provides Apache Hive-compatible column partitioning.
 
 =========================
 Partitioning Methods

Modified: tajo/site/docs/current/_sources/table_management/csv.txt
URL: http://svn.apache.org/viewvc/tajo/site/docs/current/_sources/table_management/csv.txt?rev=1585942&r1=1585941&r2=1585942&view=diff
==============================================================================
--- tajo/site/docs/current/_sources/table_management/csv.txt (original)
+++ tajo/site/docs/current/_sources/table_management/csv.txt Wed Apr  9 11:39:15 2014
@@ -1,6 +1,110 @@
 *************************************
-CSV
+CSV (TextFile)
 *************************************
 
+A character-separated values (CSV) file represents a tabular data set consisting of rows and columns.
+Each row is a plan-text line. A line is usually broken by a character line feed ``\n`` or carriage-return ``\r``.
+The line feed ``\n`` is the default delimiter in Tajo. Each record consists of multiple fields, separated by
+some other character or string, most commonly a literal vertical bar ``|``, comma ``,`` or tab ``\t``.
+The vertical bar is used as the default field delimiter in Tajo.
 
-(TODO)
\ No newline at end of file
+=========================================
+How to Create a CSV Table ?
+=========================================
+
+If you are not familiar with the ``CREATE TABLE`` statement, please refer to the Data Definition Language :doc:`/sql_language/ddl`.
+
+In order to specify a certain file format for your table, you need to use the ``USING`` clause in your ``CREATE TABLE``
+statement. The below is an example statement for creating a table using CSV files.
+
+.. code-block:: sql
+
+ CREATE TABLE
+  table1 (
+    id int,
+    name text,
+    score float,
+    type text
+  ) USING CSV;
+
+=========================================
+Physical Properties
+=========================================
+
+Some table storage formats provide parameters for enabling or disabling features and adjusting physical parameters.
+The ``WITH`` clause in the CREATE TABLE statement allows users to set those parameters.
+
+Now, the CSV storage format provides the following physical properties.
+
+* ``csvfile.delimiter``: delimiter character. ``|`` or ``\u0001`` is usually used, and the default field delimiter is ``|``.
+* ``csvfile.null``: NULL character. The default NULL character is an empty string ``''``. Hive's default NULL character is ``'\\N'``.
+* ``compression.codec``: Compression codec. You can enable compression feature and set specified compression algorithm. The compression algorithm used to compress files. The compression codec name should be the fully qualified class name inherited from `org.apache.hadoop.io.compress.CompressionCodec <https://hadoop.apache.org/docs/current/api/org/apache/hadoop/io/compress/CompressionCodec.html>`_. By default, compression is disabled.
+* ``csvfile.serde``: custom (De)serializer class. ``org.apache.tajo.storage.TextSerializerDeserializer`` is the default (De)serializer class.
+
+The following example is to set a custom field delimiter, NULL character, and compression codec:
+
+.. code-block:: sql
+
+ CREATE TABLE table1 (
+  id int,
+  name text,
+  score float,
+  type text
+ ) USING CSV WITH('csvfile.delimiter'='\u0001',
+                  'csvfile.null'='\\N',
+                  'compression.codec'='org.apache.hadoop.io.compress.SnappyCodec');
+
+.. warning::
+
+  Be careful when using ``\n`` as the field delimiter because CSV uses ``\n`` as the line delimiter.
+  At the moment, Tajo does not provide a way to specify the line delimiter.
+
+=========================================
+Custom (De)serializer
+=========================================
+
+The CSV storage format not only provides reading and writing interfaces for CSV data but also allows users to process custom
+plan-text file formats with user-defined (De)serializer classes.
+For example, with custom (de)serializers, Tajo can process JSON file formats or any specialized plan-text file formats.
+
+In order to specify a custom (De)serializer, set a physical property ``csvfile.serde``.
+The property value should be a fully qualified class name.
+
+For example:
+
+.. code-block:: sql
+
+ CREATE TABLE table1 (
+  id int,
+  name text,
+  score float,
+  type text
+ ) USING CSV WITH ('csvfile.serde'='org.my.storage.CustomSerializerDeserializer')
+
+
+=========================================
+Null Value Handling Issues
+=========================================
+In default, NULL character in CSV files is an empty string ``''``.
+In other words, an empty field is basically recognized as a NULL value in Tajo.
+If a field domain is ``TEXT``, an empty field is recognized as a string value ``''`` instead of NULL value.
+Besides, You can also use your own NULL character by specifying a physical property ``csvfile.null``.
+
+=========================================
+Compatibility Issues with Apache Hive™
+=========================================
+
+CSV files generated in Tajo can be processed directly by Apache Hive™ without further processing.
+In this section, we explain some compatibility issue for users who use both Hive and Tajo.
+
+If you set a custom field delimiter, the CSV tables cannot be directly used in Hive.
+In order to specify the custom field delimiter in Hive, you need to use ``ROW FORMAT DELIMITED FIELDS TERMINATED BY``
+clause in a Hive's ``CREATE TABLE`` statement as follows:
+
+.. code-block:: sql
+
+ CREATE TABLE table1 (id int, name string, score float, type string)
+ ROW FORMAT DELIMITED FIELDS TERMINATED BY '|'
+ STORED AS TEXTFILE
+
+To the best of our knowledge, there is not way to specify a custom NULL character in Hive.
\ No newline at end of file

Modified: tajo/site/docs/current/_sources/table_management/parquet.txt
URL: http://svn.apache.org/viewvc/tajo/site/docs/current/_sources/table_management/parquet.txt?rev=1585942&r1=1585941&r2=1585942&view=diff
==============================================================================
--- tajo/site/docs/current/_sources/table_management/parquet.txt (original)
+++ tajo/site/docs/current/_sources/table_management/parquet.txt Wed Apr  9 11:39:15 2014
@@ -2,5 +2,47 @@
 Parquet
 *************************************
 
+Parquet is a columnar storage format for Hadoop. Parquet is designed to make the advantages of compressed,
+efficient columnar data representation available to any project in the Hadoop ecosystem,
+regardless of the choice of data processing framework, data model, or programming language.
+For more details, please refer to `Parquet File Format <http://parquet.io/>`_.
 
-(TODO)
\ No newline at end of file
+=========================================
+How to Create a Parquet Table?
+=========================================
+
+If you are not familiar with ``CREATE TABLE`` statement, please refer to Data Definition Language :doc:`/sql_language/ddl`.
+
+In order to specify a certain file format for your table, you need to use the ``USING`` clause in your ``CREATE TABLE``
+statement. Below is an example statement for creating a table using parquet files.
+
+.. code-block:: sql
+
+  CREATE TABLE table1 (
+    id int,
+    name text,
+    score float,
+    type text
+  ) USING PARQUET;
+
+=========================================
+Physical Properties
+=========================================
+
+Some table storage formats provide parameters for enabling or disabling features and adjusting physical parameters.
+The ``WITH`` clause in the CREATE TABLE statement allows users to set those parameters.
+
+Now, Parquet file provides the following physical properties.
+
+* ``parquet.block.size``: The block size is the size of a row group being buffered in memory. This limits the memory usage when writing. Larger values will improve the I/O when reading but consume more memory when writing. Default size is 134217728 bytes (= 128 * 1024 * 1024).
+* ``parquet.page.size``: The page size is for compression. When reading, each page can be decompressed independently. A block is composed of pages. The page is the smallest unit that must be read fully to access a single record. If this value is too small, the compression will deteriorate. Default size is 1048576 bytes (= 1 * 1024 * 1024).
+* ``parquet.compression``: The compression algorithm used to compress pages. It should be one of ``uncompressed``, ``snappy``, ``gzip``, ``lzo``. Default is ``uncompressed``.
+* ``parquet.enable.dictionary``: The boolean value is to enable/disable dictionary encoding. It should be one of either ``true`` or ``false``. Default is ``true``.
+
+=========================================
+Compatibility Issues with Apache Hive™
+=========================================
+
+At the moment, Tajo only supports flat relational tables.
+As a result, Tajo's Parquet storage type does not support nested schemas.
+However, we are currently working on adding support for nested schemas and non-scalar types (`TAJO-710 <https://issues.apache.org/jira/browse/TAJO-710>`_).
\ No newline at end of file

Modified: tajo/site/docs/current/_sources/table_management/rcfile.txt
URL: http://svn.apache.org/viewvc/tajo/site/docs/current/_sources/table_management/rcfile.txt?rev=1585942&r1=1585941&r2=1585942&view=diff
==============================================================================
--- tajo/site/docs/current/_sources/table_management/rcfile.txt (original)
+++ tajo/site/docs/current/_sources/table_management/rcfile.txt Wed Apr  9 11:39:15 2014
@@ -1,6 +1,149 @@
 *************************************
-RCFIle
+RCFile
 *************************************
 
+RCFile, short of Record Columnar File, are flat files consisting of binary key/value pairs,
+which shares many similarities with SequenceFile.
 
-(TODO)
\ No newline at end of file
+=========================================
+How to Create a RCFile Table?
+=========================================
+
+If you are not familiar with the ``CREATE TABLE`` statement, please refer to the Data Definition Language :doc:`/sql_language/ddl`.
+
+In order to specify a certain file format for your table, you need to use the ``USING`` clause in your ``CREATE TABLE``
+statement. Below is an example statement for creating a table using RCFile.
+
+.. code-block:: sql
+
+  CREATE TABLE table1 (
+    id int,
+    name text,
+    score float,
+    type text
+  ) USING RCFILE;
+
+=========================================
+Physical Properties
+=========================================
+
+Some table storage formats provide parameters for enabling or disabling features and adjusting physical parameters.
+The ``WITH`` clause in the CREATE TABLE statement allows users to set those parameters.
+
+Now, the RCFile storage type provides the following physical properties.
+
+* ``rcfile.serde`` : custom (De)serializer class. ``org.apache.tajo.storage.BinarySerializerDeserializer`` is the default (de)serializer class.
+* ``rcfile.null`` : NULL character. It is only used when a table uses ``org.apache.tajo.storage.TextSerializerDeserializer``. The default NULL character is an empty string ``''``. Hive's default NULL character is ``'\\N'``.
+* ``compression.codec`` : Compression codec. You can enable compression feature and set specified compression algorithm. The compression algorithm used to compress files. The compression codec name should be the fully qualified class name inherited from `org.apache.hadoop.io.compress.CompressionCodec <https://hadoop.apache.org/docs/current/api/org/apache/hadoop/io/compress/CompressionCodec.html>`_. By default, compression is disabled.
+
+The following is an example for creating a table using RCFile that uses compression.
+
+.. code-block:: sql
+
+  CREATE TABLE table1 (
+    id int,
+    name text,
+    score float,
+    type text
+  ) USING RCFILE WITH ('compression.codec'='org.apache.hadoop.io.compress.SnappyCodec');
+
+=========================================
+RCFile (De)serializers
+=========================================
+
+Tajo provides two built-in (De)serializer for RCFile:
+
+* ``org.apache.tajo.storage.TextSerializerDeserializer``: stores column values in a plain-text form.
+* ``org.apache.tajo.storage.BinarySerializerDeserializer``: stores column values in a binary file format.
+
+The RCFile format can store some metadata in the RCFile header. Tajo writes the (de)serializer class name into
+the metadata header of each RCFile when the RCFile is created in Tajo.
+
+.. note::
+
+  ``org.apache.tajo.storage.BinarySerializerDeserializer`` is the default (de) serializer for RCFile.
+
+
+=========================================
+Compatibility Issues with Apache Hive™
+=========================================
+
+Regardless of whether the RCFiles are written by Apache Hive™ or Apache Tajo™, the files are compatible in both systems.
+In other words, Tajo can process RCFiles written by Apache Hive and vice versa.
+
+Since there are no metadata in RCFiles written by Hive, we need to manually specify the (de)serializer class name
+by setting a physical property.
+
+In Hive, there are two SerDe, and they correspond to the following (de)serializer in Tajo.
+
+* ``org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe``: corresponds to ``TextSerializerDeserializer`` in Tajo.
+* ``org.apache.hadoop.hive.serde2.columnar.LazyBinaryColumnarSerDe``: corresponds to ``BinarySerializerDeserializer`` in Tajo.
+
+The compatibility issue mostly occurs when a user creates an external table pointing to data of an existing table.
+The following section explains two cases: 1) the case where Tajo reads RCFile written by Hive, and
+2) the case where Hive reads RCFile written by Tajo.
+
+-----------------------------------------
+When Tajo reads RCFile generated in Hive
+-----------------------------------------
+
+To create an external RCFile table generated with ``ColumnarSerDe`` in Hive,
+you should set the physical property ``rcfile.serde`` in Tajo as follows:
+
+.. code-block:: sql
+
+  CREATE EXTERNAL TABLE table1 (
+    id int,
+    name text,
+    score float,
+    type text
+  ) USING RCFILE with ( 'rcfile.serde'='org.apache.tajo.storage.TextSerializerDeserializer', 'rcfile.null'='\\N' )
+  LOCATION '....';
+
+To create an external RCFile table generated with ``LazyBinaryColumnarSerDe`` in Hive,
+you should set the physical property ``rcfile.serde`` in Tajo as follows:
+
+.. code-block:: sql
+
+  CREATE EXTERNAL TABLE table1 (
+    id int,
+    name text,
+    score float,
+    type text
+  ) USING RCFILE WITH ('rcfile.serde' = 'org.apache.tajo.storage.BinarySerializerDeserializer')
+  LOCATION '....';
+
+.. note::
+
+  As we mentioned above, ``BinarySerializerDeserializer`` is the default (de) serializer for RCFile.
+  So, you can omit the ``rcfile.serde`` only for ``org.apache.tajo.storage.BinarySerializerDeserializer``.
+
+-----------------------------------------
+When Hive reads RCFile generated in Tajo
+-----------------------------------------
+
+To create an external RCFile table written by Tajo with ``TextSerializerDeserializer``,
+you should set the ``SERDE`` as follows:
+
+.. code-block:: sql
+
+  CREATE TABLE table1 (
+    id int,
+    name string,
+    score float,
+    type string
+  ) ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe' STORED AS RCFILE
+  LOCATION '<hdfs_location>';
+
+To create an external RCFile table written by Tajo with ``BinarySerializerDeserializer``,
+you should set the ``SERDE`` as follows:
+
+.. code-block:: sql
+
+  CREATE TABLE table1 (
+    id int,
+    name string,
+    score float,
+    type string
+  ) ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.columnar.LazyBinaryColumnarSerDe' STORED AS RCFILE
+  LOCATION '<hdfs_location>';
\ No newline at end of file

Modified: tajo/site/docs/current/configuration/cluster_setup.html
URL: http://svn.apache.org/viewvc/tajo/site/docs/current/configuration/cluster_setup.html?rev=1585942&r1=1585941&r2=1585942&view=diff
==============================================================================
--- tajo/site/docs/current/configuration/cluster_setup.html (original)
+++ tajo/site/docs/current/configuration/cluster_setup.html Wed Apr  9 11:39:15 2014
@@ -198,6 +198,21 @@
 </pre></div>
 </div>
 </div>
+<div class="section" id="workers">
+<h3>Workers<a class="headerlink" href="#workers" title="Permalink to this headline">¶</a></h3>
+<p>The file <tt class="docutils literal"><span class="pre">conf/workers</span></tt> lists all host names of workers, one per line.
+By default, this file contains the single entry <tt class="docutils literal"><span class="pre">localhost</span></tt>.
+You can easily add host names of workers via your favorite text editor.</p>
+<p>For example:</p>
+<div class="highlight-python"><div class="highlight"><pre>$ cat &gt; conf/workers
+host1.domain.com
+host2.domain.com
+....
+
+&lt;ctrl + d&gt;
+</pre></div>
+</div>
+</div>
 <div class="section" id="make-base-directories-and-set-permissions">
 <h3>Make base directories and set permissions<a class="headerlink" href="#make-base-directories-and-set-permissions" title="Permalink to this headline">¶</a></h3>
 <p>If you want to know Tajo’s configuration in more detail, see Configuration page.
@@ -291,4 +306,4 @@ $ $HADOOP_HOME/bin/hadoop fs -chmod g+w 
    
 
 </body>
-</html>
+</html>
\ No newline at end of file

Modified: tajo/site/docs/current/index.html
URL: http://svn.apache.org/viewvc/tajo/site/docs/current/index.html?rev=1585942&r1=1585941&r2=1585942&view=diff
==============================================================================
--- tajo/site/docs/current/index.html (original)
+++ tajo/site/docs/current/index.html Wed Apr  9 11:39:15 2014
@@ -266,8 +266,8 @@
 </li>
 <li class="toctree-l1"><a class="reference internal" href="table_management.html">Table Management</a><ul>
 <li class="toctree-l2"><a class="reference internal" href="table_management/file_formats.html">File Formats</a><ul>
-<li class="toctree-l3"><a class="reference internal" href="table_management/csv.html">CSV</a></li>
-<li class="toctree-l3"><a class="reference internal" href="table_management/rcfile.html">RCFIle</a></li>
+<li class="toctree-l3"><a class="reference internal" href="table_management/csv.html">CSV (TextFile)</a></li>
+<li class="toctree-l3"><a class="reference internal" href="table_management/rcfile.html">RCFile</a></li>
 <li class="toctree-l3"><a class="reference internal" href="table_management/parquet.html">Parquet</a></li>
 <li class="toctree-l3"><a class="reference internal" href="table_management/sequencefile.html">SequenceFile</a></li>
 </ul>
@@ -277,11 +277,15 @@
 </li>
 <li class="toctree-l1"><a class="reference internal" href="table_partitioning.html">Table Partitioning</a><ul>
 <li class="toctree-l2"><a class="reference internal" href="partitioning/intro_to_partitioning.html">Introduction to Partitioning</a><ul>
-<li class="toctree-l3"><a class="reference internal" href="partitioning/intro_to_partitioning.html#partition-key">Partition Key</a></li>
 <li class="toctree-l3"><a class="reference internal" href="partitioning/intro_to_partitioning.html#partitioning-methods">Partitioning Methods</a></li>
 </ul>
 </li>
-<li class="toctree-l2"><a class="reference internal" href="partitioning/column_partitioning.html">Column Partitioning</a></li>
+<li class="toctree-l2"><a class="reference internal" href="partitioning/column_partitioning.html">Column Partitioning</a><ul>
+<li class="toctree-l3"><a class="reference internal" href="partitioning/column_partitioning.html#how-to-create-a-column-partitioned-table">How to Create a Column Partitioned Table</a></li>
+<li class="toctree-l3"><a class="reference internal" href="partitioning/column_partitioning.html#partition-pruning-on-column-partitioned-tables">Partition Pruning on Column Partitioned Tables</a></li>
+<li class="toctree-l3"><a class="reference internal" href="partitioning/column_partitioning.html#compatibility-issues-with-apache-hive">Compatibility Issues with Apache Hive™</a></li>
+</ul>
+</li>
 <li class="toctree-l2"><a class="reference internal" href="partitioning/range_partitioning.html">Range Partitioning</a></li>
 <li class="toctree-l2"><a class="reference internal" href="partitioning/hash_partitioning.html">Hash Partitioning</a></li>
 </ul>

Modified: tajo/site/docs/current/partitioning/column_partitioning.html
URL: http://svn.apache.org/viewvc/tajo/site/docs/current/partitioning/column_partitioning.html?rev=1585942&r1=1585941&r2=1585942&view=diff
==============================================================================
--- tajo/site/docs/current/partitioning/column_partitioning.html (original)
+++ tajo/site/docs/current/partitioning/column_partitioning.html Wed Apr  9 11:39:15 2014
@@ -164,8 +164,46 @@
             
   <div class="section" id="column-partitioning">
 <h1>Column Partitioning<a class="headerlink" href="#column-partitioning" title="Permalink to this headline">¶</a></h1>
-<div class="admonition-todo admonition" id="index-0">
-<p class="first last admonition-title">Todo</p>
+<p>The column table partition is designed to support the partition of Apache Hive™.</p>
+<div class="section" id="how-to-create-a-column-partitioned-table">
+<h2>How to Create a Column Partitioned Table<a class="headerlink" href="#how-to-create-a-column-partitioned-table" title="Permalink to this headline">¶</a></h2>
+<p>You can create a partitioned table by using the <tt class="docutils literal"><span class="pre">PARTITION</span> <span class="pre">BY</span></tt> clause. For a column partitioned table, you should use
+the <tt class="docutils literal"><span class="pre">PARTITION</span> <span class="pre">BY</span> <span class="pre">COLUMN</span></tt> clause with partition keys.</p>
+<p>For example, assume there is a table <tt class="docutils literal"><span class="pre">orders</span></tt> composed of the following schema.</p>
+<div class="highlight-python"><div class="highlight"><pre>id          INT,
+item_name   TEXT,
+price       FLOAT
+</pre></div>
+</div>
+<p>Also, assume that you want to use <tt class="docutils literal"><span class="pre">order_date</span> <span class="pre">TEXT</span></tt> and <tt class="docutils literal"><span class="pre">ship_date</span> <span class="pre">TEXT</span></tt> as the partition keys.
+Then, you should create a table as follows:</p>
+<div class="highlight-sql"><div class="highlight"><pre><span class="k">CREATE</span> <span class="k">TABLE</span> <span class="n">orders</span> <span class="p">(</span>
+  <span class="n">id</span> <span class="nb">INT</span><span class="p">,</span>
+  <span class="n">item_name</span> <span class="nb">TEXT</span><span class="p">,</span>
+  <span class="n">price</span>
+<span class="p">)</span> <span class="n">PARTITION</span> <span class="k">BY</span> <span class="k">COLUMN</span> <span class="p">(</span><span class="n">order_date</span> <span class="nb">TEXT</span><span class="p">,</span> <span class="n">ship_date</span> <span class="nb">TEXT</span><span class="p">);</span>
+</pre></div>
+</div>
+</div>
+<div class="section" id="partition-pruning-on-column-partitioned-tables">
+<h2>Partition Pruning on Column Partitioned Tables<a class="headerlink" href="#partition-pruning-on-column-partitioned-tables" title="Permalink to this headline">¶</a></h2>
+<p>The following predicates in the <tt class="docutils literal"><span class="pre">WHERE</span></tt> clause can be used to prune unqualified column partitions without processing
+during query planning phase.</p>
+<ul class="simple">
+<li><tt class="docutils literal"><span class="pre">=</span></tt></li>
+<li><tt class="docutils literal"><span class="pre">&lt;&gt;</span></tt></li>
+<li><tt class="docutils literal"><span class="pre">&gt;</span></tt></li>
+<li><tt class="docutils literal"><span class="pre">&lt;</span></tt></li>
+<li><tt class="docutils literal"><span class="pre">&gt;=</span></tt></li>
+<li><tt class="docutils literal"><span class="pre">&lt;=</span></tt></li>
+<li>LIKE predicates with a leading wild-card character</li>
+<li>IN list predicates</li>
+</ul>
+</div>
+<div class="section" id="compatibility-issues-with-apache-hive">
+<h2>Compatibility Issues with Apache Hive™<a class="headerlink" href="#compatibility-issues-with-apache-hive" title="Permalink to this headline">¶</a></h2>
+<p>If partitioned tables of Hive are created as external tables in Tajo, Tajo can process the Hive partitioned tables directly.
+There haven&#8217;t known compatibility issues yet.</p>
 </div>
 </div>
 

Modified: tajo/site/docs/current/partitioning/intro_to_partitioning.html
URL: http://svn.apache.org/viewvc/tajo/site/docs/current/partitioning/intro_to_partitioning.html?rev=1585942&r1=1585941&r2=1585942&view=diff
==============================================================================
--- tajo/site/docs/current/partitioning/intro_to_partitioning.html (original)
+++ tajo/site/docs/current/partitioning/intro_to_partitioning.html Wed Apr  9 11:39:15 2014
@@ -164,9 +164,8 @@
             
   <div class="section" id="introduction-to-partitioning">
 <h1>Introduction to Partitioning<a class="headerlink" href="#introduction-to-partitioning" title="Permalink to this headline">¶</a></h1>
-<div class="section" id="partition-key">
-<h2>Partition Key<a class="headerlink" href="#partition-key" title="Permalink to this headline">¶</a></h2>
-</div>
+<p>Table partitioning provides two benefits: easy table management and data pruning by partition keys.
+Currently, Apache Tajo only provides Apache Hive-compatible column partitioning.</p>
 <div class="section" id="partitioning-methods">
 <h2>Partitioning Methods<a class="headerlink" href="#partitioning-methods" title="Permalink to this headline">¶</a></h2>
 <dl class="docutils">

Modified: tajo/site/docs/current/searchindex.js
URL: http://svn.apache.org/viewvc/tajo/site/docs/current/searchindex.js?rev=1585942&r1=1585941&r2=1585942&view=diff
==============================================================================
--- tajo/site/docs/current/searchindex.js (original)
+++ tajo/site/docs/current/searchindex.js Wed Apr  9 11:39:15 2014
@@ -1 +1 @@
-Search.setIndex({envversion:42,terms:{represent:15,all:[37,12,36,2,3,6,7,42,43,34],code:[3,42,23],dist:[37,14,42],c_name:34,queri:[42,12,8,23],month:15,four:39,concept:37,mno:43,follow:[25,24,11,12,36,37,38,3,28,14,30,18,42,19,43,20,39,34,8,7],disk:[3,8],row:[18,43,7,42,12],profil:36,privat:37,depend:[36,7],c_acctbal:34,zone:20,getconnect:37,decim:20,init:43,program:25,drivermanag:37,sql_languag:12,backup_and_restor:12,"case":[35,37,7,25,42,18],digit:20,sourc:[12,42,23],string:[10,3,25,15],fals:[20,3,7],"void":37,csvfile:[30,43],volum:[12,43],veri:[2,38,7],affect:[42,8],exact:20,dbname:12,tri:37,recordcompresswrit:18,list:[9,43,12,19],hive:[36,18,42],"try":[37,12],item:7,adjust:[24,8],form:24,progress:13,geoip:3,println:37,tajo_src:14,locat:[11,3,30,43,34,8],tajo_hom:[25,37,12,2,38,42,43],dir:[38,11],pleas:[37,2,38,3,43,34,8],upper:35,templet:25,smaller:11,ten:[36,23],annot:37,enjoi:43,dowload:23,second:[35,42],cost:13,design:[13,25],val3:12,val2:[7,12],val1:[7,12],port:[24,37,38,12
 ,42],tmpdir:8,compat:[18,8],index:9,abc:[7,43],addr:[3,42],compar:20,hdf:[24,12,13,37,38,30,43,23],section:[36,37,38,23,8],xtajoxx:35,access:[36,37,38],c_phone:34,"19th":15,version:[12,42],directori:42,supplier:12,"new":12,method:[9,34],elimin:30,hash:9,valn:7,gener:12,o_totalpric:12,here:[2,7,43,18],let:18,address:[20,40,38,3,42],path:[24,12,36,37,3,30,42,43,34],o_orderprior:12,strong:34,valu:[25,24,12,36,38,3,42,8],convert:[3,45],aaabc:7,derbi:42,shift:15,larger:20,table_backup:34,precis:20,datetim:10,codebas:19,implement:[2,12],sequencefil:[9,39],via:23,data_typ:30,apach:[23,3,42,12],modul:[9,25],sign:7,filenam:12,unix:15,"boolean":[20,3],namenode_hostnam:24,wip:[42,19],instal:[3,14],total:8,unit:42,from:[9,11,35,37,15,30,7,19,34,43],describ:[36,12],would:[36,14],memori:42,distinct:6,two:[2,7,25,34,45,18],next:[36,37],connector:[36,42],few:25,call:9,usr:42,compressiontyp:18,type:[9,12,3,15,25,43,34,8,35],until:12,more:[36,38,7,42,43,23,8],flat:18,discompress:14,l_quantiti:11,hive
 ql:36,claus:[9,43,7,20],tbname:12,trail:35,rootdir:38,appendix:9,name2:12,name3:12,name1:12,git:42,unpack:23,must:[20,24,37,42],hour:15,alia:[20,6,35,15],setup:33,work:[20,12],uniqu:12,dev:3,cat:[34,43],archiv:[14,42],can:[25,37,12,36,2,43,30,18,42,19,14,8,7],parquet:[9,39],purpos:[36,25],root:38,def:43,bewtween:20,dayofweek:15,prompt:12,tar:[14,42,19],give:[30,8],process:[13,12],share:37,backslash:12,indic:12,liter:25,want:[25,24,37,11,12,36,2,38,30,18,42,43,34,8],tcp:12,occur:[30,15],timestamptz:20,sundai:15,end:[35,19],goal:13,quot:12,str2:35,str1:35,mapr:18,how:[9,36,2,38,23,8],purg:30,env:42,c_address:34,instead:[43,7,12],csv:[9,43,30,12,34],simpl:[2,43],lazybinaryserd:18,blockcompresswrit:18,map:20,resourc:38,ltrim:35,clone:[42,19],after:14,resultset:37,concatnen:35,befor:38,catalog:1,date:[20,34],multipl:8,underscor:7,data:38,parallel:38,physic:38,tajo_worker_heaps:8,github:19,third:[42,15],host:[37,12,42],grant:42,element:25,inform:[12,42],preced:3,environ:[25,24,2,36,8],all
 ow:[24,36,38,30,43,8],anoth:45,worst:13,order:[12,13,36,37,38,6,42,43,20,8],help:12,shorten:42,move:14,midnight:15,vari:20,dynam:13,paramet:[35,30,3,15],joda:37,write:18,style:7,group:[6,7],cli:[37,12],html:12,persist:42,main:[13,37,7,25],lastli:42,"return":[35,15,3,7],thei:[37,7],timestamp:[20,15],col2:[11,7],col3:11,auth:37,nation:12,now:[12,14,34],utc_usec_to:15,nor:7,introduct:9,serde2:18,hive_jdbc_driver_dir:[36,42],name:[25,24,12,36,37,38,3,40,6,18,42,43,20,34,8],anyth:12,config:[25,12,36,38,40,42,8],drop:9,regexp_replac:35,instruct:[36,3,19],guava:37,easili:37,mode:42,each:[38,12,25],clsspath:37,compatibl:34,side:[37,35],mean:7,compil:[36,14,42],domain:7,replac:35,saturdai:15,procedur:14,"static":37,connect:[36,37,12,42],year:15,o_clerk:12,out:[37,8,19],unquot:12,network:10,space:35,o_orderkei:12,xxtajo:35,content:[9,30],com:3,hivemetastor:[36,42],current:[24,12,36,37,39,7,42,8],math:10,integr:42,shut:23,advanc:13,manipul:12,differ:7,free:43,standard:[13,20,7,43],asc:6,small:
 20,base:42,asf:[42,19],org:[12,36,37,30,18,42,19],"byte":[20,12,43],fridai:15,refer:[37,38,3,43,18],val:12,launch:42,could:15,filter:7,length:20,umbil:[40,38],licens:12,mvn:[36,37,14,42],first:[35,23,25,15,14,42],oper:10,rang:[9,28],resourceconfigur:8,feel:43,onc:[3,19],arrai:7,applciat:37,number:[38,12],capac:[38,8],mai:[37,25,15],o_orderd:12,set:[12,42],done:14,overwrit:[9,44],tpch:[12,34],open:13,primari:24,given:[11,12,3,30,7,43,20,35],"long":[13,15],script:[14,25],interact:[13,12,43],mkdir:[38,43],system:[37,13,2,38,25,14,8],least:7,master:[33,38,12,42],statement:[11,12,37,30,18,43,7],l_partkei:[30,11],scheme:[24,43],"final":[36,37,42,43],store:[12,13,36,18,42,43,8],listen:[37,38],mysql_jdbc_driv:[36,42],incub:[37,19],option:[34,30,12,14,15],especi:20,namespac:37,copi:[25,42],inexact:20,specifi:[24,11,12,36,37,3,18,42,20,45,8,7],part:12,rtrim:35,standalon:23,exactli:7,than:[36,11,7,23],target:[37,11,14,42],keyword:18,provid:[37,12,13,2,3,28,39,7,42,43,8,18],remov:[30,35],tree:1
 4,zero:7,charact:[20,7,35],project:13,posix:7,balanc:13,sec:[42,43],minut:[36,23],schema:[11,12,43],createdatabaseifnotexist:42,textserd:18,ran:42,respons:[13,43],argument:15,packag:[36,37,14,42],manner:37,have:[24,36,30,42,19,6],tabl:[36,34,43,12,23],need:[37,36,2,18,42,43,34],maxmind:3,predic:[9,44],featur:36,rcfile:[9,39],techniqu:13,equival:11,partsupp:12,click:19,note:[7,42],also:[18,19,7,43,12],without:13,build:[23,42],xxxxxx:42,prepar:[36,14,42,43],channel:9,begin:[12,15],c_nationkei:34,distribut:42,preliminari:33,buffer:21,filesystem:[23,19],compress:9,deflatecodec:30,most:42,microsecond:15,plan:13,pair:[18,25],metacharact:7,"class":[36,37,7,42,18],dai:15,don:[37,34,25],url:[24,37],doc:12,later:23,request:8,uri:42,ghi:43,bracket:7,snapshot:[37,14,42],determin:8,pattern:[9,35],querymast:8,think:43,show:[37,12,43,23],text:[12,3,15,35,30,18,43,20,34,7],hostnam:[24,37,38,12],syntax:45,column_refer:7,identifi:42,xml:[38,12,42],absolut:43,onli:[24,25,2,15,18,42,19],explicitli:25,r
 atio:8,just:[34,7,42,12],copyright:12,explain:[38,2,37,23],configur:12,jar:[9,36,42],enough:42,geoip_country_cod:3,templat:42,local:[23,42,3,8,19],info:40,unresolvedaddressexcept:9,variou:12,get:42,necesssari:37,express:[9,35,44],repo:[42,19],java_hom:2,"import":37,increas:38,requir:[2,15],enabl:[37,38],borrow:7,"public":37,remot:38,common:37,partit:9,contain:[37,30],where:[9,2,38,7,43,20,8],bytea:20,legaci:3,startup:12,memstor:42,c_custkei:34,see:[2,38,42,8,43],num:12,float8:[20,11,12,34],result:[37,43],arg:37,float4:20,close:37,rlike:7,o_orderstatu:12,concern:[34,25],slf4j:37,kei:[9,12],correctli:20,netti:37,ieee:20,char_length:35,state:43,score:[18,43],subdir:11,between:[37,7],"\u02c6ab":35,l_orderkei:[30,11],neither:7,approach:34,across:38,altern:7,smallint:20,accord:20,parent:14,hadoop_hom:[38,2,37],stmt:37,outfil:34,disconnect:12,addit:[37,11,7,8],both:[13,18,35],last:6,delimit:[30,35,43],fault:13,region:12,gzipcodec:30,c_comment:34,groupbi:9,against:23,eta:37,logic:7,mani:[37
 ,8],whole:8,load:[13,43],simpli:14,overview:9,"4028235e":20,split_part:35,rpc:[40,37,38],written:18,toler:13,respect:[20,7],tuesdai:15,guid:12,assum:[8,43],quit:12,xzvf:[14,19],o_com:12,three:[18,42],much:8,newinst:37,basic:34,c_mktsegment:34,pdist:[36,37,14,42],derbystor:42,o_custkei:12,xxx:12,worker:[33,38],search:9,ani:[37,7],tajojdbccli:37,work1:12,repetit:7,present:23,emploi:13,bye:43,look:19,implementaion:24,plain:18,servic:[40,42],properti:[24,25,36,38,3,40,42,8],batch:13,durat:42,cast:9,"while":37,abov:[18,43],error:30,textserializerdeseri:18,table_nam:[30,34],layer:24,file:[9,24,12,25,36,2,38,3,14,42,19,34,8,43],ctrl:43,key1:12,site:[38,12,42],cluster:[33,42],itself:12,warehous:[13,30],conf:[25,24,37,12,36,2,3,42,8],march:15,abcdef:35,conn:37,"null":[37,6],develop:19,"35b":43,minim:9,suggest:19,classpath:9,hadoop:[37,13,2,38,30,7,25,43,21,23,18],same:[36,15,3,7],binari:[20,18,42,14],instanc:[24,2,38,23],split:35,document:[12,19],log4j:37,log4j12:37,higher:21,week:15,varbit:
 20,lineitem:[30,11,12],http:[40,19,3,42,12],nio:9,optim:13,driver:[9,36,42],geoip_in_countri:3,decompress:19,jkl:43,column_nam:30,retriv:37,dskiptest:[36,37,14,42],user:[24,43,12,8,42],tajo_classpath:42,countri:3,extern:[30,34,43],col1:[11,7],tsql:34,chang:[37,38,12,42,19],built:37,lower:35,appropri:[25,43],lib:42,entri:30,well:[30,11,20],parenthes:7,know:[38,43],exampl:34,command:34,thi:[25,36,37,38,14,15,18,42,19,43,34,23,8,7],choos:[14,19],model:[9,44],usual:[37,25],explan:43,protocol:21,execut:[38,12,8,23],when:[30,37,11],dtar:[36,37,14,42],rest:19,q_1363768615503_0001_000001:43,mysql:[36,42],simultan:8,database_nam:[30,12],languag:[13,9,12,43],inet4:[20,3],web:13,binaryserializerdeseri:18,all_backup:34,easi:34,field:35,disk1:8,disk2:8,disk3:8,except:37,littl:38,add:[2,25,36,37,38,30,42,8],other:[13,11,25],blob:20,exercis:23,primit:20,input:35,approxim:20,match:[9,35,20],take:[36,23,19],bin:[37,12,36,2,38,42,43,34],varchar:20,which:[12,13,37,30,18,42,43],format:[9,31],read:[38,1
 8,19],bzip2codec:30,workload:8,phcatalog:[36,42],chmod:38,lineitem_100_snappi:30,bit:20,password:42,desc:6,insert:[9,44],storage_typ:30,table1:[37,7,43,18],resid:43,like:[24,37,14],specif:[20,24,11],substr:35,should:[24,12,36,37,38,3,23,30,42,43,34,8],reoptim:13,integ:[20,8],server:[25,42],query_succeed:43,collect:18,api:[9,37],necessari:[13,37],singl:7,output:[11,18],timetz:20,tajo:34,page:[9,38],createstat:37,"1st":15,some:[36,37,25,43,20,14],tajowork:[25,8],certain:11,intern:25,"export":[36,2,42],mirror:19,proper:[24,14,8],home:[36,2,34,42,43],tmp:[24,43],separ:18,scale:13,lead:35,"512mb":8,avoid:[13,30],exec:18,definit:[9,44],per:[12,8],exit:43,select:[9,11,35,37,3,15,7,43],bigint:[20,30],ddl:[34,43],condit:6,fornam:37,localhost:[40,30,12,42],either:[20,36,7,19],machin:[12,42],core:8,previou:7,run:[38,12],power:7,uncompress:18,ab_bc_cd:35,usag:34,hive_hom:[36,42],maven:[36,37],step:14,character_length:35,repositori:19,peer:40,java:[9,25,36,2,42,21],unicod:20,stage:43,comparison:
 [20,7],postgresql:7,column:9,varbinari:20,manag:[9,36],degre:38,regular:35,"1gb":[24,8],block:18,client:[9,40,38,12],"float":[20,18,8,43],lazybinari:18,due:8,down:[7,23],defaultport:42,storag:[24,37,18,42,34],your:[24,25,36,37,34,42,19,23,14],jdbc:[9,36,42],log:37,wai:34,support:[24,34,42],submit:[23,43],custom:[12,42,34],avail:[37,8],start:[38,35],current_databas:12,arithmet:9,includ:[37,8],unix_timestamp:15,ipv4:[20,3],"function":12,snappycodec:[30,18],untar:19,tajoxx:35,l_comment:30,int4:[20,11,12,35],link:19,select_stat:30,int2:20,"true":[3,18,42,20,8,7],int8:[20,11,12,34],count:35,"throw":37,consist:[18,25],possibl:8,whether:7,day_of_week:15,record:[7,18],below:[37,23,34],limit:8,otherwis:[12,3,7],executequeri:37,trim:35,binaryserd:18,creat:[9,12,38,23,42,43,34],btrim:35,"int":[20,43,35,18,15],sequecnefil:18,forgot:37,repres:15,"char":20,exist:[30,11,12],o_shipprior:12,guarante:37,download:[3,19],doe:[37,30],check:19,denot:7,percent:7,detail:[38,42,43],getstr:37,"default":[33,3
 8,12,42],codec:[30,18],bool:20,rdbm:37,varieti:25,test:[37,12,42],you:[25,24,37,11,12,36,2,38,3,14,15,30,7,42,19,43,34,23,8,18],node:[37,38],clean:[36,37,14,42],sequenc:7,insensit:7,tajo_dump:34,protobuf:37,tajomast:[25,38,12,42],hurt:13,lang:37,algorithm:8,function_nam:45,hyunsik:[12,34],regexp:7,descript:[12,40,7,42,43,20,8],tajo_master_heaps:24,tinyint:20,time:[12,13,37,7,43,20],backward:8,cpu:8,unset:12,tajodriv:37},objtypes:{"0":"py:function"},objnames:{"0":["py","function","Python function"]},filenames:["partitioning/hash_partitioning","backup_and_restore","getting_started/local_setup","functions/network_func_and_operators","partitioning/range_partitioning","table_management/csv","sql_language/queries","sql_language/predicates","configuration/worker_configuration","index","functions","sql_language/insert","cli","introduction","getting_started/building","functions/datetime_func_and_operators","table_management/rcfile","partitioning/column_partitioning","table_management/sequenc
 efile","getting_started/downloading_source","sql_language/data_model","getting_started/prerequisites","functions/math_func_and_operators","getting_started","configuration/tajo_master_configuration","configuration/preliminary","table_management/compression","faq","partitioning/intro_to_partitioning","table_partitioning","sql_language/ddl","table_management","table_management/parquet","configuration","backup_and_restore/catalog","functions/string_func_and_operators","hcatalog_integration","jdbc_driver","configuration/cluster_setup","table_management/file_formats","configuration/configuration_defaults","tajo_client_api","configuration/catalog_configuration","getting_started/first_query","sql_language","sql_language/sql_expression"],titles:["Hash Partitioning","Backup and Restore","Setting up a local Tajo cluster","Network Functions and Operators","Range Partitioning","CSV","Queries","Predicates","Worker Configuration","Apache Tajo\u2122 0.8.0 (dev) - User documentation","Functions","IN
 SERT (OVERWRITE) INTO","Tajo Shell (TSQL)","Introduction","Build source code","DateTime Functions and Operators","RCFIle","Column Partitioning","SequenceFile","Dowload and unpack the source code of Apache Tajo","Data Model","Prerequisites","Math Functions and Operators","Getting Started","Tajo Master Configuration","Preliminary","Compression","FAQ","Introduction to Partitioning","Table Partitioning","Data Definition Language","Table Management","Parquet","Configuration","Backup and Restore Catalog","String Functions and Operators","HCatalog Integration","Tajo JDBC Driver","Cluster Setup","File Formats","Configuration Defaults","Tajo Client API","Catalog Configuration","First query execution","SQL Language","SQL Expressions"],objects:{"":{trim:[35,0,1,""],upper:[35,0,1,""],lower:[35,0,1,""],ltrim:[35,0,1,""],btrim:[35,0,1,""],regexp_replace:[35,0,1,""],split_part:[35,0,1,""],char_length:[35,0,1,""],geoip_country_code:[3,0,1,""],rtrim:[35,0,1,""],utc_usec_to:[15,0,1,""],geoip_in_count
 ry:[3,0,1,""]}},titleterms:{code:[14,19],execut:43,queri:[6,43],session:12,permiss:38,hcatalogstor:42,ilik:7,languag:[44,30],xml:25,configur:[40,24,33,42,8],jar:37,overview:6,local:2,match:7,real:20,sourc:[14,19],synopsi:12,string:[7,45,35],format:39,faq:[27,37],insert:11,introduct:[13,28],mysqlstor:42,document:9,like:7,level:34,drop:30,list:[37,6],mode:[38,8],partit:[0,29,17,28,4],each:8,fulli:38,where:6,manag:31,set:[38,2,37],hcatalog:36,dump:34,dowload:19,meta:12,run:8,todo:[0,22,26,27,4,17,41,34],distribut:38,variabl:12,network:3,cast:45,databas:[30,34],definit:30,method:28,math:22,integr:36,hash:0,kei:28,driver:37,usag:12,serializerdeseri:18,base:38,prerequisit:[21,3],releas:19,java:37,rcfile:16,valu:20,launch:38,via:19,tajo:[9,24,37,12,2,38,40,41,25,19],groupbi:6,datetim:15,backup:[1,34],first:43,oper:[22,35,3,15],rang:4,writer:18,apach:[9,19],tajomast:24,number:[20,8],api:41,select:6,size:[24,8],git:19,from:6,memori:[24,8],tsql:12,support:3,doubl:20,overwrit:11,start:23,call:
 45,basic:12,master:[24,40],type:[20,45],"function":[22,10,35,3,15,45],shell:12,claus:6,worker:[40,8],rootdir:24,appendix:37,heap:[24,8],jdbc:37,unpack:19,restor:[1,34],"default":40,setup:38,maximum:8,dev:9,column:17,serd:18,introduc:18,parquet:32,similar:7,creat:30,site:25,indic:9,cluster:[2,38],unresolvedaddressexcept:37,file:[37,39],tabl:[9,29,30,31],dedic:8,minim:37,make:38,get:[37,23,19],classpath:37,how:37,exampl:[37,8],build:14,env:25,csv:5,channel:37,express:[7,45],resourc:8,nio:37,preliminari:25,enter:12,compress:[26,30],catalog:[25,34,42],regular:7,temporari:8,user:9,sql:[44,34,45],data:[20,30,8],parallel:8,task:8,directori:[38,8],predic:7,arithmet:45,client:[37,41],command:12,pattern:7,model:20,sequencefil:18,latest:19}})
\ No newline at end of file
+Search.setIndex({envversion:42,terms:{represent:[32,15],all:[37,12,36,2,38,3,6,7,42,43,34],code:[3,42,23],dist:[37,14,42],c_name:34,rdbm:37,queri:[42,12,8,23],consum:32,function_nam:45,month:15,four:39,scalar:32,concept:37,per:[38,12,8],mno:43,follow:[3,5,7,8,11,12,34,16,17,18,19,20,24,42,28,30,32,14,36,37,38,39,25,43],disk:[3,8],row:[12,32,5,16,7,42,43,18],profil:36,privat:37,depend:[36,7],c_acctbal:34,zone:20,getconnect:37,decim:20,init:43,program:[25,32],drivermanag:37,sql_languag:12,backup_and_restor:12,"case":[25,35,37,16,7,42,18],digit:20,sourc:[12,42,23],string:[10,3,25,15],geoip_country_cod:3,fals:[20,32,3,7],"void":37,csvfile:[43,30,5],volum:[12,43],veri:[2,38,7],affect:[42,8],hdfs_locat:16,exact:20,dbname:12,tri:37,recordcompresswrit:18,list:[9,43,38,12,19],hive:[9,36,42],"try":[37,12],item:7,adjust:[16,24,32,8,5],form:[16,24],progress:13,geoip:3,println:37,tajo_src:14,locat:[11,3,16,30,43,34,8],tajo_hom:[25,37,12,2,38,42,43],dir:[38,11],pleas:[37,2,38,3,5,16,43,32,34,8],u
 pper:35,templet:25,smaller:11,ten:[36,23],shut:23,annot:37,enjoi:43,dowload:23,second:[35,42],cost:13,design:[13,17,25,32],val3:12,val2:[7,12],val1:[7,12],port:[24,37,38,12,42],tmpdir:8,compat:[9,8],index:9,abc:[7,43],addr:[3,42],compar:20,cast:9,section:[36,37,38,5,16,23,8],xtajoxx:35,access:[36,37,38,32],c_phone:34,"19th":15,version:[12,42],directori:42,supplier:12,"new":12,method:[9,34],metadata:16,elimin:30,hash:9,valn:7,gener:[12,5],o_totalpric:12,here:[2,7,43,18],let:18,address:[20,40,38,3,42],path:[24,12,36,37,3,30,42,43,34],o_orderprior:12,strong:34,sinc:16,valu:[25,24,12,36,38,3,42,8],convert:[3,45],aaabc:7,deterior:32,derbi:42,shift:15,larger:[20,32],table_backup:34,precis:20,datetim:10,codebas:19,implement:[2,12],commonli:5,sequencefil:[9,39,16],host2:38,via:[38,23],regardless:[16,32],data_typ:30,apach:[23,3,42,12],modul:[9,25],sign:7,filenam:12,unix:15,"boolean":[20,3,32],namenode_hostnam:24,wip:[42,19],instal:[3,14],total:8,unit:[42,32],from:[9,11,35,37,15,30,7,19,34,43
 ],describ:[36,12],would:[36,14],memori:42,distinct:6,two:[2,28,16,7,25,34,45,18],next:[36,37],connector:[36,42],few:25,call:9,usr:42,compressiontyp:18,type:[9,12,3,15,25,43,34,8,35],until:12,more:[36,32,38,7,42,43,23,8],flat:[16,18,32],discompress:14,l_quantiti:11,hiveql:36,claus:[9,43,17,7,20],tbname:12,trail:35,rootdir:38,appendix:9,name2:12,name3:12,name1:12,known:17,git:42,editor:38,unpack:23,must:[20,24,37,42,32],word:[16,5],hour:15,alia:[20,6,35,15],setup:33,work:[20,12,32],uniqu:12,dev:3,cat:[38,34,43],other:[13,16,11,25,5],archiv:[14,42],can:[25,37,12,36,2,38,43,5,16,17,7,42,19,30,32,14,8,18],parquet:[9,39],purpos:[36,25],root:38,def:43,bewtween:20,dayofweek:15,prompt:12,tab:5,tar:[14,42,19],give:[30,8],process:[12,13,5,16,17,32],share:[16,37],backslash:12,indic:12,tabular:5,liter:[25,5],want:[25,24,37,11,12,36,2,38,17,18,42,43,30,34,8],tcp:12,occur:[16,30,15],timestamptz:20,sundai:15,end:[35,19],goal:13,quot:12,str2:35,str1:35,hdf:[24,12,13,37,38,30,43,23],mapr:18,how:[9,36
 ,2,38,23,8],purg:30,env:42,c_address:34,instead:[12,43,7,5],csv:[9,43,30,12,34],simpl:[2,43],lazybinaryserd:18,blockcompresswrit:18,map:20,recogn:5,ltrim:35,clone:[42,19],after:14,resultset:37,concatnen:35,befor:38,catalog:1,mai:[37,25,15],multipl:[8,5],underscor:7,data:38,parallel:38,physic:38,tajo_worker_heaps:8,"short":16,third:[42,15],host:[37,38,12,42],grant:42,favorit:38,correspond:16,element:25,issu:9,inform:[12,42],preced:3,environ:[25,24,2,36,8],allow:[24,36,32,38,5,16,30,43,8],anoth:45,worst:13,order:[12,13,36,37,38,5,16,6,42,20,17,32,8,43],oper:10,help:12,shorten:42,move:14,midnight:15,becaus:5,comma:5,vari:20,dynam:13,paramet:[35,3,15,16,30,32,5],joda:37,write:[16,32,18,5],style:7,group:[6,7,32],cli:[37,12],html:12,persist:42,main:[13,37,7,25],non:32,lastli:42,"return":[35,15,7,3,5],thei:[16,37,7],timestamp:[20,15],col2:[11,7],col3:11,auth:37,nation:12,framework:32,lzo:32,item_nam:17,binaryserializerdeseri:[16,18],utc_usec_to:15,nor:7,introduct:9,choic:32,hive_jdbc_drive
 r_dir:[36,42],name:[25,24,12,36,37,38,3,5,16,40,6,18,42,43,20,32,34,8],anyth:12,config:[25,12,36,38,40,42,8],drop:9,regexp_replac:35,o_orderd:12,guava:37,easili:[37,38],mode:42,each:[38,12,25],clsspath:37,compatibl:34,side:[37,35],mean:7,compil:[36,14,42],domain:[38,7,5],due:8,replac:35,saturdai:15,procedur:14,"static":37,connect:[36,37,12,42],year:15,resourc:38,o_clerk:12,out:[37,8,19],unquot:12,network:10,space:35,develop:19,o_orderkei:12,xxtajo:35,content:[9,30],hivemetastor:[36,42],current:[24,12,36,37,28,39,7,42,32,8],math:10,integr:42,qualifi:[16,5],advanc:13,manipul:12,given:[11,12,3,30,7,43,20,35],free:43,standard:[13,20,7,43],asc:6,small:[20,32],base:42,asf:[42,19],dictionari:32,org:[12,36,37,5,16,30,18,42,19],"byte":[20,32,12,43],fridai:15,card:17,care:5,val:12,launch:42,guarante:37,could:15,omit:16,retriv:37,filter:7,length:20,rcfile:[9,39],umbil:[40,38],interact:[13,12,43],mvn:[36,37,14,42],first:[35,23,25,15,14,42],feed:5,rang:[9,28],directli:[17,5],resourceconfigur:8,f
 eel:43,onc:[3,19],arrai:7,independ:32,applciat:37,number:[38,12],capac:[38,8],o_com:12,instruct:[36,3,19],set:[12,42],done:14,submit:[23,43],tpch:[12,34],open:13,primari:24,differ:7,"long":[13,15],script:[14,25],licens:12,mkdir:[38,43],system:[37,13,2,38,16,25,14,8],least:7,master:[33,38,12,42],too:32,statement:[11,12,37,5,16,30,7,43,32,18],l_partkei:[30,11],scheme:[24,43],"final":[36,37,42,43],store:[12,13,36,5,16,18,42,43,8],listen:[37,38],mysql_jdbc_driv:[36,42],incub:[37,19],option:[34,30,12,14,15],especi:20,namespac:37,copi:[25,42],inexact:20,specifi:[24,11,12,36,37,3,5,16,7,42,32,20,45,8,18],github:19,mostli:16,rtrim:35,standalon:23,exactli:7,than:[36,11,7,23],target:[37,11,14,42],keyword:18,provid:[37,12,13,2,3,28,5,16,7,42,43,32,39,8,18],remov:[30,35],tree:14,zero:7,charact:[35,5,16,17,7,20],project:[13,32],serd:[16,5],posix:7,balanc:13,entri:[38,30],sec:[42,43],minut:[36,23],schema:[17,32,11,12,43],createdatabaseifnotexist:42,textserd:18,ran:42,respons:[13,43],argument:15,p
 ackag:[36,37,14,42],manner:37,have:[24,36,30,42,19,6],further:5,need:[37,36,2,5,16,18,42,43,32,34],maxmind:3,predic:[9,17,44],featur:[16,36,32,5],date:[20,34],techniqu:13,equival:11,partsupp:12,click:19,note:[7,42],also:[12,43,5,17,18,19,7],without:[13,17,5],build:[23,42],xxxxxx:42,prepar:[36,14,42,43],singl:[38,7,32],begin:[12,15],c_nationkei:34,distribut:42,preliminari:33,buffer:[21,32],price:17,filesystem:[23,19],compress:9,deflatecodec:30,most:[42,5],microsecond:15,plan:[13,17,5],arithmet:9,phase:17,metacharact:7,"class":[36,37,5,16,7,42,18],dai:15,don:[37,34,25],url:[24,37],doc:12,later:23,request:8,uri:42,ghi:43,part:12,snapshot:[37,14,42],"1gb":[24,8],determin:8,pattern:[9,35],querymast:8,think:43,show:[37,12,43,23],text:[12,35,43,38,3,15,16,17,7,20,30,32,5,34,18],hostnam:[24,37,38,12],syntax:45,column_refer:7,identifi:42,ship_dat:17,xml:[38,12,42],absolut:43,onli:[24,25,2,28,15,16,18,42,19,32,5],explicitli:25,ratio:8,just:[34,7,42,12],copyright:12,explain:[37,2,38,5,16,23],c
 onfigur:12,jar:[9,36,42],enough:42,haven:17,score:[16,32,43,18,5],templat:42,local:[23,42,3,8,19],info:40,unresolvedaddressexcept:9,variou:12,get:42,necesssari:37,familiar:[16,32,5],express:[9,35,44],repo:[42,19],cannot:5,java_hom:2,"import":37,increas:38,requir:[2,15],prune:9,bar:5,enabl:[16,32,37,38,5],termin:5,borrow:7,"public":37,remot:38,common:37,partit:9,contain:[38,37,30],where:[9,2,38,17,7,43,20,8],bytea:20,legaci:3,knowledg:5,startup:12,memstor:42,c_custkei:34,see:[2,38,42,8,43],num:12,float8:[20,11,12,34],result:[32,37,43],arg:37,float4:20,close:37,rlike:7,unqualifi:17,o_orderstatu:12,concern:[34,25],slf4j:37,kei:[12,28,16,30,18,17],correctli:20,netti:37,ieee:20,char_length:35,state:43,smallest:32,subdir:11,between:[37,7],"\u02c6ab":35,l_orderkei:[30,11],neither:7,approach:34,across:38,altern:7,smallint:20,accord:20,parent:14,hadoop_hom:[38,2,37],snappi:32,stmt:37,outfil:34,disconnect:12,addit:[37,11,7,8],both:[13,16,35,18,5],last:6,delimit:[43,30,35,5],fault:13,region:12
 ,gzipcodec:30,c_comment:34,groupbi:9,against:23,eta:37,compressioncodec:[16,5],logic:7,improv:32,whole:8,load:[13,43],simpli:14,point:16,overview:9,"4028235e":20,split_part:35,header:16,rpc:[40,37,38],written:[16,18],toler:13,respect:[20,7],tuesdai:15,guid:12,assum:[17,8,43],quit:12,xzvf:[14,19],vertic:5,java:[9,25,36,2,42,21],compos:[17,32],empti:[16,5],json:5,much:8,besid:5,newinst:37,basic:34,columnar:[16,32],c_mktsegment:34,pdist:[36,37,14,42],derbystor:42,o_custkei:12,host1:38,xxx:12,worker:33,search:9,ani:[32,37,7,5],tajojdbccli:37,work1:12,repetit:7,countri:3,those:[16,32,5],emploi:13,bye:43,look:19,implementaion:24,plain:[16,18],servic:[40,42],properti:[24,25,36,38,3,40,42,8],batch:13,durat:42,defin:5,"while":37,abov:[16,18,43],error:30,wild:17,howev:32,lazybinarycolumnarserd:16,textserializerdeseri:[16,18,5],table_nam:[30,34],layer:24,file:[9,24,12,25,36,2,38,3,14,42,19,34,8,43],advantag:32,ctrl:[38,43],key1:12,tabl:[36,34,43,12,23],site:[38,12,42],cluster:[33,42],itself:12
 ,warehous:[13,30],textfil:9,conf:[25,24,37,12,36,2,38,3,42,8],march:15,abcdef:35,conn:37,"null":[37,6],present:23,"35b":43,minim:9,suggest:19,format:[9,31,5],classpath:9,hadoop:[37,13,2,38,5,16,30,7,25,43,32,21,23,18],same:[36,15,3,7],binari:[16,14,18,42,20],instanc:[24,2,38,23],split:35,document:[12,19],log4j:37,log4j12:37,higher:21,week:15,varbit:20,lineitem:[30,11,12],http:[40,19,3,42,12],nio:9,optim:13,nest:32,driver:[9,36,42],geoip_in_countri:3,decompress:[32,19],jkl:43,column_nam:30,moment:[32,5],dskiptest:[36,37,14,42],user:[24,43,12,8,42],tajo_classpath:42,mani:[16,37,8],extern:[30,16,17,34,43],col1:[11,7],tsql:34,chang:[37,38,12,42,19],built:[16,37],lower:35,appropri:[25,43],lib:42,mention:16,com:[38,3],well:[30,11,20],parenthes:7,inherit:[16,5],exampl:[38,34],command:34,thi:[32,25,36,37,38,14,15,18,42,19,43,34,5,23,8,7],choos:[14,19],gzip:32,model:[9,44],usual:[37,25,5],explan:43,protocol:21,execut:[38,12,8,23],when:[30,32,37,11,5],dtar:[36,37,14,42],rest:19,detail:[32,38,
 42,43],q_1363768615503_0001_000001:43,mysql:[36,42],simultan:8,yet:17,languag:[13,9,12,43],inet4:[20,3],web:13,now:[12,14,16,32,5,34],all_backup:34,easi:[34,28],field:[35,5],disk1:8,disk2:8,disk3:8,except:37,littl:38,add:[2,25,36,37,38,30,42,8],codec:[16,30,18,5],blob:20,exercis:23,primit:20,input:35,approxim:20,match:[9,35,20],take:[36,23,19],bin:[37,12,36,2,38,42,43,34],varchar:20,which:[12,13,37,16,30,18,42,43],serde2:[16,18],read:[32,19,38,5],bzip2codec:30,workload:8,phcatalog:[36,42],chmod:38,lineitem_100_snappi:30,insert:[9,44],bit:20,password:42,desc:6,u0001:5,storage_typ:30,table1:[37,5,16,7,43,32,18],resid:43,like:[24,37,17,14],specif:[20,24,11],substr:35,should:[24,32,12,36,37,38,3,5,16,17,42,43,30,34,23,8],reoptim:13,manual:16,integ:[20,8],server:[25,42],query_succeed:43,collect:18,benefit:28,api:[9,37],necessari:[13,37],either:[20,36,32,7,19],output:[11,18],timetz:20,tajo:34,page:[9,38,32],versa:16,order_d:17,createstat:37,"1st":15,some:[36,37,5,16,25,43,20,32,14],tajowo
 rk:[25,8],certain:[16,32,11,5],intern:25,previou:7,"export":[36,2,42],mirror:19,proper:[24,14,8],home:[36,2,34,42,43],tmp:[24,43],separ:[18,5],scale:13,lead:[17,35],channel:9,"512mb":8,avoid:[13,30],exec:18,definit:[9,44],best:5,tracker:38,exit:43,select:[9,11,35,37,3,15,7,43],bigint:[20,30],ddl:[34,43],condit:6,fornam:37,localhost:[38,40,30,12,42],refer:[37,38,3,5,16,18,43,32],machin:[12,42],core:8,who:5,run:[38,12],power:7,uncompress:[18,32],ab_bc_cd:35,usag:34,hive_hom:[36,42],maven:[36,37],broken:5,step:14,character_length:35,repositori:19,peer:40,regexp:7,unicod:20,stage:43,comparison:[20,7],postgresql:7,column:9,varbinari:20,manag:[9,36,28],degre:38,regular:35,disabl:[16,32,5],block:[18,32],columnarserd:16,client:[9,40,38,12],own:5,effici:32,"float":[32,5,16,17,18,43,20,8],encod:32,lazybinari:18,three:[18,42],down:[7,23],pair:[16,18,25],storag:[24,37,5,16,18,42,32,34],your:[24,25,36,37,38,14,23,16,42,19,32,5,34],jdbc:[9,36,42],log:37,wai:[34,5],support:[24,34,42],overwrit:[9,4
 4],custom:[12,42,34],avail:[37,8,32],start:[38,35],current_databas:12,interfac:5,includ:[37,8],unix_timestamp:15,ipv4:[20,3],"function":12,snappycodec:[16,30,18,5],untar:19,tajoxx:35,l_comment:30,int4:[20,11,12,35],link:19,select_stat:30,int2:20,line:[38,5],"true":[3,18,42,32,20,8,7],int8:[20,11,12,34],count:35,"throw":37,consist:[16,18,25,5],possibl:8,whether:[16,7],ecosystem:32,day_of_week:15,record:[16,32,7,18,5],below:[37,23,16,32,5,34],limit:[8,32],otherwis:[12,3,7],executequeri:37,defaultport:42,trim:35,binaryserd:18,creat:[9,12,38,23,42,43,34],btrim:35,"int":[35,32,15,16,17,18,43,20,5],sequecnefil:18,dure:17,forgot:37,repres:[15,5],"char":20,exist:[30,16,11,12],o_shipprior:12,our:5,download:[3,19],doe:[32,37,30,5],check:19,denot:7,know:[38,43],percent:7,tinyint:20,getstr:37,"default":[33,38,12,42],bracket:7,bool:20,special:5,varieti:25,test:[37,12,42],you:[2,3,5,7,15,8,11,12,34,16,17,18,19,23,24,42,30,32,14,36,37,38,25,43],node:[37,38],relat:32,database_nam:[30,12],clean:[36,
 37,14,42],customserializerdeseri:5,sequenc:7,insensit:7,tajo_dump:34,protobuf:37,tajomast:[25,38,12,42],hurt:13,lang:37,algorithm:[16,32,8,5],vice:16,hyunsik:[12,34],descript:[12,40,7,42,43,20,8],tajo_master_heaps:24,carriag:5,time:[12,13,37,7,43,20],backward:8,cpu:8,unset:12,tajodriv:37},objtypes:{"0":"py:function"},objnames:{"0":["py","function","Python function"]},filenames:["partitioning/hash_partitioning","backup_and_restore","getting_started/local_setup","functions/network_func_and_operators","partitioning/range_partitioning","table_management/csv","sql_language/queries","sql_language/predicates","configuration/worker_configuration","index","functions","sql_language/insert","cli","introduction","getting_started/building","functions/datetime_func_and_operators","table_management/rcfile","partitioning/column_partitioning","table_management/sequencefile","getting_started/downloading_source","sql_language/data_model","getting_started/prerequisites","functions/math_func_and_operato
 rs","getting_started","configuration/tajo_master_configuration","configuration/preliminary","table_management/compression","faq","partitioning/intro_to_partitioning","table_partitioning","sql_language/ddl","table_management","table_management/parquet","configuration","backup_and_restore/catalog","functions/string_func_and_operators","hcatalog_integration","jdbc_driver","configuration/cluster_setup","table_management/file_formats","configuration/configuration_defaults","tajo_client_api","configuration/catalog_configuration","getting_started/first_query","sql_language","sql_language/sql_expression"],titles:["Hash Partitioning","Backup and Restore","Setting up a local Tajo cluster","Network Functions and Operators","Range Partitioning","CSV (TextFile)","Queries","Predicates","Worker Configuration","Apache Tajo\u2122 0.8.0 (dev) - User documentation","Functions","INSERT (OVERWRITE) INTO","Tajo Shell (TSQL)","Introduction","Build source code","DateTime Functions and Operators","RCFile","
 Column Partitioning","SequenceFile","Dowload and unpack the source code of Apache Tajo","Data Model","Prerequisites","Math Functions and Operators","Getting Started","Tajo Master Configuration","Preliminary","Compression","FAQ","Introduction to Partitioning","Table Partitioning","Data Definition Language","Table Management","Parquet","Configuration","Backup and Restore Catalog","String Functions and Operators","HCatalog Integration","Tajo JDBC Driver","Cluster Setup","File Formats","Configuration Defaults","Tajo Client API","Catalog Configuration","First query execution","SQL Language","SQL Expressions"],objects:{"":{trim:[35,0,1,""],upper:[35,0,1,""],lower:[35,0,1,""],ltrim:[35,0,1,""],btrim:[35,0,1,""],regexp_replace:[35,0,1,""],split_part:[35,0,1,""],char_length:[35,0,1,""],geoip_country_code:[3,0,1,""],rtrim:[35,0,1,""],utc_usec_to:[15,0,1,""],geoip_in_country:[3,0,1,""]}},titleterms:{code:[14,19],execut:43,queri:[6,43],session:12,permiss:38,hcatalogstor:42,ilik:7,languag:[44,30
 ],xml:25,writer:18,jar:37,overview:6,local:2,match:7,real:20,sourc:[14,19],synopsi:12,string:[7,45,35],format:39,handl:5,faq:[27,37],insert:11,introduct:[13,28],mysqlstor:42,document:9,like:7,level:34,drop:30,list:[37,6],hive:[16,32,17,5],mode:[38,8],partit:[0,29,17,28,4],each:8,fulli:38,where:6,manag:31,set:[38,2,37],hcatalog:36,dump:34,dowload:19,meta:12,run:8,todo:[0,22,26,27,4,41,34],distribut:38,variabl:12,network:3,cast:45,databas:[30,34],preliminari:25,definit:30,directori:[38,8],method:28,math:22,integr:36,hash:0,driver:37,gener:16,usag:12,serializerdeseri:18,base:38,prerequisit:[21,3],releas:19,java:37,rcfile:16,valu:[20,5],launch:38,via:19,tajo:[9,24,37,12,2,38,16,40,41,25,19],groupbi:6,datetim:15,backup:[1,34],first:43,oper:[22,35,3,15],rang:4,configur:[40,24,33,42,8],apach:[9,19,5,16,17,32],tajomast:24,number:[20,8],api:41,select:6,size:[24,8],git:19,from:6,memori:[24,8],tsql:12,support:3,doubl:20,overwrit:11,custom:5,start:23,call:45,basic:12,master:[24,40],type:[20,45]
 ,"function":[22,10,35,3,15,45],shell:12,claus:6,worker:[40,38,8],rootdir:24,appendix:37,heap:[24,8],jdbc:37,unpack:19,prune:17,restor:[1,34],"default":40,setup:38,properti:[16,32,5],maximum:8,dev:9,column:17,serd:18,introduc:18,parquet:32,similar:7,creat:[30,16,32,17,5],site:25,indic:9,cluster:[2,38],unresolvedaddressexcept:37,textfil:5,file:[37,39],tabl:[9,5,16,29,17,31,32,30],dedic:8,"null":5,serial:[16,5],issu:[16,32,17,5],minim:37,make:38,get:[37,23,19],when:16,how:[16,32,37,17,5],exampl:[37,8],build:14,env:25,csv:5,channel:37,compat:[16,32,17,5],express:[7,45],resourc:8,nio:37,classpath:37,enter:12,compress:[26,30],catalog:[25,34,42],regular:7,temporari:8,user:9,sql:[44,34,45],data:[20,30,8],parallel:8,physic:[16,32,5],task:8,read:16,predic:7,arithmet:45,client:[37,41],command:12,pattern:7,model:20,sequencefil:18,latest:19}})
\ No newline at end of file