You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@druid.apache.org by jo...@apache.org on 2020/10/09 06:36:19 UTC

[druid-website-src] branch 20rc2_update created (now bd206f9)

This is an automated email from the ASF dual-hosted git repository.

jonwei pushed a change to branch 20rc2_update
in repository https://gitbox.apache.org/repos/asf/druid-website-src.git.


      at bd206f9  0.20.0-rc2 updates

This branch includes the following new commits:

     new bd206f9  0.20.0-rc2 updates

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@druid.apache.org
For additional commands, e-mail: commits-help@druid.apache.org


[druid-website-src] 01/01: 0.20.0-rc2 updates

Posted by jo...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

jonwei pushed a commit to branch 20rc2_update
in repository https://gitbox.apache.org/repos/asf/druid-website-src.git

commit bd206f9f00961d660fc43f6b38d98b6c9dbd261e
Author: jon-wei <jo...@imply.io>
AuthorDate: Thu Oct 8 23:36:01 2020 -0700

    0.20.0-rc2 updates
---
 docs/0.20.0/development/extensions-core/avro.html | 13 ++++++++++++-
 docs/0.20.0/ingestion/data-formats.html           |  9 +++++++++
 docs/latest/development/extensions-core/avro.html | 13 ++++++++++++-
 docs/latest/ingestion/data-formats.html           |  9 +++++++++
 4 files changed, 42 insertions(+), 2 deletions(-)

diff --git a/docs/0.20.0/development/extensions-core/avro.html b/docs/0.20.0/development/extensions-core/avro.html
index b12a1e7..38a808e 100644
--- a/docs/0.20.0/development/extensions-core/avro.html
+++ b/docs/0.20.0/development/extensions-core/avro.html
@@ -82,8 +82,19 @@
 two Avro Parsers for stream ingestion and Hadoop batch ingestion.
 See <a href="/docs/0.20.0/ingestion/data-formats.html#avro-hadoop-parser">Avro Hadoop Parser</a> and <a href="/docs/0.20.0/ingestion/data-formats.html#avro-stream-parser">Avro Stream Parser</a>
 for more details about how to use these in an ingestion spec.</p>
+<p>Additionally, it provides an InputFormat for reading Avro OCF files when using
+<a href="/docs/0.20.0/ingestion/native-batch.html">native batch indexing</a>, see <a href="/docs/0.20.0/ingestion/data-formats.html#avro-ocf">Avro OCF</a>
+for details on how to ingest OCF files.</p>
 <p>Make sure to <a href="/docs/0.20.0/development/extensions.html#loading-extensions">include</a> <code>druid-avro-extensions</code> as an extension.</p>
-</span></div></article></div><div class="docs-prevnext"><a class="docs-prev button" href="/docs/0.20.0/development/extensions-core/approximate-histograms.html"><span class="arrow-prev">← </span><span>Approximate Histogram aggregators</span></a><a class="docs-next button" href="/docs/0.20.0/development/extensions-core/azure.html"><span>Microsoft Azure</span><span class="arrow-next"> →</span></a></div></div></div><nav class="onPageNav"><ul class="toc-headings"><li><a href="#avro-extension" [...]
+<h3><a class="anchor" aria-hidden="true" id="avro-types"></a><a href="#avro-types" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1 [...]
+<p>Druid supports most Avro types natively, there are however some exceptions which are detailed here.</p>
+<p><code>union</code> types which aren't of the form <code>[null, otherType]</code> aren't supported at this time.</p>
+<p><code>bytes</code> and <code>fixed</code> Avro types will be returned by default as base64 encoded strings unless the <code>binaryAsString</code> option is enabled on the Avro parser.
+This setting will decode these types as UTF-8 strings.</p>
+<p><code>enum</code> types will be returned as <code>string</code> of the enum symbol.</p>
+<p><code>record</code> and <code>map</code> types representing nested data can be ingested using <a href="/docs/0.20.0/ingestion/data-formats.html#flattenspec">flattenSpec</a> on the parser.</p>
+<p>Druid doesn't currently support Avro logical types, they will be ignored and fields will be handled according to the underlying primitive type.</p>
+</span></div></article></div><div class="docs-prevnext"><a class="docs-prev button" href="/docs/0.20.0/development/extensions-core/approximate-histograms.html"><span class="arrow-prev">← </span><span>Approximate Histogram aggregators</span></a><a class="docs-next button" href="/docs/0.20.0/development/extensions-core/azure.html"><span>Microsoft Azure</span><span class="arrow-next"> →</span></a></div></div></div><nav class="onPageNav"><ul class="toc-headings"><li><a href="#avro-extension" [...]
                 document.addEventListener('keyup', function(e) {
                   if (e.target !== document.body) {
                     return;
diff --git a/docs/0.20.0/ingestion/data-formats.html b/docs/0.20.0/ingestion/data-formats.html
index 7846c68..9553c2b 100644
--- a/docs/0.20.0/ingestion/data-formats.html
+++ b/docs/0.20.0/ingestion/data-formats.html
@@ -262,6 +262,9 @@ please read <a href="/docs/0.20.0/development/extensions-core/orc.html#migration
 <blockquote>
 <p>You need to include the <a href="/docs/0.20.0/development/extensions-core/avro.html"><code>druid-avro-extensions</code></a> as an extension to use the Avro OCF input format.</p>
 </blockquote>
+<blockquote>
+<p>See the <a href="/docs/0.20.0/development/extensions-core/avro.html#avro-types">Avro Types</a> section for how Avro types are handled in Druid</p>
+</blockquote>
 <p>The <code>inputFormat</code> to load data of Avro OCF format. An example is:</p>
 <pre><code class="hljs css language-json">"ioConfig": {
   "inputFormat": {
@@ -383,6 +386,9 @@ Each line can be further parsed using <a href="#parsespec"><code>parseSpec</code
 <blockquote>
 <p>You need to include the <a href="/docs/0.20.0/development/extensions-core/avro.html"><code>druid-avro-extensions</code></a> as an extension to use the Avro Hadoop Parser.</p>
 </blockquote>
+<blockquote>
+<p>See the <a href="/docs/0.20.0/development/extensions-core/avro.html#avro-types">Avro Types</a> section for how Avro types are handled in Druid</p>
+</blockquote>
 <p>This parser is for <a href="/docs/0.20.0/ingestion/hadoop.html">Hadoop batch ingestion</a>.
 The <code>inputFormat</code> of <code>inputSpec</code> in <code>ioConfig</code> must be set to <code>&quot;org.apache.druid.data.input.avro.AvroValueInputFormat&quot;</code>.
 You may want to set Avro reader's schema in <code>jobProperties</code> in <code>tuningConfig</code>,
@@ -880,6 +886,9 @@ an explicitly defined <a href="http://www.joda.org/joda-time/apidocs/org/joda/ti
 <blockquote>
 <p>You need to include the <a href="/docs/0.20.0/development/extensions-core/avro.html"><code>druid-avro-extensions</code></a> as an extension to use the Avro Stream Parser.</p>
 </blockquote>
+<blockquote>
+<p>See the <a href="/docs/0.20.0/development/extensions-core/avro.html#avro-types">Avro Types</a> section for how Avro types are handled in Druid</p>
+</blockquote>
 <p>This parser is for <a href="/docs/0.20.0/ingestion/index.html#streaming">stream ingestion</a> and reads Avro data from a stream directly.</p>
 <table>
 <thead>
diff --git a/docs/latest/development/extensions-core/avro.html b/docs/latest/development/extensions-core/avro.html
index 1b8a32b..32e689b 100644
--- a/docs/latest/development/extensions-core/avro.html
+++ b/docs/latest/development/extensions-core/avro.html
@@ -82,8 +82,19 @@
 two Avro Parsers for stream ingestion and Hadoop batch ingestion.
 See <a href="/docs/latest/ingestion/data-formats.html#avro-hadoop-parser">Avro Hadoop Parser</a> and <a href="/docs/latest/ingestion/data-formats.html#avro-stream-parser">Avro Stream Parser</a>
 for more details about how to use these in an ingestion spec.</p>
+<p>Additionally, it provides an InputFormat for reading Avro OCF files when using
+<a href="/docs/latest/ingestion/native-batch.html">native batch indexing</a>, see <a href="/docs/latest/ingestion/data-formats.html#avro-ocf">Avro OCF</a>
+for details on how to ingest OCF files.</p>
 <p>Make sure to <a href="/docs/latest/development/extensions.html#loading-extensions">include</a> <code>druid-avro-extensions</code> as an extension.</p>
-</span></div></article></div><div class="docs-prevnext"><a class="docs-prev button" href="/docs/latest/development/extensions-core/approximate-histograms.html"><span class="arrow-prev">← </span><span>Approximate Histogram aggregators</span></a><a class="docs-next button" href="/docs/latest/development/extensions-core/azure.html"><span>Microsoft Azure</span><span class="arrow-next"> →</span></a></div></div></div><nav class="onPageNav"><ul class="toc-headings"><li><a href="#avro-extension" [...]
+<h3><a class="anchor" aria-hidden="true" id="avro-types"></a><a href="#avro-types" aria-hidden="true" class="hash-link"><svg class="hash-link-icon" aria-hidden="true" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1 [...]
+<p>Druid supports most Avro types natively, there are however some exceptions which are detailed here.</p>
+<p><code>union</code> types which aren't of the form <code>[null, otherType]</code> aren't supported at this time.</p>
+<p><code>bytes</code> and <code>fixed</code> Avro types will be returned by default as base64 encoded strings unless the <code>binaryAsString</code> option is enabled on the Avro parser.
+This setting will decode these types as UTF-8 strings.</p>
+<p><code>enum</code> types will be returned as <code>string</code> of the enum symbol.</p>
+<p><code>record</code> and <code>map</code> types representing nested data can be ingested using <a href="/docs/latest/ingestion/data-formats.html#flattenspec">flattenSpec</a> on the parser.</p>
+<p>Druid doesn't currently support Avro logical types, they will be ignored and fields will be handled according to the underlying primitive type.</p>
+</span></div></article></div><div class="docs-prevnext"><a class="docs-prev button" href="/docs/latest/development/extensions-core/approximate-histograms.html"><span class="arrow-prev">← </span><span>Approximate Histogram aggregators</span></a><a class="docs-next button" href="/docs/latest/development/extensions-core/azure.html"><span>Microsoft Azure</span><span class="arrow-next"> →</span></a></div></div></div><nav class="onPageNav"><ul class="toc-headings"><li><a href="#avro-extension" [...]
                 document.addEventListener('keyup', function(e) {
                   if (e.target !== document.body) {
                     return;
diff --git a/docs/latest/ingestion/data-formats.html b/docs/latest/ingestion/data-formats.html
index efe7249..192e9c0 100644
--- a/docs/latest/ingestion/data-formats.html
+++ b/docs/latest/ingestion/data-formats.html
@@ -262,6 +262,9 @@ please read <a href="/docs/latest/development/extensions-core/orc.html#migration
 <blockquote>
 <p>You need to include the <a href="/docs/latest/development/extensions-core/avro.html"><code>druid-avro-extensions</code></a> as an extension to use the Avro OCF input format.</p>
 </blockquote>
+<blockquote>
+<p>See the <a href="/docs/latest/development/extensions-core/avro.html#avro-types">Avro Types</a> section for how Avro types are handled in Druid</p>
+</blockquote>
 <p>The <code>inputFormat</code> to load data of Avro OCF format. An example is:</p>
 <pre><code class="hljs css language-json">"ioConfig": {
   "inputFormat": {
@@ -383,6 +386,9 @@ Each line can be further parsed using <a href="#parsespec"><code>parseSpec</code
 <blockquote>
 <p>You need to include the <a href="/docs/latest/development/extensions-core/avro.html"><code>druid-avro-extensions</code></a> as an extension to use the Avro Hadoop Parser.</p>
 </blockquote>
+<blockquote>
+<p>See the <a href="/docs/latest/development/extensions-core/avro.html#avro-types">Avro Types</a> section for how Avro types are handled in Druid</p>
+</blockquote>
 <p>This parser is for <a href="/docs/latest/ingestion/hadoop.html">Hadoop batch ingestion</a>.
 The <code>inputFormat</code> of <code>inputSpec</code> in <code>ioConfig</code> must be set to <code>&quot;org.apache.druid.data.input.avro.AvroValueInputFormat&quot;</code>.
 You may want to set Avro reader's schema in <code>jobProperties</code> in <code>tuningConfig</code>,
@@ -880,6 +886,9 @@ an explicitly defined <a href="http://www.joda.org/joda-time/apidocs/org/joda/ti
 <blockquote>
 <p>You need to include the <a href="/docs/latest/development/extensions-core/avro.html"><code>druid-avro-extensions</code></a> as an extension to use the Avro Stream Parser.</p>
 </blockquote>
+<blockquote>
+<p>See the <a href="/docs/latest/development/extensions-core/avro.html#avro-types">Avro Types</a> section for how Avro types are handled in Druid</p>
+</blockquote>
 <p>This parser is for <a href="/docs/latest/ingestion/index.html#streaming">stream ingestion</a> and reads Avro data from a stream directly.</p>
 <table>
 <thead>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@druid.apache.org
For additional commands, e-mail: commits-help@druid.apache.org