You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@arrow.apache.org by we...@apache.org on 2019/08/13 13:11:08 UTC

[arrow] branch master updated: ARROW-5746: [Website] Move website source out of apache/arrow

This is an automated email from the ASF dual-hosted git repository.

wesm pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/arrow.git


The following commit(s) were added to refs/heads/master by this push:
     new 84254c0  ARROW-5746: [Website] Move website source out of apache/arrow
84254c0 is described below

commit 84254c029924af3baa5c8d85b7f2c49671b2e017
Author: Neal Richardson <ne...@gmail.com>
AuthorDate: Tue Aug 13 08:10:51 2019 -0500

    ARROW-5746: [Website] Move website source out of apache/arrow
    
    Companion to https://github.com/apache/arrow-site/pull/9
    
    In addition to `rm -rf site`, this patch edits a few scripts that reference the site directory. The least invasive approach I found was to add `site/` to `.gitignore` and recommend that folks can clone `apache/arrow-site` there, at which point existing scripts should mostly just work.
    
    There were also a few misspellings that incorrectly matched `grep site` when I was searching for those places to edit.
    
    Closes #5015 from nealrichardson/move-website-source and squashes the following commits:
    
    c23a26951 <Neal Richardson> Add this dockerignore back in case people use the site dir locally
    e811f94ba <Neal Richardson> Resolve TODOs by recommending cloning arrow-site into the (now gitignored) site directory
    96ba26811 <Neal Richardson> Remove website source, note a few TODOs for doc building; fix some typos that 'grep site' shouldn't have hit
    
    Authored-by: Neal Richardson <ne...@gmail.com>
    Signed-off-by: Wes McKinney <we...@apache.org>
---
 .dockerignore                                      |   2 +-
 .gitignore                                         |   2 +
 ci/detect-changes.py                               |   6 +-
 ci/docker_build_sphinx.sh                          |   5 +-
 cpp/src/parquet/schema-test.cc                     |   2 +-
 dev/gen_apidocs/create_documents.sh                |   1 +
 dev/release/post-03-website.sh                     |   1 +
 dev/run_site/run_site.sh                           |   1 +
 go/arrow/ipc/file_reader.go                        |   2 +-
 .../org/apache/arrow/memory/AllocationManager.java |   2 +-
 js/README.md                                       |   2 +-
 python/manylinux1/build_arrow.sh                   |   2 +-
 python/manylinux2010/build_arrow.sh                |   2 +-
 site/.gitignore                                    |   9 -
 site/Gemfile                                       |  26 -
 site/README.md                                     | 124 ----
 site/_config.yml                                   |  54 --
 site/_data/contributors.yml                        |  52 --
 site/_data/versions.yml                            |  31 -
 site/_docs/.gitignore                              |   1 -
 site/_docs/c_glib/index.md                         |  36 -
 site/_includes/analytics.html                      |   9 -
 site/_includes/blog_entry.html                     |  20 -
 site/_includes/blog_header.html                    |  59 --
 site/_includes/footer.html                         |   6 -
 site/_includes/header.html                         |  86 ---
 site/_includes/top.html                            |  21 -
 site/_layouts/default.html                         |  16 -
 site/_layouts/post.html                            |  22 -
 site/_posts/2017-05-08-0.3-release-japanese.md     | 189 -----
 site/_posts/2017-05-08-0.3-release.md              | 265 -------
 site/_posts/2017-05-23-0.4.0-release.md            | 111 ---
 site/_posts/2017-06-14-0.4.1-release.md            |  67 --
 site/_posts/2017-06-16-turbodbc-arrow.md           | 117 ----
 site/_posts/2017-07-24-0.5.0-release.md            | 114 ----
 site/_posts/2017-07-26-spark-arrow.md              | 158 -----
 .../2017-08-08-plasma-in-memory-object-store.md    | 150 ----
 site/_posts/2017-08-16-0.6.0-release.md            | 112 ---
 site/_posts/2017-09-19-0.7.0-release.md            | 190 ------
 ...fast-python-serialization-with-ray-and-arrow.md | 308 ---------
 site/_posts/2017-12-19-0.8.0-release.md            | 192 ------
 site/_posts/2017-12-19-java-vector-improvements.md | 104 ---
 site/_posts/2018-03-22-0.9.0-release.md            | 100 ---
 site/_posts/2018-03-22-go-code-donation.md         |  82 ---
 site/_posts/2018-07-20-jemalloc.md                 | 141 ----
 site/_posts/2018-08-07-0.10.0-release.md           |  65 --
 site/_posts/2018-10-09-0.11.0-release.md           | 119 ----
 site/_posts/2018-12-05-gandiva-donation.md         |  93 ---
 site/_posts/2019-01-21-0.12.0-release.md           | 184 -----
 site/_posts/2019-01-25-r-spark-improvements.md     | 198 ------
 site/_posts/2019-02-04-datafusion-donation.md      | 119 ----
 .../_posts/2019-02-05-python-string-memory-0.12.md | 247 -------
 site/_posts/2019-04-02-0.13.0-release.md           | 187 -----
 site/_posts/2019-07-08-0.14.0-release.md           | 300 --------
 site/_posts/2019-08-01-r-package-on-cran.md        | 173 -----
 site/_release/0.1.0.md                             | 236 -------
 site/_release/0.10.0.md                            | 612 -----------------
 site/_release/0.11.0.md                            | 427 ------------
 site/_release/0.11.1.md                            |  52 --
 site/_release/0.12.0.md                            | 736 --------------------
 site/_release/0.13.0.md                            | 715 -------------------
 site/_release/0.14.0.md                            | 760 ---------------------
 site/_release/0.14.1.md                            | 133 ----
 site/_release/0.2.0.md                             | 255 -------
 site/_release/0.3.0.md                             | 365 ----------
 site/_release/0.4.0.md                             | 138 ----
 site/_release/0.4.1.md                             |  98 ---
 site/_release/0.5.0.md                             | 203 ------
 site/_release/0.6.0.md                             | 156 -----
 site/_release/0.7.0.md                             | 205 ------
 site/_release/0.7.1.md                             |  91 ---
 site/_release/0.8.0.md                             | 370 ----------
 site/_release/0.9.0.md                             | 354 ----------
 site/_release/index.md                             |  65 --
 site/_sass/_font-awesome.scss                      |  25 -
 site/arrow.rdf                                     |  62 --
 .../arrow_object.png                               | Bin 10539 -> 0 bytes
 .../python_object.png                              | Bin 24165 -> 0 bytes
 .../speedups0.png                                  | Bin 4050 -> 0 bytes
 .../speedups1.png                                  | Bin 4502 -> 0 bytes
 .../speedups2.png                                  | Bin 4276 -> 0 bytes
 .../speedups3.png                                  | Bin 13637 -> 0 bytes
 site/assets/javascript/main.js                     |  18 -
 site/blog.html                                     |  30 -
 site/committers.html                               | 270 --------
 site/css/main.scss                                 |   5 -
 site/css/syntax.css                                | 216 ------
 site/faq.md                                        |  53 --
 site/img/20190205-arrow-string.png                 | Bin 4127 -> 0 bytes
 site/img/20190205-numpy-string.png                 | Bin 13714 -> 0 bytes
 site/img/arrow-inverse-300px.png                   | Bin 10756 -> 0 bytes
 site/img/arrow-inverse.png                         | Bin 26443 -> 0 bytes
 site/img/arrow-r-spark-collecting.png              | Bin 12026 -> 0 bytes
 site/img/arrow-r-spark-copying.png                 | Bin 11921 -> 0 bytes
 site/img/arrow-r-spark-transforming.png            | Bin 9176 -> 0 bytes
 site/img/arrow.png                                 | Bin 21636 -> 0 bytes
 site/img/asf_logo.svg                              | 210 ------
 site/img/copy.png                                  | Bin 64271 -> 0 bytes
 site/img/native_go_implementation.png              | Bin 56186 -> 0 bytes
 site/img/shared.png                                | Bin 20925 -> 0 bytes
 site/img/simd.png                                  | Bin 101031 -> 0 bytes
 site/img/turbodbc_arrow.png                        | Bin 75697 -> 0 bytes
 site/index.html                                    |  72 --
 site/install.md                                    | 249 -------
 site/powered_by.md                                 | 176 -----
 site/scripts/sync_format_docs.sh                   |  24 -
 106 files changed, 17 insertions(+), 12051 deletions(-)

diff --git a/.dockerignore b/.dockerignore
index 20098d0..16bdebb 100644
--- a/.dockerignore
+++ b/.dockerignore
@@ -92,5 +92,5 @@ js/test/data/**/*.arrow
 # Rust
 rust/target
 
-# site
+# Site
 site/
diff --git a/.gitignore b/.gitignore
index 21e8f34..bad26c4 100644
--- a/.gitignore
+++ b/.gitignore
@@ -64,3 +64,5 @@ docker_cache
 .gdb_history
 .DS_Store
 *.orig
+
+site/
diff --git a/ci/detect-changes.py b/ci/detect-changes.py
index 9194502..d35d9b5 100644
--- a/ci/detect-changes.py
+++ b/ci/detect-changes.py
@@ -132,7 +132,7 @@ def list_appveyor_affected_files():
 LANGUAGE_TOPICS = ['c_glib', 'cpp', 'docs', 'go', 'java', 'js', 'python',
                    'r', 'ruby', 'rust', 'csharp']
 
-ALL_TOPICS = LANGUAGE_TOPICS + ['integration', 'site', 'dev']
+ALL_TOPICS = LANGUAGE_TOPICS + ['integration', 'dev']
 
 
 AFFECTED_DEPENDENCIES = {
@@ -147,7 +147,7 @@ AFFECTED_DEPENDENCIES = {
 }
 
 COMPONENTS = {'cpp', 'java', 'c_glib', 'r', 'ruby', 'integration', 'js',
-              'rust', 'csharp', 'site', 'go', 'docs', 'python', 'dev'}
+              'rust', 'csharp', 'go', 'docs', 'python', 'dev'}
 
 
 def get_affected_topics(affected_files):
@@ -270,7 +270,6 @@ def test_get_affected_topics():
         'rust': False,
         'csharp': False,
         'integration': True,
-        'site': False,
         'dev': False
     }
 
@@ -288,7 +287,6 @@ def test_get_affected_topics():
         'rust': True,
         'csharp': True,
         'integration': True,
-        'site': False,
         'dev': False
     }
 
diff --git a/ci/docker_build_sphinx.sh b/ci/docker_build_sphinx.sh
index 4a65f81..4f36910 100755
--- a/ci/docker_build_sphinx.sh
+++ b/ci/docker_build_sphinx.sh
@@ -24,5 +24,6 @@ popd
 
 sphinx-build -b html /arrow/docs/source /arrow/docs/_build/html
 
-mkdir -p /arrow/site/asf-site/docs/latest
-rsync -r /arrow/docs/_build/html/ /arrow/site/asf-site/docs/latest/
+git clone -b asf-site https://github.com/apache/arrow-site /arrow/site
+mkdir -p /arrow/site/docs/latest
+rsync -r /arrow/docs/_build/html/ /arrow/site/docs/latest/
diff --git a/cpp/src/parquet/schema-test.cc b/cpp/src/parquet/schema-test.cc
index 023a1b0..511f09a 100644
--- a/cpp/src/parquet/schema-test.cc
+++ b/cpp/src/parquet/schema-test.cc
@@ -445,7 +445,7 @@ TEST_F(TestSchemaConverter, NestedExample) {
 
   ASSERT_TRUE(schema->Equals(group_));
 
-  // Check that the parent relationship in each node is consitent
+  // Check that the parent relationship in each node is consistent
   ASSERT_EQ(group_->parent(), nullptr);
   ASSERT_TRUE(check_for_parent_consistency(group_));
 }
diff --git a/dev/gen_apidocs/create_documents.sh b/dev/gen_apidocs/create_documents.sh
index c946f48..142e6e0 100755
--- a/dev/gen_apidocs/create_documents.sh
+++ b/dev/gen_apidocs/create_documents.sh
@@ -30,6 +30,7 @@ export CXXFLAGS="-D_GLIBCXX_USE_CXX11_ABI=0"
 
 # Make Java documentation
 # Override user.home to cache dependencies outside the Docker container
+# NB: this assumes that you have arrow-site cloned in the (gitignored) site directory
 pushd arrow/java
 mvn -Duser.home=`pwd`/.apidocs-m2 -Drat.skip=true -Dcheckstyle.skip=true install site
 mkdir -p ../site/asf-site/docs/java/
diff --git a/dev/release/post-03-website.sh b/dev/release/post-03-website.sh
index 826dc4e..db1374f 100755
--- a/dev/release/post-03-website.sh
+++ b/dev/release/post-03-website.sh
@@ -37,6 +37,7 @@ branch_name=release-note-${version}
 git checkout master
 git checkout -b ${branch_name}
 
+# NB: this assumes that you have arrow-site cloned in the (gitignored) site directory
 site_dir="${SOURCE_TOP_DIR}/site"
 release_dir="${site_dir}/_release"
 announce_file="${release_dir}/${version}.md"
diff --git a/dev/run_site/run_site.sh b/dev/run_site/run_site.sh
index 088eabd..3f03aea 100755
--- a/dev/run_site/run_site.sh
+++ b/dev/run_site/run_site.sh
@@ -15,6 +15,7 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 #
+# NB: this assumes that you have arrow-site cloned in the (gitignored) site directory
 cd /apache-arrow/arrow/site
 export GEM_HOME=$(pwd)
 export BUNDLE_PATH=$(pwd)
diff --git a/go/arrow/ipc/file_reader.go b/go/arrow/ipc/file_reader.go
index 5de7efb..813f583 100644
--- a/go/arrow/ipc/file_reader.go
+++ b/go/arrow/ipc/file_reader.go
@@ -82,7 +82,7 @@ func NewFileReader(r ReadAtSeeker, opts ...Option) (*FileReader, error) {
 	}
 
 	if cfg.schema != nil && !cfg.schema.Equal(f.schema) {
-		return nil, errors.Errorf("arrow/ipc: inconsitent schema for reading (got: %v, want: %v)", f.schema, cfg.schema)
+		return nil, errors.Errorf("arrow/ipc: inconsistent schema for reading (got: %v, want: %v)", f.schema, cfg.schema)
 	}
 
 	return &f, err
diff --git a/java/memory/src/main/java/org/apache/arrow/memory/AllocationManager.java b/java/memory/src/main/java/org/apache/arrow/memory/AllocationManager.java
index 993ae5d..d516d9c 100644
--- a/java/memory/src/main/java/org/apache/arrow/memory/AllocationManager.java
+++ b/java/memory/src/main/java/org/apache/arrow/memory/AllocationManager.java
@@ -136,7 +136,7 @@ public class AllocationManager {
       // store the mapping for <allocator, reference manager>
       BufferLedger oldLedger = map.put(ledger);
       Preconditions.checkState(oldLedger == null,
-          "Detected inconsitent state: A reference manager already exists for this allocator");
+          "Detected inconsistent state: A reference manager already exists for this allocator");
 
       // needed for debugging only: keep a pointer to reference manager inside allocator
       // to dump state, verify allocator state etc
diff --git a/js/README.md b/js/README.md
index 0a72b52..bbf5fe3 100644
--- a/js/README.md
+++ b/js/README.md
@@ -251,7 +251,7 @@ Full list of broader Apache Arrow [committers](https://arrow.apache.org/committe
 
 # Powered By Apache Arrow in JS
 
-Full list of broader Apache Arrow [projects & organizations](https://github.com/apache/arrow/blob/master/site/powered_by.md).
+Full list of broader Apache Arrow [projects & organizations](https://arrow.apache.org/powered_by/).
 
 ## Open Source Projects
 
diff --git a/python/manylinux1/build_arrow.sh b/python/manylinux1/build_arrow.sh
index 9128b73..95f1f5a 100755
--- a/python/manylinux1/build_arrow.sh
+++ b/python/manylinux1/build_arrow.sh
@@ -148,7 +148,7 @@ if sys.version_info.major > 2:
     import pyarrow.gandiva
   "
 
-  # More thorough testing happens outsite of the build to prevent
+  # More thorough testing happens outside of the build to prevent
   # packaging issues like ARROW-4372
   mv dist/*.tar.gz /io/dist
   mv repaired_wheels/*.whl /io/dist
diff --git a/python/manylinux2010/build_arrow.sh b/python/manylinux2010/build_arrow.sh
index d4268e5..ee3497d 100755
--- a/python/manylinux2010/build_arrow.sh
+++ b/python/manylinux2010/build_arrow.sh
@@ -144,7 +144,7 @@ if sys.version_info.major > 2:
     import pyarrow.gandiva
 "
 
-# More thorough testing happens outsite of the build to prevent
+# More thorough testing happens outside of the build to prevent
 # packaging issues like ARROW-4372
 mv dist/*.tar.gz /io/dist
 mv repaired_wheels/*.whl /io/dist
diff --git a/site/.gitignore b/site/.gitignore
deleted file mode 100644
index 3a88173..0000000
--- a/site/.gitignore
+++ /dev/null
@@ -1,9 +0,0 @@
-_site
-.sass-cache
-.jekyll-cache
-.jekyll-metadata
-Gemfile.lock
-asf-site
-build/
-.bundle/
-ruby/
diff --git a/site/Gemfile b/site/Gemfile
deleted file mode 100644
index a8b1245..0000000
--- a/site/Gemfile
+++ /dev/null
@@ -1,26 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to you under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-source "https://rubygems.org"
-ruby RUBY_VERSION
-gem "jekyll", "3.8.4"
-gem 'bootstrap', '~> 4.1.3'
-group :jekyll_plugins do
-   gem "jekyll-assets"
-   gem "jekyll-feed", "~> 0.6"
-   gem "jekyll-jupyter-notebook"
-end
-gem 'tzinfo-data', platforms: [:mingw, :mswin, :x64_mingw, :jruby]
diff --git a/site/README.md b/site/README.md
deleted file mode 100644
index 8dddbf0..0000000
--- a/site/README.md
+++ /dev/null
@@ -1,124 +0,0 @@
-<!---
-  Licensed to the Apache Software Foundation (ASF) under one
-  or more contributor license agreements.  See the NOTICE file
-  distributed with this work for additional information
-  regarding copyright ownership.  The ASF licenses this file
-  to you under the Apache License, Version 2.0 (the
-  "License"); you may not use this file except in compliance
-  with the License.  You may obtain a copy of the License at
-
-    http://www.apache.org/licenses/LICENSE-2.0
-
-  Unless required by applicable law or agreed to in writing,
-  software distributed under the License is distributed on an
-  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-  KIND, either express or implied.  See the License for the
-  specific language governing permissions and limitations
-  under the License.
--->
-
-# Apache Arrow Website
-
-## Overview
-
-Site content is maintained in the main Arrow repository, mostly in markdown
-format. [Jekyll](https://jekyllrb.com/) is used to generate HTML files that can
-then be committed to the [arrow-site](https://github.com/apache/arrow-site)
-repository.
-
-## Adding Content
-
-To add a blog post, create a new markdown file in the `_posts` directory, following the model of existing posts. In the front matter, you should specify an "author". This should be your Apache ID if you have one, or it can just be your name. To add additional metadata about yourself (GitHub ID, website), add yourself to `_data/contributors.yml`. This object is keyed by `apacheId`, so use that as the `author` in your post. (It doesn't matter if the ID actually exists in the ASF; all metad [...]
-
-## Prerequisites
-
-With Ruby >= 2.1 installed, run the following commands to install
-[Jekyll](https://jekyllrb.com/).
-
-```shell
-gem install jekyll bundler
-bundle install
-```
-
-On some platforms, the Ruby `nokogiri` library may fail to build, in
-such cases the following configuration option may help:
-
-```
-bundle config build.nokogiri --use-system-libraries
-```
-
-`nokogiri` depends on the `libxml2` and `libxslt1` libraries, which can be
-installed on Debian-like systems with
-
-```
-apt-get install libxml2-dev libxslt1-dev
-```
-
-If you are planning to publish the website, you must clone the arrow-site git
-repository. Run this command from the `site` directory so that `asf-site` is a
-subdirectory of `site`.
-
-```shell
-git clone --branch=asf-site https://github.com/apache/arrow-site.git asf-site
-```
-
-## Previewing the site
-
-From the `site` directory, run the following to generate HTML files and run the
-web site locally.
-
-```
-bundle exec jekyll serve
-```
-
-## Publishing
-
-After following the above instructions, run the following commands from the
-`site` directory:
-
-```shell
-JEKYLL_ENV=production bundle exec jekyll build
-rsync -r build/ asf-site/
-cd asf-site
-git status
-```
-
-Now `git add` any new files, then commit everything, and push:
-
-```
-git push
-```
-
-### Updating Code Documentation
-
-To update the documentation, run the script `./dev/gen_apidocs.sh`. This script
-will run the code documentation tools in a fixed environment.
-
-#### C (GLib)
-
-First, build Apache Arrow C++ and Apache Arrow GLib.
-
-```
-mkdir -p ../cpp/build
-cd ../cpp/build
-cmake .. -DCMAKE_BUILD_TYPE=debug
-make
-cd ../../c_glib
-./autogen.sh
-./configure \
-  --with-arrow-cpp-build-dir=$PWD/../cpp/build \
-  --with-arrow-cpp-build-type=debug \
-  --enable-gtk-doc
-LD_LIBRARY_PATH=$PWD/../cpp/build/debug make GTK_DOC_V_XREF=": "
-rsync -r doc/reference/html/ ../site/asf-site/docs/c_glib/
-```
-
-#### Javascript
-
-```
-cd ../js
-npm run doc
-rsync -r doc/ ../site/asf-site/docs/js
-```
-
-Then add/commit/push from the site/asf-site git checkout.
diff --git a/site/_config.yml b/site/_config.yml
deleted file mode 100644
index ae420e4..0000000
--- a/site/_config.yml
+++ /dev/null
@@ -1,54 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to you under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-markdown: kramdown
-permalink: /blog/:year/:month/:day/:title/
-repository: https://github.com/apache/arrow
-destination: build
-excerpt_separator: ""
-timezone: America/New_York
-
-kramdown:
-  input: GFM
-  syntax_highlighter: rouge
-
-include:
-  - _static
-
-exclude:
-  - Gemfile
-  - Gemfile.lock
-  - ruby
-  - asf-site
-  - scripts
-  - README.md
-  - vendor
-
-collections:
-  docs:
-    output: true
-  release:
-    output: true
-
-sass:
-  style: compressed
-
-# The base path where the website is deployed
-baseurl:
-
-plugins:
-  - jekyll-assets
-  - jekyll-feed
-  - jekyll-jupyter-notebook
diff --git a/site/_data/contributors.yml b/site/_data/contributors.yml
deleted file mode 100644
index e70d9af..0000000
--- a/site/_data/contributors.yml
+++ /dev/null
@@ -1,52 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to you under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Database of contributors to Apache Arrow (WIP)
-# Blogs and other pages use this data
-#
-- name: Wes McKinney
-  apacheId: wesm
-  githubId: wesm
-  homepage: https://wesmckinney.com
-  role: PMC
-- name: Uwe Korn
-  apacheId: uwe
-  githubId: xhochy
-  role: PMC
-- name: Jacques Nadeau
-  apacheId: jacques
-  githubId: jacques-n
-  role: PMC
-- name: Julien Le Dem
-  apacheId: julienledem
-  githubId: julienledem
-  role: PMC
-- name: Michael König
-  apacheId: MathMagique
-  githubId: MathMagique
-- name: Siddharth Teotia
-  apacheId: siddharthteotia
-  githubId: siddharthteotia
-- name: The Apache Arrow PMC
-  apacheId: pmc # Not a real apacheId
-  homepage: https://arrow.apache.org
-- name: Andy Grove
-  apacheId: agrove
-  githubId: andygrove
-  role: PMC
-- name: Neal Richardson
-  apacheId: npr # Not a real apacheId
-  githubId: nealrichardson
-# End contributors.yml
diff --git a/site/_data/versions.yml b/site/_data/versions.yml
deleted file mode 100644
index 813a2f7..0000000
--- a/site/_data/versions.yml
+++ /dev/null
@@ -1,31 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to you under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License.  You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-# Database of the current verion
-#
-current:
-  number: '0.14.1'
-  pinned_number: '0.14.*'
-  date: '22 July 2019'
-  git-tag: '5f564424c71cef12619522cdde59be5f69b31b68'
-  github-tag-link: 'https://github.com/apache/arrow/releases/tag/apache-arrow-0.14.1'
-  release-notes: 'https://arrow.apache.org/release/0.14.1.html'
-  mirrors: 'https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.14.1/'
-  tarball_name: 'apache-arrow-0.14.1.tar.gz'
-  mirrors-tar: 'https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.14.1/apache-arrow-0.14.1.tar.gz'
-  java-artifacts: 'http://search.maven.org/#search%7Cga%7C1%7Cg%3A%22org.apache.arrow%22%20AND%20v%3A%220.14.1%22'
-  asc: 'https://www.apache.org/dist/arrow/arrow-0.14.1/apache-arrow-0.14.1.tar.gz.asc'
-  sha256: 'https://www.apache.org/dist/arrow/arrow-0.14.1/apache-arrow-0.14.1.tar.gz.sha256'
-  sha512: 'https://www.apache.org/dist/arrow/arrow-0.14.1/apache-arrow-0.14.1.tar.gz.sha512'
diff --git a/site/_docs/.gitignore b/site/_docs/.gitignore
deleted file mode 100644
index 1e942fc..0000000
--- a/site/_docs/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-format/
\ No newline at end of file
diff --git a/site/_docs/c_glib/index.md b/site/_docs/c_glib/index.md
deleted file mode 100644
index d232d88..0000000
--- a/site/_docs/c_glib/index.md
+++ /dev/null
@@ -1,36 +0,0 @@
----
-layout: default
-title: Apache Arrow GLib (C)
-permalink: /docs/c_glib/index.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow GLib (C)
-
-Apache Arrow GLib is a wrapper library for Apache Arrow C++. Apache Arrow GLib provides C API.
-
-Apache Arrow GLib supports [GObject Introspection][gobject-introspection]. It means that you can create language bindings at runtime or compile time automatically.
-
-## API reference manuals
-
-  * [Apache Arrow GLib](arrow-glib/)
-  * [Apache Parquet GLib](parquet-glib/)
-
-[gobject-introspection]: https://wiki.gnome.org/action/show/Projects/GObjectIntrospection
diff --git a/site/_includes/analytics.html b/site/_includes/analytics.html
deleted file mode 100644
index db0e0b7..0000000
--- a/site/_includes/analytics.html
+++ /dev/null
@@ -1,9 +0,0 @@
-<!-- Global Site Tag (gtag.js) - Google Analytics -->
-<script async src="https://www.googletagmanager.com/gtag/js?id=UA-107500873-1"></script>
-<script>
-  window.dataLayer = window.dataLayer || [];
-  function gtag(){dataLayer.push(arguments)};
-  gtag('js', new Date());
-
-  gtag('config', 'UA-107500873-1');
-</script>
diff --git a/site/_includes/blog_entry.html b/site/_includes/blog_entry.html
deleted file mode 100644
index 65ec138..0000000
--- a/site/_includes/blog_entry.html
+++ /dev/null
@@ -1,20 +0,0 @@
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
-  <div class="blog-post" style="margin-bottom: 4rem">
-    {% include blog_header.html %}
-    {{ post.content }}
-  </div>
diff --git a/site/_includes/blog_header.html b/site/_includes/blog_header.html
deleted file mode 100644
index a5a1bac..0000000
--- a/site/_includes/blog_header.html
+++ /dev/null
@@ -1,59 +0,0 @@
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
-<h1>
-  {{ post.title }}
-  <a href="{{ post.url }}" class="permalink" title="Permalink">∞</a>
-</h1>
-
-{% capture discard %}
-  {% comment %}
-    We have to initialize the author_name to nil so that when this _include
-    is used in a for loop, the previous entry doesn't contaminate
-  {% endcomment %}
-  {% assign author_name = nil %}
-  {% for c in site.data.contributors %}
-    {% if c.apacheId == post.author %}
-      {% assign author_name = c.name %}
-      {% if c.homepage %}
-        {% assign homepage = c.homepage %}
-      {% else if c.githubId %}
-        {% capture homepage %}http://github.com/{{ c.githubId }}{% endcapture %}
-      {% else %}
-        {% capture homepage %}http://people.apache.org/~{{ post.author }}{% endcapture %}
-      {% endif %}
-      {% if c.avatar %}
-        {% assign avatar = c.avatar %}
-      {% else %}
-        {% capture avatar %}http://github.com/{{ c.githubId }}.png{% endcapture %}
-      {% endif %}
-    {% endif %}
-  {% endfor %}
-{% endcapture %}{% assign discard = nil %}
-
-<p>
-  <span class="badge badge-secondary">Published</span>
-  <span class="published">
-    {{ post.date | date_to_string }}
-  </span>
-  <br />
-  <span class="badge badge-secondary">By</span>
-  {% if author_name %}
-    <a href="{{ homepage }}">{{ author_name }} ({{ post.author }}) </a>
-  {% else %}
-    {{ post.author }}
-  {% endif %}
-</p>
diff --git a/site/_includes/footer.html b/site/_includes/footer.html
deleted file mode 100644
index 25dfac4..0000000
--- a/site/_includes/footer.html
+++ /dev/null
@@ -1,6 +0,0 @@
-<hr/>
-<footer class="footer">
-  <p>Apache Arrow, Arrow, Apache, the Apache feather logo, and the Apache Arrow project logo are either registered trademarks or trademarks of The Apache Software Foundation in the United States and other countries.</p>
-  <p>&copy; 2016-2019 The Apache Software Foundation</p>
-  {% asset main.js %}
-</footer>
diff --git a/site/_includes/header.html b/site/_includes/header.html
deleted file mode 100644
index 7c02533..0000000
--- a/site/_includes/header.html
+++ /dev/null
@@ -1,86 +0,0 @@
-<nav class="navbar navbar-expand-md navbar-dark bg-dark">
-  <a class="navbar-brand" href="{{ site.baseurl }}/"><img src="{{ site.baseurl }}/img/arrow-inverse-300px.png" height="60px"/></a>
-  <button class="navbar-toggler" type="button" data-toggle="collapse" data-target="#arrow-navbar" aria-controls="arrow-navbar" aria-expanded="false" aria-label="Toggle navigation">
-    <span class="navbar-toggler-icon"></span>
-  </button>
-
-    <!-- Collect the nav links, forms, and other content for toggling -->
-    <div class="collapse navbar-collapse" id="arrow-navbar">
-      <ul class="nav navbar-nav">
-        <li class="nav-item dropdown">
-          <a class="nav-link dropdown-toggle" href="#"
-             id="navbarDropdownProjectLinks" role="button" data-toggle="dropdown"
-             aria-haspopup="true" aria-expanded="false">
-             Project Links
-          </a>
-          <div class="dropdown-menu" aria-labelledby="navbarDropdownProjectLinks">
-            <a class="dropdown-item" href="{{ site.baseurl }}/install/">Installation</a>
-            <a class="dropdown-item" href="{{ site.baseurl }}/release/">Releases</a>
-            <a class="dropdown-item" href="{{ site.baseurl }}/faq/">FAQ</a>
-            <a class="dropdown-item" href="{{ site.baseurl }}/blog/">Blog</a>
-            <a class="dropdown-item" href="https://github.com/apache/arrow">Source Code</a>
-            <a class="dropdown-item" href="https://issues.apache.org/jira/browse/ARROW">Issue Tracker</a>
-          </div>
-        </li>
-        <li class="nav-item dropdown">
-          <a class="nav-link dropdown-toggle" href="#"
-             id="navbarDropdownCommunity" role="button" data-toggle="dropdown"
-             aria-haspopup="true" aria-expanded="false">
-             Community
-          </a>
-          <div class="dropdown-menu" aria-labelledby="navbarDropdownCommunity">
-            <a class="dropdown-item" href="http://mail-archives.apache.org/mod_mbox/arrow-user/">User Mailing List</a>
-            <a class="dropdown-item" href="http://mail-archives.apache.org/mod_mbox/arrow-dev/">Dev Mailing List</a>
-            <a class="dropdown-item" href="https://cwiki.apache.org/confluence/display/ARROW">Developer Wiki</a>
-            <a class="dropdown-item" href="{{ site.baseurl }}/committers/">Committers</a>
-            <a class="dropdown-item" href="{{ site.baseurl }}/powered_by/">Powered By</a>
-          </div>
-        </li>
-        <li class="nav-item">
-          <a class="nav-link" href="{{ site.baseurl }}/docs/format/README.html"
-             role="button" aria-haspopup="true" aria-expanded="false">
-             Specification
-          </a>
-        </li>
-        <li class="nav-item dropdown">
-          <a class="nav-link dropdown-toggle" href="#"
-             id="navbarDropdownDocumentation" role="button" data-toggle="dropdown"
-             aria-haspopup="true" aria-expanded="false">
-             Documentation
-          </a>
-          <div class="dropdown-menu" aria-labelledby="navbarDropdownDocumentation">
-            <a class="dropdown-item" href="{{ site.baseurl }}/docs">Project Docs</a>
-            <a class="dropdown-item" href="{{ site.baseurl }}/docs/python">Python</a>
-            <a class="dropdown-item" href="{{ site.baseurl }}/docs/cpp">C++</a>
-            <a class="dropdown-item" href="{{ site.baseurl }}/docs/java">Java API</a>
-            <a class="dropdown-item" href="{{ site.baseurl }}/docs/c_glib">C GLib API</a>
-            <a class="dropdown-item" href="{{ site.baseurl }}/docs/js">Javascript API</a>
-          </div>
-        </li>
-        <!-- <li><a href="{{ site.baseurl }}/blog">Blog</a></li> -->
-        <li class="nav-item dropdown">
-          <a class="nav-link dropdown-toggle" href="#"
-             id="navbarDropdownASF" role="button" data-toggle="dropdown"
-             aria-haspopup="true" aria-expanded="false">
-             ASF Links
-          </a>
-          <div class="dropdown-menu" aria-labelledby="navbarDropdownASF">
-            <a class="dropdown-item" href="http://www.apache.org/">ASF Website</a>
-            <a class="dropdown-item" href="http://www.apache.org/licenses/">License</a>
-            <a class="dropdown-item" href="http://www.apache.org/foundation/sponsorship.html">Donate</a>
-            <a class="dropdown-item" href="http://www.apache.org/foundation/thanks.html">Thanks</a>
-            <a class="dropdown-item" href="http://www.apache.org/security/">Security</a>
-          </div>
-        </li>
-      </ul>
-      <div class="flex-row justify-content-end ml-md-auto">
-        <a class="d-sm-none d-md-inline pr-2" href="https://www.apache.org/events/current-event.html">
-          <img src="https://www.apache.org/events/current-event-234x60.png"/>
-        </a>
-        <a href="http://www.apache.org/">
-          <img src="{{ site.baseurl }}/img/asf_logo.svg" width="120px"/>
-        </a>
-      </div>
-      </div><!-- /.navbar-collapse -->
-    </div>
-  </nav>
diff --git a/site/_includes/top.html b/site/_includes/top.html
deleted file mode 100644
index 270b87e..0000000
--- a/site/_includes/top.html
+++ /dev/null
@@ -1,21 +0,0 @@
-<!DOCTYPE html>
-<html lang="en-US">
-  <head>
-    <meta charset="UTF-8">
-    <title>Apache Arrow Homepage</title>
-    <meta http-equiv="X-UA-Compatible" content="IE=edge">
-    <meta name="viewport" content="width=device-width, initial-scale=1">
-    <meta name="generator" content="Jekyll v{{ jekyll.version }}">
-    <!-- The above 3 meta tags *must* come first in the head; any other head content must come *after* these tags -->
-    <link rel="icon" type="image/x-icon" href="{{ site.baseurl }}/favicon.ico">
-
-    <link rel="stylesheet" href="//fonts.googleapis.com/css?family=Lato:300,300italic,400,400italic,700,700italic,900">
-
-    <link href="{{ site.baseurl }}/css/main.css" rel="stylesheet">
-    <link href="{{ site.baseurl }}/css/syntax.css" rel="stylesheet">
-    <script src="https://code.jquery.com/jquery-3.3.1.slim.min.js" integrity="sha384-q8i/X+965DzO0rT7abK41JStQIAqVgRVzpbzo5smXKp4YfRvH+8abtTE1Pi6jizo" crossorigin="anonymous"></script>
-    <script src="https://cdnjs.cloudflare.com/ajax/libs/popper.js/1.14.3/umd/popper.min.js" integrity="sha384-ZMP7rVo3mIykV+2+9J3UJ46jBk0WLaUAdn689aCwoqbBJiSnjAK/l8WvCWPIPm49" crossorigin="anonymous"></script>
-    {% if jekyll.environment == 'production' %}
-    {% include analytics.html %}
-    {% endif %}
-  </head>
diff --git a/site/_layouts/default.html b/site/_layouts/default.html
deleted file mode 100644
index 21c000c..0000000
--- a/site/_layouts/default.html
+++ /dev/null
@@ -1,16 +0,0 @@
-{% include top.html %}
-
-<body class="wrap">
-  <header>
-    {% include header.html %}
-  </header>
-
-  <div class="container p-lg-4">
-    <main role="main">
-      {{ content }}
-    </main>
-
-    {% include footer.html %}
-  </div>
-</body>
-</html>
diff --git a/site/_layouts/post.html b/site/_layouts/post.html
deleted file mode 100644
index 1e9db93..0000000
--- a/site/_layouts/post.html
+++ /dev/null
@@ -1,22 +0,0 @@
-{% include top.html %}
-
-<body class="wrap">
-  <header>
-    {% include header.html %}
-  </header>
-
-  <div class="container p-lg-4">
-    <main role="main">
-    {% comment %}
-      This 'assign' is to match the variable name used in _includes/blog_header.html
-    {% endcomment %}
-    {% assign post = page %}
-    {% include blog_header.html %}
-
-    {{ content }}
-    </main>
-
-    {% include footer.html %}
-  </div>
-</body>
-</html>
diff --git a/site/_posts/2017-05-08-0.3-release-japanese.md b/site/_posts/2017-05-08-0.3-release-japanese.md
deleted file mode 100644
index 6bad5ad..0000000
--- a/site/_posts/2017-05-08-0.3-release-japanese.md
+++ /dev/null
@@ -1,189 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.3.0リリース"
-date: "2017-05-08 00:00:00 -0400"
-author: wesm
-categories: [release,translation]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-[原文(English)]({% post_url 2017-05-08-0.3-release %})
-
-Apache Arrowチームは0.3.0のリリースをアナウンスできてうれしいです。2月にリリースした0.2.0から10週間の活発な開発の結果が今回のリリースです。[**23人のコントリビューター**][14]が[**306個のJIRAのissueを解決**][1]しました。
-
-複数のArrowの実装にたくさんの新しい機能を追加しています。2017年、特に注力して開発するのは、インメモリー用のフォーマット、型のメタデータ、メッセージング用のプロトコルです。これは、ビッグデータアプリケーションに**安定していてプロダクションで使える基盤**を提供するためです。高性能IOとインメモリーデータ処理にArrowを活用するために、[Apache Spark][3]・[GeoMesa][2]コミュニティーと協力していてとてもエキサイティングです。
-
-それぞれのプラットフォームでArrowを使う方法は[インストールページ][6]を見てください。
-
-Arrowでビッグデータシステムを高速化するケースを増やすために、近いうちにApache Arrowのロードマップを公開する予定です。
-
-Arrowの開発に参加するコントリビューターを募集しています。すでにArrowの開発に参加しているコミュニティーからのコントリビューターもそうですし、まだ参加していないGo、R、Juliaといったコミュニティーからのコントリビューターも募集しています。
-
-### ファイルフォーマットとストリーミングフォーマットの強化
-
-0.2.0では**ランダムアクセス**用と**ストリーミング**用のArrowのワイヤーフォーマットを導入しました。実装の詳細は[IPC仕様][15]を見てください。ユースケースは[使用例を紹介したブログ][4]を見てください。これらのフォーマットを使うと低オーバーヘッド・コピーなしでArrowのレコードバッチのペイロードにアクセスできます。
-
-0.3.0ではこのバイナリーフォマットの細かい詳細をたくさん固めました。Java、C++、Python間の連携のテストおよびそれぞれ言語での単体テストの整備も進めました。[Google Flatbuffers][5]は、前方互換性を壊さずにメタデータに新しい機能を追加するのに非常に助かりました。
-
-まだバイナリーフォーマットの前方互換性を必ず壊さないと約束できる状態ではありませんが(もしかしたら変更する必要があるなにかが見つかるかもしれない)、メジャーリリース間では不必要に互換性を壊さないように努力するつもりです。Apache ArrowのWebサイト、各コンポーネントのユーザー向けのドキュメントおよびAPIドキュメントへのコントリビューションを非常に歓迎します。
-
-### 辞書エンコーディングのサポート
-
-[GeoMesa][2]プロジェクトの[Emilio Lahr-Vivaz][13]はJavaのArrow実装に辞書エンコード対応ベクターをコントリビュートしました。これを受けて、C++とPythonでもサポートしました。(`pandas.Categorical`とも連携できます。)辞書エンコーディング用のインテグレーションテスト(C++とJava間でこのデータを送受信するテスト)はまだ完成していませんが、0.4.0までには完成させたいです。
-
-これはカテゴリーデータ用の一般的なデータ表現テクニックです。これを使うと、複数のレコードバッチで共通の「辞書」を共有し、各レコードバッチの値はこの辞書を参照する整数になります。このデータは統計的言語(statistical language)の分野では「カテゴリー(categorical)」や「因子(factor)」と呼ばれています。Apache Parquetのようなファイルフォーマットの分野ではデータ圧縮のためだけに使われています。
-
-### 日付、時刻、固定長型の拡張
-
-0.2.0では現実に使われている日付・時刻型をインテグレーションテスト付きで完全にサポートすることを諦めました。これらは[Apache Parquet][21]とApache Sparkとの連携に必要な機能です。
-
-* **日付**: 32-bit(日単位)と64-bit(ミリ秒単位)
-* **時刻**: 単位付き64-bit整数(単位:秒、ミリ秒、マイクロ秒、ナノ秒)
-* **タイムスタンプ(UNIXエポックからの経過時間)**: 単位付き64-bit整数のタイムゾーン付きとタイムゾーンなし
-* **固定長バイナリー**: 決まったバイト数のプリミティブな値
-* **固定長リスト**: 各要素が同じサイズのリスト(要素のベクターとは別にオフセットのベクターを持つ必要がない)
-
-C++のArrow実装では、[Boost.Multiprecision][12]を使ったexactな小数のサポートを実験的に追加しました。ただし、Java実装とC++実装間での小数のメモリーフォーマットはまだ固まっていません。
-
-### C++とPythonのWindowsサポート
-
-一般的なC++とPythonでの開発用に、パッケージ周りの改良も多数入っています。0.3.0はVisual Studio(MSVC)2015と2017を使ってWindowsを完全にサポートした最初のバージョンです。AppveyorでMSVC用のCIを実行しています。Windows上でソースからビルドするためのガイドも書きました。[C++][7]用と[Python][8]用。
-
-[conda-forge][31]からWindows用のArrowのPythonライブラリーをインストールできます。
-
-```shell
-conda install pyarrow -c conda-forge
-```
-
-### C(GLib)バインディングとRuby・Lua・他のサポート
-
-[Kouhei Sutou][9]は新しいApache Arrowのコントリビューターです。Linux用の(ArrowのC++実装の)GLibを使ったCバインディングをコントリビュートしました。[GObject Introspection][10]というCのミドルウェアを使うことでRuby、Lua、Goや[他にも様々なプログラミング言語][11]でシームレスにバインディングを使うことができます。これらのバインディングがどのように動いているか、これらのバインディングをどのように使うかを説明するブログ記事が別途必要な気がします。
-
-### PySparkを使ったApache Sparkとの連携
-
-[SPARK-13534][16]でApache Sparkコミュニティーと協力しています。PySparkでの`DataFrame.toPandas`をArrowを使って高速化しようとしています。効率的なデータのシリアライズにより[**40倍以上高速化**][17]できるケースがあります。
-
-PySparkでArrowを使うことでこれまでできなかったパフォーマンス最適化の道が開けました。特に、UDFの評価まわりでいろいろやれることがあるでしょう。(たとえば、Pythonのラムダ関数を使って`map`・`filter`を実行するケース。)
-
-### Python実装での新しい機能:メモリービュー、Feather、Apache Parquetのサポート
-
-ArrowのPythonライブラリーである`pyarrow`は`libarrow`と`libarrow_python`というC++ライブラリーのCythonバインディングです。`pyarrow`はNumPyと[pandas][29]とPythonの標準ライブラリー間のシームレスな連携を実現します。
-
-ArrowのC++ライブラリーで最も重要なものは`arrow::Buffer`オブジェクトです。これはメモリービューを管理します。コピーなしの読み込みとスライスをサポートしている点が重要です。[Jeff Knupp][25]はArrowのバッファーとPythonのバッファープロトコルとmemoryviewの連携処理をコントリビュートしました。これにより次のようなことができるようになりました。
-
-```python
-In [6]: import pyarrow as pa
-
-In [7]: buf = pa.frombuffer(b'foobarbaz')
-
-In [8]: buf
-Out[8]: <pyarrow._io.Buffer at 0x7f6c0a84b538>
-
-In [9]: memoryview(buf)
-Out[9]: <memory at 0x7f6c0a8c5e88>
-
-In [10]: buf.to_pybytes()
-Out[10]: b'foobarbaz'
-```
-
-C++でのParquet実装である[parquet-cpp][23]を使うことで大幅に[**Apache Parquet**][21]サポートを改良しました。たとえば、ディスク上にあるかHDFS上にあるか関係なく、パーティションされたデータセットをサポートしました。[Daskプロジェクト][24]はArrowを使ったParquetサポートを実装した最初のプロジェクトです。Dask開発者とはpandsデータを分散処理する文脈でさらに協力できることを楽しみにしています。
-
-pandasを成熟させるためにArrowを改良することもあり、[**Featherフォーマット**][22]の実装をマージしたのもその1つです。Featherフォーマットは本質的にはArrowのランダムアクセスフォーマットの特別なケースの1つです。ArrowのコードベースでFeatherの開発を続けます。たとえば、今のFeatherはArrowのPythonバインディングのレイヤーを使うことでPythonのファイルオブジェクトを読み書きできるようになっています。
-
-`DatetimeTZ`や`Categorical`といったpandas固有のデータ型のちゃんとした(robust)サポートも実装しました。
-
-### C++ライブラリーでのテンソルサポート
-
-Apache Arrowはコピーなしで共有メモリーを管理するツールという側面があります。機械学習アプリケーションの文脈でこの機能への関心が増えています。UCバークレー校の[RISELab][27]の[Rayプロジェクト][26]が最初の例です。
-
-機械学習ではは「テンソル」とも呼ばれる多次元配列というデータ構造を扱います。このようなデータ構造はArrowのカラムフォーマットがサポートしているデータ構造の範囲を超えています。今回のケースでは、[`arrow::Tensor`][28]というC++の型を追加で実装しました。これはArrowのコピーなしの共有メモリー機能を活用して実装しました。(メモリーの生存期間の管理に`arrow::Buffer`を使いました。)C++実装では、これからも、共通のIO・メモリー管理ツールとしてArrowを活用できるようにするため、追加のデータ構造を提供するつもりです。
-
-### JavaScript(TypeScript)実装の開始
-
-[Brian Hulette][20]はNodeJSとWebブラウザー上で動くアプリケーションで使うために[TypeScript][30]でのArrowの実装を始めました。FlatBuffersがJavaScriptをファーストクラスでサポートしているので実装が捗ります。
-
-### Webサイトと開発者用ドキュメントの改良
-
-0.2.0をリリースしてからドキュメントとブログを公開するためにWebサイトのシステムを[Jekyll][18]ベースで作りました。Kouhei Sutouは[Jekyll Jupyter Notebookプラグイン][19]を作りました。これによりArrowのWebサイトのコンテンツを作るためにJupyterを使うことができます。
-
-WebサイトにはC、C++、Java、PythonのAPIドキュメントを公開しました。これらの中にArrowを使い始めるための有益な情報を見つけられるでしょう。
-
-### コントリビューター
-
-このリリースにパッチをコントリビュートしたみなさんに感謝します。
-
-```
-$ git shortlog -sn apache-arrow-0.2.0..apache-arrow-0.3.0
-    119 Wes McKinney
-     55 Kouhei Sutou
-     18 Uwe L. Korn
-     17 Julien Le Dem
-      9 Phillip Cloud
-      6 Bryan Cutler
-      5 Philipp Moritz
-      5 Emilio Lahr-Vivaz
-      4 Max Risuhin
-      4 Johan Mabille
-      4 Jeff Knupp
-      3 Steven Phillips
-      3 Miki Tebeka
-      2 Leif Walsh
-      2 Jeff Reback
-      2 Brian Hulette
-      1 Tsuyoshi Ozawa
-      1 rvernica
-      1 Nong Li
-      1 Julien Lafaye
-      1 Itai Incze
-      1 Holden Karau
-      1 Deepak Majeti
-```
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20in%20(Resolved%2C%20Closed)%20AND%20fixVersion%20%3D%200.3.0
-[2]: http://www.geomesa.org/
-[3]: http://spark.apache.org
-[4]: http://wesmckinney.com/blog/arrow-streaming-columnar/
-[5]: http://github.com/google/flatbuffers
-[6]: http://arrow.apache.org/install
-[7]: https://github.com/apache/arrow/blob/master/cpp/apidoc/Windows.md
-[8]: https://github.com/apache/arrow/blob/master/python/doc/source/development.rst
-[9]: http://github.com/kou
-[10]: https://wiki.gnome.org/Projects/GObjectIntrospection
-[11]: https://wiki.gnome.org/Projects/GObjectIntrospection/Users
-[12]: https://github.com/boostorg/multiprecision
-[13]: https://github.com/elahrvivaz
-[14]: https://github.com/apache/arrow/graphs/contributors
-[15]: http://arrow.apache.org/docs/ipc.html
-[16]: https://issues.apache.org/jira/browse/SPARK-13534
-[17]: https://github.com/apache/spark/pull/15821#issuecomment-282175163
-[18]: https://jekyllrb.com
-[19]: https://github.com/red-data-tools/jekyll-jupyter-notebook
-[20]: https://github.com/TheNeuralBit
-[21]: http://parquet.apache.org
-[22]: https://github.com/wesm/feather
-[23]: https://github.com/apache/parquet-cpp
-[24]: https://github.com/dask/dask/commit/68f9e417924a985c1f2e2a587126833c70a2e9f4
-[25]: https://github.com/JeffKnupp
-[26]: https://github.com/ray-project/ray
-[27]: https://rise.cs.berkeley.edu/
-[28]: http://arrow.apache.org/docs/cpp/classarrow_1_1_tensor.html
-[29]: http://pandas.pydata.org
-[30]: https://github.com/apache/arrow/tree/master/js
-[31]: https://conda-forge.github.io
diff --git a/site/_posts/2017-05-08-0.3-release.md b/site/_posts/2017-05-08-0.3-release.md
deleted file mode 100644
index cf38a8b..0000000
--- a/site/_posts/2017-05-08-0.3-release.md
+++ /dev/null
@@ -1,265 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.3.0 Release"
-date: "2017-05-08 00:00:00 -0400"
-author: wesm
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-Translations: [日本語]({% post_url 2017-05-08-0.3-release-japanese %})
-
-The Apache Arrow team is pleased to announce the 0.3.0 release of the
-project. It is the product of an intense 10 weeks of development since the
-0.2.0 release from this past February. It includes [**306 resolved JIRAs**][1]
-from [**23 contributors**][14].
-
-While we have added many new features to the different Arrow implementations,
-one of the major development focuses in 2017 has been hardening the in-memory
-format, type metadata, and messaging protocol to provide a **stable,
-production-ready foundation** for big data applications. We are excited to be
-collaborating with the [Apache Spark][3] and [GeoMesa][2] communities on
-utilizing Arrow for high performance IO and in-memory data processing.
-
-See the [Install Page][6] to learn how to get the libraries for your platform.
-
-We will be publishing more information about the Apache Arrow roadmap as we
-forge ahead with using Arrow to accelerate big data systems.
-
-We are looking for more contributors from within our existing communities and
-from other communities (such as Go, R, or Julia) to get involved in Arrow
-development.
-
-### File and Streaming Format Hardening
-
-The 0.2.0 release brought with it the first iterations of the **random access**
-and **streaming** Arrow wire formats. See the [IPC specification][15] for
-implementation details and [example blog post][4] with some use cases. These
-provide low-overhead, zero-copy access to Arrow record batch payloads.
-
-In 0.3.0 we have solidified a number of small details with the binary format
-and improved our integration and unit testing particularly in the Java, C++,
-and Python libraries. Using the [Google Flatbuffers][5] project has helped with
-adding new features to our metadata without breaking forward compatibility.
-
-We are not yet ready to make a firm commitment to strong forward compatibility
-(in case we find something needs to change) in the binary format, but we will
-make efforts between major releases to not make unnecessary
-breakages. Contributions to the website and component user and API
-documentation would also be most welcome.
-
-### Dictionary Encoding Support
-
-[Emilio Lahr-Vivaz][13] from the [GeoMesa][2] project contributed Java support
-for dictionary-encoded Arrow vectors. We followed up with C++ and Python
-support (and `pandas.Categorical` integration). We have not yet implemented
-full integration tests for dictionaries (for sending this data between C++ and
-Java), but hope to achieve this in the 0.4.0 Arrow release.
-
-This common data representation technique for categorical data allows multiple
-record batches to share a common "dictionary", with the values in the batches
-being represented as integers referencing the dictionary. This data is called
-"categorical" or "factor" in statistical languages, while in file formats like
-Apache Parquet it is strictly used for data compression.
-
-### Expanded Date, Time, and Fixed Size Types
-
-A notable omission from the 0.2.0 release was complete and integration-tested
-support for the gamut of date and time types that occur in the wild. These are
-needed for [Apache Parquet][21] and Apache Spark integration.
-
-* **Date**: 32-bit (days unit) and 64-bit (milliseconds unit)
-* **Time**: 64-bit integer with unit (second, millisecond, microsecond, nanosecond)
-* **Timestamp**: 64-bit integer with unit, with or without timezone
-* **Fixed Size Binary**: Primitive values occupying certain number of bytes
-* **Fixed Size List**: List values with constant size (no separate offsets vector)
-
-We have additionally added experimental support for exact decimals in C++ using
-[Boost.Multiprecision][12], though we have not yet hardened the Decimal memory
-format between the Java and C++ implementations.
-
-### C++ and Python Support on Windows
-
-We have made many general improvements to development and packaging for general
-C++ and Python development. 0.3.0 is the first release to bring full C++ and
-Python support for Windows on Visual Studio (MSVC) 2015 and 2017. In addition
-to adding Appveyor continuous integration for MSVC, we have also written guides
-for building from source on Windows: [C++][7] and [Python][8].
-
-For the first time, you can install the Arrow Python library on Windows from
-[conda-forge][31]:
-
-```shell
-conda install pyarrow -c conda-forge
-```
-
-### C (GLib) Bindings, with support for Ruby, Lua, and more
-
-[Kouhei Sutou][9] is a new Apache Arrow contributor and has contributed GLib C
-bindings (to the C++ libraries) for Linux. Using a C middleware framework
-called [GObject Introspection][10], it is possible to use these bindings
-seamlessly in Ruby, Lua, Go, and [other programming languages][11]. We will
-probably need to publish some follow up blogs explaining how these bindings
-work and how to use them.
-
-### Apache Spark Integration for PySpark
-
-We have been collaborating with the Apache Spark community on [SPARK-13534][16]
-to add support for using Arrow to accelerate `DataFrame.toPandas` in
-PySpark. We have observed over [**40x speedup**][17] from the more efficient
-data serialization.
-
-Using Arrow in PySpark opens the door to many other performance optimizations,
-particularly around UDF evaluation (e.g. `map` and `filter` operations with
-Python lambda functions).
-
-### New Python Feature: Memory Views, Feather, Apache Parquet support
-
-Arrow's Python library `pyarrow` is a Cython binding for the `libarrow` and
-`libarrow_python` C++ libraries, which handle inteoperability with NumPy,
-[pandas][29], and the Python standard library.
-
-At the heart of Arrow's C++ libraries is the `arrow::Buffer` object, which is a
-managed memory view supporting zero-copy reads and slices. [Jeff Knupp][25]
-contributed integration between Arrow buffers and the Python buffer protocol
-and memoryviews, so now code like this is possible:
-
-```python
-In [6]: import pyarrow as pa
-
-In [7]: buf = pa.frombuffer(b'foobarbaz')
-
-In [8]: buf
-Out[8]: <pyarrow._io.Buffer at 0x7f6c0a84b538>
-
-In [9]: memoryview(buf)
-Out[9]: <memory at 0x7f6c0a8c5e88>
-
-In [10]: buf.to_pybytes()
-Out[10]: b'foobarbaz'
-```
-
-We have significantly expanded [**Apache Parquet**][21] support via the C++
-Parquet implementation [parquet-cpp][23]. This includes support for partitioned
-datasets on disk or in HDFS. We added initial Arrow-powered Parquet support [in
-the Dask project][24], and look forward to more collaborations with the Dask
-developers on distributed processing of pandas data.
-
-With Arrow's support for pandas maturing, we were able to merge in the
-[**Feather format**][22] implementation, which is essentially a special case of
-the Arrow random access format. We'll be continuing Feather development within
-the Arrow codebase. For example, Feather can now read and write with Python
-file objects using Arrow's Python binding layer.
-
-We also implemented more robust support for pandas-specific data types, like
-`DatetimeTZ` and `Categorical`.
-
-### Support for Tensors and beyond in C++ Library
-
-There has been increased interest in using Apache Arrow as a tool for zero-copy
-shared memory management for machine learning applications. A flagship example
-is the [Ray project][26] from the UC Berkeley [RISELab][27].
-
-Machine learning deals in additional kinds of data structures beyond what the
-Arrow columnar format supports, like multidimensional arrays aka "tensors". As
-such, we implemented the [`arrow::Tensor`][28] C++ type which can utilize the
-rest of Arrow's zero-copy shared memory machinery (using `arrow::Buffer` for
-managing memory lifetime). In C++ in particular, we will want to provide for
-additional data structures utilizing common IO and memory management tools.
-
-### Start of JavaScript (TypeScript) Implementation
-
-[Brian Hulette][20] started developing an Arrow implementation in
-[TypeScript][30] for use in NodeJS and browser-side applications. We are
-benefitting from Flatbuffers' first class support for JavaScript.
-
-### Improved Website and Developer Documentation
-
-Since 0.2.0 we have implemented a new website stack for publishing
-documentation and blogs based on [Jekyll][18]. Kouhei Sutou developed a [Jekyll
-Jupyter Notebook plugin][19] so that we can use Jupyter to author content for
-the Arrow website.
-
-On the website, we have now published API documentation for the C, C++, Java,
-and Python subcomponents. Within these you will find easier-to-follow developer
-instructions for getting started.
-
-### Contributors
-
-Thanks to all who contributed patches to this release.
-
-```
-$ git shortlog -sn apache-arrow-0.2.0..apache-arrow-0.3.0
-    119 Wes McKinney
-     55 Kouhei Sutou
-     18 Uwe L. Korn
-     17 Julien Le Dem
-      9 Phillip Cloud
-      6 Bryan Cutler
-      5 Philipp Moritz
-      5 Emilio Lahr-Vivaz
-      4 Max Risuhin
-      4 Johan Mabille
-      4 Jeff Knupp
-      3 Steven Phillips
-      3 Miki Tebeka
-      2 Leif Walsh
-      2 Jeff Reback
-      2 Brian Hulette
-      1 Tsuyoshi Ozawa
-      1 rvernica
-      1 Nong Li
-      1 Julien Lafaye
-      1 Itai Incze
-      1 Holden Karau
-      1 Deepak Majeti
-```
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20in%20(Resolved%2C%20Closed)%20AND%20fixVersion%20%3D%200.3.0
-[2]: http://www.geomesa.org/
-[3]: http://spark.apache.org
-[4]: http://wesmckinney.com/blog/arrow-streaming-columnar/
-[5]: http://github.com/google/flatbuffers
-[6]: http://arrow.apache.org/install
-[7]: https://github.com/apache/arrow/blob/master/cpp/apidoc/Windows.md
-[8]: https://github.com/apache/arrow/blob/master/python/doc/source/development.rst
-[9]: http://github.com/kou
-[10]: https://wiki.gnome.org/Projects/GObjectIntrospection
-[11]: https://wiki.gnome.org/Projects/GObjectIntrospection/Users
-[12]: https://github.com/boostorg/multiprecision
-[13]: https://github.com/elahrvivaz
-[14]: https://github.com/apache/arrow/graphs/contributors
-[15]: http://arrow.apache.org/docs/ipc.html
-[16]: https://issues.apache.org/jira/browse/SPARK-13534
-[17]: https://github.com/apache/spark/pull/15821#issuecomment-282175163
-[18]: https://jekyllrb.com
-[19]: https://github.com/red-data-tools/jekyll-jupyter-notebook
-[20]: https://github.com/TheNeuralBit
-[21]: http://parquet.apache.org
-[22]: https://github.com/wesm/feather
-[23]: https://github.com/apache/parquet-cpp
-[24]: https://github.com/dask/dask/commit/68f9e417924a985c1f2e2a587126833c70a2e9f4
-[25]: https://github.com/JeffKnupp
-[26]: https://github.com/ray-project/ray
-[27]: https://rise.cs.berkeley.edu/
-[28]: http://arrow.apache.org/docs/cpp/classarrow_1_1_tensor.html
-[29]: http://pandas.pydata.org
-[30]: https://github.com/apache/arrow/tree/master/js
-[31]: https://conda-forge.github.io
diff --git a/site/_posts/2017-05-23-0.4.0-release.md b/site/_posts/2017-05-23-0.4.0-release.md
deleted file mode 100644
index d94a68a..0000000
--- a/site/_posts/2017-05-23-0.4.0-release.md
+++ /dev/null
@@ -1,111 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.4.0 Release"
-date: "2017-05-23 00:00:00 -0400"
-author: wesm
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-The Apache Arrow team is pleased to announce the 0.4.0 release of the
-project. While only 17 days since the release, it includes [**77 resolved
-JIRAs**][1] with some important new features and bug fixes.
-
-See the [Install Page][6] to learn how to get the libraries for your platform.
-
-### Expanded JavaScript Implementation
-
-The TypeScript Arrow implementation has undergone some work since 0.3.0 and can
-now read a substantial portion of the Arrow streaming binary format. As this
-implementation develops, we will eventually want to include JS in the
-integration test suite along with Java and C++ to ensure wire
-cross-compatibility.
-
-### Python Support for Apache Parquet on Windows
-
-With the [1.1.0 C++ release][7] of [Apache Parquet][8], we have enabled the
-`pyarrow.parquet` extension on Windows for Python 3.5 and 3.6. This should
-appear in conda-forge packages and PyPI in the near future. Developers can
-follow the [source build instructions][9].
-
-### Generalizing Arrow Streams
-
-In the 0.2.0 release, we defined the first version of the Arrow streaming
-binary format for low-cost messaging with columnar data. These streams presume
-that the message components are written as a continuous byte stream over a
-socket or file.
-
-We would like to be able to support other other transport protocols, like
-[gRPC][3], for the message components of Arrow streams. To that end, in C++ we
-defined an abstract stream reader interface, for which the current contiguous
-streaming format is one implementation:
-
-{% highlight cpp %}
-class RecordBatchReader {
- public:
-  virtual std::shared_ptr<Schema> schema() const = 0;
-  virtual Status GetNextRecordBatch(std::shared_ptr<RecordBatch>* batch) = 0;
-};
-{% endhighlight %}
-
-It would also be good to define abstract stream reader and writer interfaces in
-the Java implementation.
-
-In an upcoming blog post, we will explain in more depth how Arrow streams work,
-but you can learn more about them by reading the [IPC specification][4].
-
-### C++ and Cython API for Python Extensions
-
-As other Python libraries with C or C++ extensions use Apache Arrow, they will
-need to be able to return Python objects wrapping the underlying C++
-objects. In this release, we have implemented a prototype C++ API which enables
-Python wrapper objects to be constructed from C++ extension code:
-
-{% highlight cpp %}
-#include "arrow/python/pyarrow.h"
-
-if (!arrow::py::import_pyarrow()) {
-  // Error
-}
-
-std::shared_ptr<arrow::RecordBatch> cpp_batch = GetData(...);
-PyObject* py_batch = arrow::py::wrap_batch(cpp_batch);
-{% endhighlight %}
-
-This API is intended to be usable from Cython code as well:
-
-{% highlight cython %}
-cimport pyarrow
-pyarrow.import_pyarrow()
-{% endhighlight %}
-
-### Python Wheel Installers on macOS
-
-With this release, `pip install pyarrow` works on macOS (OS X) as well as
-Linux. We are working on providing binary wheel installers for Windows as well.
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20in%20(Resolved%2C%20Closed)%20AND%20fixVersion%20%3D%200.4.0
-[2]: http://arrow.apache.org/install
-[3]: http://grpc.io/
-[4]: http://arrow.apache.org/docs/ipc.html
-[6]: http://arrow.apache.org/install
-[7]: https://github.com/apache/parquet-cpp/releases/tag/apache-parquet-cpp-1.1.0
-[8]: http://parquet.apache.org
-[9]: http://arrow.apache.org/docs/python/development.html
\ No newline at end of file
diff --git a/site/_posts/2017-06-14-0.4.1-release.md b/site/_posts/2017-06-14-0.4.1-release.md
deleted file mode 100644
index 4dd8b4e..0000000
--- a/site/_posts/2017-06-14-0.4.1-release.md
+++ /dev/null
@@ -1,67 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.4.1 Release"
-date: "2017-06-14 10:00:00 -0400"
-author: wesm
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-The Apache Arrow team is pleased to announce the 0.4.1 release of the
-project. This is a bug fix release that addresses a regression with Decimal
-types in the Java implementation introduced in 0.4.0 (see
-[ARROW-1091][10]). There were a total of [31 resolved JIRAs][1].
-
-See the [Install Page][2] to learn how to get the libraries for your platform.
-
-### Python Wheel Installers for Windows
-
-Max Risuhin contributed fixes to enable binary wheel installers to be generated
-for Python 3.5 and 3.6. Thus, 0.4.1 is the first Arrow release for which
-PyArrow including bundled [Apache Parquet][8] support that can be installed
-with either conda or pip across the 3 major platforms: Linux, macOS, and
-Windows. Use one of:
-
-```
-pip install pyarrow
-conda install pyarrow -c conda-forge
-```
-
-### Turbodbc 2.0.0 with Apache Arrow Support
-
-[Turbodbc][12], a fast C++ ODBC interface with Python bindings, released
-version 2.0.0 including reading SQL result sets as Arrow record batches. The
-team used the PyArrow C++ API introduced in version 0.4.0 to construct
-`pyarrow.Table` objects inside the `turbodbc` library. Learn more in their
-[documentation][13] and install with one of:
-
-```
-pip install turbodbc
-conda install turbodbc -c conda-forge
-```
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20in%20(Resolved%2C%20Closed)%20AND%20fixVersion%20%3D%200.4.1
-[2]: http://arrow.apache.org/install
-[8]: http://parquet.apache.org
-[9]: http://arrow.apache.org/docs/python/development.html
-[10]: https://issues.apache.org/jira/browse/ARROW-1091
-[11]: https://github.com/maxris
-[12]: http://turbodbc.readthedocs.io/
-[13]: http://turbodbc.readthedocs.io/en/latest/pages/advanced_usage.html#apache-arrow-support
\ No newline at end of file
diff --git a/site/_posts/2017-06-16-turbodbc-arrow.md b/site/_posts/2017-06-16-turbodbc-arrow.md
deleted file mode 100644
index be6256d..0000000
--- a/site/_posts/2017-06-16-turbodbc-arrow.md
+++ /dev/null
@@ -1,117 +0,0 @@
----
-layout: post
-title: "Connecting Relational Databases to the Apache Arrow World with turbodbc"
-date: "2017-06-16 04:00:00 -0400"
-author: MathMagique
-categories: [application]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-*[Michael König][15] is the lead developer of the [turbodbc project][2]*
-
-The [Apache Arrow][1] project set out to become the universal data layer for
-column-oriented data processing systems without incurring serialization costs
-or compromising on performance on a more general level. While relational
-databases still lag behind in Apache Arrow adoption, the Python database module
-[turbodbc][2] brings Apache Arrow support to these databases using a much
-older, more specialized data exchange layer: [ODBC][3].
-
-ODBC is a database interface that offers developers the option to transfer data
-either in row-wise or column-wise fashion. Previous Python ODBC modules typically
-use the row-wise approach, and often trade repeated database roundtrips for simplified
-buffer handling. This makes them less suited for data-intensive applications,
-particularly when interfacing with modern columnar analytical databases.
-
-In contrast, turbodbc was designed to leverage columnar data processing from day
-one. Naturally, this implies using the columnar portion of the ODBC API. Equally
-important, however, is to find new ways of providing columnar data to Python users
-that exceed the capabilities of the row-wise API mandated by Python’s [PEP 249][4].
-Turbodbc has adopted Apache Arrow for this very task with the recently released
-version 2.0.0:
-
-```python
->>> from turbodbc import connect
->>> connection = connect(dsn="My columnar database")
->>> cursor = connection.cursor()
->>> cursor.execute("SELECT some_integers, some_strings FROM my_table")
->>> cursor.fetchallarrow()
-pyarrow.Table
-some_integers: int64
-some_strings: string
-```
-
-With this new addition, the data flow for a result set of a typical SELECT query
-is like this:
-*   The database prepares the result set and exposes it to the ODBC driver using
-    either row-wise or column-wise storage.
-*   Turbodbc has the ODBC driver write chunks of the result set into columnar buffers.
-*   These buffers are exposed to turbodbc's Apache Arrow frontend. This frontend
-    will create an Arrow table and fill in the buffered values.
-*   The previous steps are repeated until the entire result set is retrieved.
-
-![Data flow from relational databases to Python with turbodbc and the Apache Arrow frontend]({{ site.url }}/img/turbodbc_arrow.png){:class="img-responsive" width="75%"}
-
-In practice, it is possible to achieve the following ideal situation: A 64-bit integer
-column is stored as one contiguous block of memory in a columnar database. A huge chunk
-of 64-bit integers is transferred over the network and the ODBC driver directly writes
-it to a turbodbc buffer of 64-bit integers. The Arrow frontend accumulates these values
-by copying the entire 64-bit buffer into a free portion of an Arrow table's 64-bit
-integer column.
-
-Moving data from the database to an Arrow table and, thus, providing it to the Python
-user can be as simple as copying memory blocks around, megabytes equivalent to hundred
-thousands of rows at a time. The absence of serialization and conversion logic renders
-the process extremely efficient.
-
-Once the data is stored in an Arrow table, Python users can continue to do some
-actual work. They can convert it into a [Pandas DataFrame][5] for data analysis
-(using a quick `table.to_pandas()`), pass it on to other data processing
-systems such as [Apache Spark][6] or [Apache Impala (incubating)][7], or store
-it in the [Apache Parquet][8] file format. This way, non-Python systems are
-efficiently connected with relational databases.
-
-In the future, turbodbc’s Arrow support will be extended to use more
-sophisticated features such as [dictionary-encoded][9] string fields. We also
-plan to pick smaller than 64-bit [data types][10] where possible. Last but not
-least, Arrow support will be extended to cover the reverse direction of data
-flow, so that Python users can quickly insert Arrow tables into relational
-databases.
-
-If you would like to learn more about turbodbc, check out the [GitHub project][2] and the
-[project documentation][11]. If you want to learn more about how turbodbc implements the
-nitty-gritty details, check out parts [one][12] and [two][13] of the
-["Making of turbodbc"][12] series at [Blue Yonder's technology blog][14].
-
-
-[1]: https://arrow.apache.org/
-[2]: https://github.com/blue-yonder/turbodbc
-[3]: https://en.wikipedia.org/wiki/Open_Database_Connectivity
-[4]: https://www.python.org/dev/peps/pep-0249/
-[5]: https://arrow.apache.org/docs/python/pandas.html
-[6]: http://spark.apache.org/
-[7]: http://impala.apache.org/
-[8]: http://parquet.apache.org/
-[9]: https://arrow.apache.org/docs/memory_layout.html#dictionary-encoding
-[10]: https://arrow.apache.org/docs/metadata.html#integers
-[11]: http://turbodbc.readthedocs.io/
-[12]: https://tech.blue-yonder.com/making-of-turbodbc-part-1-wrestling-with-the-side-effects-of-a-c-api/
-[13]: https://tech.blue-yonder.com/making-of-turbodbc-part-2-c-to-python/
-[14]: https://tech.blue-yonder.com/
-[15]: https://github.com/mathmagique
\ No newline at end of file
diff --git a/site/_posts/2017-07-24-0.5.0-release.md b/site/_posts/2017-07-24-0.5.0-release.md
deleted file mode 100644
index 5c156bf..0000000
--- a/site/_posts/2017-07-24-0.5.0-release.md
+++ /dev/null
@@ -1,114 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.5.0 Release"
-date: "2017-07-25 00:00:00 -0400"
-author: wesm
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-The Apache Arrow team is pleased to announce the 0.5.0 release. It includes
-[**130 resolved JIRAs**][1] with some new features, expanded integration
-testing between implementations, and bug fixes. The Arrow memory format remains
-stable since the 0.3.x and 0.4.x releases.
-
-See the [Install Page][2] to learn how to get the libraries for your
-platform. The [complete changelog][5] is also available.
-
-## Expanded Integration Testing
-
-In this release, we added compatibility tests for dictionary-encoded data
-between Java and C++. This enables the distinct values (the *dictionary*) in a
-vector to be transmitted as part of an Arrow schema while the record batches
-contain integers which correspond to the dictionary.
-
-So we might have:
-
-```
-data (string): ['foo', 'bar', 'foo', 'bar']
-```
-
-In dictionary-encoded form, this could be represented as:
-
-```
-indices (int8): [0, 1, 0, 1]
-dictionary (string): ['foo', 'bar']
-```
-
-In upcoming releases, we plan to complete integration testing for the remaining
-data types (including some more complicated types like unions and decimals) on
-the road to a 1.0.0 release in the future.
-
-## C++ Activity
-
-We completed a number of significant pieces of work in the C++ part of Apache
-Arrow.
-
-### Using jemalloc as default memory allocator
-
-We decided to use [jemalloc][4] as the default memory allocator unless it is
-explicitly disabled. This memory allocator has significant performance
-advantages in Arrow workloads over the default `malloc` implementation. We will
-publish a blog post going into more detail about this and why you might care.
-
-### Sharing more C++ code with Apache Parquet
-
-We imported the compression library interfaces and dictionary encoding
-algorithms from the [Apache Parquet C++ library][3]. The Parquet library now
-depends on this code in Arrow, and we will be able to use it more easily for
-data compression in Arrow use cases.
-
-As part of incorporating Parquet's dictionary encoding utilities, we have
-developed an `arrow::DictionaryBuilder` class to enable building
-dictionary-encoded arrays iteratively. This can help save memory and yield
-better performance when interacting with databases, Parquet files, or other
-sources which may have columns having many duplicates.
-
-### Support for LZ4 and ZSTD compressors
-
-We added LZ4 and ZSTD compression library support. In ARROW-300 and other
-planned work, we intend to add some compression features for data sent via RPC.
-
-## Python Activity
-
-We fixed many bugs which were affecting Parquet and Feather users and fixed
-several other rough edges with normal Arrow use. We also added some additional
-Arrow type conversions: structs, lists embedded in pandas objects, and Arrow
-time types (which deserialize to the `datetime.time` type).
-
-In upcoming releases we plan to continue to improve [Dask][7] support and
-performance for distributed processing of Apache Parquet files with pyarrow.
-
-## The Road Ahead
-
-We have much work ahead of us to build out Arrow integrations in other data
-systems to improve their processing performance and interoperability with other
-systems.
-
-We are discussing the roadmap to a future 1.0.0 release on the [developer
-mailing list][6]. Please join the discussion there.
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20in%20(Resolved%2C%20Closed)%20AND%20fixVersion%20%3D%200.5.0
-[2]: http://arrow.apache.org/install
-[3]: http://github.com/apache/parquet-cpp
-[4]: https://github.com/jemalloc/jemalloc
-[5]: http://arrow.apache.org/release/0.5.0.html
-[6]: http://mail-archives.apache.org/mod_mbox/arrow-dev/
-[7]: http://github.com/dask/dask
\ No newline at end of file
diff --git a/site/_posts/2017-07-26-spark-arrow.md b/site/_posts/2017-07-26-spark-arrow.md
deleted file mode 100644
index 211e5a4..0000000
--- a/site/_posts/2017-07-26-spark-arrow.md
+++ /dev/null
@@ -1,158 +0,0 @@
----
-layout: post
-title: "Speeding up PySpark with Apache Arrow"
-date: "2017-07-26 08:00:00 -0800"
-author: BryanCutler
-categories: [application]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-*[Bryan Cutler][11] is a software engineer at IBM's Spark Technology Center [STC][12]*
-
-Beginning with [Apache Spark][1] version 2.3, [Apache Arrow][2] will be a supported
-dependency and begin to offer increased performance with columnar data transfer.
-If you are a Spark user that prefers to work in Python and Pandas, this is a cause
-to be excited over! The initial work is limited to collecting a Spark DataFrame
-with `toPandas()`, which I will discuss below, however there are many additional
-improvements that are currently [underway][3].
-
-# Optimizing Spark Conversion to Pandas
-
-The previous way of converting a Spark DataFrame to Pandas with `DataFrame.toPandas()`
-in PySpark was painfully inefficient. Basically, it worked by first collecting all
-rows to the Spark driver. Next, each row would get serialized into Python's pickle
-format and sent to a Python worker process. This child process unpickles each row into
-a huge list of tuples. Finally, a Pandas DataFrame is created from the list using
-`pandas.DataFrame.from_records()`.
-
-This all might seem like standard procedure, but suffers from 2 glaring issues: 1)
-even using CPickle, Python serialization is a slow process and 2) creating
-a `pandas.DataFrame` using `from_records` must slowly iterate over the list of pure
-Python data and convert each value to Pandas format. See [here][4] for a detailed
-analysis.
-
-Here is where Arrow really shines to help optimize these steps: 1) Once the data is
-in Arrow memory format, there is no need to serialize/pickle anymore as Arrow data can
-be sent directly to the Python process, 2) When the Arrow data is received in Python,
-then pyarrow can utilize zero-copy methods to create a `pandas.DataFrame` from entire
-chunks of data at once instead of processing individual scalar values. Additionally,
-the conversion to Arrow data can be done on the JVM and pushed back for the Spark
-executors to perform in parallel, drastically reducing the load on the driver.
-
-As of the merging of [SPARK-13534][5], the use of Arrow when calling `toPandas()`
-needs to be enabled by setting the SQLConf "spark.sql.execution.arrow.enabled" to
-"true".  Let's look at a simple usage example.
-
-```
-Welcome to
-      ____              __
-     / __/__  ___ _____/ /__
-    _\ \/ _ \/ _ `/ __/  '_/
-   /__ / .__/\_,_/_/ /_/\_\   version 2.3.0-SNAPSHOT
-      /_/
-
-Using Python version 2.7.13 (default, Dec 20 2016 23:09:15)
-SparkSession available as 'spark'.
-
-In [1]: from pyspark.sql.functions import rand
-   ...: df = spark.range(1 << 22).toDF("id").withColumn("x", rand())
-   ...: df.printSchema()
-   ...: 
-root
- |-- id: long (nullable = false)
- |-- x: double (nullable = false)
-
-
-In [2]: %time pdf = df.toPandas()
-CPU times: user 17.4 s, sys: 792 ms, total: 18.1 s
-Wall time: 20.7 s
-
-In [3]: spark.conf.set("spark.sql.execution.arrow.enabled", "true")
-
-In [4]: %time pdf = df.toPandas()
-CPU times: user 40 ms, sys: 32 ms, total: 72 ms                                 
-Wall time: 737 ms
-
-In [5]: pdf.describe()
-Out[5]: 
-                 id             x
-count  4.194304e+06  4.194304e+06
-mean   2.097152e+06  4.998996e-01
-std    1.210791e+06  2.887247e-01
-min    0.000000e+00  8.291929e-07
-25%    1.048576e+06  2.498116e-01
-50%    2.097152e+06  4.999210e-01
-75%    3.145727e+06  7.498380e-01
-max    4.194303e+06  9.999996e-01
-```
-
-This example was run locally on my laptop using Spark defaults so the times
-shown should not be taken precisely. Even though, it is clear there is a huge
-performance boost and using Arrow took something that was excruciatingly slow
-and speeds it up to be barely noticeable.
-
-# Notes on Usage
-
-Here are some things to keep in mind before making use of this new feature. At
-the time of writing this, pyarrow will not be installed automatically with
-pyspark and needs to be manually installed, see installation [instructions][6].
-It is planned to add pyarrow as a pyspark dependency so that 
-`> pip install pyspark` will also install pyarrow.
-
-Currently, the controlling SQLConf is disabled by default. This can be enabled
-programmatically as in the example above or by adding the line
-"spark.sql.execution.arrow.enabled=true" to `SPARK_HOME/conf/spark-defaults.conf`.
-
-Also, not all Spark data types are currently supported and limited to primitive
-types. Expanded type support is in the works and expected to also be in the Spark
-2.3 release.
-
-# Future Improvements
-
-As mentioned, this was just a first step in using Arrow to make life easier for
-Spark Python users. A few exciting initiatives in the works are to allow for
-vectorized UDF evaluation ([SPARK-21190][7], [SPARK-21404][8]), and the ability
-to apply a function on grouped data using a Pandas DataFrame ([SPARK-20396][9]).
-Just as Arrow helped in converting a Spark to Pandas, it can also work in the
-other direction when creating a Spark DataFrame from an existing Pandas
-DataFrame ([SPARK-20791][10]). Stay tuned for more!
-
-# Collaborators
-
-Reaching this first milestone was a group effort from both the Apache Arrow and
-Spark communities. Thanks to the hard work of [Wes McKinney][13], [Li Jin][14],
-[Holden Karau][15], Reynold Xin, Wenchen Fan, Shane Knapp and many others that
-helped push this effort forwards.
-
-[1]: https://spark.apache.org/
-[2]: https://arrow.apache.org/
-[3]: https://issues.apache.org/jira/issues/?filter=12335725&jql=project%20%3D%20SPARK%20AND%20status%20in%20(Open%2C%20%22In%20Progress%22%2C%20Reopened)%20AND%20text%20~%20%22arrow%22%20ORDER%20BY%20createdDate%20DESC
-[4]: https://gist.github.com/wesm/0cb5531b1c2e346a0007
-[5]: https://issues.apache.org/jira/browse/SPARK-13534
-[6]: https://github.com/apache/arrow/blob/master/site/install.md
-[7]: https://issues.apache.org/jira/browse/SPARK-21190
-[8]: https://issues.apache.org/jira/browse/SPARK-21404
-[9]: https://issues.apache.org/jira/browse/SPARK-20396
-[10]: https://issues.apache.org/jira/browse/SPARK-20791
-[11]: https://github.com/BryanCutler
-[12]: http://www.spark.tc/
-[13]: https://github.com/wesm
-[14]: https://github.com/icexelloss
-[15]: https://github.com/holdenk
diff --git a/site/_posts/2017-08-08-plasma-in-memory-object-store.md b/site/_posts/2017-08-08-plasma-in-memory-object-store.md
deleted file mode 100644
index 48cfb66..0000000
--- a/site/_posts/2017-08-08-plasma-in-memory-object-store.md
+++ /dev/null
@@ -1,150 +0,0 @@
----
-layout: post
-title: "Plasma In-Memory Object Store"
-date: "2017-08-08 00:00:00 -0400"
-author: Philipp Moritz and Robert Nishihara
-categories: [application]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-*[Philipp Moritz][1] and [Robert Nishihara][2] are graduate students at UC
- Berkeley.*
-
-## Plasma: A High-Performance Shared-Memory Object Store
-
-### Motivating Plasma
-
-This blog post presents Plasma, an in-memory object store that is being
-developed as part of Apache Arrow. **Plasma holds immutable objects in shared
-memory so that they can be accessed efficiently by many clients across process
-boundaries.** In light of the trend toward larger and larger multicore machines,
-Plasma enables critical performance optimizations in the big data regime.
-
-Plasma was initially developed as part of [Ray][3], and has recently been moved
-to Apache Arrow in the hopes that it will be broadly useful.
-
-One of the goals of Apache Arrow is to serve as a common data layer enabling
-zero-copy data exchange between multiple frameworks. A key component of this
-vision is the use of off-heap memory management (via Plasma) for storing and
-sharing Arrow-serialized objects between applications.
-
-**Expensive serialization and deserialization as well as data copying are a
-common performance bottleneck in distributed computing.** For example, a
-Python-based execution framework that wishes to distribute computation across
-multiple Python “worker” processes and then aggregate the results in a single
-“driver” process may choose to serialize data using the built-in `pickle`
-library. Assuming one Python process per core, each worker process would have to
-copy and deserialize the data, resulting in excessive memory usage. The driver
-process would then have to deserialize results from each of the workers,
-resulting in a bottleneck.
-
-Using Plasma plus Arrow, the data being operated on would be placed in the
-Plasma store once, and all of the workers would read the data without copying or
-deserializing it (the workers would map the relevant region of memory into their
-own address spaces). The workers would then put the results of their computation
-back into the Plasma store, which the driver could then read and aggregate
-without copying or deserializing the data.
-
-### The Plasma API:
-
-Below we illustrate a subset of the API. The C++ API is documented more fully
-[here][6], and the Python API is documented [here][7].
-
-**Object IDs:** Each object is associated with a string of bytes.
-
-**Creating an object:** Objects are stored in Plasma in two stages. First, the
-object store *creates* the object by allocating a buffer for it. At this point,
-the client can write to the buffer and construct the object within the allocated
-buffer. When the client is done, the client *seals* the buffer making the object
-immutable and making it available to other Plasma clients.
-
-```python
-# Create an object.
-object_id = pyarrow.plasma.ObjectID(20 * b'a')
-object_size = 1000
-buffer = memoryview(client.create(object_id, object_size))
-
-# Write to the buffer.
-for i in range(1000):
-    buffer[i] = 0
-
-# Seal the object making it immutable and available to other clients.
-client.seal(object_id)
-```
-
-**Getting an object:** After an object has been sealed, any client who knows the
-object ID can get the object.
-
-```python
-# Get the object from the store. This blocks until the object has been sealed.
-object_id = pyarrow.plasma.ObjectID(20 * b'a')
-[buff] = client.get([object_id])
-buffer = memoryview(buff)
-```
-
-If the object has not been sealed yet, then the call to `client.get` will block
-until the object has been sealed.
-
-### A sorting application
-
-To illustrate the benefits of Plasma, we demonstrate an **11x speedup** (on a
-machine with 20 physical cores) for sorting a large pandas DataFrame (one
-billion entries). The baseline is the built-in pandas sort function, which sorts
-the DataFrame in 477 seconds. To leverage multiple cores, we implement the
-following standard distributed sorting scheme.
-
-* We assume that the data is partitioned across K pandas DataFrames and that
-  each one already lives in the Plasma store.
-* We subsample the data, sort the subsampled data, and use the result to define
-  L non-overlapping buckets.
-* For each of the K data partitions and each of the L buckets, we find the
-  subset of the data partition that falls in the bucket, and we sort that
-  subset.
-* For each of the L buckets, we gather all of the K sorted subsets that fall in
-  that bucket.
-* For each of the L buckets, we merge the corresponding K sorted subsets.
-* We turn each bucket into a pandas DataFrame and place it in the Plasma store.
-
-Using this scheme, we can sort the DataFrame (the data starts and ends in the
-Plasma store), in 44 seconds, giving an 11x speedup over the baseline.
-
-### Design
-
-The Plasma store runs as a separate process. It is written in C++ and is
-designed as a single-threaded event loop based on the [Redis][4] event loop library.
-The plasma client library can be linked into applications. Clients communicate
-with the Plasma store via messages serialized using [Google Flatbuffers][5].
-
-### Call for contributions
-
-Plasma is a work in progress, and the API is currently unstable. Today Plasma is
-primarily used in [Ray][3] as an in-memory cache for Arrow serialized objects.
-We are looking for a broader set of use cases to help refine Plasma’s API. In
-addition, we are looking for contributions in a variety of areas including
-improving performance and building other language bindings. Please let us know
-if you are interested in getting involved with the project.
-
-[1]: https://people.eecs.berkeley.edu/~pcmoritz/
-[2]: http://www.robertnishihara.com
-[3]: https://github.com/ray-project/ray
-[4]: https://redis.io/
-[5]: https://google.github.io/flatbuffers/
-[6]: https://github.com/apache/arrow/blob/master/cpp/apidoc/tutorials/plasma.md
-[7]: https://github.com/apache/arrow/blob/master/python/doc/source/plasma.rst
diff --git a/site/_posts/2017-08-16-0.6.0-release.md b/site/_posts/2017-08-16-0.6.0-release.md
deleted file mode 100644
index 2796c4b..0000000
--- a/site/_posts/2017-08-16-0.6.0-release.md
+++ /dev/null
@@ -1,112 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.6.0 Release"
-date: "2017-08-16 00:00:00 -0400"
-author: wesm
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-The Apache Arrow team is pleased to announce the 0.6.0 release. It includes
-[**90 resolved JIRAs**][1] with the new Plasma shared memory object store, and
-improvements and bug fixes to the various language implementations. The Arrow
-memory format remains stable since the 0.3.x release.
-
-See the [Install Page][2] to learn how to get the libraries for your
-platform. The [complete changelog][5] is also available.
-
-## Plasma Shared Memory Object Store
-
-This release includes the [Plasma Store][7], which you can read more about in
-the linked blog post. This system was originally developed as part of the [Ray
-Project][8] at the [UC Berkeley RISELab][9]. We recognized that Plasma would be
-highly valuable to the Arrow community as a tool for shared memory management
-and zero-copy deserialization. Additionally, we believe we will be able to
-develop a stronger software stack through sharing of IO and buffer management
-code.
-
-The Plasma store is a server application which runs as a separate process. A
-reference C++ client, with Python bindings, is made available in this
-release. Clients can be developed in Java or other languages in the future to
-enable simple sharing of complex datasets through shared memory.
-
-## Arrow Format Addition: Map type
-
-We added a Map logical type to represent ordered and unordered maps
-in-memory. This corresponds to the `MAP` logical type annotation in the Parquet
-format (where maps are represented as repeated structs).
-
-Map is represented as a list of structs. It is the first example of a logical
-type whose physical representation is a nested type. We have not yet created
-implementations of Map containers in any of the implementations, but this can
-be done in a future release.
-
-As an example, the Python data:
-
-```
-data = [{'a': 1, 'bb': 2, 'cc': 3}, {'dddd': 4}]
-```
-
-Could be represented in an Arrow `Map<String, Int32>` as:
-
-```
-Map<String, Int32> = List<Struct<keys: String, values: Int32>>
-  is_valid: [true, true]
-  offsets: [0, 3, 4]
-  values: Struct<keys: String, values: Int32>
-    children:
-      - keys: String
-          is_valid: [true, true, true, true]
-          offsets: [0, 1, 3, 5, 9]
-          data: abbccdddd
-      - values: Int32
-          is_valid: [true, true, true, true]
-          data: [1, 2, 3, 4]
-```
-## Python Changes
-
-Some highlights of Python development outside of bug fixes and general API
-improvements include:
-
-* New `strings_to_categorical=True` option when calling `Table.to_pandas` will
-  yield pandas `Categorical` types from Arrow binary and string columns
-* Expanded Hadoop Filesystem (HDFS) functionality to improve compatibility with
-  Dask and other HDFS-aware Python libraries.
-* s3fs and other Dask-oriented filesystems can now be used with
-  `pyarrow.parquet.ParquetDataset`
-* More graceful handling of pandas's nanosecond timestamps when writing to
-  Parquet format. You can now pass `coerce_timestamps='ms'` to cast to
-  milliseconds, or `'us'` for microseconds.
-
-## Toward Arrow 1.0.0 and Beyond
-
-We are still discussing the roadmap to 1.0.0 release on the [developer mailing
-list][6]. The focus of the 1.0.0 release will likely be memory format stability
-and hardening integration tests across the remaining data types implemented in
-Java and C++. Please join the discussion there.
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20in%20(Resolved%2C%20Closed)%20AND%20fixVersion%20%3D%200.6.0
-[2]: http://arrow.apache.org/install
-[3]: http://github.com/apache/parquet-cpp
-[5]: http://arrow.apache.org/release/0.6.0.html
-[6]: http://mail-archives.apache.org/mod_mbox/arrow-dev/
-[7]: http://arrow.apache.org/blog/2017/08/08/plasma-in-memory-object-store/
-[8]: https://ray-project.github.io/ray/
-[9]: https://rise.cs.berkeley.edu/
\ No newline at end of file
diff --git a/site/_posts/2017-09-19-0.7.0-release.md b/site/_posts/2017-09-19-0.7.0-release.md
deleted file mode 100644
index dd253df..0000000
--- a/site/_posts/2017-09-19-0.7.0-release.md
+++ /dev/null
@@ -1,190 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.7.0 Release"
-date: "2017-09-19 00:00:00 -0400"
-author: wesm
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-The Apache Arrow team is pleased to announce the 0.7.0 release. It includes
-[**133 resolved JIRAs**][1] many new features and bug fixes to the various
-language implementations. The Arrow memory format remains stable since the
-0.3.x release.
-
-See the [Install Page][2] to learn how to get the libraries for your
-platform. The [complete changelog][3] is also available.
-
-We include some highlights from the release in this post.
-
-## New PMC Member: Kouhei Sutou
-
-Since the last release we have added [Kou][4] to the Arrow Project Management
-Committee. He is also a PMC for Apache Subversion, and a major contributor to
-many other open source projects.
-
-As an active member of the Ruby community in Japan, Kou has been developing the
-GLib-based C bindings for Arrow with associated Ruby wrappers, to enable Ruby
-users to benefit from the work that's happening in Apache Arrow.
-
-We are excited to be collaborating with the Ruby community on shared
-infrastructure for in-memory analytics and data science.
-
-## Expanded JavaScript (TypeScript) Implementation
-
-[Paul Taylor][5] from the [Falcor][7] and [ReactiveX][6] projects has worked to
-expand the JavaScript implementation (which is written in TypeScript), using
-the latest in modern JavaScript build and packaging technology. We are looking
-forward to building out the JS implementation and bringing it up to full
-functionality with the C++ and Java implementations.
-
-We are looking for more JavaScript developers to join the project and work
-together to make Arrow for JS work well with many kinds of front end use cases,
-like real time data visualization.
-
-## Type casting for C++ and Python
-
-As part of longer-term efforts to build an Arrow-native in-memory analytics
-library, we implemented a variety of type conversion functions. These functions
-are essential in ETL tasks when conforming one table schema to another. These
-are similar to the `astype` function in NumPy.
-
-```python
-In [17]: import pyarrow as pa
-
-In [18]: arr = pa.array([True, False, None, True])
-
-In [19]: arr
-Out[19]:
-<pyarrow.lib.BooleanArray object at 0x7ff6fb069b88>
-[
-  True,
-  False,
-  NA,
-  True
-]
-
-In [20]: arr.cast(pa.int32())
-Out[20]:
-<pyarrow.lib.Int32Array object at 0x7ff6fb0383b8>
-[
-  1,
-  0,
-  NA,
-  1
-]
-```
-
-Over time these will expand to support as many input-and-output type
-combinations with optimized conversions.
-
-## New Arrow GPU (CUDA) Extension Library for C++
-
-To help with GPU-related projects using Arrow, like the [GPU Open Analytics
-Initiative][8], we have started a C++ add-on library to simplify Arrow memory
-management on CUDA-enabled graphics cards. We would like to expand this to
-include a library of reusable CUDA kernel functions for GPU analytics on Arrow
-columnar memory.
-
-For example, we could write a record batch from CPU memory to GPU device memory
-like so (some error checking omitted):
-
-```c++
-#include <arrow/api.h>
-#include <arrow/gpu/cuda_api.h>
-
-using namespace arrow;
-
-gpu::CudaDeviceManager* manager;
-std::shared_ptr<gpu::CudaContext> context;
-
-gpu::CudaDeviceManager::GetInstance(&manager)
-manager_->GetContext(kGpuNumber, &context);
-
-std::shared_ptr<RecordBatch> batch = GetCpuData();
-
-std::shared_ptr<gpu::CudaBuffer> device_serialized;
-gpu::SerializeRecordBatch(*batch, context_.get(), &device_serialized));
-```
-
-We can then "read" the GPU record batch, but the returned `arrow::RecordBatch`
-internally will contain GPU device pointers that you can use for CUDA kernel
-calls:
-
-```
-std::shared_ptr<RecordBatch> device_batch;
-gpu::ReadRecordBatch(batch->schema(), device_serialized,
-                     default_memory_pool(), &device_batch));
-
-// Now run some CUDA kernels on device_batch
-```
-
-## Decimal Integration Tests
-
-[Phillip Cloud][9] has been working on decimal support in C++ to enable Parquet
-read/write support in C++ and Python, and also end-to-end testing against the
-Arrow Java libraries.
-
-In the upcoming releases, we hope to complete the remaining data types that
-need end-to-end testing between Java and C++:
-
-* Fixed size lists (variable-size lists already implemented)
-* Fixes size binary
-* Unions
-* Maps
-* Time intervals
-
-## Other Notable Python Changes
-
-Some highlights of Python development outside of bug fixes and general API
-improvements include:
-
-* Simplified `put` and `get` arbitrary Python objects in Plasma objects
-* [High-speed, memory efficient object serialization][10]. This is important
-  enough that we will likely write a dedicated blog post about it.
-* New `flavor='spark'` option to `pyarrow.parquet.write_table` to enable easy
-  writing of Parquet files maximized for Spark compatibility
-* `parquet.write_to_dataset` function with support for partitioned writes
-* Improved support for Dask filesystems
-* Improved Python usability for IPC: read and write schemas and record batches
-  more easily. See the [API docs][11] for more about these.
-
-## The Road Ahead
-
-Upcoming Arrow releases will continue to expand the project to cover more use
-cases. In addition to completing end-to-end testing for all the major data
-types, some of us will be shifting attention to building Arrow-native in-memory
-analytics libraries.
-
-We are looking for more JavaScript, R, and other programming language
-developers to join the project and expand the available implementations and
-bindings to more languages.
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20in%20(Resolved%2C%20Closed)%20AND%20fixVersion%20%3D%200.7.0
-[2]: http://arrow.apache.org/install
-[3]: http://arrow.apache.org/release/0.7.0.html
-[4]: https://github.com/kou
-[5]: https://github.com/trxcllnt
-[6]: http://reactivex.io
-[7]: https://github.com/netflix/falcor
-[8]: http://gpuopenanalytics.com/
-[9]: http://github.com/cpcloud
-[10]: http://arrow.apache.org/docs/python/ipc.html
-[11]: http://arrow.apache.org/docs/python/api.html
\ No newline at end of file
diff --git a/site/_posts/2017-10-15-fast-python-serialization-with-ray-and-arrow.md b/site/_posts/2017-10-15-fast-python-serialization-with-ray-and-arrow.md
deleted file mode 100644
index f5f3cca..0000000
--- a/site/_posts/2017-10-15-fast-python-serialization-with-ray-and-arrow.md
+++ /dev/null
@@ -1,308 +0,0 @@
----
-layout: post
-title: "Fast Python Serialization with Ray and Apache Arrow"
-excerpt: "This post describes how serialization works in Ray."
-date: 2017-10-15 14:00:00
-author: Philipp Moritz, Robert Nishihara
-categories: [application]
----
-
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-*This was originally posted on the [Ray blog][11]. [Philipp Moritz][12] and [Robert Nishihara][13] are graduate students at UC Berkeley.*
-
-This post elaborates on the integration between [Ray][1] and [Apache Arrow][2].
-The main problem this addresses is [data serialization][3].
-
-From [Wikipedia][3], **serialization** is
-
-> ... the process of translating data structures or object state into a format
-> that can be stored ... or transmitted ... and reconstructed later (possibly
-> in a different computer environment).
-
-Why is any translation necessary? Well, when you create a Python object, it may
-have pointers to other Python objects, and these objects are all allocated in
-different regions of memory, and all of this has to make sense when unpacked by
-another process on another machine.
-
-Serialization and deserialization are **bottlenecks in parallel and distributed
-computing**, especially in machine learning applications with large objects and
-large quantities of data.
-
-## Design Goals
-
-As Ray is optimized for machine learning and AI applications, we have focused a
-lot on serialization and data handling, with the following design goals:
-
-1. It should be very efficient with **large numerical data** (this includes
-NumPy arrays and Pandas DataFrames, as well as objects that recursively contain
-Numpy arrays and Pandas DataFrames).
-2. It should be about as fast as Pickle for **general Python types**.
-3. It should be compatible with **shared memory**, allowing multiple processes
-to use the same data without copying it.
-4. **Deserialization** should be extremely fast (when possible, it should not
-require reading the entire serialized object).
-5. It should be **language independent** (eventually we'd like to enable Python
-workers to use objects created by workers in Java or other languages and vice
-versa).
-
-## Our Approach and Alternatives
-
-The go-to serialization approach in Python is the **pickle** module. Pickle is
-very general, especially if you use variants like [cloudpickle][4]. However, it
-does not satisfy requirements 1, 3, 4, or 5. Alternatives like **json** satisfy
-5, but not 1-4.
-
-**Our Approach:** To satisfy requirements 1-5, we chose to use the
-[Apache Arrow][2] format as our underlying data representation. In collaboration
-with the Apache Arrow team, we built [libraries][7] for mapping general Python
-objects to and from the Arrow format. Some properties of this approach:
-
-- The data layout is language independent (requirement 5).
-- Offsets into a serialized data blob can be computed in constant time without
-reading the full object (requirements 1 and 4).
-- Arrow supports **zero-copy reads**, so objects can naturally be stored in
-shared memory and used by multiple processes (requirements 1 and 3).
-- We can naturally fall back to pickle for anything we can’t handle well
-(requirement 2).
-
-**Alternatives to Arrow:** We could have built on top of
-[**Protocol Buffers**][5], but protocol buffers really isn't designed for
-numerical data, and that approach wouldn’t satisfy 1, 3, or 4. Building on top
-of [**Flatbuffers**][6] actually could be made to work, but it would have
-required implementing a lot of the facilities that Arrow already has and we
-preferred a columnar data layout more optimized for big data.
-
-## Speedups
-
-Here we show some performance improvements over Python’s pickle module. The
-experiments were done using `pickle.HIGHEST_PROTOCOL`. Code for generating these
-plots is included at the end of the post.
-
-**With NumPy arrays:** In machine learning and AI applications, data (e.g.,
-images, neural network weights, text documents) are typically represented as
-data structures containing NumPy arrays. When using NumPy arrays, the speedups
-are impressive.
-
-The fact that the Ray bars for deserialization are barely visible is not a
-mistake. This is a consequence of the support for zero-copy reads (the savings
-largely come from the lack of memory movement).
-
-<div align="center">
-<img src="{{ site.base-url }}/assets/fast_python_serialization_with_ray_and_arrow/speedups0.png" width="365" height="255">
-<img src="{{ site.base-url }}/assets/fast_python_serialization_with_ray_and_arrow/speedups1.png" width="365" height="255">
-</div>
-
-Note that the biggest wins are with deserialization. The speedups here are
-multiple orders of magnitude and get better as the NumPy arrays get larger
-(thanks to design goals 1, 3, and 4). Making **deserialization** fast is
-important for two reasons. First, an object may be serialized once and then
-deserialized many times (e.g., an object that is broadcast to all workers).
-Second, a common pattern is for many objects to be serialized in parallel and
-then aggregated and deserialized one at a time on a single worker making
-deserialization the bottleneck.
-
-**Without NumPy arrays:** When using regular Python objects, for which we
-cannot take advantage of shared memory, the results are comparable to pickle.
-
-<div align="center">
-<img src="{{ site.base-url }}/assets/fast_python_serialization_with_ray_and_arrow/speedups2.png" width="365" height="255">
-<img src="{{ site.base-url }}/assets/fast_python_serialization_with_ray_and_arrow/speedups3.png" width="365" height="255">
-</div>
-
-These are just a few examples of interesting Python objects. The most important
-case is the case where NumPy arrays are nested within other objects. Note that
-our serialization library works with very general Python types including custom
-Python classes and deeply nested objects.
-
-## The API
-
-The serialization library can be used directly through pyarrow as follows. More
-documentation is available [here][7].
-
-```python
-x = [(1, 2), 'hello', 3, 4, np.array([5.0, 6.0])]
-serialized_x = pyarrow.serialize(x).to_buffer()
-deserialized_x = pyarrow.deserialize(serialized_x)
-```
-
-It can be used directly through the Ray API as follows.
-
-```python
-x = [(1, 2), 'hello', 3, 4, np.array([5.0, 6.0])]
-x_id = ray.put(x)
-deserialized_x = ray.get(x_id)
-```
-
-## Data Representation
-
-We use Apache Arrow as the underlying language-independent data layout. Objects
-are stored in two parts: a **schema** and a **data blob**. At a high level, the
-data blob is roughly a flattened concatenation of all of the data values
-recursively contained in the object, and the schema defines the types and
-nesting structure of the data blob.
-
-**Technical Details:** Python sequences (e.g., dictionaries, lists, tuples,
-sets) are encoded as Arrow [UnionArrays][8] of other types (e.g., bools, ints,
-strings, bytes, floats, doubles, date64s, tensors (i.e., NumPy arrays), lists,
-tuples, dicts and sets). Nested sequences are encoded using Arrow
-[ListArrays][9]. All tensors are collected and appended to the end of the
-serialized object, and the UnionArray contains references to these tensors.
-
-To give a concrete example, consider the following object.
-
-```python
-[(1, 2), 'hello', 3, 4, np.array([5.0, 6.0])]
-```
-
-It would be represented in Arrow with the following structure.
-
-```
-UnionArray(type_ids=[tuple, string, int, int, ndarray],
-           tuples=ListArray(offsets=[0, 2],
-                            UnionArray(type_ids=[int, int],
-                                       ints=[1, 2])),
-           strings=['hello'],
-           ints=[3, 4],
-           ndarrays=[<offset of numpy array>])
-```
-
-Arrow uses Flatbuffers to encode serialized schemas. **Using only the schema, we
-can compute the offsets of each value in the data blob without scanning through
-the data blob** (unlike Pickle, this is what enables fast deserialization). This
-means that we can avoid copying or otherwise converting large arrays and other
-values during deserialization. Tensors are appended at the end of the UnionArray
-and can be efficiently shared and accessed using shared memory.
-
-Note that the actual object would be laid out in memory as shown below.
-
-<div align="center">
-<img src="{{ site.base-url }}/assets/fast_python_serialization_with_ray_and_arrow/python_object.png" width="600">
-</div>
-<div><i>The layout of a Python object in the heap. Each box is allocated in a
-different memory region, and arrows between boxes represent pointers.</i></div>
-<br />
-
-The Arrow serialized representation would be as follows.
-
-<div align="center">
-<img src="{{ site.base-url }}/assets/fast_python_serialization_with_ray_and_arrow/arrow_object.png" width="400">
-</div>
-<div><i>The memory layout of the Arrow-serialized object.</i></div>
-<br />
-
-## Getting Involved
-
-We welcome contributions, especially in the following areas.
-
-- Use the C++ and Java implementations of Arrow to implement versions of this
-for C++ and Java.
-- Implement support for more Python types and better test coverage.
-
-## Reproducing the Figures Above
-
-For reference, the figures can be reproduced with the following code.
-Benchmarking `ray.put` and `ray.get` instead of `pyarrow.serialize` and
-`pyarrow.deserialize` gives similar figures. The plots were generated at this
-[commit][10].
-
-```python
-import pickle
-import pyarrow
-import matplotlib.pyplot as plt
-import numpy as np
-import timeit
-
-
-def benchmark_object(obj, number=10):
-    # Time serialization and deserialization for pickle.
-    pickle_serialize = timeit.timeit(
-        lambda: pickle.dumps(obj, protocol=pickle.HIGHEST_PROTOCOL),
-        number=number)
-    serialized_obj = pickle.dumps(obj, pickle.HIGHEST_PROTOCOL)
-    pickle_deserialize = timeit.timeit(lambda: pickle.loads(serialized_obj),
-                                       number=number)
-
-    # Time serialization and deserialization for Ray.
-    ray_serialize = timeit.timeit(
-        lambda: pyarrow.serialize(obj).to_buffer(), number=number)
-    serialized_obj = pyarrow.serialize(obj).to_buffer()
-    ray_deserialize = timeit.timeit(
-        lambda: pyarrow.deserialize(serialized_obj), number=number)
-
-    return [[pickle_serialize, pickle_deserialize],
-            [ray_serialize, ray_deserialize]]
-
-
-def plot(pickle_times, ray_times, title, i):
-    fig, ax = plt.subplots()
-    fig.set_size_inches(3.8, 2.7)
-
-    bar_width = 0.35
-    index = np.arange(2)
-    opacity = 0.6
-
-    plt.bar(index, pickle_times, bar_width,
-            alpha=opacity, color='r', label='Pickle')
-
-    plt.bar(index + bar_width, ray_times, bar_width,
-            alpha=opacity, color='c', label='Ray')
-
-    plt.title(title, fontweight='bold')
-    plt.ylabel('Time (seconds)', fontsize=10)
-    labels = ['serialization', 'deserialization']
-    plt.xticks(index + bar_width / 2, labels, fontsize=10)
-    plt.legend(fontsize=10, bbox_to_anchor=(1, 1))
-    plt.tight_layout()
-    plt.yticks(fontsize=10)
-    plt.savefig('plot-' + str(i) + '.png', format='png')
-
-
-test_objects = [
-    [np.random.randn(50000) for i in range(100)],
-    {'weight-' + str(i): np.random.randn(50000) for i in range(100)},
-    {i: set(['string1' + str(i), 'string2' + str(i)]) for i in range(100000)},
-    [str(i) for i in range(200000)]
-]
-
-titles = [
-    'List of large numpy arrays',
-    'Dictionary of large numpy arrays',
-    'Large dictionary of small sets',
-    'Large list of strings'
-]
-
-for i in range(len(test_objects)):
-    plot(*benchmark_object(test_objects[i]), titles[i], i)
-```
-
-[1]: http://ray.readthedocs.io/en/latest/index.html
-[2]: https://arrow.apache.org/
-[3]: https://en.wikipedia.org/wiki/Serialization
-[4]: https://github.com/cloudpipe/cloudpickle/
-[5]: https://developers.google.com/protocol-buffers/
-[6]: https://google.github.io/flatbuffers/
-[7]: https://arrow.apache.org/docs/python/ipc.html#arbitrary-object-serialization
-[8]: http://arrow.apache.org/docs/memory_layout.html#dense-union-type
-[9]: http://arrow.apache.org/docs/memory_layout.html#list-type
-[10]: https://github.com/apache/arrow/tree/894f7400977693b4e0e8f4b9845fd89481f6bf29
-[11]: https://ray-project.github.io/
-[12]: https://people.eecs.berkeley.edu/~pcmoritz/
-[13]: http://www.robertnishihara.com
diff --git a/site/_posts/2017-12-19-0.8.0-release.md b/site/_posts/2017-12-19-0.8.0-release.md
deleted file mode 100644
index 1ecc939..0000000
--- a/site/_posts/2017-12-19-0.8.0-release.md
+++ /dev/null
@@ -1,192 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.8.0 Release"
-date: "2017-12-19 00:01:00 -0400"
-author: wesm
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-The Apache Arrow team is pleased to announce the 0.8.0 release. It is the
-product of 10 weeks of development and includes [**286 resolved JIRAs**][1] with
-many new features and bug fixes to the various language implementations. This
-is the largest release since 0.3.0 earlier this year.
-
-As part of work towards a stabilizing the Arrow format and making a 1.0.0
-release sometime in 2018, we made a series of backwards-incompatible changes to
-the serialized Arrow metadata that requires Arrow readers and writers (0.7.1
-and earlier) to upgrade in order to be compatible with 0.8.0 and higher. We
-expect future backwards-incompatible changes to be rare going forward.
-
-See the [Install Page][2] to learn how to get the libraries for your
-platform. The [complete changelog][3] is also available.
-
-We discuss some highlights from the release and other project news in this
-post.
-
-## Projects "Powered By" Apache Arrow
-
-A growing ecosystem of projects are using Arrow to solve in-memory analytics
-and data interchange problems. We have added a new [Powered By][19] page to the
-Arrow website where we can acknowledge open source projects and companies which
-are using Arrow. If you would like to add your project to the list as an Arrow
-user, please let us know.
-
-## New Arrow committers
-
-Since the last release, we have added 5 new Apache committers:
-
-* [Phillip Cloud][6], who has mainly contributed to C++ and Python
-* [Bryan Cutler][14], who has mainly contributed to Java and Spark integration
-* [Li Jin][15], who has mainly contributed to Java and Spark integration
-* [Paul Taylor][5], who has mainly contributed to JavaScript
-* [Siddharth Teotia][16], who has mainly contributed to Java
-
-Welcome to the Arrow team, and thank you for your contributions!
-
-## Improved Java vector API, performance improvements
-
-Siddharth Teotia led efforts to revamp the Java vector API to make things
-simpler and faster. As part of this, we removed the dichotomy between nullable
-and non-nullable vectors.
-
-See [Sidd's blog post][11] for more about these changes.
-
-## Decimal support in C++, Python, consistency with Java
-
-[Phillip Cloud][6] led efforts this release to harden details about exact
-decimal values in the Arrow specification and ensure a consistent
-implementation across Java, C++, and Python.
-
-Arrow now supports decimals represented internally as a 128-bit little-endian
-integer, with a set precision and scale (as defined in many SQL-based
-systems). As part of this work, we needed to change Java's internal
-representation from big- to little-endian.
-
-We are now integration testing decimals between Java, C++, and Python, which
-will facilitate Arrow adoption in Apache Spark and other systems that use both
-Java and Python.
-
-Decimal data can now be read and written by the [Apache Parquet C++
-library][7], including via pyarrow.
-
-In the future, we may implement support for smaller-precision decimals
-represented by 32- or 64-bit integers.
-
-## C++ improvements: expanded kernels library and more
-
-In C++, we have continued developing the new `arrow::compute` submodule
-consisting of native computation fuctions for Arrow data. New contributor
-[Licht Takeuchi][8] helped expand the supported types for type casting in
-`compute::Cast`. We have also implemented new kernels `Unique` and
-`DictionaryEncode` for computing the distinct elements of an array and
-dictionary encoding (conversion to categorical), respectively.
-
-We expect the C++ computation "kernel" library to be a major expansion area for
-the project over the next year and beyond. Here, we can also implement SIMD-
-and GPU-accelerated versions of basic in-memory analytics functionality.
-
-As minor breaking API change in C++, we have made the `RecordBatch` and `Table`
-APIs "virtual" or abstract interfaces, to enable different implementations of a
-record batch or table which conform to the standard interface. This will help
-enable features like lazy IO or column loading.
-
-There was significant work improving the C++ library generally and supporting
-work happening in Python and C. See the change log for full details.
-
-## GLib C improvements: Meson build, GPU support
-
-Developing of the GLib-based C bindings has generally tracked work happening in
-the C++ library. These bindings are being used to develop [data science tools
-for Ruby users][9] and elsewhere.
-
-The C bindings now support the [Meson build system][10] in addition to
-autotools, which enables them to be built on Windows.
-
-The Arrow GPU extension library is now also supported in the C bindings.
-
-## JavaScript: first independent release on NPM
-
-[Brian Hulette][12] and [Paul Taylor][5] have been continuing to drive efforts
-on the TypeScript-based JavaScript implementation.
-
-Since the last release, we made a first JavaScript-only Apache release, version
-0.2.0, which is [now available on NPM][13]. We decided to make separate
-JavaScript releases to enable the JS library to release more frequently than
-the rest of the project.
-
-## Python improvements
-
-In addition to some of the new features mentioned above, we have made a variety
-of usability and performance improvements for integrations with pandas, NumPy,
-Dask, and other Python projects which may make use of pyarrow, the Arrow Python
-library.
-
-Some of these improvements include:
-
-* [Component-based serialization][17] for more flexible and memory-efficient
-  transport of large or complex Python objects
-* Substantially improved serialization performance for pandas objects when
-  using `pyarrow.serialize` and `pyarrow.deserialize`. This includes a special
-  `pyarrow.pandas_serialization_context` which further accelerates certain
-  internal details of pandas serialization * Support zero-copy reads for
-* `pandas.DataFrame` using `pyarrow.deserialize` for objects without Python
-  objects
-* Multithreaded conversions from `pandas.DataFrame` to `pyarrow.Table` (we
-  already supported multithreaded conversions from Arrow back to pandas)
-* More efficient conversion from 1-dimensional NumPy arrays to Arrow format
-* New generic buffer compression and decompression APIs `pyarrow.compress` and
-  `pyarrow.decompress`
-* Enhanced Parquet cross-compatibility with [fastparquet][18] and improved Dask
-  support
-* Python support for accessing Parquet row group column statistics
-
-## Upcoming Roadmap
-
-The 0.8.0 release includes some API and format changes, but upcoming releases
-will focus on ompleting and stabilizing critical functionality to move the
-project closer to a 1.0.0 release.
-
-With the ecosystem of projects using Arrow expanding rapidly, we will be
-working to improve and expand the libraries in support of downstream use cases.
-
-We continue to look for more JavaScript, Julia, R, Rust, and other programming
-language developers to join the project and expand the available
-implementations and bindings to more languages.
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20in%20(Resolved%2C%20Closed)%20AND%20fixVersion%20%3D%200.8.0
-[2]: https://arrow.apache.org/install
-[3]: https://arrow.apache.org/release/0.8.0.html
-[4]: https://github.com/kou
-[5]: https://github.com/trxcllnt
-[6]: https://github.com/cpcloud
-[7]: https://github.com/apache/parquet-cpp
-[8]: https://github.com/licht-t
-[9]: https://github.com/red-data-tools
-[10]: https://mesonbuild.com
-[11]: https://arrow.apache.org/blog/2017/12/19/java-vector-improvements/
-[12]: https://github.com/TheNeuralBit
-[13]: http://npmjs.org/package/apache-arrow
-[14]: https://github.com/BryanCutler
-[15]: https://github.com/icexelloss
-[16]: https://github.com/siddharthteotia
-[17]: http://arrow.apache.org/docs/python/ipc.html
-[18]: https://github.com/dask/fastparquet
-[19]: http://arrow.apache.org/powered_by/
diff --git a/site/_posts/2017-12-19-java-vector-improvements.md b/site/_posts/2017-12-19-java-vector-improvements.md
deleted file mode 100644
index 2854861..0000000
--- a/site/_posts/2017-12-19-java-vector-improvements.md
+++ /dev/null
@@ -1,104 +0,0 @@
----
-layout: post
-title: "Improvements to Java Vector API in Apache Arrow 0.8.0"
-excerpt: "This post describes the recent improvements in Java Vector code"
-date: 2017-12-19 00:00:00
-author: Siddharth Teotia
-categories: [application]
----
-
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-This post gives insight into the major improvements in the Java implementation
-of vectors. We undertook this work over the last 10 weeks since the last Arrow
-release.
-
-## Design Goals
-
-1. Improved maintainability and extensibility
-2. Improved heap memory usage
-3. No performance overhead on hot code paths
-
-## Background
-
-### Improved maintainability and extensibility
-
-We use templates in several places for compile time Java code generation for
-different vector classes, readers, writers etc. Templates are helpful as the
-developers don't have to write a lot of duplicate code.
-
-However, we realized that over a period of time some specific Java
-templates became extremely complex with giant if-else blocks, poor code indentation
-and documentation. All this impacted the ability to easily extend these templates
-for adding new functionality or improving the existing infrastructure.
-
-So we evaluated the usage of templates for compile time code generation and
-decided not to use complex templates in some places by writing small amount of
-duplicate code which is elegant, well documented and extensible.
-
-### Improved heap usage
-
-We did extensive memory analysis downstream in [Dremio][1] where Arrow is used
-heavily for in-memory query execution on columnar data. The general conclusion
-was that Arrow's Java vector classes have non-negligible heap overhead and
-volume of objects was too high. There were places in code where we were
-creating objects unnecessarily and using structures that could be substituted
-with better alternatives.
-
-### No performance overhead on hot code paths
-
-Java vectors used delegation and abstraction heavily throughout the object
-hierarchy. The performance critical get/set methods of vectors went through a
-chain of function calls back and forth between different objects before doing
-meaningful work. We also evaluated the usage of branches in vector APIs and
-reimplemented some of them by avoiding branches completely.
-
-We took inspiration from how the Java memory code in `ArrowBuf` works. For all
-the performance critical methods, `ArrowBuf` bypasses all the netty object
-hierarchy, grabs the target virtual address and directly interacts with the
-memory.
-
-There were cases where branches could be avoided all together.
-
-In case of nullable vectors, we were doing multiple checks to confirm if
-the value at a given position in the vector is null or not.
-
-## Our implementation approach
-
-- For scalars, the inheritance tree was simplified by writing different
-abstract base classes for fixed and variable width scalars.
-- The base classes contained all the common functionality across different
-types.
-- The individual subclasses implemented type specific APIs for fixed and
-variable width scalar vectors.
-- For the performance critical methods, all the work is done either in
-the vector class or corresponding ArrowBuf. There is no delegation to any
-internal object.
-- The mutator and accessor based access to vector APIs is removed. These
-objects led to unnecessary heap overhead and complicated the use of APIs.
-- Both scalar and complex vectors directly interact with underlying buffers
-that manage the offsets, data and validity. Earlier we were creating different
-inner vectors for each vector and delegating all the functionality to inner
-vectors. This introduced a lot of bugs in memory management, excessive heap
-overhead and performance penalty due to chain of delegations.
-- We reduced the number of vector classes by removing non-nullable vectors.
-In the new implementation, all vectors in Java are nullable in nature.
-
-[1]: https://www.dremio.com/
\ No newline at end of file
diff --git a/site/_posts/2018-03-22-0.9.0-release.md b/site/_posts/2018-03-22-0.9.0-release.md
deleted file mode 100644
index e8778d5..0000000
--- a/site/_posts/2018-03-22-0.9.0-release.md
+++ /dev/null
@@ -1,100 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.9.0 Release"
-date: "2018-03-22 00:00:00 -0400"
-author: wesm
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-The Apache Arrow team is pleased to announce the 0.9.0 release. It is the
-product of over 3 months of development and includes [**260 resolved
-JIRAs**][1].
-
-While we made some of backwards-incompatible columnar binary format changes in
-last December's 0.8.0 release, the 0.9.0 release is backwards-compatible with
-0.8.0. We will be working toward a 1.0.0 release this year, which will mark
-longer-term binary stability for the Arrow columnar format and metadata.
-
-See the [Install Page][2] to learn how to get the libraries for your
-platform. The [complete changelog][3] is also available.
-
-We discuss some highlights from the release and other project news in this
-post. This release has been overall focused more on bug fixes, compatibility,
-and stability compared with previous releases which have pushed more on new and
-expanded features.
-
-## New Arrow committers and PMC members
-
-Since the last release, we have added 2 new Arrow committers: [Brian
-Hulette][4] and [Robert Nishihara][5]. Additionally, [Phillip Cloud][6] and
-[Philipp Moritz][7] have been promoted from committer to PMC
-member. Congratulations and thank you for your contributions!
-
-## Plasma Object Store Improvements
-
-The Plasma Object Store now supports managing interprocess shared memory on
-CUDA-enabled GPUs. We are excited to see more GPU-related functionality develop
-in Apache Arrow, as this has become a key computing environment for scalable
-machine learning.
-
-## Python Improvements
-
-[Antoine Pitrou][8] has joined the Python development efforts and helped
-significantly this release with interoperability with built-in CPython data
-structures and NumPy structured data types.
-
-* New experimental support for reading Apache ORC files
-* `pyarrow.array` now accepts lists of tuples or Python dicts for creating
-  Arrow struct type arrays.
-* NumPy structured dtypes (which are row/record-oriented) can be directly
-  converted to Arrow struct (column-oriented) arrays
-* Python 3.6 `pathlib` objects for file paths are now accepted in many file
-  APIs, including for Parquet files
-* Arrow integer arrays with nulls can now be converted to NumPy object arrays
-  with `None` values
-* New `pyarrow.foreign_buffer` API for interacting with memory blocks located
-  at particular memory addresses
-
-## Java Improvements
-
-Java now fully supports the `FixedSizeBinary` data type.
-
-## JavaScript Improvements
-
-The JavaScript library has been significantly refactored and expanded. We are
-making separate Apache releases (most recently `JS-0.3.1`) for JavaScript,
-which are being [published to NPM][9].
-
-## Upcoming Roadmap
-
-In the coming months, we will be working to move Apache Arrow closer to a 1.0.0
-release. We will also be discussing plans to develop native Arrow-based
-computational libraries within the project.
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20in%20(Resolved%2C%20Closed)%20AND%20fixVersion%20%3D%200.9.0
-[2]: https://arrow.apache.org/install
-[3]: https://arrow.apache.org/release/0.8.0.html
-[4]: https://github.com/theneuralbit
-[5]: https://github.com/robertnishihara
-[6]: https://github.com/cpcloud
-[7]: https://github.com/pcmoritz
-[8]: https://github.com/pitrou
-[9]: https://www.npmjs.com/package/apache-arrow
\ No newline at end of file
diff --git a/site/_posts/2018-03-22-go-code-donation.md b/site/_posts/2018-03-22-go-code-donation.md
deleted file mode 100644
index 41ddace..0000000
--- a/site/_posts/2018-03-22-go-code-donation.md
+++ /dev/null
@@ -1,82 +0,0 @@
----
-layout: post
-title: "A Native Go Library for Apache Arrow"
-date: "2018-03-22 00:00:00 -0400"
-author: pmc
-categories: [application]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-Since launching in early 2016, Apache Arrow has been growing fast. We have made
-nine major releases through the efforts of over 120 distinct contributors. The
-project’s scope has also expanded. We began by focusing on the development of
-the standardized in-memory columnar data format, which now serves as a pillar
-of the project. Since then, we have been growing into a more general
-cross-language platform for in-memory data analysis through new additions to
-the project like the [Plasma shared memory object store][1]. A primary goal of
-the project is to enable data system developers to process and move data fast.
-
-So far, we officially have developed native Arrow implementations in C++, Java,
-and JavaScript. We have created binding layers for the C++ libraries in C
-(using the GLib libraries) and Python. We have also seen efforts to develop
-interfaces to the Arrow C++ libraries in Go, Lua, Ruby, and Rust. While binding
-layers serve many purposes, there can be benefits to native implementations,
-and so we’ve been keen to see future work on native implementations in growing
-systems languages like Go and Rust.
-
-This past October, engineers [Stuart Carnie][2], [Nathaniel Cook][3], and
-[Chris Goller][4], employees of [InfluxData][5], began developing a native [Go
-language implementation of the [Apache Arrow][6] in-memory columnar format for
-use in Go-based database systems like InfluxDB. We are excited to announce that
-InfluxData has [donated this native Go implementation to the Apache Arrow
-project][10], where it will continue to be developed. This work features
-low-level integration with the Go runtime and native support for SIMD
-instruction sets. We are looking forward to working more closely with the Go
-community on solving in-memory analytics and data interoperability problems.
-
-<div align="center">
-<img src="{{ site.base-url }}/img/native_go_implementation.png"
-     alt="Apache Arrow implementations and bindings"
-     width="60%" class="img-responsive">
-</div>
-
-One of the mantras in [The Apache Software Foundation][7] is "Community over
-Code". By building an open and collaborative development community across many
-programming language ecosystems, we will be able to development better and
-longer-lived solutions to the systems problems faced by data developers.
-
-We are excited for what the future holds for the Apache Arrow project. Adding
-first-class support for a popular systems programming language like Go is an
-important step along the way. We welcome others from the Go community to get
-involved in the project. We also welcome others who wish to explore building
-Arrow support for other programming languages not yet represented. Learn more
-at [https://arrow.apache.org][9] and join the mailing list
-[dev@arrow.apache.org][8].
-
-[1]: http://arrow.apache.org/blog/2017/08/16/0.6.0-release/
-[2]: https://github.com/stuartcarnie
-[3]: https://github.com/nathanielc
-[4]: https://github.com/goller
-[5]: https://influxdata.com
-[6]: https://github.com/influxdata/arrow
-[7]: https://www.apache.org
-[8]: https://lists.apache.org/list.html?dev@arrow.apache.org
-[9]: https://arrow.apache.org
-[10]: https://www.businesswire.com/news/home/20180322005393/en/InfluxData-Announces-Language-Implementation-Contribution-Apache-Arrow
\ No newline at end of file
diff --git a/site/_posts/2018-07-20-jemalloc.md b/site/_posts/2018-07-20-jemalloc.md
deleted file mode 100644
index ea47665..0000000
--- a/site/_posts/2018-07-20-jemalloc.md
+++ /dev/null
@@ -1,141 +0,0 @@
----
-layout: post
-title: "Faster, scalable memory allocations in Apache Arrow with jemalloc"
-date: "2018-07-20 12:00:00 +0100"
-author: uwe
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-With the release of the 0.9 version of Apache Arrow, we have switched our
-default allocator for array buffers from the system allocator to jemalloc on
-OSX and Linux. This applies to the C++/GLib/Python implementations of Arrow.
-In most cases changing the default allocator is normally done to avoid problems
-that occur with many small, frequent (de)allocations. In contrast, in Arrow we
-normally deal with large in-memory datasets. While jemalloc provides good
-strategies for [avoiding RAM fragmentation for allocations that are lower than
-a memory page (4kb)][1], it also provides functionality that improves
-performance on allocations that span several memory pages.
-
-Outside of Apache Arrow, [jemalloc powers the infrastructure of Facebook][2]
-(this is also where most of its development happens). It is also used as the
-[default allocator in Rust][3] as well as it helps [Redis reduce the memory
-fragmentation on Linux][4] ("Allocator").
-
-One allocation specialty that we require in Arrow is that memory should be
-64byte aligned. This is so that we can get the most performance out of SIMD
-instruction sets like AVX. While the most modern SIMD instructions also work on
-unaligned memory, their performance is much better on aligned memory. To get the
-best performance for our analytical applications, we want all memory to be
-allocated such that SIMD performance is maximized.
-
-For aligned allocations, the POSIX APIs only provide the
-`aligned_alloc(void** ptr, size_t alignment, size_t size)` function to
-allocate aligned memory. There is also 
-`posix_memalign(void **ptr, size_t alignment, size_t size)` to modify an
-allocation to the preferred alignment. But neither of them cater for expansions
-of the allocation. While the `realloc` function can often expand allocations
-without moving them physically, it does not ensure that in the case the
-allocation is moved that the alignment is kept.
-
-In the case when Arrow was built without jemalloc being enabled, this resulted
-in copying the data on each new expansion of an allocation. To reduce the number
-of memory copies, we use jemalloc's `*allocx()`-APIs to create, modify and free
-aligned allocations. One of the typical tasks where this gives us a major
-speedup is on the incremental construction of an Arrow table that consists of
-several columns. We often don't know the size of the table in advance and need
-to expand our allocations as the data is loaded.
-
-To incrementally build a vector using memory expansion of a factor of 2, we
-would use the following C-code with the standard POSIX APIs:
-
-```c
-size_t size = 128 * 1024;
-void* ptr = aligned_alloc(64, size);
-for (int i = 0; i < 10; i++) {
-  size_t new_size = size * 2;
-  void* ptr2 = aligned_alloc(64, new_size);
-  memcpy(ptr2, ptr, size);
-  free(ptr);
-  ptr = ptr2;
-  size = new_size;
-}
-free(ptr);
-```
-
-With jemalloc's special APIs, we are able to omit the explicit call to `memcpy`.
-In the case where a memory expansion cannot be done in-place, it is still called
-by the allocator but not needed on all occasions. This simplifies our user code
-to:
-
-```c
-size_t size = 128 * 1024;
-void* ptr = mallocx(size, MALLOCX_ALIGN(64));
-for (int i = 0; i < 10; i++) {
-  size *= 2;
-  ptr = rallocx(ptr, size, MALLOCX_ALIGN(64));
-}
-dallocx(ptr, MALLOCX_ALIGN(64));
-```
-
-To see the real world benefits of using jemalloc, we look at the benchmarks in
-Arrow C++. There we have modeled a typical use case of incrementally building up
-an array of primitive values. For the build-up of the array, we don't know the
-number of elements in the final array so we need to continuously expand the
-memory region in which the data is stored. The code for this benchmark is part
-of the `builder-benchmark` in the Arrow C++ sources as
-`BuildPrimitiveArrayNoNulls`.
-
-Runtimes without `jemalloc`:
-
-```
-BM_BuildPrimitiveArrayNoNulls/repeats:3                 636726 us   804.114MB/s
-BM_BuildPrimitiveArrayNoNulls/repeats:3                 621345 us   824.019MB/s
-BM_BuildPrimitiveArrayNoNulls/repeats:3                 625008 us    819.19MB/s
-BM_BuildPrimitiveArrayNoNulls/repeats:3_mean            627693 us   815.774MB/s
-BM_BuildPrimitiveArrayNoNulls/repeats:3_median          625008 us    819.19MB/s
-BM_BuildPrimitiveArrayNoNulls/repeats:3_stddev            8034 us   10.3829MB/s
-```
-
-Runtimes with `jemalloc`:
-
-```
-BM_BuildPrimitiveArrayNoNulls/repeats:3                 630881 us   811.563MB/s
-BM_BuildPrimitiveArrayNoNulls/repeats:3                 352891 us   1.41687GB/s
-BM_BuildPrimitiveArrayNoNulls/repeats:3                 351039 us   1.42434GB/s
-BM_BuildPrimitiveArrayNoNulls/repeats:3_mean            444937 us   1.21125GB/s
-BM_BuildPrimitiveArrayNoNulls/repeats:3_median          352891 us   1.41687GB/s
-BM_BuildPrimitiveArrayNoNulls/repeats:3_stddev          161035 us   371.335MB/s
-```
-
-The benchmark was run three times for each configuration to see the performance
-differences. The first run in each configuration yielded the same performance but
-in all subsequent runs, the version using jemalloc was about twice as fast. In
-these cases, the memory region that was used for constructing the array could be
-expanded in place without moving the data around. This was possible as there
-were memory pages assigned to the process that were unused but not reclaimed by
-the operating system. Without `jemalloc`, we cannot make use of them simply by
-the fact that the default allocator has no API that provides aligned
-reallocation.
-
-[1]: https://zapier.com/engineering/celery-python-jemalloc/
-[2]: https://www.facebook.com/notes/facebook-engineering/scalable-memory-allocation-using-jemalloc/480222803919/
-[3]: https://github.com/rust-lang/rust/pull/6895
-[4]: http://download.redis.io/redis-stable/README.md
diff --git a/site/_posts/2018-08-07-0.10.0-release.md b/site/_posts/2018-08-07-0.10.0-release.md
deleted file mode 100644
index 471f865..0000000
--- a/site/_posts/2018-08-07-0.10.0-release.md
+++ /dev/null
@@ -1,65 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.10.0 Release"
-date: "2018-08-07 00:00:00 -0400"
-author: wesm
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-The Apache Arrow team is pleased to announce the 0.10.0 release. It is the
-product of over 4 months of development and includes [**470 resolved
-issues**][1]. It is the largest release so far in the project's history. 90
-individuals contributed to this release.
-
-See the [Install Page][2] to learn how to get the libraries for your
-platform. The [complete changelog][3] is also available.
-
-We discuss some highlights from the release and other project news in this
-post.
-
-## Offical Binary Packages and Packaging Automation
-
-One of the largest projects in this release cycle was automating our build and
-packaging tooling to be able to easily and reproducibly create a [comprehensive
-set of binary artifacts][4] which have been approved and released by the Arrow
-PMC. We developed a tool called **Crossbow** which uses Appveyor and Travis CI
-to build each of the different supported packages on all 3 platforms (Linux,
-macOS, and Windows). As a result of our efforts, we should be able to make more
-frequent Arrow releases. This work was led by Phillip Cloud, Kouhei Sutou, and
-Krisztián Szűcs. Bravo!
-
-## New Programming Languages: Go, Ruby, Rust
-
-This release also adds 3 new programming languages to the project: Go, Ruby,
-and Rust. Together with C, C++, Java, JavaScript, and Python, **we now have
-some level of support for 8 programming languages**.
-
-## Upcoming Roadmap
-
-In the coming months, we will be working to move Apache Arrow closer to a 1.0.0
-release. We will continue to grow new features, improve performance and
-stability, and expand support for currently supported and new programming
-languages.
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20in%20(Resolved%2C%20Closed)%20AND%20fixVersion%20%3D%200.10.0
-[2]: https://arrow.apache.org/install
-[3]: https://arrow.apache.org/release/0.10.0.html
-[4]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.10.0/binaries
\ No newline at end of file
diff --git a/site/_posts/2018-10-09-0.11.0-release.md b/site/_posts/2018-10-09-0.11.0-release.md
deleted file mode 100644
index 3825ce1..0000000
--- a/site/_posts/2018-10-09-0.11.0-release.md
+++ /dev/null
@@ -1,119 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.11.0 Release"
-date: "2018-10-09 00:00:00 -0400"
-author: wesm
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-The Apache Arrow team is pleased to announce the 0.11.0 release. It is the
-product of 2 months of development and includes [**287 resolved
-issues**][1].
-
-See the [Install Page][2] to learn how to get the libraries for your
-platform. The [complete changelog][3] is also available.
-
-We discuss some highlights from the release and other project news in this
-post.
-
-## Arrow Flight RPC and Messaging Framework
-
-We are developing a new Arrow-native RPC framework, Arrow Flight, based on
-[gRPC][11] for high performance Arrow-based messaging. Through low-level
-extensions to gRPC's internal memory management, we are able to avoid expensive
-parsing when receiving datasets over the wire, unlocking unprecedented levels
-of performance in moving datasets from one machine to another. We will be
-writing more about Flight on the Arrow blog in the future.
-
-Prototype implementations are available in Java and C++, and we will be focused
-in the coming months on hardening the Flight RPC framework for enterprise-grade
-production use cases.
-
-## Parquet and Arrow C++ communities joining forces
-
-After discussion over the last year, the Apache Arrow and Apache Parquet C++
-communities decide to merge the Parquet C++ codebase into the Arrow C++
-codebase and work together in a "monorepo" structure. This should result in
-better developer productivity in core Parquet work as well as in Arrow
-integration.
-
-Before this codebase merge, we had a circular dependency between the Arrow and
-Parquet codebases, since the Parquet C++ library is used in the Arrow Python
-library.
-
-## Gandiva LLVM Expression Compiler donation
-
-[Dremio Corporation][5] has donated the [Gandiva][6] LLVM expression compiler
-to Apache Arrow. We will be working on cross-platform builds, packaging, and
-language bindings (e.g. in Python) for Gandiva in the upcoming 0.12 release and
-beyond. We will write more about Gandiva in the future.
-
-## Parquet C GLib Bindings Donation
-
-PMC member [Kouhei Sutou][7] has donated GLib bindings for the Parquet C++
-libraries, which are designed to work together with the existing Arrow GLib
-bindings.
-
-## C++ CSV Reader Project
-
-We have begun developing a general purpose multithreaded CSV file parser in
-C++. The purpose of this library is to parse and convert comma-separated text
-files into Arrow columnar record batches as efficiently as possible. The
-prototype version features Python bindings, and any language that can use the
-C++ libraries (including C, R, and Ruby).
-
-## New MATLAB bindings
-
-[The MathWorks][12] has contributed an initial MEX file binding to the Arrow
-C++ libraries. Initially, it is possible to read Arrow-based Feather files in
-MATLAB. We are looking forward to seeing more developments for MATLAB users.
-
-## R Library in Development
-
-The community has begun implementing [R language bindings and interoperability][8]
-with the Arrow C++ libraries. This will include support for zero-copy shared
-memory IPC and other tools needed to improve R integration with Apache Spark
-and more.
-
-## Support for CUDA-based GPUs in Python
-
-This release includes Python bindings to the Arrow CUDA integration C++
-library. This work is targeting interoperability with [Numba][9] and the [GPU
-Open Analytics Initiative][10].
-
-## Upcoming Roadmap
-
-In the coming months, we will continue to make progress on many fronts, with
-Gandiva packaging, expanded language support (especially in R), and improved
-data access (e.g. CSV, Parquet files) in focus.
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20in%20(Resolved%2C%20Closed)%20AND%20fixVersion%20%3D%200.11.0
-[2]: https://arrow.apache.org/install
-[3]: https://arrow.apache.org/release/0.11.0.html
-[4]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.11.0/binaries
-[5]: http://dremio.com
-[6]: http://github.com/dremio/gandiva
-[7]: https://github.com/kou
-[8]: https://github.com/apache/arrow/tree/master/r
-[9]: https://github.com/numba/numba
-[10]: http://gpuopenanalytics.com/
-[11]: http://grpc.io
-[12]: https://mathworks.com
\ No newline at end of file
diff --git a/site/_posts/2018-12-05-gandiva-donation.md b/site/_posts/2018-12-05-gandiva-donation.md
deleted file mode 100644
index ab12e4d..0000000
--- a/site/_posts/2018-12-05-gandiva-donation.md
+++ /dev/null
@@ -1,93 +0,0 @@
----
-layout: post
-title: "Gandiva: A LLVM-based Analytical Expression Compiler for Apache Arrow"
-date: "2018-12-05 00:00:00 -0500"
-author: jacques
-categories: [application]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-Today we're happy to announce that the Gandiva Initiative for Apache Arrow, an
-LLVM-based execution kernel, is now part of the Apache Arrow project. Gandiva
-was kindly donated by [Dremio](https://www.dremio.com/), where it was
-originally developed and open-sourced. Gandiva extends Arrow's capabilities to
-provide high performance analytical execution and is composed of two main
-components:
-
-* A runtime expression compiler leveraging LLVM
-
-* A high performance execution environment
-
-Gandiva works as follows: applications submit an expression tree to the
-compiler, built in a language agnostic protobuf-based expression
-representation. From there, Gandiva then compiles the expression tree to native
-code for the current runtime environment and hardware. Once compiled, the
-Gandiva execution kernel then consumes and produces Arrow columnar batches. The
-generated code is highly optimized for parallel processing on modern CPUs. For
-example, on AVX-128 processors Gandiva can process 8 pairs of 2 byte values in
-a single vectorized operation, and on AVX-512 processors Gandiva can process 4x
-as many values in a single operation. Gandiva is built from the ground up to
-understand Arrow's in-memory representation and optimize processing against it.
-
-While Gandiva is just starting within the Arrow community, it already supports
-hundreds of [expressions][1], ranging from math functions to case
-statements. Gandiva was built as a standalone C++ library built on top of the
-core Apache Arrow codebase and was donated with C++ and Java APIs construction
-and execution APIs for projection and filtering operations. The Arrow community
-is already looking to expand Gandiva's capabilities. This will include
-incorporating more operations and supporting many new language bindings. As an
-example, multiple community members are already actively building new language
-bindings that allow use of Gandiva within Python and Ruby.
-
-While young within the Arrow community, Gandiva is already shipped and used in
-production by many Dremio customers as part of Dremio's execution
-engine. Experiments have demonstrated [70x performance improvement][2] on many
-SQL queries. We expect to see similar performance gains for many other projects
-that leverage Arrow.
-
-The Arrow community is working to ship the first formal Apache Arrow release
-that includes Gandiva, and we hope this will be available within the next
-couple months. This should make it much easier for the broader analytics and
-data science development communities to leverage runtime code generation for
-high-performance data processing in a variety of contexts and projects.
-
-We started the Arrow project a couple of years ago with the objective of
-creating an industry-standard columnar in-memory data representation for
-analytics. Within this short period of time, Apache Arrow has been adopted by
-dozens of both open source and commercial software products. Some key examples
-include technologies such as Apache Spark, Pandas, Nvidia RAPIDS, Dremio, and
-InfluxDB. This success has driven Arrow to now be downloaded more than 1
-million times per month. Over 200 developers have already contributed to Apache
-Arrow. If you're interested in contributing to Gandiva or any other part of the
-Apache Arrow project, feel free to reach out on the mailing list and join us!
-
-For additional technical details on Gandiva, you can check out some of the
-following resources:
-
-* [https://www.dremio.com/announcing-gandiva-initiative-for-apache-arrow/](https://www.dremio.com/announcing-gandiva-initiative-for-apache-arrow/)
-
-* [https://www.dremio.com/gandiva-performance-improvements-production-query/](https://www.dremio.com/gandiva-performance-improvements-production-query/)
-
-* [https://www.dremio.com/webinars/vectorized-query-processing-apache-arrow/](https://www.dremio.com/webinars/vectorized-query-processing-apache-arrow/)
-
-* [https://www.dremio.com/adding-a-user-define-function-to-gandiva/](https://www.dremio.com/adding-a-user-define-function-to-gandiva/)
-
-[1]: https://github.com/apache/arrow/blob/master/cpp/src/gandiva/function_registry.cc
-[2]: https://www.dremio.com/gandiva-performance-improvements-production-query/
\ No newline at end of file
diff --git a/site/_posts/2019-01-21-0.12.0-release.md b/site/_posts/2019-01-21-0.12.0-release.md
deleted file mode 100644
index eef865c..0000000
--- a/site/_posts/2019-01-21-0.12.0-release.md
+++ /dev/null
@@ -1,184 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.12.0 Release"
-date: "2019-01-21 07:00:00 -0600"
-author: wesm
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-The Apache Arrow team is pleased to announce the 0.12.0 release. This is the
-largest release yet in the project, covering 3 months of development work and
-includes [**614 resolved issues**][1] from [**77 distinct contributors**][22].
-
-See the [Install Page][2] to learn how to get the libraries for your
-platform. The [complete changelog][3] is also available.
-
-It's a huge release, but we'll give some brief highlights and new from the
-project to help guide you to the parts of the project that may be of interest.
-
-## New committers and PMC member
-
-The Arrow team is growing! Since the 0.11.0 release we have added 3 new
-committers:
-
-* [Sebastien Binet][13], who has mainly worked on the Go implementation
-* [Romain Francois][14], who has mainly worked on the R implementation
-* [Yosuke Shiro][15], who has mainly worked on the GLib (C) and Ruby
-  implementations
-
-We also pleased to announce that [Krisztián Szűcs][16] has been promoted
-from committer to PMC (Project Management Committee) member.
-
-Thank you for all your contributions!
-
-## Code donations
-
-Since the last release, we have received 3 code donations into the Apache
-project.
-
-* A [native C# .NET library][17] donated by Feyen Zylstra LLC.
-* A [Ruby library for Parquet files][18] which uses the existing GLib bindings to
-  the C++ Parquet library.
-* A [native Rust Parquet library][19]
-
-We are excited to continue to grow the Apache Arrow development community.
-
-## Combined project-level documentation
-
-Since the last release, we have merged the Python and C++ documentation to
-create a combined project-wide documentation site:
-https://arrow.apache.org/docs. There is now some prose documentation about many
-parts of the C++ library. We intend to keep adding documentation for other
-parts of Apache Arrow to this site.
-
-## Packages
-
-We start providing the official APT and Yum repositories for C++ and
-GLib (C). See the [install document][23] for details.
-
-## C++ notes
-
-Much of the C++ development work the last 3 months concerned internal code
-refactoring and performance improvements. Some user-visible highlights of note:
-
-* Experimental support for [in-memory sparse tensors (or ndarrays)][21], with
-  support for zero-copy IPC
-* Support for building on Alpine Linux
-* Significantly improved hash table utilities, with improved hash table
-  performance in many parts of the library
-* IO library improvements for both read and write buffering
-* A fast [trie implementation][20] for string searching
-* Many improvements to the parallel CSV reader in performance and features. See
-  the changelog
-
-Since the LLVM-based Gandiva expression compiler was donated to Apache Arrow
-during the last release cycle, development there has been moving along. We
-expect to have Windows support for Gandiva and to ship this in downstream
-packages (like Python) in the 0.13 release time frame.
-
-## Go notes
-
-The Arrow Go development team has been expanding. The Go library has gained
-support for many missing features from the columnar format as well as semantic
-constructs like chunked arrays and tables that are used heavily in the C++
-project.
-
-## GLib and Ruby notes
-
-Development of the GLib-based C bindings and corresponding Ruby interfaces have
-advanced in lock-step with the C++, Python, and R libraries. In this release,
-there are many new features in C and Ruby:
-
-* Compressed file read/write support
-* Support for using the C++ parallel CSV reader
-* Feather file support
-* Gandiva bindings
-* Plasma bindings
-
-## Python notes
-
-We fixed a ton of bugs and made many improvements throughout the Python
-project. Some highlights from the Python side include:
-
-* Python 3.7 support: wheels and conda packages are now available for Python
-  3.7
-* Substantially improved memory use when converting strings types to pandas
-  format, including when reading Parquet files. Parquet users should notice
-  significantly lower memory use in common use cases
-* Support for reading and writing compressed files, can be used for CSV files,
-  IPC, or any other form of IO
-* The new `pyarrow.input_stream` and `pyarrow.output_stream` functions support
-  read and write buffering. This is analogous to `BufferedIOBase` from the
-  Python standard library, but the internals are implemented natively in C++.
-* Gandiva (LLVM expression compiler) bindings, though not yet available in
-  pip/conda yet. Look for this in 0.13.0.
-* Many improvements to Arrow CUDA integration, including interoperability with
-  Numba
-
-## R notes
-
-The R library made huge progress in 0.12, with work led by new committer Romain
-Francois. The R project's features are not far behind the Python library, and
-we are hoping to be able to make the R library available to CRAN users for use
-with Apache Spark or for reading and writing Parquet files over the next
-quarter.
-
-Users of the `feather` R library will see significant speed increases in many
-cases when reading Feather files with the new Arrow R library.
-
-## Rust notes
-
-Rust development had an active last 3 months; see the changelog for details.
-
-A native Rust implementation was just donated to the project, and the community
-intends to provide a similar level of functionality for reading and writing
-Parquet files using the Arrow in-memory columnar format as an intermediary.
-
-## Upcoming Roadmap, Outlook for 2019
-
-Apache Arrow has become a large, diverse open source project. It is now being
-used in dozens of downstream open source and commercial projects. Work will be
-proceeding in many areas in 2019:
-
-* Development of in-memory query execution engines (e.g. in C++, Rust)
-* Expanded support for reading and writing the Apache Parquet format, and other
-  common data formats like Apache Avro, CSV, JSON, and Apache ORC.
-* New Flight RPC system for fast messaging of Arrow datasets
-* Expanded support in existing programming languages
-* New programming language bindings or native implementations
-
-It promises to be an exciting 2019. We look forward to having you involved in
-the development community.
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20in%20(Resolved%2C%20Closed)%20AND%20fixVersion%20%3D%200.12.0
-[2]: https://arrow.apache.org/install
-[3]: https://arrow.apache.org/release/0.12.0.html
-[13]: https://github.com/sbinet
-[14]: https://github.com/romainfrancois
-[15]: https://github.com/shiro615
-[16]: https://github.com/kszucs
-[17]: http://incubator.apache.org/ip-clearance/arrow-csharp-library.html
-[18]: http://incubator.apache.org/ip-clearance/arrow-parquet-ruby.html
-[19]: http://incubator.apache.org/ip-clearance/arrow-parquet-rust.html
-[20]: https://github.com/apache/arrow/blob/master/cpp/src/arrow/util/trie.h
-[21]: https://github.com/apache/arrow/blob/master/cpp/src/arrow/sparse_tensor.h
-[22]: https://arrow.apache.org/release/0.12.0.html#contributors
-[23]: https://arrow.apache.org/install/
diff --git a/site/_posts/2019-01-25-r-spark-improvements.md b/site/_posts/2019-01-25-r-spark-improvements.md
deleted file mode 100644
index a06d7aa..0000000
--- a/site/_posts/2019-01-25-r-spark-improvements.md
+++ /dev/null
@@ -1,198 +0,0 @@
----
-layout: post
-title: "Speeding up R and Apache Spark using Apache Arrow"
-date: "2019-01-25 00:00:00 -0600"
-author: Javier Luraschi
-categories: [application]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-*[Javier Luraschi][1] is a software engineer at [RStudio][2]*
-
-Support for Apache Arrow in Apache Spark with R is currently under active
-development in the [sparklyr][3] and [SparkR][4] projects. This post explores early, yet
-promising, performance improvements achieved when using R with [Apache Spark][5],
-Arrow and `sparklyr`.
-
-# Setup
-
-Since this work is under active development, install `sparklyr` and
-`arrow` from GitHub as follows:
-
-```r
-devtools::install_github("apache/arrow", subdir = "r", ref = "apache-arrow-0.12.0")
-devtools::install_github("rstudio/sparklyr", ref = "apache-arrow-0.12.0")
-```
-
-In this benchmark, we will use [dplyr][6], but similar improvements can
-be  expected from using [DBI][7], or [Spark DataFrames][8] in `sparklyr`.
-The local Spark connection and dataframe with 10M numeric rows was
-initialized as follows:
-
-```r
-library(sparklyr)
-library(dplyr)
-
-sc <- spark_connect(master = "local", config = list("sparklyr.shell.driver-memory" = "6g"))
-data <- data.frame(y = runif(10^7, 0, 1))
-```
-
-# Copying
-
-Currently, copying data to Spark using `sparklyr` is performed by persisting
-data on-disk from R and reading it back from Spark. This was meant to be used
-for small datasets since there are better tools to transfer data into
-distributed storage systems. Nevertheless, many users have requested support to
-transfer more data at fast speeds into Spark.
-
-Using `arrow` with `sparklyr`, we can transfer data directly from R to
-Spark without having to serialize this data in R or persist in disk.
-
-The following example copies 10M rows from R into Spark using `sparklyr`
-with and without `arrow`, there is close to a 16x improvement using `arrow`.
-
-This benchmark uses the [microbenchmark][9] R package, which runs code
-multiple times, provides stats on total execution time and plots each
-excecution time to understand the distribution over each iteration.
-
-```r
-microbenchmark::microbenchmark(
-  setup = library(arrow),
-  arrow_on = {
-    sparklyr_df <<- copy_to(sc, data, overwrite = T)
-    count(sparklyr_df) %>% collect()
-  },
-  arrow_off = {
-    if ("arrow" %in% .packages()) detach("package:arrow")
-    sparklyr_df <<- copy_to(sc, data, overwrite = T)
-    count(sparklyr_df) %>% collect()
-  },
-  times = 10
-) %T>% print() %>% ggplot2::autoplot()
-```
-```
- Unit: seconds
-      expr       min        lq       mean    median         uq       max neval
-  arrow_on  3.011515  4.250025   7.257739  7.273011   8.974331  14.23325    10
- arrow_off 50.051947 68.523081 119.946947 71.898908 138.743419 390.44028    10
-```
-
-<div align="center">
-<img src="{{ site.baseurl }}/img/arrow-r-spark-copying.png"
-     alt="Copying data with R into Spark with and without Arrow"
-     width="60%" class="img-responsive">
-</div>
-
-# Collecting
-
-Similarly, `arrow` with `sparklyr` can now avoid deserializing data in R
-while collecting data from Spark into R. These improvements are not as
-significant as copying data since, `sparklyr` already collects data in
-columnar format.
-
-The following benchmark collects 10M rows from Spark into R and shows that
-`arrow` can bring 3x improvements.
-
-```r
-microbenchmark::microbenchmark(
-  setup = library(arrow),
-  arrow_on = {
-    collect(sparklyr_df)
-  },
-  arrow_off = {
-    if ("arrow" %in% .packages()) detach("package:arrow")
-    collect(sparklyr_df)
-  },
-  times = 10
-) %T>% print() %>% ggplot2::autoplot()
-```
-```
-Unit: seconds
-      expr      min        lq      mean    median        uq       max neval
-  arrow_on 4.520593  5.609812  6.154509  5.928099  6.217447  9.432221    10
- arrow_off 7.882841 13.358113 16.670708 16.127704 21.051382 24.373331    10
-```
-
-<div align="center">
-<img src="{{ site.baseurl }}/img/arrow-r-spark-collecting.png"
-     alt="Collecting data with R from Spark with and without Arrow"
-     width="60%" class="img-responsive">
-</div>
-
-# Transforming
-
-Today, custom transformations of data using R functions are performed in
-`sparklyr` by moving data in row-format from Spark into an R process through a
-socket connection, transferring data in row-format is inefficient since
-multiple data types need to be deserialized over each row, then the data gets
-converted to columnar format (R was originally designed to use columnar data),
-once R finishes this computation, data is again converted to row-format,
-serialized row-by-row and then sent back to Spark over the socket connection.
-
-By adding support for `arrow` in `sparklyr`, it makes Spark perform the
-row-format to column-format conversion in parallel in Spark. Data
-is then transferred through the socket but no custom serialization takes place.
-All the R process needs to do is copy this data from the socket into its heap,
-transform it and copy it back to the socket connection.
-
-The following example transforms 100K rows with and without `arrow` enabled,
-`arrow` makes transformation with R functions close to 41x faster.
-
-```r
-microbenchmark::microbenchmark(
-  setup = library(arrow),
-  arrow_on = {
-    sample_n(sparklyr_df, 10^5) %>% spark_apply(~ .x / 2) %>% count()
-  },
-  arrow_off = {
-    if ("arrow" %in% .packages()) detach("package:arrow")
-    sample_n(sparklyr_df, 10^5) %>% spark_apply(~ .x / 2) %>% count()
-  },
-  times = 10
-) %T>% print() %>% ggplot2::autoplot()
-```
-```
-Unit: seconds
-      expr        min         lq       mean     median         uq        max neval
-  arrow_on   3.881293   4.038376   5.136604   4.772739   5.759082   7.873711    10
- arrow_off 178.605733 183.654887 213.296238 227.182018 233.601885 238.877341    10
- ```
-
-<div align="center">
-<img src="{{ site.base-url }}/img/arrow-r-spark-transforming.png"
-     alt="Transforming data with R in Spark with and without Arrow"
-     width="60%" class="img-responsive">
-</div>
-
-Additional benchmarks and fine-tuning parameters can be found under `sparklyr`
-[/rstudio/sparklyr/pull/1611][10] and `SparkR` [/apache/spark/pull/22954][11]. Looking forward to bringing this feature
-to the Spark, Arrow and R communities.
-
-[1]: https://github.com/javierluraschi
-[2]: https://rstudio.com
-[3]: https://github.com/rstudio/sparklyr
-[4]: https://spark.apache.org/docs/latest/sparkr.html
-[5]: https://spark.apache.org
-[6]: https://dplyr.tidyverse.org
-[7]: https://cran.r-project.org/package=DBI
-[8]: https://spark.rstudio.com/reference/#section-spark-dataframes
-[9]: https://CRAN.R-project.org/package=microbenchmark
-[10]: https://github.com/rstudio/sparklyr/pull/1611
-[11]: https://github.com/apache/spark/pull/22954
diff --git a/site/_posts/2019-02-04-datafusion-donation.md b/site/_posts/2019-02-04-datafusion-donation.md
deleted file mode 100644
index 9a7806c..0000000
--- a/site/_posts/2019-02-04-datafusion-donation.md
+++ /dev/null
@@ -1,119 +0,0 @@
----
-layout: post
-title: "DataFusion: A Rust-native Query Engine for Apache Arrow"
-date: "2019-02-04 00:00:00 -0600"
-author: agrove
-categories: [application]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-We are excited to announce that [DataFusion](https://github.com/apache/arrow/tree/master/rust/datafusion) has been donated to the Apache Arrow project. DataFusion is an in-memory query engine for the Rust implementation of Apache Arrow.
-
-Although DataFusion was started two years ago, it was recently re-implemented to be Arrow-native and currently has limited capabilities but does support SQL queries against iterators of RecordBatch and has support for CSV files. There are plans to [add support for Parquet files](https://issues.apache.org/jira/browse/ARROW-4466).
-
-SQL support is limited to projection (`SELECT`), selection (`WHERE`), and simple aggregates (`MIN`, `MAX`, `SUM`) with an optional `GROUP BY` clause.
-
-Supported expressions are identifiers, literals, simple math operations (`+`, `-`, `*`, `/`), binary expressions (`AND`, `OR`), equality and comparison operators (`=`, `!=`, `<`, `<=`, `>=`, `>`), and `CAST(expr AS type)`.
-
-## Example
-
-The following example demonstrates running a simple aggregate SQL query against a CSV file.
-
-```rust
-// create execution context
-let mut ctx = ExecutionContext::new();
-
-// define schema for data source (csv file)
-let schema = Arc::new(Schema::new(vec![
-    Field::new("c1", DataType::Utf8, false),
-    Field::new("c2", DataType::UInt32, false),
-    Field::new("c3", DataType::Int8, false),
-    Field::new("c4", DataType::Int16, false),
-    Field::new("c5", DataType::Int32, false),
-    Field::new("c6", DataType::Int64, false),
-    Field::new("c7", DataType::UInt8, false),
-    Field::new("c8", DataType::UInt16, false),
-    Field::new("c9", DataType::UInt32, false),
-    Field::new("c10", DataType::UInt64, false),
-    Field::new("c11", DataType::Float32, false),
-    Field::new("c12", DataType::Float64, false),
-    Field::new("c13", DataType::Utf8, false),
-]));
-
-// register csv file with the execution context
-let csv_datasource =
-    CsvDataSource::new("test/data/aggregate_test_100.csv", schema.clone(), 1024);
-ctx.register_datasource("aggregate_test_100", Rc::new(RefCell::new(csv_datasource)));
-
-let sql = "SELECT c1, MIN(c12), MAX(c12) FROM aggregate_test_100 WHERE c11 > 0.1 AND c11 < 0.9 GROUP BY c1";
-
-// execute the query
-let relation = ctx.sql(&sql).unwrap();
-let mut results = relation.borrow_mut();
-
-// iterate over the results
-while let Some(batch) = results.next().unwrap() {
-    println!(
-        "RecordBatch has {} rows and {} columns",
-        batch.num_rows(),
-        batch.num_columns()
-    );
-
-    let c1 = batch
-        .column(0)
-        .as_any()
-        .downcast_ref::<BinaryArray>()
-        .unwrap();
-
-    let min = batch
-        .column(1)
-        .as_any()
-        .downcast_ref::<Float64Array>()
-        .unwrap();
-
-    let max = batch
-        .column(2)
-        .as_any()
-        .downcast_ref::<Float64Array>()
-        .unwrap();
-
-    for i in 0..batch.num_rows() {
-        let c1_value: String = String::from_utf8(c1.value(i).to_vec()).unwrap();
-        println!("{}, Min: {}, Max: {}", c1_value, min.value(i), max.value(i),);
-    }
-}
-```
-
-## Roadmap
-
-The roadmap for DataFusion will depend on interest from the Rust community, but here are some of the short term items that are planned:
-
-- Extending test coverage of the existing functionality
-- Adding support for Parquet data sources
-- Implementing more SQL features such as `JOIN`, `ORDER BY` and `LIMIT`
-- Implement a DataFrame API as an alternative to SQL
-- Adding support for partitioning and parallel query execution using Rust's async and await functionality
-- Creating a Docker image to make it easy to use DataFusion as a standalone query tool for interactive and batch queries
-
-## Contributors Welcome!
-
-If you are excited about being able to use Rust for data science and would like to contribute to this work then there are many ways to get involved. The simplest way to get started is to try out DataFusion against your own data sources and file bug reports for any issues that you find. You could also check out the current [list of issues](https://cwiki.apache.org/confluence/display/ARROW/Rust+JIRA+Dashboard) and have a go at fixing one. You can also join the [user mailing list](http://ma [...]
-
-
diff --git a/site/_posts/2019-02-05-python-string-memory-0.12.md b/site/_posts/2019-02-05-python-string-memory-0.12.md
deleted file mode 100644
index 0979274..0000000
--- a/site/_posts/2019-02-05-python-string-memory-0.12.md
+++ /dev/null
@@ -1,247 +0,0 @@
----
-layout: post
-title: "Reducing Python String Memory Use in Apache Arrow 0.12"
-date: "2019-02-05 07:00:00 -0600"
-author: wesm
-categories: [application]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-Python users who upgrade to recently released `pyarrow` 0.12 may find that
-their applications use significantly less memory when converting Arrow string
-data to pandas format. This includes using `pyarrow.parquet.read_table` and
-`pandas.read_parquet`. This article details some of what is going on under the
-hood, and why Python applications dealing with large amounts of strings are
-prone to memory use problems.
-
-## Why Python strings can use a lot of memory
-
-Let's start with some possibly surprising facts. I'm going to create an empty
-`bytes` object and an empty `str` (unicode) object in Python 3.7:
-
-```
-In [1]: val = b''
-
-In [2]: unicode_val = u''
-```
-
-The `sys.getsizeof` function accurately reports the number of bytes used by
-built-in Python objects. You might be surprised to find that:
-
-
-```
-In [4]: import sys
-In [5]: sys.getsizeof(val)
-Out[5]: 33
-
-In [6]: sys.getsizeof(unicode_val)
-Out[6]: 49
-```
-
-Since strings in Python are nul-terminated, we can infer that a bytes object
-has 32 bytes of overhead while unicode has 48 bytes. One must also account for
-`PyObject*` pointer references to the objects, so the actual overhead is 40 and
-56 bytes, respectively. With large strings and text, this overhead may not
-matter much, but when you have a lot of small strings, such as those arising
-from reading a CSV or Apache Parquet file, they can take up an unexpected
-amount of memory. pandas represents strings in NumPy arrays of `PyObject*`
-pointers, so the total memory used by a unique unicode string is
-
-```
-8 (PyObject*) + 48 (Python C struct) + string_length + 1
-```
-
-Suppose that we read a CSV file with
-
-* 1 column
-* 1 million rows
-* Each value in the column is a string with 10 characters
-
-On disk this file would take approximately 10MB. Read into memory, however, it
-could take up over 60MB, as a 10 character string object takes up 67 bytes in a
-`pandas.Series`.
-
-## How Apache Arrow represents strings
-
-While a Python unicode string can have 57 bytes of overhead, a string in the
-Arrow columnar format has only 4 (32 bits) or 4.125 (33 bits) bytes of
-overhead. 32-bit integer offsets encodes the position and size of a string
-value in a contiguous chunk of memory:
-
-<div align="center">
-<img src="{{ site.baseurl }}/img/20190205-arrow-string.png"
-     alt="Apache Arrow string memory layout"
-     width="80%" class="img-responsive">
-</div>
-
-When you call `table.to_pandas()` or `array.to_pandas()` with `pyarrow`, we
-have to convert this compact string representation back to pandas's
-Python-based strings. This can use a huge amount of memory when we have a large
-number of small strings. It is a quite common occurrence when working with web
-analytics data, which compresses to a compact size when stored in the Parquet
-columnar file format.
-
-Note that the Arrow string memory format has other benefits beyond memory
-use. It is also much more efficient for analytics due to the guarantee of data
-locality; all strings are next to each other in memory. In the case of pandas
-and Python strings, the string data can be located anywhere in the process
-heap. Arrow PMC member Uwe Korn did some work to [extend pandas with Arrow
-string arrays][1] for improved performance and memory use.
-
-## Reducing pandas memory use when converting from Arrow
-
-For many years, the `pandas.read_csv` function has relied on a trick to limit
-the amount of string memory allocated. Because pandas uses arrays of
-`PyObject*` pointers to refer to objects in the Python heap, we can avoid
-creating multiple strings with the same value, instead reusing existing objects
-and incrementing their reference counts.
-
-Schematically, we have the following:
-
-<div align="center">
-<img src="{{ site.baseurl }}/img/20190205-numpy-string.png"
-     alt="pandas string memory optimization"
-     width="80%" class="img-responsive">
-</div>
-
-In `pyarrow` 0.12, we have implemented this when calling `to_pandas`. It
-requires using a hash table to deduplicate the Arrow string data as it's being
-converted to pandas. Hashing data is not free, but counterintuitively it can be
-faster in addition to being vastly more memory efficient in the common case in
-analytics where we have table columns with many instances of the same string
-values.
-
-## Memory and Performance Benchmarks
-
-We can use the [`memory_profiler`][2] Python package to easily get process
-memory usage within a running Python application.
-
-```python
-import memory_profiler
-def mem():
-    return memory_profiler.memory_usage()[0]
-```
-
-In a new application I have:
-
-```
-In [7]: mem()
-Out[7]: 86.21875
-```
-
-I will generate approximate 1 gigabyte of string data represented as Python
-strings with length 10. The `pandas.util.testing` module has a handy `rands`
-function for generating random strings. Here is the data generation function:
-
-```python
-from pandas.util.testing import rands
-def generate_strings(length, nunique, string_length=10):
-    unique_values = [rands(string_length) for i in range(nunique)]
-    values = unique_values * (length // nunique)
-    return values
-```
-
-This generates a certain number of unique strings, then duplicates then to
-yield the desired number of total strings. So I'm going to create 100 million
-strings with only 10000 unique values:
-
-```
-In [8]: values = generate_strings(100000000, 10000)
-
-In [9]: mem()
-Out[9]: 852.140625
-```
-
-100 million `PyObject*` values is only 745 MB, so this increase of a little
-over 770 MB is consistent with what we know so far. Now I'm going to convert
-this to Arrow format:
-
-```
-In [11]: arr = pa.array(values)
-
-In [12]: mem()
-Out[12]: 2276.9609375
-```
-
-Since `pyarrow` exactly accounts for all of its memory allocations, we also
-check that
-
-```
-In [13]: pa.total_allocated_bytes()
-Out[13]: 1416777280
-```
-
-Since each string takes about 14 bytes (10 bytes plus 4 bytes of overhead),
-this is what we expect.
-
-Now, converting `arr` back to pandas is where things get tricky. The _minimum_
-amount of memory that pandas can use is a little under 800 MB as above as we
-need 100 million `PyObject*` values, which are 8 bytes each.
-
-```
-In [14]: arr_as_pandas = arr.to_pandas()
-
-In [15]: mem()
-Out[15]: 3041.78125
-```
-
-Doing the math, we used 765 MB which seems right. We can disable the string
-deduplication logic by passing `deduplicate_objects=False` to `to_pandas`:
-
-```
-In [16]: arr_as_pandas_no_dedup = arr.to_pandas(deduplicate_objects=False)
-
-In [17]: mem()
-Out[17]: 10006.95703125
-```
-
-Without object deduplication, we use 6965 megabytes, or an average of 73 bytes
-per value. This is a little bit higher than the theoretical size of 67 bytes
-computed above.
-
-One of the more surprising results is that the new behavior is about twice as fast:
-
-```
-In [18]: %time arr_as_pandas_time = arr.to_pandas()
-CPU times: user 2.94 s, sys: 213 ms, total: 3.15 s
-Wall time: 3.14 s
-
-In [19]: %time arr_as_pandas_no_dedup_time = arr.to_pandas(deduplicate_objects=False)
-CPU times: user 4.19 s, sys: 2.04 s, total: 6.23 s
-Wall time: 6.21 s
-```
-
-The reason for this is that creating so many Python objects is more expensive
-than hashing the 10 byte values and looking them up in a hash table.
-
-Note that when you convert Arrow data with mostly unique values back to pandas,
-the memory use benefits here won't have as much of an impact.
-
-## Takeaways
-
-In Apache Arrow, our goal is to develop computational tools to operate natively
-on the cache- and SIMD-friendly efficient Arrow columnar format. In the
-meantime, though, we recognize that users have legacy applications using the
-native memory layout of pandas or other analytics tools. We will do our best to
-provide fast and memory-efficient interoperability with pandas and other
-popular libraries.
-
-[1]: https://www.slideshare.net/xhochy/extending-pandas-using-apache-arrow-and-numba
-[2]: https://pypi.org/project/memory-profiler/
\ No newline at end of file
diff --git a/site/_posts/2019-04-02-0.13.0-release.md b/site/_posts/2019-04-02-0.13.0-release.md
deleted file mode 100644
index 1450bfc..0000000
--- a/site/_posts/2019-04-02-0.13.0-release.md
+++ /dev/null
@@ -1,187 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.13.0 Release"
-date: "2019-04-02 07:00:00 -0600"
-author: wesm
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-The Apache Arrow team is pleased to announce the 0.13.0 release. This covers
-more than 2 months of development work and includes [**550 resolved
-issues**][1] from [**81 distinct contributors**][22].
-
-See the [Install Page][2] to learn how to get the libraries for your
-platform. The [complete changelog][3] is also available.
-
-While it's a large release, this post will give some brief highlights in the
-project since the 0.12.0 release from January.
-
-## New committers and PMC member
-
-The Arrow team is growing! Since the 0.12.0 release we have increased the size
-of our committer and PMC rosters.
-
-* [Andy Grove][13] was promoted to PMC member
-* [Paddy Horan][245] was added as a committer
-* [Micah Kornfield][14] was added as a committer
-* [Ravindra Pindikura][246] was added as a committer
-* [Chao Sun][247] was added as a committer
-
-Thank you for all your contributions!
-
-## Rust DataFusion Query Engine donation
-
-Since the last release, we received a donation of [DataFusion][17], a
-Rust-native query engine for the Arrow columnar format, whose development had
-been led prior by Andy Grove. [Read more about DataFusion][19] in our February
-blog post.
-
-This is an exciting development for the Rust community, and we look forward to
-developing more analytical query processing within the Apache Arrow project.
-
-## Arrow Flight gRPC progress
-
-Over the last couple months, we have made significant progress on Arrow Flight,
-an Arrow-native data messaging framework. We have integration tests to check
-C++ and Java compatibility, and we have added Python bindings for the C++
-library. We will write a future blog post to go into more detail about how
-Flight works.
-
-## C++ notes
-
-There were 231 issues relating to C++ in this release, far too much to
-summarize in a blog post. Some notable items include:
-
-* An experimental `ExtensionType` was developed for creating user-defined data
-  types that can be embedded in the Arrow binary protocol. This is not yet
-  finalized, but [feedback would be welcome][51].
-* We have undertaken a significant reworking of our CMake build system for C++
-  to make the third party dependencies more configurable. Among other things,
-  this eases work on packaging for Linux distributions. Read more about this in
-  the [C++ developer documentation][18].
-* Laying more groundwork for an Arrow-native in-memory query engine
-* We began building a reader for line-delimited JSON files
-* Gandiva can now be compiled on Windows with Visual Studio
-
-## C# Notes
-
-C# .NET development has picked up since the initial code donation last
-fall. 11 issues were resolved this release cycle.
-
-The Arrow C# package is [now available via NuGet][244].
-
-## Go notes
-
-8 Go-related issues were resolved. A notable feature is the addition of a CSV
-file writer.
-
-## Java notes
-
-26 Java issues were resolved. Outside of Flight-related work, some notable
-items include:
-
-* Migration to Java 8 date and time APIs from Joda
-* Array type support in JDBC adapter
-
-## Javascript Notes
-
-The recent [JavaScript 0.4.1 release][243] is the last JavaScript-only release
-of Apache Arrow. Starting with 0.13 the Javascript implementation is now
-included in mainline Arrow releases! The version number of the released
-JavaScript packages will now be in sync with the mainline version number.
-
-## Python notes
-
-86 Python-related issues were resolved. Some highlights include:
-
-* The Gandiva LLVM expression compiler is now available in the Python wheels
-  through the ``pyarrow.gandiva`` module.
-* Flight RPC bindings
-* Improved pandas serialization performance with RangeIndex
-* pyarrow can be used without pandas installed
-
-Note that Apache Arrow will continue to support Python 2.7 until January 2020.
-
-## Ruby and C GLib notes
-
-36 C/GLib- and Ruby-related issues were resolved. The work continues to follow
-the upstream work in the C++ project.
-
-* `Arrow::RecordBatch#raw_records` was added. It can convert a record batch to
-  a Ruby's array in 10x-200x faster than the same conversion by a pure-Ruby
-  implementation.
-
-## Rust notes
-
-69 Rust-related issues were resolved. Many of these relate to ongoing work in
-the DataFusion query engine. Some notable items include:
-
-* Date/time support
-* SIMD for arithmetic operations
-* Writing CSV and reading line-delimited JSON
-* Parquet data source support for DataFusion
-* Prototype DataFrame-style API for DataFusion
-* Continued evolution of Parquet file reader
-
-## R development progress
-
-The Arrow R developers have expanded the scope of the R language bindings and
-additionally worked on packaging support to be able to submit the package to
-CRAN in the near future. 23 issues were resolved for this release.
-
-[We wrote in January about ongoing work][50] to accelerate R work on Apache Spark
-using Arrow.
-
-## Community Discussions Ongoing
-
-There are a number of active discussions ongoing on the developer
-``dev@arrow.apache.org`` mailing list. We look forward to hearing from the
-community there:
-
-* **Benchmarking**: we are working to create tools for tracking all of our
-  benchmark results on a commit-by-commit basis in a centralized database
-  schema so that we can monitor for performance regressions over time. We hope
-  to develop a publicly viewable benchmark result dashboard.
-* **C++ Datasets**: development of a unified API for reading and writing
-  datasets stored in various common formats like Parquet, JSON, and CSV.
-* **C++ Query Engine**: architecture of a parallel Arrow-native query engine
-  for C++
-* **Arrow Flight Evolution**: adding features to support different real-world
-  data messaging use cases
-* **Arrow Columnar Format evolution**: we are discussing a new "duration" or
-  "time interval" type and some other additions to the Arrow columnar format.
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20%3D%20Resolved%20AND%20fixVersion%20%3D%200.13.0
-[2]: https://arrow.apache.org/install
-[3]: https://arrow.apache.org/release/0.13.0.html
-[13]: https://github.com/andygrove
-[14]: https://github.com/emkornfield
-[17]: http://incubator.apache.org/ip-clearance/arrow-rust-datafusion.html
-[18]: https://github.com/apache/arrow/blob/master/docs/source/developers/cpp.rst#build-dependency-management
-[19]: http://arrow.apache.org/blog/2019/02/04/datafusion-donation/
-[22]: https://arrow.apache.org/release/0.13.0.html#contributors
-[50]: http://arrow.apache.org/blog/2019/01/25/r-spark-improvements/
-[51]: https://github.com/apache/arrow/blob/master/cpp/src/arrow/extension_type.h
-[243]: https://www.npmjs.com/package/apache-arrow/v/0.4.1
-[244]: https://www.nuget.org/packages/Apache.Arrow/0.13.0
-[245]: https://github.com/paddyhoran
-[246]: https://github.com/pravindra
-[247]: https://github.com/sunchao
diff --git a/site/_posts/2019-07-08-0.14.0-release.md b/site/_posts/2019-07-08-0.14.0-release.md
deleted file mode 100644
index bd85705..0000000
--- a/site/_posts/2019-07-08-0.14.0-release.md
+++ /dev/null
@@ -1,300 +0,0 @@
----
-layout: post
-title: "Apache Arrow 0.14.0 Release"
-date: "2019-07-02 00:00:00 -0600"
-author: pmc
-categories: [release]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-The Apache Arrow team is pleased to announce the 0.14.0 release. This
-covers 3 months of development work and includes [**602 resolved
-issues**][1] from [**75 distinct contributors**][2].  See the Install
-Page to learn how to get the libraries for your platform. The
-[complete changelog][3] is also available.
-
-This post will give some brief highlights in the project since the
-0.13.0 release from April.
-
-## New committers
-
-Since the 0.13.0 release, the following have been added:
-
-* [Neville Dipale][5] was added as a committer
-* [François Saint-Jacques][6] was added as a committer
-* [Praveen Kumar][7] was added as a committer
-
-Thank you for all your contributions!
-
-## Upcoming 1.0.0 Format Stability Release
-
-We are planning for our next major release to move from 0.14.0 to
-1.0.0. The major version number will indicate stability of the Arrow
-columnar format and binary protocol. While the format has already been
-stable since December 2017, we believe it is a good idea to make this
-stability official and to indicate that it is safe to persist
-serialized Arrow data in applications. This means that applications
-will be able to safely upgrade to new Arrow versions without having to
-worry about backwards incompatibilities. We will write in a future
-blog post about the stability guarantees we intend to provide to help
-application developers plan accordingly.
-
-## Packaging
-
-We added support for the following platforms:
-
-* Debian GNU/Linux buster
-* Ubuntu 19.04
-
-We dropped support for Ubuntu 14.04.
-
-## Development Infrastructure and Tooling
-
-As the project has grown larger and more diverse, we are increasingly
-outgrowing what we can test in public continuous integration services
-like Travis CI and Appveyor. In addition, we share these resources
-with the entire Apache Software Foundation, and given the high volume
-of pull requests into Apache Arrow, maintainers are frequently waiting
-many hours for the green light to merge patches.
-
-The complexity of our testing is driven by the number of different
-components and programming languages as well as increasingly long
-compilation and test execution times as individual libraries grow
-larger. The 50 minute time limit of public CI services is simply too
-limited to comprehensively test the project. Additionally, the CI host
-machines are constrained in their features and memory limits,
-preventing us from testing features that are only relevant on large
-amounts of data (10GB or more) or functionality that requires a
-CUDA-enabled GPU.
-
-Organizations that contribute to Apache Arrow are working on physical
-build infrastructure and tools to improve build times and build
-scalability. One such new tool is `ursabot`, a GitHub-enabled bot
-that can be used to trigger builds either on physical build or in the
-cloud. It can also be used to trigger benchmark timing comparisons. If
-you are contributing to the project, you may see Ursabot being
-employed to trigger tests in pull requests.
-
-To help assist with migrating away from Travis CI, we are also working
-to make as many of our builds reproducible with Docker and not reliant
-on Travis CI-specific configuration details. This will also help
-contributors reproduce build failures locally without having to wait
-for Travis CI.
-
-## Columnar Format Notes
-
-* User-defined "extension" types have been formalized in the Arrow
-  format, enabling library users to embed custom data types in the
-  Arrow columnar format. Initial support is available in C++, Java,
-  and Python.
-* A new Duration logical type was added to represent absolute lengths
-  of time.
-
-## Arrow Flight notes
-
-Flight now supports many of the features of a complete RPC
-framework.
-
-* Authentication APIs are now supported across all languages (ARROW-5137)
-* Encrypted communication using OpenSSL is supported (ARROW-5643,
-  ARROW-5529)
-* Clients can specify timeouts on remote calls (ARROW-5136)
-* On the protocol level, endpoints are now identified with URIs, to
-  support an open-ended number of potential transports (including TLS
-  and Unix sockets, and perhaps even non-gRPC-based transports in the
-  future) (ARROW-4651)
-* Application-defined metadata can be sent alongside data (ARROW-4626,
-  ARROW-4627).
-
-Windows is now a supported platform for Flight in C++ and Python
-(ARROW-3294), and Python wheels are shipped for all languages
-(ARROW-3150, ARROW-5656). C++, Python, and Java have been brought to
-parity, now that actions can return streaming results in Java
-(ARROW-5254).
-
-## C++ notes
-
-188 resolved issues related to the C++ implementation, so we summarize
-some of the work here.
-
-### General platform improvements
-
-* A FileSystem abstraction (ARROW-767) has been added, which paves the
-  way for a future Arrow Datasets library allowing to access sharded
-  data on arbitrary storage systems, including remote or cloud
-  storage. A first draft of the Datasets API was committed in
-  ARROW-5512. Right now, this comes with no implementation, but we
-  expect to slowly build it up in the coming weeks or months. Early
-  feedback is welcome on this API.
-* The dictionary API has been reworked in ARROW-3144. The dictionary
-  values used to be tied to the DictionaryType instance, which ended
-  up too inflexible. Since dictionary-encoding is more often an
-  optimization than a semantic property of the data, we decided to
-  move the dictionary values to the ArrayData structure, making it
-  natural for dictionary-encoded arrays to share the same DataType
-  instance, regardless of the encoding details.
-* The FixedSizeList and Map types have been implemented, including in
-  integration tests. The Map type is akin to a List of Struct(key,
-  value) entries, but making it explicit that the underlying data has
-  key-value mapping semantics. Also, map entries are always non-null.
-* A `Result<T>` class has been introduced in ARROW-4800. The aim is to
-  allow to return an error as w ell as a function's logical result
-  without resorting to pointer-out arguments.
-* The Parquet C++ library has been refactored to use common Arrow IO
-  classes for improved C++ platform interoperability.
-
-### Line-delimited JSON reader
-
-A multithreaded line-delimited JSON reader (powered internally by
-RapidJSON) is now available for use (also in Python and R via
-bindings) . This will likely be expanded to support more kinds of JSON
-storage in the future.
-
-### New computational kernels
-
-A number of new computational kernels have been developed
-
-* Compare filter for logical comparisons yielding boolean arrays
-* Filter kernel for selecting elements of an input array according to
-  a boolean selection array.
-* Take kernel, which selects elements by integer index, has been
-  expanded to support nested types
-
-## C# Notes
-
-The native C# implementation has continued to mature since 0.13. This
-release includes a number of performance, memory use, and usability
-improvements.
-
-## Go notes
-
-Go's support for the Arrow columnar format continues to expand. Go now
-supports reading and writing the Arrow columnar binary protocol, and
-it has also been **added to the cross language integration
-tests**. There are now four languages (C++, Go, Java, and JavaScript)
-included in our integration tests to verify cross-language
-interoperability.
-
-## Java notes
-
-* Support for referencing arbitrary memory using `ArrowBuf` has been
-  implemented, paving the way for memory map support in Java
-* A number of performance improvements around vector value access were
-  added (see ARROW-5264, ARROW-5290).
-* The Map type has been implemented in Java and integration tested
-  with C++
-* Several microbenchmarks have been added and improved.  Including a
-  significant speed-up of zeroing out buffers.
-* A new algorithms package has been started to contain reference
-  implementations of common algorithms.  The initial contribution is
-  for Array/Vector sorting.
-
-## JavaScript Notes
-
-A new incremental [array builder API][4] is available.
-
-## MATLAB Notes
-
-Version 0.14.0 features improved Feather file support in the MEX bindings.
-
-## Python notes
-
-* We fixed a problem with the Python wheels causing the Python wheels
-  to be much larger in 0.13.0 than they were in 0.12.0. Since the
-  introduction of LLVM into our build toolchain, the wheels are going
-  to still be significantly bigger. We are interested in approaches to
-  enable pyarrow to be installed in pieces with pip or conda rather
-  than monolithically.
-* It is now possible to define ExtensionTypes with a Python
-  implementation (ARROW-840). Those ExtensionTypes can survive a
-  roundtrip through C++ and serialization.
-* The Flight improvements highlighted above (see C++ notes) are all
-  available from Python. Furthermore, Flight is now bundled in our
-  binary wheels and conda packages for Linux, Windows and macOS
-  (ARROW-3150, ARROW-5656).
-* We will build "manylinux2010" binary wheels for Linux systems, in
-  addition to "manylinux1" wheels (ARROW-2461). Manylinux2010 is a
-  newer standard for more recent systems, with less limiting toolchain
-  constraints. Installing manylinux2010 wheels requires an up-to-date
-  version of pip.
-* Various bug fixes for CSV reading in Python and C++ including the
-  ability to parse Decimal(x, y) columns.
-
-### Parquet improvements
-
-* Column statistics for logical types like unicode strings, unsigned
-  integers, and timestamps are casted to compatible Python types (see
-  ARROW-4139)
-* It's now possible to configure "data page" sizes when writing a file
-  from Python
-
-## Ruby and C GLib notes
-
-The GLib and Ruby bindings have been tracking features in the C++
-project. This release includes bindings for Gandiva, JSON reader, and
-other C++ features.
-
-## Rust notes
-
-There is ongoing work in Rust happening on Parquet file support,
-computational kernels, and the DataFusion query engine. See the full
-changelog for details.
-
-## R notes
-
-We have been working on build and packaging for R so that community
-members can hopefully release the project to CRAN in the near
-future. Feature development for R has continued to follow the upstream
-C++ project.
-
-## Community Discussions Ongoing
-
-There are a number of active discussions ongoing on the developer
-dev@arrow.apache.org mailing list. We look forward to hearing from the
-community there:
-
-* [Timing and scope of 1.0.0 release][15]
-* [Solutions to increase continuous integration capacity][13]
-* [A proposal for versioning and forward/backward compatibility
-  guarantees for the 1.0.0 release][8] was shared, not much discussion has
-  occurred yet.
-* [Addressing possible unaligned access and undefined behavior concerns][9]
-  in the Arrow binary protocol
-* [Supporting smaller than 128-bit encoding of fixed width decimals][10]
-* [Forking the Avro C++ implementation][11] so as to adapt it to Arrow's
-  needs
-* [Sparse representation and compression in Arrow][12]
-* [Flight extensions: middleware API and generalized Put operations][14]
-
-[1]: https://issues.apache.org/jira/issues/?jql=project%20%3D%20ARROW%20AND%20status%20%3D%20Resolved%20AND%20fixVersion%20%3D%200.13.0
-[2]: https://arrow.apache.org/release/0.14.0.html#contributors
-[3]: https://arrow.apache.org/release/0.14.0.html
-[4]: https://github.com/apache/arrow/tree/master/js/src/builder
-[5]: https://github.com/nevi-me
-[6]: https://github.com/fsaintjacques
-[7]: https://github.com/praveenbingo
-[8]: https://lists.apache.org/thread.html/5715a4d402c835d22d929a8069c5c0cf232077a660ee98639d544af8@%3Cdev.arrow.apache.org%3E
-[9]: https://lists.apache.org/thread.html/8440be572c49b7b2ffb76b63e6d935ada9efd9c1c2021369b6d27786@%3Cdev.arrow.apache.org%3E
-[10]: https://lists.apache.org/thread.html/31b00086c2991104bd71fb1a2173f32b4a2f569d8e7b5b41e836f3a3@%3Cdev.arrow.apache.org%3E
-[11]: https://lists.apache.org/thread.html/97d78112ab583eecb155a7d78342c1063df65d64ec3ccfa0b18737c3@%3Cdev.arrow.apache.org%3E
-[12]: https://lists.apache.org/thread.html/a99124e57c14c3c9ef9d98f3c80cfe1dd25496bf3ff7046778add937@%3Cdev.arrow.apache.org%3E
-[13]: https://lists.apache.org/thread.html/96b2e22606e8a7b0ad7dc4aae16f232724d1059b34636676ed971d40@%3Cdev.arrow.apache.org%3E
-[14]: https://lists.apache.org/thread.html/82a7c026ad18dbe9fdbcffa3560979aff6fd86dd56a49f40d9cfb46e@%3Cdev.arrow.apache.org%3E
-[15]: https://lists.apache.org/thread.html/44a7a3d256ab5dbd62da6fe45b56951b435697426bf4adedb6520907@%3Cdev.arrow.apache.org%3E
diff --git a/site/_posts/2019-08-01-r-package-on-cran.md b/site/_posts/2019-08-01-r-package-on-cran.md
deleted file mode 100644
index d3d8172..0000000
--- a/site/_posts/2019-08-01-r-package-on-cran.md
+++ /dev/null
@@ -1,173 +0,0 @@
----
-layout: post
-title: "Apache Arrow R Package On CRAN"
-date: "2019-08-08 06:00:00 -0600"
-author: npr
-categories: [application]
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-We are very excited to announce that the `arrow` R package is now available on
-[CRAN](https://cran.r-project.org/).
-
-[Apache Arrow](https://arrow.apache.org/) is a cross-language development
-platform for in-memory data that specifies a standardized columnar memory
-format for flat and hierarchical data, organized for efficient analytic
-operations on modern hardware. The `arrow` package provides an R interface to
-the Arrow C++ library, including support for working with Parquet and Feather
-files, as well as lower-level access to Arrow memory and messages.
-
-You can install the package from CRAN with
-
-```r
-install.packages("arrow")
-```
-
-On macOS and Windows, installing a binary package from CRAN will generally
-handle Arrow's C++ dependencies for you. However, the macOS CRAN binaries are
-unfortunately incomplete for this version, so to install 0.14.1, you'll first
-need to use Homebrew to get the Arrow C++ library (`brew install
-apache-arrow`), and then from R you can `install.packages("arrow", type =
-"source")`.
-
-Windows binaries are not yet available on CRAN but should be published soon.
-
-On Linux, you'll need to first install the C++ library. See the [Arrow project
-installation page](https://arrow.apache.org/install/) to find pre-compiled
-binary packages for some common Linux distributions, including Debian, Ubuntu,
-and CentOS. You'll need to install `libparquet-dev` on Debian and Ubuntu, or
-`parquet-devel` on CentOS. This will also automatically install the Arrow C++
-library as a dependency. Other Linux distributions must install the C++ library
-from source.
-
-If you install the `arrow` R package from source and the C++ library is not
-found, the R package functions will notify you that Arrow is not
-available. Call
-
-```r
-arrow::install_arrow()
-```
-
-for version- and platform-specific guidance on installing the Arrow C++
-library.
-
-## Parquet files
-
-This release introduces basic read and write support for the [Apache
-Parquet](https://parquet.apache.org/) columnar data file format. Prior to this
-release, options for accessing Parquet data in R were limited; the most common
-recommendation was to use Apache Spark. The `arrow` package greatly simplifies
-this access and lets you go from a Parquet file to a `data.frame` and back
-easily, without having to set up a database.
-
-```r
-library(arrow)
-df <- read_parquet("path/to/file.parquet")
-```
-
-This function, along with the other readers in the package, takes an optional
-`col_select` argument, inspired by the
-[`vroom`](https://vroom.r-lib.org/reference/vroom.html) package. This argument
-lets you use the ["tidyselect" helper
-functions](https://tidyselect.r-lib.org/reference/select_helpers.html), as you
-can do in `dplyr::select()`, to specify that you only want to keep certain
-columns. By narrowing your selection at read time, you can load a `data.frame`
-with less memory overhead.
-
-For example, suppose you had written the `iris` dataset to Parquet. You could
-read a `data.frame` with only the columns `c("Sepal.Length", "Sepal.Width")` by
-doing
-
-```r
-df <- read_parquet("iris.parquet", col_select = starts_with("Sepal"))
-```
-
-Just as you can read, you can write Parquet files:
-
-```r
-write_parquet(df, "path/to/different_file.parquet")
-```
-
-Note that this read and write support for Parquet files in R is in its early
-stages of development. The Python Arrow library
-([pyarrow](https://arrow.apache.org/docs/python/)) still has much richer
-support for Parquet files, including working with multi-file datasets. We
-intend to reach feature equivalency between the R and Python packages in the
-future.
-
-## Feather files
-
-This release also includes a faster and more robust implementation of the
-Feather file format, providing `read_feather()` and
-`write_feather()`. [Feather](https://github.com/wesm/feather) was one of the
-initial applications of Apache Arrow for Python and R, providing an efficient,
-common file format language-agnostic data frame storage, along with
-implementations in R and Python.
-
-As Arrow progressed, development of Feather moved to the
-[`apache/arrow`](https://github.com/apache/arrow) project, and for the last two
-years, the Python implementation of Feather has just been a wrapper around
-`pyarrow`. This meant that as Arrow progressed and bugs were fixed, the Python
-version of Feather got the improvements but sadly R did not.
-
-With this release, the R implementation of Feather catches up and now depends
-on the same underlying C++ library as the Python version does. This should
-result in more reliable and consistent behavior across the two languages, as
-well as [improved
-performance](https://wesmckinney.com/blog/feather-arrow-future/).
-
-We encourage all R users of `feather` to switch to using
-`arrow::read_feather()` and `arrow::write_feather()`.
-
-Note that both Feather and Parquet are columnar data formats that allow sharing
-data frames across R, Pandas, and other tools. When should you use Feather and
-when should you use Parquet? Parquet balances space-efficiency with
-deserialization costs, making it an ideal choice for remote storage systems
-like HDFS or Amazon S3. Feather is designed for fast local reads, particularly
-with solid-state drives, and is not intended for use with remote storage
-systems. Feather files can be memory-mapped and accessed as Arrow columnar data
-in-memory without any deserialization while Parquet files always must be
-decompressed and decoded. See the [Arrow project
-FAQ](https://arrow.apache.org/faq/) for more.
-
-## Other capabilities
-
-In addition to these readers and writers, the `arrow` package has wrappers for
-other readers in the C++ library; see `?read_csv_arrow` and
-`?read_json_arrow`. These readers are being developed to optimize for the
-memory layout of the Arrow columnar format and are not intended as a direct
-replacement for existing R CSV readers (`base::read.csv`, `readr::read_csv`,
-`data.table::fread`) that return an R `data.frame`.
-
-It also provides many lower-level bindings to the C++ library, which enable you
-to access and manipulate Arrow objects. You can use these to build connectors
-to other applications and services that use Arrow. One example is Spark: the
-[`sparklyr`](https://spark.rstudio.com/) package has support for using Arrow to
-move data to and from Spark, yielding [significant performance
-gains](http://arrow.apache.org/blog/2019/01/25/r-spark-improvements/).
-
-## Acknowledgements
-
-In addition to the work on wiring the R package up to the Arrow Parquet C++
-library, a lot of effort went into building and packaging Arrow for R users,
-ensuring its ease of installation across platforms. We'd like to thank the
-support of Jeroen Ooms, Javier Luraschi, JJ Allaire, Davis Vaughan, the CRAN
-team, and many others in the Apache Arrow community for helping us get to this
-point.
diff --git a/site/_release/0.1.0.md b/site/_release/0.1.0.md
deleted file mode 100644
index cf30fde..0000000
--- a/site/_release/0.1.0.md
+++ /dev/null
@@ -1,236 +0,0 @@
----
-layout: default
-title: 0.1.0 Release
-permalink: /release/0.1.0.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow 0.1.0 (10 October 2016)
-
-## Download
-
-* **Source Release**: [apache-arrow-0.1.0.tar.gz][6]
-* **Verification**: [md5][3], [asc][7]
-
-# Changelog
-
-## Contributors
-
-```shell
-$ git shortlog -sn d5aa7c46..apache-arrow-0.1.0
-    49  Wes McKinney
-    27  Uwe L. Korn
-    25  Julien Le Dem
-    13  Micah Kornfield
-    11  Steven Phillips
-     6  Jihoon Son
-     5  Laurent Goujon
-     5  adeneche
-     4  Dan Robinson
-     4  proflin
-     2  Jacques Nadeau
-     1  Christopher C. Aycock
-     1  Edmon Begoli
-     1  Kai Zheng
-     1  MechCoder
-     1  Minji Kim
-     1  Philipp Moritz
-     1  Smyatkin Maxim
-     1  fengguangyuan
-     1  hyukjinkwon
-     1  hzhang2
-     1  lfzCarlosC
-```
-
-## New Features and Improvements
-
-* [ARROW-1](https://issues.apache.org/jira/browse/ARROW-1) - Import Initial Codebase
-* [ARROW-10](https://issues.apache.org/jira/browse/ARROW-10) - Fix mismatch of javadoc names and method parameters
-* [ARROW-100](https://issues.apache.org/jira/browse/ARROW-100) - [C++] Computing RowBatch size
-* [ARROW-101](https://issues.apache.org/jira/browse/ARROW-101) - Fix java warnings emitted by java compiler
-* [ARROW-102](https://issues.apache.org/jira/browse/ARROW-102) - travis-ci support for java project
-* [ARROW-106](https://issues.apache.org/jira/browse/ARROW-106) - Add IPC round trip for string types (string, char, varchar, binary)
-* [ARROW-107](https://issues.apache.org/jira/browse/ARROW-107) - [C++] add ipc round trip for struct types
-* [ARROW-11](https://issues.apache.org/jira/browse/ARROW-11) - Mirror JIRA activity to dev@arrow.apache.org
-* [ARROW-13](https://issues.apache.org/jira/browse/ARROW-13) - Add PR merge tool similar to that used in Parquet
-* [ARROW-14](https://issues.apache.org/jira/browse/ARROW-14) - Add JIRA components
-* [ARROW-15](https://issues.apache.org/jira/browse/ARROW-15) - Fix a naming typo for memory.AllocationManager.AllocationOutcome
-* [ARROW-19](https://issues.apache.org/jira/browse/ARROW-19) - C++: Externalize memory allocations and add a MemoryPool abstract interface to builder classes
-* [ARROW-190](https://issues.apache.org/jira/browse/ARROW-190) - Python: Provide installable sdist builds
-* [ARROW-197](https://issues.apache.org/jira/browse/ARROW-197) - [Python] Add conda dev recipe for pyarrow
-* [ARROW-199](https://issues.apache.org/jira/browse/ARROW-199) - [C++] Refine third party dependency
-* [ARROW-2](https://issues.apache.org/jira/browse/ARROW-2) - Post Simple Website
-* [ARROW-20](https://issues.apache.org/jira/browse/ARROW-20) - C++: Add null count member to Array containers, remove nullable member
-* [ARROW-201](https://issues.apache.org/jira/browse/ARROW-201) - C++: Initial ParquetWriter implementation
-* [ARROW-203](https://issues.apache.org/jira/browse/ARROW-203) - Python: Basic filename based Parquet read/write
-* [ARROW-204](https://issues.apache.org/jira/browse/ARROW-204) - [Python] Automate uploading conda build artifacts for libarrow and pyarrow
-* [ARROW-206](https://issues.apache.org/jira/browse/ARROW-206) - [C++] Expose an equality API for arrays that compares a range of slots on two arrays
-* [ARROW-21](https://issues.apache.org/jira/browse/ARROW-21) - C++: Add in-memory schema metadata container
-* [ARROW-212](https://issues.apache.org/jira/browse/ARROW-212) - [C++] Clarify the fact that PrimitiveArray is now abstract class
-* [ARROW-213](https://issues.apache.org/jira/browse/ARROW-213) - Exposing static arrow build
-* [ARROW-214](https://issues.apache.org/jira/browse/ARROW-214) - C++: Add String support to Parquet I/O
-* [ARROW-215](https://issues.apache.org/jira/browse/ARROW-215) - C++: Support other integer types in Parquet I/O
-* [ARROW-218](https://issues.apache.org/jira/browse/ARROW-218) - Add option to use GitHub API token via environment variable when merging PRs
-* [ARROW-22](https://issues.apache.org/jira/browse/ARROW-22) - C++: Add schema adapter routines for converting flat Parquet schemas to in-memory Arrow schemas
-* [ARROW-222](https://issues.apache.org/jira/browse/ARROW-222) - [C++] Create prototype file-like interface to HDFS (via libhdfs) and begin defining more general IO interface for Arrow data adapters
-* [ARROW-23](https://issues.apache.org/jira/browse/ARROW-23) - C++: Add logical "Column" container for chunked data
-* [ARROW-233](https://issues.apache.org/jira/browse/ARROW-233) - [C++] Add visibility defines for limiting shared library symbol visibility
-* [ARROW-234](https://issues.apache.org/jira/browse/ARROW-234) - [C++] Build with libhdfs support in arrow_io in conda builds
-* [ARROW-236](https://issues.apache.org/jira/browse/ARROW-236) - [Python] Enable Parquet read/write to work with HDFS file objects
-* [ARROW-237](https://issues.apache.org/jira/browse/ARROW-237) - [C++] Create Arrow specializations of Parquet allocator and read interfaces
-* [ARROW-238](https://issues.apache.org/jira/browse/ARROW-238) - C++: InternalMemoryPool::Free() should throw an error when there is insufficient allocated memory
-* [ARROW-24](https://issues.apache.org/jira/browse/ARROW-24) - C++: Add logical "Table" container
-* [ARROW-242](https://issues.apache.org/jira/browse/ARROW-242) - C++/Python: Support Timestamp Data Type
-* [ARROW-245](https://issues.apache.org/jira/browse/ARROW-245) - [Format] Clarify Arrow's relationship with big endian platforms
-* [ARROW-251](https://issues.apache.org/jira/browse/ARROW-251) - [C++] Expose APIs for getting code and message of the status
-* [ARROW-252](https://issues.apache.org/jira/browse/ARROW-252) - Add implementation guidelines to the documentation
-* [ARROW-253](https://issues.apache.org/jira/browse/ARROW-253) - Int types should only have width of 8*2^n (8, 16, 32, 64)
-* [ARROW-254](https://issues.apache.org/jira/browse/ARROW-254) - Remove Bit type as it is redundant with boolean
-* [ARROW-255](https://issues.apache.org/jira/browse/ARROW-255) - Finalize Dictionary representation
-* [ARROW-256](https://issues.apache.org/jira/browse/ARROW-256) - Add versioning to the arrow spec.
-* [ARROW-257](https://issues.apache.org/jira/browse/ARROW-257) - Add a typeids Vector to Union type
-* [ARROW-26](https://issues.apache.org/jira/browse/ARROW-26) - C++: Add developer instructions for building parquet-cpp integration
-* [ARROW-260](https://issues.apache.org/jira/browse/ARROW-260) - TestValueVector.testFixedVectorReallocation and testVariableVectorReallocation are flaky
-* [ARROW-262](https://issues.apache.org/jira/browse/ARROW-262) - [Format] Add a new format document for metadata and logical types for messaging and IPC / on-wire/file representations
-* [ARROW-264](https://issues.apache.org/jira/browse/ARROW-264) - Create an Arrow File format
-* [ARROW-267](https://issues.apache.org/jira/browse/ARROW-267) - [C++] C++ implementation of file-like layout for RPC / IPC
-* [ARROW-270](https://issues.apache.org/jira/browse/ARROW-270) - [Format] Define more generic Interval logical type
-* [ARROW-271](https://issues.apache.org/jira/browse/ARROW-271) - Update Field structure to be more explicit
-* [ARROW-272](https://issues.apache.org/jira/browse/ARROW-272) - Arrow release 0.1
-* [ARROW-279](https://issues.apache.org/jira/browse/ARROW-279) - rename vector module to arrow-vector for consistency
-* [ARROW-28](https://issues.apache.org/jira/browse/ARROW-28) - C++: Add google/benchmark to the 3rd-party build toolchain
-* [ARROW-280](https://issues.apache.org/jira/browse/ARROW-280) - [C++] Consolidate file and shared memory IO interfaces
-* [ARROW-285](https://issues.apache.org/jira/browse/ARROW-285) - Allow for custom flatc compiler
-* [ARROW-286](https://issues.apache.org/jira/browse/ARROW-286) - Build thirdparty dependencies in parallel
-* [ARROW-289](https://issues.apache.org/jira/browse/ARROW-289) - Install test-util.h
-* [ARROW-290](https://issues.apache.org/jira/browse/ARROW-290) - Specialize alloc() in ArrowBuf
-* [ARROW-292](https://issues.apache.org/jira/browse/ARROW-292) - [Java] Upgrade Netty to 4.041
-* [ARROW-293](https://issues.apache.org/jira/browse/ARROW-293) - [C++] Implementations of IO interfaces for operating system files
-* [ARROW-296](https://issues.apache.org/jira/browse/ARROW-296) - [C++] Remove arrow_parquet C++ module and related parts of build system
-* [ARROW-298](https://issues.apache.org/jira/browse/ARROW-298) - create release scripts
-* [ARROW-299](https://issues.apache.org/jira/browse/ARROW-299) - Use absolute namespace in macros
-* [ARROW-3](https://issues.apache.org/jira/browse/ARROW-3) - Post Initial Arrow Format Spec
-* [ARROW-30](https://issues.apache.org/jira/browse/ARROW-30) - Python: pandas/NumPy to/from Arrow conversion routines
-* [ARROW-301](https://issues.apache.org/jira/browse/ARROW-301) - [Format] Add some form of user field metadata to IPC schemas
-* [ARROW-302](https://issues.apache.org/jira/browse/ARROW-302) - [Python] Add support to use the Arrow file format with file-like objects
-* [ARROW-305](https://issues.apache.org/jira/browse/ARROW-305) - Add compression and use_dictionary options to Parquet interface
-* [ARROW-306](https://issues.apache.org/jira/browse/ARROW-306) - Add option to pass cmake arguments via environment variable
-* [ARROW-31](https://issues.apache.org/jira/browse/ARROW-31) - Python: basic PyList <-> Arrow marshaling code
-* [ARROW-315](https://issues.apache.org/jira/browse/ARROW-315) - Finalize timestamp type
-* [ARROW-318](https://issues.apache.org/jira/browse/ARROW-318) - [Python] Revise README to reflect current state of project
-* [ARROW-319](https://issues.apache.org/jira/browse/ARROW-319) - Add canonical Arrow Schema json representation
-* [ARROW-324](https://issues.apache.org/jira/browse/ARROW-324) - Update arrow metadata diagram
-* [ARROW-325](https://issues.apache.org/jira/browse/ARROW-325) - make TestArrowFile not dependent on timezone
-* [ARROW-35](https://issues.apache.org/jira/browse/ARROW-35) - Add a short call-to-action / how-to-get-involved to the main README.md
-* [ARROW-37](https://issues.apache.org/jira/browse/ARROW-37) - C++: Represent boolean array data in bit-packed form
-* [ARROW-4](https://issues.apache.org/jira/browse/ARROW-4) - Initial Arrow CPP Implementation
-* [ARROW-42](https://issues.apache.org/jira/browse/ARROW-42) - Python: Add to Travis CI build
-* [ARROW-43](https://issues.apache.org/jira/browse/ARROW-43) - Python: Add rudimentary console __repr__ for array types
-* [ARROW-44](https://issues.apache.org/jira/browse/ARROW-44) - Python: Implement basic object model for scalar values (i.e. results of arrow_arr[i])
-* [ARROW-48](https://issues.apache.org/jira/browse/ARROW-48) - Python: Add Schema object wrapper
-* [ARROW-49](https://issues.apache.org/jira/browse/ARROW-49) - Python: Add Column and Table wrapper interface
-* [ARROW-50](https://issues.apache.org/jira/browse/ARROW-50) - C++: Enable library builds for 3rd-party users without having to build thirdparty googletest
-* [ARROW-53](https://issues.apache.org/jira/browse/ARROW-53) - Python: Fix RPATH and add source installation instructions
-* [ARROW-54](https://issues.apache.org/jira/browse/ARROW-54) - Python: rename package to "pyarrow"
-* [ARROW-56](https://issues.apache.org/jira/browse/ARROW-56) - Format: Specify LSB bit ordering in bit arrays
-* [ARROW-57](https://issues.apache.org/jira/browse/ARROW-57) - Format: Draft data headers IDL for data interchange
-* [ARROW-58](https://issues.apache.org/jira/browse/ARROW-58) - Format: Draft type metadata ("schemas") IDL
-* [ARROW-59](https://issues.apache.org/jira/browse/ARROW-59) - Python: Boolean data support for builtin data structures
-* [ARROW-60](https://issues.apache.org/jira/browse/ARROW-60) - C++: Struct type builder API
-* [ARROW-64](https://issues.apache.org/jira/browse/ARROW-64) - Add zsh support to C++ build scripts
-* [ARROW-66](https://issues.apache.org/jira/browse/ARROW-66) - Maybe some missing steps in installation guide
-* [ARROW-67](https://issues.apache.org/jira/browse/ARROW-67) - C++: Draft type metadata conversion to/from IPC representation
-* [ARROW-68](https://issues.apache.org/jira/browse/ARROW-68) - Update setup_build_env and third-party script to be more userfriendly
-* [ARROW-7](https://issues.apache.org/jira/browse/ARROW-7) - Add Python library build toolchain
-* [ARROW-70](https://issues.apache.org/jira/browse/ARROW-70) - C++: Add "lite" DCHECK macros used in parquet-cpp
-* [ARROW-71](https://issues.apache.org/jira/browse/ARROW-71) - C++: Add script to run clang-tidy on codebase
-* [ARROW-73](https://issues.apache.org/jira/browse/ARROW-73) - Support CMake 2.8
-* [ARROW-76](https://issues.apache.org/jira/browse/ARROW-76) - Revise format document to include null count, defer non-nullable arrays to the domain of metadata
-* [ARROW-78](https://issues.apache.org/jira/browse/ARROW-78) - C++: Add constructor for DecimalType
-* [ARROW-79](https://issues.apache.org/jira/browse/ARROW-79) - Python: Add benchmarks
-* [ARROW-8](https://issues.apache.org/jira/browse/ARROW-8) - Set up Travis CI
-* [ARROW-82](https://issues.apache.org/jira/browse/ARROW-82) - C++: Implement IPC exchange for List types
-* [ARROW-83](https://issues.apache.org/jira/browse/ARROW-83) - Add basic test infrastructure for DecimalType
-* [ARROW-85](https://issues.apache.org/jira/browse/ARROW-85) - C++: memcmp can be avoided in Equal when comparing with the same Buffer
-* [ARROW-86](https://issues.apache.org/jira/browse/ARROW-86) - Python: Implement zero-copy Arrow-to-Pandas conversion
-* [ARROW-87](https://issues.apache.org/jira/browse/ARROW-87) - Implement Decimal schema conversion for all ways supported in Parquet
-* [ARROW-89](https://issues.apache.org/jira/browse/ARROW-89) - Python: Add benchmarks for Arrow<->Pandas conversion
-* [ARROW-9](https://issues.apache.org/jira/browse/ARROW-9) - Rename some unchanged "Drill" to "Arrow"
-* [ARROW-90](https://issues.apache.org/jira/browse/ARROW-90) - Apache Arrow cpp code does not support power architecture
-* [ARROW-91](https://issues.apache.org/jira/browse/ARROW-91) - C++: First draft of an adapter class for parquet-cpp's ParquetFileReader that produces Arrow table/row batch objects
-* [ARROW-92](https://issues.apache.org/jira/browse/ARROW-92) - C++: Arrow to Parquet Schema conversion
-
-## Bug Fixes
-
-* [ARROW-103](https://issues.apache.org/jira/browse/ARROW-103) - Missing patterns from .gitignore
-* [ARROW-104](https://issues.apache.org/jira/browse/ARROW-104) - Update Layout.md based on discussion on the mailing list
-* [ARROW-105](https://issues.apache.org/jira/browse/ARROW-105) - Unit tests fail if assertions are disabled
-* [ARROW-113](https://issues.apache.org/jira/browse/ARROW-113) - TestValueVector test fails if cannot allocate 2GB of memory
-* [ARROW-16](https://issues.apache.org/jira/browse/ARROW-16) - Building cpp issues on XCode 7.2.1
-* [ARROW-17](https://issues.apache.org/jira/browse/ARROW-17) - Set some vector fields to default access level for Drill compatibility
-* [ARROW-18](https://issues.apache.org/jira/browse/ARROW-18) - Fix bug with decimal precision and scale
-* [ARROW-185](https://issues.apache.org/jira/browse/ARROW-185) - [C++] Make sure alignment and memory padding conform to spec
-* [ARROW-188](https://issues.apache.org/jira/browse/ARROW-188) - Python: Add numpy as install requirement
-* [ARROW-193](https://issues.apache.org/jira/browse/ARROW-193) - For the instruction, typos "int his" should be "in this"
-* [ARROW-194](https://issues.apache.org/jira/browse/ARROW-194) - C++: Allow read-only memory mapped source
-* [ARROW-200](https://issues.apache.org/jira/browse/ARROW-200) - [Python] Convert Values String looks like it has incorrect error handling
-* [ARROW-209](https://issues.apache.org/jira/browse/ARROW-209) - [C++] Broken builds: llvm.org apt repos are unavailable
-* [ARROW-210](https://issues.apache.org/jira/browse/ARROW-210) - [C++] Tidy up the type system a little bit
-* [ARROW-211](https://issues.apache.org/jira/browse/ARROW-211) - Several typos/errors in Layout.md examples
-* [ARROW-217](https://issues.apache.org/jira/browse/ARROW-217) - Fix Travis w.r.t conda 4.1.0 changes
-* [ARROW-219](https://issues.apache.org/jira/browse/ARROW-219) - [C++] Passed CMAKE_CXX_FLAGS are being dropped, fix compiler warnings
-* [ARROW-223](https://issues.apache.org/jira/browse/ARROW-223) - Do not link against libpython
-* [ARROW-225](https://issues.apache.org/jira/browse/ARROW-225) - [C++/Python] master Travis CI build is broken
-* [ARROW-244](https://issues.apache.org/jira/browse/ARROW-244) - [C++] Some global APIs of IPC module should be visible to the outside
-* [ARROW-246](https://issues.apache.org/jira/browse/ARROW-246) - [Java] UnionVector doesn't call allocateNew() when creating it's vectorType
-* [ARROW-247](https://issues.apache.org/jira/browse/ARROW-247) - [C++] Missing explicit destructor in RowBatchReader causes an incomplete type error
-* [ARROW-250](https://issues.apache.org/jira/browse/ARROW-250) - Fix for ARROW-246 may cause memory leaks
-* [ARROW-259](https://issues.apache.org/jira/browse/ARROW-259) - Use flatbuffer fields in java implementation
-* [ARROW-265](https://issues.apache.org/jira/browse/ARROW-265) - Negative decimal values have wrong padding
-* [ARROW-266](https://issues.apache.org/jira/browse/ARROW-266) - [C++] Fix the broken build
-* [ARROW-274](https://issues.apache.org/jira/browse/ARROW-274) - Make the MapVector nullable
-* [ARROW-278](https://issues.apache.org/jira/browse/ARROW-278) - [Format] Struct type name consistency in implementations and metadata
-* [ARROW-283](https://issues.apache.org/jira/browse/ARROW-283) - [C++] Update arrow_parquet to account for API changes in PARQUET-573
-* [ARROW-284](https://issues.apache.org/jira/browse/ARROW-284) - [C++] Triage builds by disabling Arrow-Parquet module
-* [ARROW-287](https://issues.apache.org/jira/browse/ARROW-287) - [java] Make nullable vectors use a BitVecor instead of UInt1Vector for bits
-* [ARROW-297](https://issues.apache.org/jira/browse/ARROW-297) - Fix Arrow pom for release
-* [ARROW-304](https://issues.apache.org/jira/browse/ARROW-304) - NullableMapReaderImpl.isSet() always returns true
-* [ARROW-308](https://issues.apache.org/jira/browse/ARROW-308) - UnionListWriter.setPosition() should not call startList()
-* [ARROW-309](https://issues.apache.org/jira/browse/ARROW-309) - Types.getMinorTypeForArrowType() does not work for Union type
-* [ARROW-313](https://issues.apache.org/jira/browse/ARROW-313) - XCode 8.0 breaks builds
-* [ARROW-314](https://issues.apache.org/jira/browse/ARROW-314) - JSONScalar is unnecessary and unused.
-* [ARROW-320](https://issues.apache.org/jira/browse/ARROW-320) - ComplexCopier.copy(FieldReader, FieldWriter) should not start a list if reader is not set
-* [ARROW-321](https://issues.apache.org/jira/browse/ARROW-321) - Fix Arrow licences
-* [ARROW-36](https://issues.apache.org/jira/browse/ARROW-36) - Remove fixVersions from patch tool (until we have them)
-* [ARROW-46](https://issues.apache.org/jira/browse/ARROW-46) - Port DRILL-4410 to Arrow
-* [ARROW-5](https://issues.apache.org/jira/browse/ARROW-5) - Error when run maven install
-* [ARROW-51](https://issues.apache.org/jira/browse/ARROW-51) - Move ValueVector test from Drill project
-* [ARROW-55](https://issues.apache.org/jira/browse/ARROW-55) - Python: fix legacy Python (2.7) tests and add to Travis CI
-* [ARROW-62](https://issues.apache.org/jira/browse/ARROW-62) - Format: Are the nulls bits 0 or 1 for null values?
-* [ARROW-63](https://issues.apache.org/jira/browse/ARROW-63) - C++: ctest fails if Python 3 is the active Python interpreter
-* [ARROW-65](https://issues.apache.org/jira/browse/ARROW-65) - Python: FindPythonLibsNew does not work in a virtualenv
-* [ARROW-69](https://issues.apache.org/jira/browse/ARROW-69) - Change permissions for assignable users
-* [ARROW-72](https://issues.apache.org/jira/browse/ARROW-72) - FindParquet searches for non-existent header
-* [ARROW-75](https://issues.apache.org/jira/browse/ARROW-75) - C++: Fix handling of empty strings
-* [ARROW-77](https://issues.apache.org/jira/browse/ARROW-77) - C++: conform null bit interpretation to match ARROW-62
-* [ARROW-80](https://issues.apache.org/jira/browse/ARROW-80) - Segmentation fault on len(Array) for empty arrays
-* [ARROW-88](https://issues.apache.org/jira/browse/ARROW-88) - C++: Refactor given PARQUET-572
-* [ARROW-93](https://issues.apache.org/jira/browse/ARROW-93) - XCode 7.3 breaks builds
-* [ARROW-94](https://issues.apache.org/jira/browse/ARROW-94) - Expand list example to clarify null vs empty list
\ No newline at end of file
diff --git a/site/_release/0.10.0.md b/site/_release/0.10.0.md
deleted file mode 100644
index 3c7b263..0000000
--- a/site/_release/0.10.0.md
+++ /dev/null
@@ -1,612 +0,0 @@
----
-layout: default
-title: Apache Arrow 0.10.0 Release
-permalink: /release/0.10.0.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow 0.10.0 (6 August 2018)
-
-This is a major release.
-
-## Download
-
-* [**Source Artifacts**][2]
-* [**Binary Artifacts**][3]
-* [Git tag][1]
-
-## Contributors
-
-```shell
-$ git shortlog -sn apache-arrow-0.9.0..apache-arrow-0.10.0
-    70  Antoine Pitrou
-    49  Kouhei Sutou
-    40  Korn, Uwe
-    37  Wes McKinney
-    32  Krisztián Szűcs
-    30  Andy Grove
-    20  Philipp Moritz
-    13  Phillip Cloud
-    11  Bryan Cutler
-    11  yosuke shiro
-     7  Dimitri Vorona
-     6  Zhijun Fu
-     5  Bruce Mitchener
-     5  Joshua Storck
-     5  Robert Nishihara
-     5  ptaylor
-     4  Maximilian Roos
-     4  Sebastien Binet
-     3  Alex
-     3  Brian Hulette
-     3  Chao Sun
-     3  Dominik Moritz
-     3  Kenji Okimoto
-     3  Marco Neumann
-     3  Yuhong Guo
-     2  Abhi
-     2  Dhruv Madeka
-     2  Dmitry Kalinkin
-     2  Donal Simmie
-     2  Frank Wessels
-     2  Julius Neuffer
-     2  Manabu Ejima
-     2  Omer Katz
-     2  Paddy
-     2  Paddy Horan
-     2  Robert Gruener
-     2  Teddy Choi
-     2  Vanco Buca
-     2  Venki Korukanti
-     2  bomeng
-     2  fjetter
-     2  liurenjie1024
-     2  songqing
-     1  284km
-     1  Adrian Dorr
-     1  Albert Shieh
-     1  Alessandro Andrioni
-     1  Alok Singh
-     1  Aneesh Karve
-     1  Atul Dambalkar
-     1  Ben Wolfson
-     1  Brent Kerby
-     1  Daniel Chalef
-     1  Daniel Compton
-     1  Florian Rathgeber
-     1  Gatis Seja
-     1  HE, Tao
-     1  James Lamb
-     1  Jeff Zhang
-     1  Juan Paulo Gutierrez
-     1  Kane
-     1  Kee Chong Tan
-     1  Kelsey Jordahl
-     1  Kendall Willets
-     1  Li Jin
-     1  Licht-T
-     1  Lizhou Gao
-     1  Louis Potok
-     1  Markus Klein
-     1  Matt Topol
-     1  Matthew Topol
-     1  Michael Sarahan
-     1  Paul Taylor
-     1  Peter Schafhalter
-     1  Philipp Hoch
-     1  Renato Marroquin
-     1  Richard Gowers
-     1  Robbie Gruener
-```
-
-# Patch Committers
-
-The following Apache committers committed contributed patches to the repository.
-
-```
-$ git shortlog -csn apache-arrow-0.9.0..apache-arrow-0.10.0
-   120  Wes McKinney
-   119  Korn, Uwe
-    63  Antoine Pitrou
-    50  Uwe L. Korn
-    28  Kouhei Sutou
-    27  Philipp Moritz
-    15  Bryan Cutler
-    15  Phillip Cloud
-     8  Robert Nishihara
-     6  Sidd
-     4  Brian Hulette
-     2  GitHub
-     1  Your Name Here
-     1  ptaylor
-```
-
-# Changelog
-
-## New Features and Improvements
-
-* [ARROW-1018](https://issues.apache.org/jira/browse/ARROW-1018) - [C++] Add option to create FileOutputStream, ReadableFile from OS file descriptor
-* [ARROW-1163](https://issues.apache.org/jira/browse/ARROW-1163) - [Plasma][Java] Java client for Plasma
-* [ARROW-1388](https://issues.apache.org/jira/browse/ARROW-1388) - [Python] Add Table.drop method for removing columns
-* [ARROW-1454](https://issues.apache.org/jira/browse/ARROW-1454) - [Python] More informative error message when attempting to write an unsupported Arrow type to Parquet format
-* [ARROW-1715](https://issues.apache.org/jira/browse/ARROW-1715) - [Python] Implement pickling for Column, ChunkedArray, RecordBatch, Table
-* [ARROW-1722](https://issues.apache.org/jira/browse/ARROW-1722) - [C++] Add linting script to look for C++/CLI issues
-* [ARROW-1731](https://issues.apache.org/jira/browse/ARROW-1731) - [Python] Provide for selecting a subset of columns to convert in RecordBatch/Table.from\_pandas
-* [ARROW-1744](https://issues.apache.org/jira/browse/ARROW-1744) - [Plasma] Provide TensorFlow operator to read tensors from plasma
-* [ARROW-1780](https://issues.apache.org/jira/browse/ARROW-1780) - [Java] JDBC Adapter for Apache Arrow
-* [ARROW-1858](https://issues.apache.org/jira/browse/ARROW-1858) - [Python] Add documentation about parquet.write\_to\_dataset and related methods
-* [ARROW-1868](https://issues.apache.org/jira/browse/ARROW-1868) - [Java] Change vector getMinorType to use MinorType instead of Types.MinorType
-* [ARROW-1886](https://issues.apache.org/jira/browse/ARROW-1886) - [Python] Add function to "flatten" structs within tables
-* [ARROW-1913](https://issues.apache.org/jira/browse/ARROW-1913) - [Java] Fix Javadoc generation bugs with JDK8
-* [ARROW-1928](https://issues.apache.org/jira/browse/ARROW-1928) - [C++] Add benchmarks comparing performance of internal::BitmapReader/Writer with naive approaches
-* [ARROW-1954](https://issues.apache.org/jira/browse/ARROW-1954) - [Python] Add metadata accessor to pyarrow.Field
-* [ARROW-1964](https://issues.apache.org/jira/browse/ARROW-1964) - [Python] Expose Builder classes
-* [ARROW-2014](https://issues.apache.org/jira/browse/ARROW-2014) - [Python] Document read\_pandas method in pyarrow.parquet
-* [ARROW-2055](https://issues.apache.org/jira/browse/ARROW-2055) - [Java] Upgrade to Java 8
-* [ARROW-2060](https://issues.apache.org/jira/browse/ARROW-2060) - [Python] Documentation for creating StructArray using from\_arrays or a sequence of dicts
-* [ARROW-2061](https://issues.apache.org/jira/browse/ARROW-2061) - [C++] Run ASAN builds in Travis CI
-* [ARROW-2074](https://issues.apache.org/jira/browse/ARROW-2074) - [Python] Allow type inference for struct arrays
-* [ARROW-2097](https://issues.apache.org/jira/browse/ARROW-2097) - [Python] Suppress valgrind stdout/stderr in Travis CI builds when there are no errors
-* [ARROW-2100](https://issues.apache.org/jira/browse/ARROW-2100) - [Python] Drop Python 3.4 support
-* [ARROW-2140](https://issues.apache.org/jira/browse/ARROW-2140) - [Python] Conversion from Numpy float16 array unimplemented
-* [ARROW-2141](https://issues.apache.org/jira/browse/ARROW-2141) - [Python] Conversion from Numpy object array to varsize binary unimplemented
-* [ARROW-2147](https://issues.apache.org/jira/browse/ARROW-2147) - [Python] Type inference doesn't work on lists of Numpy arrays
-* [ARROW-2207](https://issues.apache.org/jira/browse/ARROW-2207) - [GLib] Support decimal type
-* [ARROW-2222](https://issues.apache.org/jira/browse/ARROW-2222) - [C++] Add option to validate Flatbuffers messages
-* [ARROW-2224](https://issues.apache.org/jira/browse/ARROW-2224) - [C++] Get rid of boost regex usage
-* [ARROW-2241](https://issues.apache.org/jira/browse/ARROW-2241) - [Python] Simple script for running all current ASV benchmarks at a commit or tag
-* [ARROW-2264](https://issues.apache.org/jira/browse/ARROW-2264) - [Python] Efficiently serialize numpy arrays with dtype of unicode fixed length string
-* [ARROW-2267](https://issues.apache.org/jira/browse/ARROW-2267) - Rust bindings
-* [ARROW-2276](https://issues.apache.org/jira/browse/ARROW-2276) - [Python] Tensor could implement the buffer protocol
-* [ARROW-2281](https://issues.apache.org/jira/browse/ARROW-2281) - [Python] Expose MakeArray to construct arrays from buffers
-* [ARROW-2285](https://issues.apache.org/jira/browse/ARROW-2285) - [Python] Can't convert Numpy string arrays
-* [ARROW-2286](https://issues.apache.org/jira/browse/ARROW-2286) - [Python] Allow subscripting pyarrow.lib.StructValue
-* [ARROW-2287](https://issues.apache.org/jira/browse/ARROW-2287) - [Python] chunked array not iterable, not indexable
-* [ARROW-2299](https://issues.apache.org/jira/browse/ARROW-2299) - [Go] Go language implementation
-* [ARROW-2301](https://issues.apache.org/jira/browse/ARROW-2301) - [Python] Add source distribution publishing instructions to package / release management documentation
-* [ARROW-2302](https://issues.apache.org/jira/browse/ARROW-2302) - [GLib] Run autotools and meson Linux builds in same Travis CI build entry
-* [ARROW-2308](https://issues.apache.org/jira/browse/ARROW-2308) - Serialized tensor data should be 64-byte aligned.
-* [ARROW-2315](https://issues.apache.org/jira/browse/ARROW-2315) - [C++/Python] Add method to flatten a struct array
-* [ARROW-2319](https://issues.apache.org/jira/browse/ARROW-2319) - [C++] Add buffered output class implementing OutputStream interface
-* [ARROW-2322](https://issues.apache.org/jira/browse/ARROW-2322) - Document requirements to run dev/release/01-perform.sh
-* [ARROW-2325](https://issues.apache.org/jira/browse/ARROW-2325) - [Python] Update setup.py to use Markdown project description
-* [ARROW-2330](https://issues.apache.org/jira/browse/ARROW-2330) - [C++] Optimize delta buffer creation with partially finishable array builders
-* [ARROW-2332](https://issues.apache.org/jira/browse/ARROW-2332) - [Python] Provide API for reading multiple Feather files
-* [ARROW-2334](https://issues.apache.org/jira/browse/ARROW-2334) - [C++] Update boost to 1.66.0
-* [ARROW-2335](https://issues.apache.org/jira/browse/ARROW-2335) - [Go] Move Go README one directory higher
-* [ARROW-2340](https://issues.apache.org/jira/browse/ARROW-2340) - [Website] Add blog post about Go codebase donation
-* [ARROW-2341](https://issues.apache.org/jira/browse/ARROW-2341) - [Python] pa.union() mode argument unintuitive
-* [ARROW-2343](https://issues.apache.org/jira/browse/ARROW-2343) - [Java/Packaging] Run mvn clean in API doc builds
-* [ARROW-2344](https://issues.apache.org/jira/browse/ARROW-2344) - [Go] Run Go unit tests in Travis CI
-* [ARROW-2345](https://issues.apache.org/jira/browse/ARROW-2345) - [Documentation] Fix bundle exec and set sphinx nosidebar to True
-* [ARROW-2348](https://issues.apache.org/jira/browse/ARROW-2348) - [GLib] Remove Go example
-* [ARROW-2350](https://issues.apache.org/jira/browse/ARROW-2350) - Shrink size of spark\_integration Docker container
-* [ARROW-2353](https://issues.apache.org/jira/browse/ARROW-2353) - Test correctness of built wheel on AppVeyor
-* [ARROW-2361](https://issues.apache.org/jira/browse/ARROW-2361) - [Rust] Start native Rust Implementation
-* [ARROW-2364](https://issues.apache.org/jira/browse/ARROW-2364) - [Plasma] PlasmaClient::Get() could take vector of object ids
-* [ARROW-2376](https://issues.apache.org/jira/browse/ARROW-2376) - [Rust] Travis should run tests for Rust library
-* [ARROW-2378](https://issues.apache.org/jira/browse/ARROW-2378) - [Rust] Use rustfmt to format source code
-* [ARROW-2381](https://issues.apache.org/jira/browse/ARROW-2381) - [Rust] Buffer<T> should have an Iterator
-* [ARROW-2384](https://issues.apache.org/jira/browse/ARROW-2384) - Rust: Use Traits rather than defining methods directly
-* [ARROW-2385](https://issues.apache.org/jira/browse/ARROW-2385) - [Rust] Implement to\_json() for Field and DataType
-* [ARROW-2388](https://issues.apache.org/jira/browse/ARROW-2388) - [C++] Arrow::StringBuilder::Append() uses null\_bytes not valid\_bytes
-* [ARROW-2389](https://issues.apache.org/jira/browse/ARROW-2389) - [C++] Add StatusCode::OverflowError
-* [ARROW-2390](https://issues.apache.org/jira/browse/ARROW-2390) - [C++/Python] CheckPyError() could inspect exception type
-* [ARROW-2395](https://issues.apache.org/jira/browse/ARROW-2395) - [Python] Correct flake8 errors outside of pyarrow/ directory
-* [ARROW-2396](https://issues.apache.org/jira/browse/ARROW-2396) - Unify Rust Errors
-* [ARROW-2397](https://issues.apache.org/jira/browse/ARROW-2397) - Document changes in Tensor encoding in IPC.md.
-* [ARROW-2398](https://issues.apache.org/jira/browse/ARROW-2398) - [Rust] Provide a zero-copy builder for type-safe Buffer<T>
-* [ARROW-2400](https://issues.apache.org/jira/browse/ARROW-2400) - [C++] Status destructor is expensive
-* [ARROW-2401](https://issues.apache.org/jira/browse/ARROW-2401) - Support filters on Hive partitioned Parquet files
-* [ARROW-2402](https://issues.apache.org/jira/browse/ARROW-2402) - [C++] FixedSizeBinaryBuilder::Append lacks "const char\*" overload
-* [ARROW-2404](https://issues.apache.org/jira/browse/ARROW-2404) - Fix declaration of 'type\_id' hides class member warning in msvc build
-* [ARROW-2407](https://issues.apache.org/jira/browse/ARROW-2407) - [GLib] Add garrow\_string\_array\_builder\_append\_values()
-* [ARROW-2408](https://issues.apache.org/jira/browse/ARROW-2408) - [Rust] It should be possible to get a &mut[T] from Builder<T>
-* [ARROW-2411](https://issues.apache.org/jira/browse/ARROW-2411) - [C++] Add method to append batches of null-terminated strings to StringBuilder
-* [ARROW-2413](https://issues.apache.org/jira/browse/ARROW-2413) - [Rust] Remove useless use of \`format!\`
-* [ARROW-2414](https://issues.apache.org/jira/browse/ARROW-2414) - [Documentation] Fix miscellaneous documentation typos
-* [ARROW-2415](https://issues.apache.org/jira/browse/ARROW-2415) - [Rust] Fix using references in pattern matching
-* [ARROW-2416](https://issues.apache.org/jira/browse/ARROW-2416) - [C++] Support system libprotobuf
-* [ARROW-2417](https://issues.apache.org/jira/browse/ARROW-2417) - [Rust] Review APIs for safety
-* [ARROW-2422](https://issues.apache.org/jira/browse/ARROW-2422) - [Python] Support more filter operators on Hive partitioned Parquet files
-* [ARROW-2427](https://issues.apache.org/jira/browse/ARROW-2427) - [C++] ReadAt implementations suboptimal
-* [ARROW-2430](https://issues.apache.org/jira/browse/ARROW-2430) - MVP for branch based packaging automation
-* [ARROW-2433](https://issues.apache.org/jira/browse/ARROW-2433) - [Rust] Add Builder.push\_slice(&[T])
-* [ARROW-2434](https://issues.apache.org/jira/browse/ARROW-2434) - [Rust] Add windows support
-* [ARROW-2435](https://issues.apache.org/jira/browse/ARROW-2435) - [Rust] Add memory pool abstraction.
-* [ARROW-2436](https://issues.apache.org/jira/browse/ARROW-2436) - [Rust] Add windows CI
-* [ARROW-2440](https://issues.apache.org/jira/browse/ARROW-2440) - [Rust] Implement ListBuilder<T>
-* [ARROW-2442](https://issues.apache.org/jira/browse/ARROW-2442) - [C++] Disambiguate Builder::Append overloads
-* [ARROW-2445](https://issues.apache.org/jira/browse/ARROW-2445) - [Rust] Add documentation and make some fields private
-* [ARROW-2448](https://issues.apache.org/jira/browse/ARROW-2448) - Segfault when plasma client goes out of scope before buffer.
-* [ARROW-2451](https://issues.apache.org/jira/browse/ARROW-2451) - Handle more dtypes efficiently in custom numpy array serializer.
-* [ARROW-2453](https://issues.apache.org/jira/browse/ARROW-2453) - [Python] Improve Table column access
-* [ARROW-2458](https://issues.apache.org/jira/browse/ARROW-2458) - [Plasma] PlasmaClient uses global variable
-* [ARROW-2463](https://issues.apache.org/jira/browse/ARROW-2463) - [C++] Update flatbuffers to 1.9.0
-* [ARROW-2464](https://issues.apache.org/jira/browse/ARROW-2464) - [Python] Use a python\_version marker instead of a condition
-* [ARROW-2469](https://issues.apache.org/jira/browse/ARROW-2469) - Make out arguments last in ReadMessage API.
-* [ARROW-2470](https://issues.apache.org/jira/browse/ARROW-2470) - [C++] FileGetSize() should not seek
-* [ARROW-2472](https://issues.apache.org/jira/browse/ARROW-2472) - [Rust] The Schema and Fields types should not have public attributes
-* [ARROW-2477](https://issues.apache.org/jira/browse/ARROW-2477) - [Rust] Set up code coverage in CI
-* [ARROW-2478](https://issues.apache.org/jira/browse/ARROW-2478) - [C++] Introduce a checked\_cast function that performs a dynamic\_cast in debug mode
-* [ARROW-2479](https://issues.apache.org/jira/browse/ARROW-2479) - [C++] Have a global thread pool
-* [ARROW-2480](https://issues.apache.org/jira/browse/ARROW-2480) - [C++] Enable casting the value of a decimal to int32\_t or int64\_t
-* [ARROW-2481](https://issues.apache.org/jira/browse/ARROW-2481) - [Rust] Move calls to free() into memory.rs
-* [ARROW-2482](https://issues.apache.org/jira/browse/ARROW-2482) - [Rust] support nested types
-* [ARROW-2484](https://issues.apache.org/jira/browse/ARROW-2484) - [C++] Document ABI compliance checking
-* [ARROW-2485](https://issues.apache.org/jira/browse/ARROW-2485) - [C++] Output diff when run\_clang\_format.py reports a change
-* [ARROW-2486](https://issues.apache.org/jira/browse/ARROW-2486) - [C++/Python] Provide a Docker image that contains all dependencies for development
-* [ARROW-2488](https://issues.apache.org/jira/browse/ARROW-2488) - [C++] List Boost 1.67 as supported version
-* [ARROW-2493](https://issues.apache.org/jira/browse/ARROW-2493) - [Python] Add support for pickling to buffers and arrays
-* [ARROW-2494](https://issues.apache.org/jira/browse/ARROW-2494) - Return status codes from PlasmaClient::Seal
-* [ARROW-2498](https://issues.apache.org/jira/browse/ARROW-2498) - [Java] Upgrade to JDK 1.8
-* [ARROW-2499](https://issues.apache.org/jira/browse/ARROW-2499) - [C++] Add iterator facility for Python sequences
-* [ARROW-2505](https://issues.apache.org/jira/browse/ARROW-2505) - [C++] Disable MSVC warning C4800
-* [ARROW-2506](https://issues.apache.org/jira/browse/ARROW-2506) - [Plasma] Build error on macOS
-* [ARROW-2507](https://issues.apache.org/jira/browse/ARROW-2507) - [Rust] Don't take a reference when not needed
-* [ARROW-2508](https://issues.apache.org/jira/browse/ARROW-2508) - [Python] pytest API changes make tests fail
-* [ARROW-2513](https://issues.apache.org/jira/browse/ARROW-2513) - [Python] DictionaryType should give access to index type and dictionary array
-* [ARROW-2516](https://issues.apache.org/jira/browse/ARROW-2516) - AppVeyor Build Matrix should be specific to the changes made in a PR
-* [ARROW-2521](https://issues.apache.org/jira/browse/ARROW-2521) - [Rust] Refactor Rust API to use traits and generics
-* [ARROW-2522](https://issues.apache.org/jira/browse/ARROW-2522) - [C++] Version shared library files
-* [ARROW-2525](https://issues.apache.org/jira/browse/ARROW-2525) - [GLib] Add garrow\_struct\_array\_flatten()
-* [ARROW-2526](https://issues.apache.org/jira/browse/ARROW-2526) - [GLib] Update .gitignore
-* [ARROW-2527](https://issues.apache.org/jira/browse/ARROW-2527) - [GLib] Enable GPU document
-* [ARROW-2529](https://issues.apache.org/jira/browse/ARROW-2529) - [C++] Update mention of clang-format to 5.0 in the docs
-* [ARROW-2531](https://issues.apache.org/jira/browse/ARROW-2531) - [C++] Update clang bits to 6.0
-* [ARROW-2533](https://issues.apache.org/jira/browse/ARROW-2533) - [CI] Fast finish failing AppVeyor builds
-* [ARROW-2536](https://issues.apache.org/jira/browse/ARROW-2536) - [Rust] ListBuilder uses wrong initial size for offset builder
-* [ARROW-2537](https://issues.apache.org/jira/browse/ARROW-2537) - [Ruby] Import
-* [ARROW-2539](https://issues.apache.org/jira/browse/ARROW-2539) - [Plasma] Use unique\_ptr instead of raw pointer
-* [ARROW-2540](https://issues.apache.org/jira/browse/ARROW-2540) - [Plasma] add constructor/destructor to make sure dlfree is called automatically
-* [ARROW-2541](https://issues.apache.org/jira/browse/ARROW-2541) - [Plasma] Clean up macro usage
-* [ARROW-2543](https://issues.apache.org/jira/browse/ARROW-2543) - [Rust] CI should cache dependencies for faster builds
-* [ARROW-2544](https://issues.apache.org/jira/browse/ARROW-2544) - [CI] Run C++ tests with two jobs on Travis-CI
-* [ARROW-2547](https://issues.apache.org/jira/browse/ARROW-2547) - [Format] Fix off-by-one in List<List<byte>> example
-* [ARROW-2548](https://issues.apache.org/jira/browse/ARROW-2548) - [Format] Clarify \`List<Char>\` Array example
-* [ARROW-2549](https://issues.apache.org/jira/browse/ARROW-2549) - [GLib] Apply arrow::StatusCodes changes to GArrowError
-* [ARROW-2550](https://issues.apache.org/jira/browse/ARROW-2550) - [C++] Add missing status codes into arrow::StatusCode::CodeAsString()
-* [ARROW-2551](https://issues.apache.org/jira/browse/ARROW-2551) - [Plasma] Improve notification logic
-* [ARROW-2553](https://issues.apache.org/jira/browse/ARROW-2553) - [Python] Set MACOSX\_DEPLOYMENT\_TARGET in wheel build
-* [ARROW-2558](https://issues.apache.org/jira/browse/ARROW-2558) - [Plasma] avoid walk through all the objects when a client disconnects
-* [ARROW-2562](https://issues.apache.org/jira/browse/ARROW-2562) - [C++] Upload coverage data to codecov.io
-* [ARROW-2563](https://issues.apache.org/jira/browse/ARROW-2563) - [Rust] Poor caching in Travis-CI
-* [ARROW-2566](https://issues.apache.org/jira/browse/ARROW-2566) - [CI] Add codecov.io badge to README
-* [ARROW-2567](https://issues.apache.org/jira/browse/ARROW-2567) - [C++/Python] Unit is ignored on comparison of TimestampArrays
-* [ARROW-2568](https://issues.apache.org/jira/browse/ARROW-2568) - [Python] Expose thread pool size setting to Python, and deprecate "nthreads"
-* [ARROW-2569](https://issues.apache.org/jira/browse/ARROW-2569) - [C++] Improve thread pool size heuristic
-* [ARROW-2574](https://issues.apache.org/jira/browse/ARROW-2574) - [CI] Collect and publish Python coverage
-* [ARROW-2576](https://issues.apache.org/jira/browse/ARROW-2576) - [GLib] Add abs functions for Decimal128.
-* [ARROW-2577](https://issues.apache.org/jira/browse/ARROW-2577) - [Plasma] Add ASV benchmarks
-* [ARROW-2580](https://issues.apache.org/jira/browse/ARROW-2580) - [GLib] Fix abs functions for Decimal128
-* [ARROW-2582](https://issues.apache.org/jira/browse/ARROW-2582) - [GLib] Add negate functions for Decimal128
-* [ARROW-2585](https://issues.apache.org/jira/browse/ARROW-2585) - [C++] Add Decimal128::FromBigEndian
-* [ARROW-2586](https://issues.apache.org/jira/browse/ARROW-2586) - [C++] Make child builders of ListBuilder and StructBuilder shared\_ptr's
-* [ARROW-2595](https://issues.apache.org/jira/browse/ARROW-2595) - [Plasma] operator[] creates entries in map
-* [ARROW-2596](https://issues.apache.org/jira/browse/ARROW-2596) - [GLib] Use the default value of GTK-Doc
-* [ARROW-2597](https://issues.apache.org/jira/browse/ARROW-2597) - [Plasma] remove UniqueIDHasher
-* [ARROW-2604](https://issues.apache.org/jira/browse/ARROW-2604) - [Java] Add method overload for VarCharVector.set(int,String)
-* [ARROW-2608](https://issues.apache.org/jira/browse/ARROW-2608) - [Java/Python] Add pyarrow.{Array,Field}.from\_jvm / jvm\_buffer
-* [ARROW-2611](https://issues.apache.org/jira/browse/ARROW-2611) - [Python] Python 2 integer serialization
-* [ARROW-2612](https://issues.apache.org/jira/browse/ARROW-2612) - [Plasma] Fix deprecated PLASMA\_DEFAULT\_RELEASE\_DELAY
-* [ARROW-2613](https://issues.apache.org/jira/browse/ARROW-2613) - [Docs] Update the gen\_apidocs docker script
-* [ARROW-2614](https://issues.apache.org/jira/browse/ARROW-2614) - [CI] Remove 'group: deprecated' in Travis
-* [ARROW-2626](https://issues.apache.org/jira/browse/ARROW-2626) - [Python] pandas ArrowInvalid message should include failing column name
-* [ARROW-2634](https://issues.apache.org/jira/browse/ARROW-2634) - [Go] Add LICENSE additions for Go subproject
-* [ARROW-2635](https://issues.apache.org/jira/browse/ARROW-2635) - [Ruby] LICENSE.txt isn't suitable
-* [ARROW-2636](https://issues.apache.org/jira/browse/ARROW-2636) - [Ruby] "Unofficial" package note is missing
-* [ARROW-2638](https://issues.apache.org/jira/browse/ARROW-2638) - [Python] Prevent calling extension class constructors directly
-* [ARROW-2639](https://issues.apache.org/jira/browse/ARROW-2639) - [Python] Remove unnecessary \_check\_nullptr methods
-* [ARROW-2641](https://issues.apache.org/jira/browse/ARROW-2641) - [C++] Investigate spurious memset() calls
-* [ARROW-2645](https://issues.apache.org/jira/browse/ARROW-2645) - [Java] ArrowStreamWriter accumulates DictionaryBatch ArrowBlocks
-* [ARROW-2649](https://issues.apache.org/jira/browse/ARROW-2649) - [C++] Add std::generate()-like function for faster bitmap writing
-* [ARROW-2656](https://issues.apache.org/jira/browse/ARROW-2656) - [Python] Improve ParquetManifest creation time
-* [ARROW-2660](https://issues.apache.org/jira/browse/ARROW-2660) - [Python] Experiment with zero-copy pickling
-* [ARROW-2661](https://issues.apache.org/jira/browse/ARROW-2661) - [Python/C++] Allow passing HDFS Config values via map/dict instead of needing an hdfs-site.xml file
-* [ARROW-2662](https://issues.apache.org/jira/browse/ARROW-2662) - [Python] Add to\_pandas / to\_numpy to ChunkedArray
-* [ARROW-2663](https://issues.apache.org/jira/browse/ARROW-2663) - [Python] Make dictionary\_encode and unique accesible on Column / ChunkedArray
-* [ARROW-2664](https://issues.apache.org/jira/browse/ARROW-2664) - [Python] Implement \_\_getitem\_\_ / slicing on Buffer
-* [ARROW-2666](https://issues.apache.org/jira/browse/ARROW-2666) - [Python] numpy.asarray should trigger to\_pandas on Array/ChunkedArray
-* [ARROW-2672](https://issues.apache.org/jira/browse/ARROW-2672) - [Python] Build ORC extension in manylinux1 wheels
-* [ARROW-2674](https://issues.apache.org/jira/browse/ARROW-2674) - [Packaging] Start building nightlies
-* [ARROW-2676](https://issues.apache.org/jira/browse/ARROW-2676) - [Packaging] Deploy build artifacts to github releases
-* [ARROW-2677](https://issues.apache.org/jira/browse/ARROW-2677) - [Python] Expose Parquet ZSTD compression
-* [ARROW-2678](https://issues.apache.org/jira/browse/ARROW-2678) - [GLib] Add extra information to common build problems on macOS
-* [ARROW-2680](https://issues.apache.org/jira/browse/ARROW-2680) - [Python] Add documentation about type inference in Table.from\_pandas
-* [ARROW-2682](https://issues.apache.org/jira/browse/ARROW-2682) - [CI] Notify in Slack about broken builds
-* [ARROW-2689](https://issues.apache.org/jira/browse/ARROW-2689) - [Python] Remove references to timestamps\_to\_ms argument from documentation
-* [ARROW-2692](https://issues.apache.org/jira/browse/ARROW-2692) - [Python] Add test for writing dictionary encoded columns to chunked Parquet files
-* [ARROW-2695](https://issues.apache.org/jira/browse/ARROW-2695) - [Python] Prevent calling scalar contructors directly
-* [ARROW-2696](https://issues.apache.org/jira/browse/ARROW-2696) - [JAVA] enhance AllocationListener with an onFailedAllocation() call
-* [ARROW-2699](https://issues.apache.org/jira/browse/ARROW-2699) - [C++/Python] Add Table method that replaces a column with a new supplied column
-* [ARROW-2700](https://issues.apache.org/jira/browse/ARROW-2700) - [Python] Add simple examples to Array.cast docstring
-* [ARROW-2701](https://issues.apache.org/jira/browse/ARROW-2701) - [C++] Make MemoryMappedFile resizable
-* [ARROW-2704](https://issues.apache.org/jira/browse/ARROW-2704) - [Java] IPC stream handling should be more friendly to low level processing
-* [ARROW-2713](https://issues.apache.org/jira/browse/ARROW-2713) - [Packaging] Fix linux package builds
-* [ARROW-2717](https://issues.apache.org/jira/browse/ARROW-2717) - [Packaging] Postfix conda artifacts with target arch
-* [ARROW-2718](https://issues.apache.org/jira/browse/ARROW-2718) - [Packaging] GPG sign downloaded artifacts
-* [ARROW-2724](https://issues.apache.org/jira/browse/ARROW-2724) - [Packaging] Determine whether all the expected artifacts are uploaded
-* [ARROW-2725](https://issues.apache.org/jira/browse/ARROW-2725) - [JAVA] make Accountant.AllocationOutcome publicly visible
-* [ARROW-2729](https://issues.apache.org/jira/browse/ARROW-2729) - [GLib] Add decimal128 array builder
-* [ARROW-2731](https://issues.apache.org/jira/browse/ARROW-2731) - Allow usage of external ORC library
-* [ARROW-2732](https://issues.apache.org/jira/browse/ARROW-2732) - Update brew packages for macOS
-* [ARROW-2733](https://issues.apache.org/jira/browse/ARROW-2733) - [GLib] Cast garrow\_decimal128 to gint64
-* [ARROW-2738](https://issues.apache.org/jira/browse/ARROW-2738) - [GLib] Use Brewfile on installation process
-* [ARROW-2739](https://issues.apache.org/jira/browse/ARROW-2739) - [GLib] Use G\_DECLARE\_DERIVABLE\_TYPE for GArrowDecimalDataType and GArrowDecimal128ArrayBuilder
-* [ARROW-2740](https://issues.apache.org/jira/browse/ARROW-2740) - [Python] Add address property to Buffer
-* [ARROW-2742](https://issues.apache.org/jira/browse/ARROW-2742) - [Python] Allow Table.from\_batches to use Iterator of ArrowRecordBatches
-* [ARROW-2748](https://issues.apache.org/jira/browse/ARROW-2748) - [GLib] Add garrow\_decimal\_data\_type\_get\_scale() (and \_precision())
-* [ARROW-2749](https://issues.apache.org/jira/browse/ARROW-2749) - [GLib] Rename \*garrow\_decimal128\_array\_get\_value to \*garrow\_decimal128\_array\_format\_value
-* [ARROW-2751](https://issues.apache.org/jira/browse/ARROW-2751) - [GLib] Add garrow\_table\_replace\_column()
-* [ARROW-2752](https://issues.apache.org/jira/browse/ARROW-2752) - [GLib] Document garrow\_decimal\_data\_type\_new()
-* [ARROW-2753](https://issues.apache.org/jira/browse/ARROW-2753) - [GLib] Add garrow\_schema\_\*\_field()
-* [ARROW-2755](https://issues.apache.org/jira/browse/ARROW-2755) - [Python] Allow using Ninja to build extension
-* [ARROW-2756](https://issues.apache.org/jira/browse/ARROW-2756) - [Python] Remove redundant imports and minor fixes in parquet tests
-* [ARROW-2758](https://issues.apache.org/jira/browse/ARROW-2758) - [Plasma] Use Scope enum in Plasma
-* [ARROW-2760](https://issues.apache.org/jira/browse/ARROW-2760) - [Python] Remove legacy property definition syntax from parquet module and test them
-* [ARROW-2761](https://issues.apache.org/jira/browse/ARROW-2761) - Support set filter operators on Hive partitioned Parquet files
-* [ARROW-2763](https://issues.apache.org/jira/browse/ARROW-2763) - [Python] Make parquet \_metadata file accessible from ParquetDataset
-* [ARROW-2780](https://issues.apache.org/jira/browse/ARROW-2780) - [Go] Run code coverage analysis
-* [ARROW-2784](https://issues.apache.org/jira/browse/ARROW-2784) - [C++] MemoryMappedFile::WriteAt allow writing past the end
-* [ARROW-2790](https://issues.apache.org/jira/browse/ARROW-2790) - [C++] Buffers contain uninitialized memory
-* [ARROW-2791](https://issues.apache.org/jira/browse/ARROW-2791) - [Packaging] Build Ubuntu 18.04 packages
-* [ARROW-2792](https://issues.apache.org/jira/browse/ARROW-2792) - [Packaging] Consider uploading tarballs to avoid naming conflicts
-* [ARROW-2794](https://issues.apache.org/jira/browse/ARROW-2794) - [Plasma] Add Delete method for multiple objects
-* [ARROW-2798](https://issues.apache.org/jira/browse/ARROW-2798) - [Plasma] Use hashing function that takes into account all UniqueID bytes
-* [ARROW-2802](https://issues.apache.org/jira/browse/ARROW-2802) - [Docs] Move release management guide to project wiki
-* [ARROW-2804](https://issues.apache.org/jira/browse/ARROW-2804) - [Website] Link to Developer wiki (Confluence) from front page
-* [ARROW-2805](https://issues.apache.org/jira/browse/ARROW-2805) - [Python] TensorFlow import workaround not working with tensorflow-gpu if CUDA is not installed
-* [ARROW-2809](https://issues.apache.org/jira/browse/ARROW-2809) - [C++] Decrease verbosity of lint checks in Travis CI
-* [ARROW-2811](https://issues.apache.org/jira/browse/ARROW-2811) - [Python] Test serialization for determinism
-* [ARROW-2815](https://issues.apache.org/jira/browse/ARROW-2815) - [CI] Suppress DEBUG logging when building Java library in C++ CI entries
-* [ARROW-2816](https://issues.apache.org/jira/browse/ARROW-2816) - [Python] Add \_\_iter\_\_ method to NativeFile
-* [ARROW-2821](https://issues.apache.org/jira/browse/ARROW-2821) - [C++] Only zero memory in BooleanBuilder in one place
-* [ARROW-2822](https://issues.apache.org/jira/browse/ARROW-2822) - [C++] Zero padding bytes in PoolBuffer::Resize
-* [ARROW-2824](https://issues.apache.org/jira/browse/ARROW-2824) - [GLib] Add garrow\_decimal128\_array\_get\_value()
-* [ARROW-2825](https://issues.apache.org/jira/browse/ARROW-2825) - [C++] Need AllocateBuffer / AllocateResizableBuffer variant with default memory pool
-* [ARROW-2826](https://issues.apache.org/jira/browse/ARROW-2826) - [C++] Clarification needed between ArrayBuilder::Init(), Resize() and Reserve()
-* [ARROW-2827](https://issues.apache.org/jira/browse/ARROW-2827) - [C++] LZ4 and Zstd build may be failed in parallel build
-* [ARROW-2829](https://issues.apache.org/jira/browse/ARROW-2829) - [GLib] Add GArrowORCFileReader
-* [ARROW-2830](https://issues.apache.org/jira/browse/ARROW-2830) - [Packaging] Enable parallel build for deb package build again
-* [ARROW-2833](https://issues.apache.org/jira/browse/ARROW-2833) - [Python] Column.\_\_repr\_\_ will lock up Jupyter with large datasets
-* [ARROW-2834](https://issues.apache.org/jira/browse/ARROW-2834) - [GLib] Remove "enable\_" prefix from Meson options
-* [ARROW-2836](https://issues.apache.org/jira/browse/ARROW-2836) - [Packaging] Expand build matrices to multiple tasks
-* [ARROW-2837](https://issues.apache.org/jira/browse/ARROW-2837) - [C++] ArrayBuilder::null\_bitmap returns PoolBuffer
-* [ARROW-2838](https://issues.apache.org/jira/browse/ARROW-2838) - [Python] Speed up null testing with Pandas semantics
-* [ARROW-2844](https://issues.apache.org/jira/browse/ARROW-2844) - [Packaging] Test OSX wheels after build
-* [ARROW-2845](https://issues.apache.org/jira/browse/ARROW-2845) - [Packaging] Upload additional debian artifacts
-* [ARROW-2846](https://issues.apache.org/jira/browse/ARROW-2846) - [Packaging] Update nightly build in crossbow as well as the sample configuration
-* [ARROW-2847](https://issues.apache.org/jira/browse/ARROW-2847) - [Packaging] Fix artifact name matching for conda forge packages
-* [ARROW-2848](https://issues.apache.org/jira/browse/ARROW-2848) - [Packaging] lib\*.deb package name doesn't match so version
-* [ARROW-2849](https://issues.apache.org/jira/browse/ARROW-2849) - [Ruby] Arrow::Table#load supports ORC
-* [ARROW-2855](https://issues.apache.org/jira/browse/ARROW-2855) - [C++] Blog post that outlines the benefits of using jemalloc
-* [ARROW-2859](https://issues.apache.org/jira/browse/ARROW-2859) - [Python] Handle objects exporting the buffer protocol in open\_stream, open\_file, and RecordBatch\*Reader APIs
-* [ARROW-2861](https://issues.apache.org/jira/browse/ARROW-2861) - [Python] Add extra tips about using Parquet to store index-less pandas data
-* [ARROW-2864](https://issues.apache.org/jira/browse/ARROW-2864) - [Plasma] Add deletion cache to delete objects later
-* [ARROW-2868](https://issues.apache.org/jira/browse/ARROW-2868) - [Packaging] Fix centos-7 build
-* [ARROW-2869](https://issues.apache.org/jira/browse/ARROW-2869) - [Python] Add documentation for Array.to\_numpy
-* [ARROW-2875](https://issues.apache.org/jira/browse/ARROW-2875) - [Packaging] Don't attempt to download arrow archive in linux builds
-* [ARROW-2881](https://issues.apache.org/jira/browse/ARROW-2881) - [Website] Add Community tab to website
-* [ARROW-2884](https://issues.apache.org/jira/browse/ARROW-2884) - [Packaging] Options to build packages from apache source archive
-* [ARROW-2886](https://issues.apache.org/jira/browse/ARROW-2886) - [Release] An unused variable exists
-* [ARROW-2890](https://issues.apache.org/jira/browse/ARROW-2890) - [Plasma] Make Python PlasmaClient.release private
-* [ARROW-2893](https://issues.apache.org/jira/browse/ARROW-2893) - [C++] Remove PoolBuffer class from public API and hide implementation details behind factory functions
-* [ARROW-2897](https://issues.apache.org/jira/browse/ARROW-2897) - Organize supported Ubuntu versions
-* [ARROW-2898](https://issues.apache.org/jira/browse/ARROW-2898) - [Packaging] Setuptools\_scm just shipped a new version which fails to parse \`apache-arrow-<version>\` tag
-* [ARROW-2906](https://issues.apache.org/jira/browse/ARROW-2906) - [Website] Remove the link to slack channel
-* [ARROW-2907](https://issues.apache.org/jira/browse/ARROW-2907) - [GitHub] Improve "How to contribute patches"
-* [ARROW-2908](https://issues.apache.org/jira/browse/ARROW-2908) - [Rust] Update version to 0.10.0
-* [ARROW-2914](https://issues.apache.org/jira/browse/ARROW-2914) - [Integration] Add WindowPandasUDFTests to Spark Integration
-* [ARROW-2915](https://issues.apache.org/jira/browse/ARROW-2915) - [Packaging] Remove artifact form ubuntu-trusty build
-* [ARROW-2918](https://issues.apache.org/jira/browse/ARROW-2918) - [C++] Improve formatting of Struct pretty prints
-* [ARROW-2921](https://issues.apache.org/jira/browse/ARROW-2921) - [Release] Update .deb/.rpm changelos in preparation
-* [ARROW-2922](https://issues.apache.org/jira/browse/ARROW-2922) - [Release] Make python command name customizable
-* [ARROW-2923](https://issues.apache.org/jira/browse/ARROW-2923) - [Doc] Add instructions for running Spark integration tests
-* [ARROW-2924](https://issues.apache.org/jira/browse/ARROW-2924) - [Java] mvn release fails when an older maven javadoc plugin is installed
-* [ARROW-2927](https://issues.apache.org/jira/browse/ARROW-2927) - [Packaging] AppVeyor wheel task is failing on initial checkout
-* [ARROW-2928](https://issues.apache.org/jira/browse/ARROW-2928) - [Packaging] AppVeyor crossbow conda builds are picking up boost 1.63.0 instead of the installed version
-* [ARROW-2929](https://issues.apache.org/jira/browse/ARROW-2929) - [C++] ARROW-2826 Breaks parquet-cpp 1.4.0 builds
-* [ARROW-2934](https://issues.apache.org/jira/browse/ARROW-2934) - [Packaging] Add checksums creation to sign subcommand
-* [ARROW-2935](https://issues.apache.org/jira/browse/ARROW-2935) - [Packaging] Add verify\_binary\_artifacts function to verify-release-candidate.sh
-* [ARROW-2937](https://issues.apache.org/jira/browse/ARROW-2937) - [Java] Follow-up changes to ARROW-2704
-* [ARROW-2943](https://issues.apache.org/jira/browse/ARROW-2943) - [C++] Implement BufferedOutputStream::Flush
-* [ARROW-2944](https://issues.apache.org/jira/browse/ARROW-2944) - [Format] Arrow columnar format docs mentions VectorLayout that does not exist anymore
-* [ARROW-2946](https://issues.apache.org/jira/browse/ARROW-2946) - [Packaging] Stop to use PWD in debian/rules
-* [ARROW-2947](https://issues.apache.org/jira/browse/ARROW-2947) - [Packaging] Remove Ubuntu Artful
-* [ARROW-2949](https://issues.apache.org/jira/browse/ARROW-2949) - [CI] repo.continuum.io can be flaky in builds
-* [ARROW-2951](https://issues.apache.org/jira/browse/ARROW-2951) - [CI] Changes in format/ should cause Appveyor builds to run
-* [ARROW-2953](https://issues.apache.org/jira/browse/ARROW-2953) - [Plasma] Store memory usage
-* [ARROW-2954](https://issues.apache.org/jira/browse/ARROW-2954) - [Plasma] Store object\_id only once in object table
-* [ARROW-2962](https://issues.apache.org/jira/browse/ARROW-2962) - [Packaging] Bintray descriptor files are no longer needed
-* [ARROW-2977](https://issues.apache.org/jira/browse/ARROW-2977) - [Packaging] Release verification script should check rust too
-* [ARROW-2985](https://issues.apache.org/jira/browse/ARROW-2985) - [Ruby] Run unit tests in verify-release-candidate.sh
-* [ARROW-2988](https://issues.apache.org/jira/browse/ARROW-2988) - [Release] More automated release verification on Windows
-* [ARROW-2990](https://issues.apache.org/jira/browse/ARROW-2990) - [GLib] Fail to build with rpath-ed Arrow C++ on macOS
-* [ARROW-530](https://issues.apache.org/jira/browse/ARROW-530) - C++/Python: Provide subpools for better memory allocation tracking
-* [ARROW-564](https://issues.apache.org/jira/browse/ARROW-564) - [Python] Add methods to return vanilla NumPy arrays (plus boolean mask array if there are nulls)
-* [ARROW-889](https://issues.apache.org/jira/browse/ARROW-889) - [C++] Implement arrow::PrettyPrint for ChunkedArray
-* [ARROW-902](https://issues.apache.org/jira/browse/ARROW-902) - [C++] Build C++ project including thirdparty dependencies from local tarballs
-* [ARROW-906](https://issues.apache.org/jira/browse/ARROW-906) - [C++] Serialize Field metadata to IPC metadata
-
-## Bug Fixes
-
-* [ARROW-2059](https://issues.apache.org/jira/browse/ARROW-2059) - [Python] Possible performance regression in Feather read/write path
-* [ARROW-2101](https://issues.apache.org/jira/browse/ARROW-2101) - [Python] from\_pandas reads 'str' type as binary Arrow data with Python 2
-* [ARROW-2122](https://issues.apache.org/jira/browse/ARROW-2122) - [Python] Pyarrow fails to serialize dataframe with timestamp.
-* [ARROW-2182](https://issues.apache.org/jira/browse/ARROW-2182) - [Python] ASV benchmark setup does not account for C++ library changing
-* [ARROW-2193](https://issues.apache.org/jira/browse/ARROW-2193) - [Plasma] plasma\_store has runtime dependency on Boost shared libraries when ARROW\_BOOST\_USE\_SHARED=on
-* [ARROW-2195](https://issues.apache.org/jira/browse/ARROW-2195) - [Plasma] Segfault when retrieving RecordBatch from plasma store
-* [ARROW-2247](https://issues.apache.org/jira/browse/ARROW-2247) - [Python] Statically-linking boost\_regex in both libarrow and libparquet results in segfault
-* [ARROW-2273](https://issues.apache.org/jira/browse/ARROW-2273) - Cannot deserialize pandas SparseDataFrame
-* [ARROW-2300](https://issues.apache.org/jira/browse/ARROW-2300) - [Python] python/testing/test\_hdfs.sh no longer works
-* [ARROW-2305](https://issues.apache.org/jira/browse/ARROW-2305) - [Python] Cython 0.25.2 compilation failure
-* [ARROW-2314](https://issues.apache.org/jira/browse/ARROW-2314) - [Python] Union array slicing is defective
-* [ARROW-2326](https://issues.apache.org/jira/browse/ARROW-2326) - [Python] cannot import pip installed pyarrow on OS X (10.9)
-* [ARROW-2328](https://issues.apache.org/jira/browse/ARROW-2328) - Writing a slice with feather ignores the offset
-* [ARROW-2331](https://issues.apache.org/jira/browse/ARROW-2331) - [Python] Fix indexing implementations
-* [ARROW-2333](https://issues.apache.org/jira/browse/ARROW-2333) - [Python] boost bundling fails in setup.py
-* [ARROW-2342](https://issues.apache.org/jira/browse/ARROW-2342) - [Python] Aware timestamp type fails pickling
-* [ARROW-2346](https://issues.apache.org/jira/browse/ARROW-2346) - [Python] PYARROW\_CXXFLAGS doesn't accept multiple options
-* [ARROW-2349](https://issues.apache.org/jira/browse/ARROW-2349) - [Python] Boost shared library bundling is broken for MSVC
-* [ARROW-2351](https://issues.apache.org/jira/browse/ARROW-2351) - [C++] StringBuilder::append(vector<string>...) not implemented
-* [ARROW-2354](https://issues.apache.org/jira/browse/ARROW-2354) - [C++] PyDecimal\_Check() is much too slow
-* [ARROW-2355](https://issues.apache.org/jira/browse/ARROW-2355) - [Python] Unable to import pyarrow [0.9.0] OSX
-* [ARROW-2357](https://issues.apache.org/jira/browse/ARROW-2357) - Benchmark PandasObjectIsNull
-* [ARROW-2368](https://issues.apache.org/jira/browse/ARROW-2368) - DecimalVector#setBigEndian is not padding correctly for negative values
-* [ARROW-2369](https://issues.apache.org/jira/browse/ARROW-2369) - Large (>~20 GB) files written to Parquet via PyArrow are corrupted
-* [ARROW-2370](https://issues.apache.org/jira/browse/ARROW-2370) - [GLib] include path is wrong on Meson build
-* [ARROW-2371](https://issues.apache.org/jira/browse/ARROW-2371) - [GLib] gio-2.0 isn't required on GNU Autotools build
-* [ARROW-2372](https://issues.apache.org/jira/browse/ARROW-2372) - [Python] ArrowIOError: Invalid argument when reading Parquet file
-* [ARROW-2375](https://issues.apache.org/jira/browse/ARROW-2375) - [Rust] Buffer should release memory when dropped
-* [ARROW-2377](https://issues.apache.org/jira/browse/ARROW-2377) - [GLib] Travis-CI failures
-* [ARROW-2380](https://issues.apache.org/jira/browse/ARROW-2380) - [Python] Correct issues in numpy\_to\_arrow conversion routines
-* [ARROW-2382](https://issues.apache.org/jira/browse/ARROW-2382) - [Rust] List<T> was not using memory safely
-* [ARROW-2383](https://issues.apache.org/jira/browse/ARROW-2383) - [C++] Debian packages need to depend on libprotobuf
-* [ARROW-2387](https://issues.apache.org/jira/browse/ARROW-2387) - [Python] negative decimal values get spurious rescaling error
-* [ARROW-2391](https://issues.apache.org/jira/browse/ARROW-2391) - [Python] Segmentation fault from PyArrow when mapping Pandas datetime column to pyarrow.date64
-* [ARROW-2393](https://issues.apache.org/jira/browse/ARROW-2393) - [C++] arrow/status.h does not define ARROW\_CHECK needed for ARROW\_CHECK\_OK
-* [ARROW-2403](https://issues.apache.org/jira/browse/ARROW-2403) - [C++] arrow::CpuInfo::model\_name\_ destructed twice on exit
-* [ARROW-2405](https://issues.apache.org/jira/browse/ARROW-2405) - [C++] <functional> is missing in plasma/client.h
-* [ARROW-2418](https://issues.apache.org/jira/browse/ARROW-2418) - [Rust] List builder fails due to memory not being reserved correctly
-* [ARROW-2419](https://issues.apache.org/jira/browse/ARROW-2419) - [Site] Website generation depends on local timezone
-* [ARROW-2420](https://issues.apache.org/jira/browse/ARROW-2420) - [Rust] Memory is never released
-* [ARROW-2423](https://issues.apache.org/jira/browse/ARROW-2423) - [Python] PyArrow datatypes raise ValueError on equality checks against non-PyArrow objects
-* [ARROW-2424](https://issues.apache.org/jira/browse/ARROW-2424) - [Rust] Missing import causing broken build
-* [ARROW-2425](https://issues.apache.org/jira/browse/ARROW-2425) - [Rust] Array::from missing mapping for u8 type
-* [ARROW-2426](https://issues.apache.org/jira/browse/ARROW-2426) - [CI] glib build failure
-* [ARROW-2432](https://issues.apache.org/jira/browse/ARROW-2432) - [Python] from\_pandas fails when converting decimals if have None values
-* [ARROW-2437](https://issues.apache.org/jira/browse/ARROW-2437) - [C++] Change of arrow::ipc::ReadMessage signature breaks ABI compability
-* [ARROW-2441](https://issues.apache.org/jira/browse/ARROW-2441) - [Rust] Builder<T>::slice\_mut assertions are too strict
-* [ARROW-2443](https://issues.apache.org/jira/browse/ARROW-2443) - [Python] Conversion from pandas of empty categorical fails with ArrowInvalid
-* [ARROW-2450](https://issues.apache.org/jira/browse/ARROW-2450) - [Python] Saving to parquet fails for empty lists
-* [ARROW-2452](https://issues.apache.org/jira/browse/ARROW-2452) - [TEST] Spark integration test fails with permission error
-* [ARROW-2454](https://issues.apache.org/jira/browse/ARROW-2454) - [Python] Empty chunked array slice crashes
-* [ARROW-2455](https://issues.apache.org/jira/browse/ARROW-2455) - [C++] The bytes\_allocated\_ in CudaContextImpl isn't initialized
-* [ARROW-2457](https://issues.apache.org/jira/browse/ARROW-2457) - garrow\_array\_builder\_append\_values() won't work for large arrays
-* [ARROW-2459](https://issues.apache.org/jira/browse/ARROW-2459) - pyarrow: Segfault with pyarrow.deserialize\_pandas
-* [ARROW-2462](https://issues.apache.org/jira/browse/ARROW-2462) - [C++] Segfault when writing a parquet table containing a dictionary column from Record Batch Stream
-* [ARROW-2465](https://issues.apache.org/jira/browse/ARROW-2465) - [Plasma] plasma\_store fails to find libarrow\_gpu.so
-* [ARROW-2466](https://issues.apache.org/jira/browse/ARROW-2466) - [C++] misleading "append" flag to FileOutputStream
-* [ARROW-2468](https://issues.apache.org/jira/browse/ARROW-2468) - [Rust] Builder::slice\_mut should take mut self
-* [ARROW-2471](https://issues.apache.org/jira/browse/ARROW-2471) - [Rust] Assertion when pushing value to Builder/ListBuilder with zero capacity
-* [ARROW-2473](https://issues.apache.org/jira/browse/ARROW-2473) - [Rust] List assertion error with list of zero length
-* [ARROW-2474](https://issues.apache.org/jira/browse/ARROW-2474) - [Rust] Add windows support for memory pool abstraction
-* [ARROW-2489](https://issues.apache.org/jira/browse/ARROW-2489) - [Plasma] test\_plasma.py crashes
-* [ARROW-2491](https://issues.apache.org/jira/browse/ARROW-2491) - [Python] Array.from\_buffers does not work for ListArray
-* [ARROW-2492](https://issues.apache.org/jira/browse/ARROW-2492) - [Python] Prevent segfault on accidental call of pyarrow.Array
-* [ARROW-2500](https://issues.apache.org/jira/browse/ARROW-2500) - [Java] IPC Writers/readers are not always setting validity bits correctly
-* [ARROW-2502](https://issues.apache.org/jira/browse/ARROW-2502) - [Rust] Restore Windows Compatibility
-* [ARROW-2503](https://issues.apache.org/jira/browse/ARROW-2503) - [Python] Trailing space character in RowGroup statistics of pyarrow.parquet.ParquetFile
-* [ARROW-2509](https://issues.apache.org/jira/browse/ARROW-2509) - [CI] Intermittent npm failures
-* [ARROW-2511](https://issues.apache.org/jira/browse/ARROW-2511) - BaseVariableWidthVector.allocateNew is not throwing OOM when it can't allocate memory
-* [ARROW-2514](https://issues.apache.org/jira/browse/ARROW-2514) - [Python] Inferring / converting nested Numpy array is very slow
-* [ARROW-2515](https://issues.apache.org/jira/browse/ARROW-2515) - Errors with DictionaryArray inside of ListArray or other DictionaryArray
-* [ARROW-2518](https://issues.apache.org/jira/browse/ARROW-2518) - [Java] Restore Java unit tests and javadoc test to CI matrix
-* [ARROW-2530](https://issues.apache.org/jira/browse/ARROW-2530) - [GLib] Out-of-source build is failed
-* [ARROW-2534](https://issues.apache.org/jira/browse/ARROW-2534) - [C++] libarrow.so leaks zlib symbols
-* [ARROW-2545](https://issues.apache.org/jira/browse/ARROW-2545) - [Python] Arrow fails linking against statically-compiled Python
-* [ARROW-2554](https://issues.apache.org/jira/browse/ARROW-2554) - pa.array type inference bug when using NS-timestamp
-* [ARROW-2557](https://issues.apache.org/jira/browse/ARROW-2557) - [Rust] Add badge for code coverage in README
-* [ARROW-2561](https://issues.apache.org/jira/browse/ARROW-2561) - [C++] Crash in cuda-test shutdown with coverage enabled
-* [ARROW-2564](https://issues.apache.org/jira/browse/ARROW-2564) - [C++] Rowwise Tutorial is out of date
-* [ARROW-2565](https://issues.apache.org/jira/browse/ARROW-2565) - [Plasma] new subscriber cannot receive notifications about existing objects
-* [ARROW-2570](https://issues.apache.org/jira/browse/ARROW-2570) - [Python] Add support for writing parquet files with LZ4 compression
-* [ARROW-2571](https://issues.apache.org/jira/browse/ARROW-2571) - [C++] Lz4Codec doesn't properly handle empty data
-* [ARROW-2575](https://issues.apache.org/jira/browse/ARROW-2575) - [Python] Exclude hidden files when reading Parquet dataset
-* [ARROW-2578](https://issues.apache.org/jira/browse/ARROW-2578) - [Plasma] Valgrind errors related to std::random\_device
-* [ARROW-2589](https://issues.apache.org/jira/browse/ARROW-2589) - [Python] test\_parquet.py regression with Pandas 0.23.0
-* [ARROW-2593](https://issues.apache.org/jira/browse/ARROW-2593) - [Python] TypeError: data type "mixed-integer" not understood
-* [ARROW-2594](https://issues.apache.org/jira/browse/ARROW-2594) - [Java] Vector reallocation does not properly clear reused buffers
-* [ARROW-2601](https://issues.apache.org/jira/browse/ARROW-2601) - [Python] MemoryPool bytes\_allocated causes seg
-* [ARROW-2603](https://issues.apache.org/jira/browse/ARROW-2603) - [Python] from pandas raises ArrowInvalid for date(time) subclasses
-* [ARROW-2615](https://issues.apache.org/jira/browse/ARROW-2615) - [Rust] Refactor introduced a bug around Arrays of String
-* [ARROW-2629](https://issues.apache.org/jira/browse/ARROW-2629) - [Plasma] Iterator invalidation for pending\_notifications\_
-* [ARROW-2630](https://issues.apache.org/jira/browse/ARROW-2630) - [Java] Typo in the document
-* [ARROW-2632](https://issues.apache.org/jira/browse/ARROW-2632) - [Java] ArrowStreamWriter accumulates ArrowBlock but does not use them
-* [ARROW-2640](https://issues.apache.org/jira/browse/ARROW-2640) - JS Writer should serialize schema metadata
-* [ARROW-2643](https://issues.apache.org/jira/browse/ARROW-2643) - [C++] Travis-CI build failure with cpp toolchain enabled
-* [ARROW-2644](https://issues.apache.org/jira/browse/ARROW-2644) - [Python] parquet binding fails building on AppVeyor
-* [ARROW-2655](https://issues.apache.org/jira/browse/ARROW-2655) - [C++] Failure with -Werror=conversion on gcc 7.3.0
-* [ARROW-2657](https://issues.apache.org/jira/browse/ARROW-2657) - Segfault when importing TensorFlow after Pyarrow
-* [ARROW-2668](https://issues.apache.org/jira/browse/ARROW-2668) - [C++] -Wnull-pointer-arithmetic warning with dlmalloc.c on clang 6.0, Ubuntu 14.04
-* [ARROW-2669](https://issues.apache.org/jira/browse/ARROW-2669) - [C++] EP\_CXX\_FLAGS not passed on when building gbenchmark
-* [ARROW-2675](https://issues.apache.org/jira/browse/ARROW-2675) - Arrow build error with clang-10 (Apple Clang / LLVM)
-* [ARROW-2683](https://issues.apache.org/jira/browse/ARROW-2683) - [Python] Resource Warning (Unclosed File) when using pyarrow.parquet.read\_table()
-* [ARROW-2690](https://issues.apache.org/jira/browse/ARROW-2690) - [C++] Plasma does not follow style conventions for variable and function names
-* [ARROW-2691](https://issues.apache.org/jira/browse/ARROW-2691) - [Rust] Travis fails due to formatting diff
-* [ARROW-2693](https://issues.apache.org/jira/browse/ARROW-2693) - [Python] pa.chunked\_array causes a segmentation fault on empty input
-* [ARROW-2694](https://issues.apache.org/jira/browse/ARROW-2694) - [Python] ArrayValue string conversion returns the representation instead of the converted python object string
-* [ARROW-2698](https://issues.apache.org/jira/browse/ARROW-2698) - [Python] Exception when passing a string to Table.column
-* [ARROW-2711](https://issues.apache.org/jira/browse/ARROW-2711) - [Python/C++] Pandas-Arrow doesn't roundtrip when column of lists has empty first element
-* [ARROW-2716](https://issues.apache.org/jira/browse/ARROW-2716) - [Python] Make manylinux1 base image independent of Python patch releases
-* [ARROW-2721](https://issues.apache.org/jira/browse/ARROW-2721) - [C++] Link error with Arrow C++ build with -DARROW\_ORC=ON on CentOS 7
-* [ARROW-2722](https://issues.apache.org/jira/browse/ARROW-2722) - [Python] ndarray to arrow conversion fails when downcasted from pandas to\_numeric
-* [ARROW-2723](https://issues.apache.org/jira/browse/ARROW-2723) - [C++] arrow-orc.pc is missing
-* [ARROW-2726](https://issues.apache.org/jira/browse/ARROW-2726) - [C++] The latest Boost version is wrong
-* [ARROW-2727](https://issues.apache.org/jira/browse/ARROW-2727) - [Java] Unable to build java/adapters module
-* [ARROW-2741](https://issues.apache.org/jira/browse/ARROW-2741) - [Python] pa.array from np.datetime[D] and type=pa.date64 produces invalid results
-* [ARROW-2744](https://issues.apache.org/jira/browse/ARROW-2744) - [Python] Writing to parquet crashes when writing a ListArray of empty lists
-* [ARROW-2745](https://issues.apache.org/jira/browse/ARROW-2745) - [C++] ORC ExternalProject needs to declare dependency on vendored protobuf
-* [ARROW-2747](https://issues.apache.org/jira/browse/ARROW-2747) - [CI] [Plasma] huge tables test failure on Travis
-* [ARROW-2754](https://issues.apache.org/jira/browse/ARROW-2754) - [Python] When installing pyarrow via pip, a debug build is created
-* [ARROW-2770](https://issues.apache.org/jira/browse/ARROW-2770) - [Packaging] Account for conda-forge compiler migration in conda recipes
-* [ARROW-2773](https://issues.apache.org/jira/browse/ARROW-2773) - [Python] Corrected parquet docs partition\_cols parameter name
-* [ARROW-2781](https://issues.apache.org/jira/browse/ARROW-2781) - [Python] Download boost using curl in manylinux1 image
-* [ARROW-2787](https://issues.apache.org/jira/browse/ARROW-2787) - [Python] Memory Issue passing table from python to c++ via cython
-* [ARROW-2795](https://issues.apache.org/jira/browse/ARROW-2795) - [Python] Run TensorFlow import workaround only on Linux
-* [ARROW-2806](https://issues.apache.org/jira/browse/ARROW-2806) - [Python] Inconsistent handling of np.nan
-* [ARROW-2810](https://issues.apache.org/jira/browse/ARROW-2810) - [Plasma] Plasma public headers leak flatbuffers.h
-* [ARROW-2812](https://issues.apache.org/jira/browse/ARROW-2812) - [Ruby] StructArray#[] raises NoMethodError
-* [ARROW-2820](https://issues.apache.org/jira/browse/ARROW-2820) - [Python] RecordBatch.from\_arrays does not validate array lengths are all equal
-* [ARROW-2823](https://issues.apache.org/jira/browse/ARROW-2823) - [C++] Search for flatbuffers in <root>/lib64
-* [ARROW-2841](https://issues.apache.org/jira/browse/ARROW-2841) - [Go] Fix recent Go build failures in Travis CI
-* [ARROW-2850](https://issues.apache.org/jira/browse/ARROW-2850) - [C++/Python] PARQUET\_RPATH\_ORIGIN=ON missing in manylinux1 build
-* [ARROW-2851](https://issues.apache.org/jira/browse/ARROW-2851) - [C++] Update RAT excludes for new install file names
-* [ARROW-2852](https://issues.apache.org/jira/browse/ARROW-2852) - [Rust] Mark Array as Sync and Send
-* [ARROW-2862](https://issues.apache.org/jira/browse/ARROW-2862) - [C++] Ensure thirdparty download directory has been created in thirdparty/download\_thirdparty.sh
-* [ARROW-2867](https://issues.apache.org/jira/browse/ARROW-2867) - [Python] Incorrect example for Cython usage
-* [ARROW-2871](https://issues.apache.org/jira/browse/ARROW-2871) - [Python] Array.to\_numpy is invalid for boolean arrays
-* [ARROW-2872](https://issues.apache.org/jira/browse/ARROW-2872) - [Python] Add pytest mark to opt into TensorFlow-related unit tests
-* [ARROW-2876](https://issues.apache.org/jira/browse/ARROW-2876) - [Packaging] Crossbow builds can hang if you cloned using SSH
-* [ARROW-2877](https://issues.apache.org/jira/browse/ARROW-2877) - [Packaging] crossbow submit results in duplicate Travis CI build
-* [ARROW-2878](https://issues.apache.org/jira/browse/ARROW-2878) - [Packaging] README.md does not mention setting GitHub API token in user's crossbow repo settings
-* [ARROW-2883](https://issues.apache.org/jira/browse/ARROW-2883) - [Plasma] Compilation warnings
-* [ARROW-2891](https://issues.apache.org/jira/browse/ARROW-2891) - Preserve schema in write\_to\_dataset
-* [ARROW-2894](https://issues.apache.org/jira/browse/ARROW-2894) - [Glib] Format tests broken due to recent refactor
-* [ARROW-2895](https://issues.apache.org/jira/browse/ARROW-2895) - [Ruby] CI isn't ran when C++ is changed
-* [ARROW-2896](https://issues.apache.org/jira/browse/ARROW-2896) - [GLib] export are missing
-* [ARROW-2901](https://issues.apache.org/jira/browse/ARROW-2901) - [Java] Build is failing on Java9
-* [ARROW-2902](https://issues.apache.org/jira/browse/ARROW-2902) - [Python] HDFS Docker integration tests leave around files created by root
-* [ARROW-2911](https://issues.apache.org/jira/browse/ARROW-2911) - [Python] Parquet binary statistics that end in '\0' truncate last byte
-* [ARROW-2917](https://issues.apache.org/jira/browse/ARROW-2917) - [Python] Tensor requiring gradiant cannot be serialized with pyarrow.serialize
-* [ARROW-2920](https://issues.apache.org/jira/browse/ARROW-2920) - [Python] Segfault with pytorch 0.4
-* [ARROW-2926](https://issues.apache.org/jira/browse/ARROW-2926) - [Python] ParquetWriter segfaults in example where passed schema and table schema do not match
-* [ARROW-2930](https://issues.apache.org/jira/browse/ARROW-2930) - [C++] Trying to set target properties on not existing CMake target
-* [ARROW-2940](https://issues.apache.org/jira/browse/ARROW-2940) - [Python] Import error with pytorch 0.3
-* [ARROW-2945](https://issues.apache.org/jira/browse/ARROW-2945) - [Packaging] Update argument check for 02-source.sh
-* [ARROW-2955](https://issues.apache.org/jira/browse/ARROW-2955) - [Python] Typo in pyarrow's HDFS API result
-* [ARROW-2963](https://issues.apache.org/jira/browse/ARROW-2963) - [Python] Deadlock during fork-join and use\_threads=True
-* [ARROW-2978](https://issues.apache.org/jira/browse/ARROW-2978) - [Rust] Travis CI build is failing
-* [ARROW-2982](https://issues.apache.org/jira/browse/ARROW-2982) - The "--show-progress" option is only supported in wget 1.16 and higher
-* [ARROW-640](https://issues.apache.org/jira/browse/ARROW-640) - [Python] Arrow scalar values should have a sensible \_\_hash\_\_ and comparison
-
-[1]: https://github.com/apache/arrow/releases/tag/apache-arrow-0.10.0
-[2]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.10.0/
-[3]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.10.0/binaries
diff --git a/site/_release/0.11.0.md b/site/_release/0.11.0.md
deleted file mode 100644
index 4dd9e26..0000000
--- a/site/_release/0.11.0.md
+++ /dev/null
@@ -1,427 +0,0 @@
----
-layout: default
-title: Apache Arrow 0.11.0 Release
-permalink: /release/0.11.0.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow 0.11.0 (8 October 2018)
-
-This is a major release.
-
-## Download
-
-* [**Source Artifacts**][1]
-* [**Binary Artifacts**][2]
-* [Git tag][3]
-
-## Contributors
-
-This includes patches from Apache Parquet that were merged.
-
-```shell
-$ git shortlog -sn apache-arrow-0.10.0..apache-arrow-0.11.0
-   166	Wes McKinney
-    59	Uwe L. Korn
-    57	Deepak Majeti
-    54	Kouhei Sutou
-    50	Krisztián Szűcs
-    48	Antoine Pitrou
-    38	Korn, Uwe
-    29	Pindikura Ravindra
-    21	praveenbingo
-     9	Vivekanand Vellanki
-     8	Philipp Moritz
-     8	Sebastien Binet
-     7	Paddy Horan
-     7	Phillip Cloud
-     6	Brian Hulette
-     6	Max Risuhin
-     6	Xianjin YE
-     5	Aliaksei Sandryhaila
-     5	Bryan Cutler
-     5	Romain Francois
-     4	Gidon Gershinsky
-     4	Thomas Sanchez
-     3	Artem Tarasov
-     3	Joshua Storck
-     3	Lukasz A. Bartnik
-     3	Robert Nishihara
-     3	rip.nsk
-     3	yosuke shiro
-     2	Amir Szekely
-     2	Andy Grove
-     2	Chao Sun
-     2	Chen, Junjie
-     2	Itai Incze
-     2	Jacek Pliszka
-     2	Li Jin
-     2	Marc Vertes
-     2	Nong Li
-     2	Pearu Peterson
-     2	Philipp Hoch
-     2	Renato Marroquin
-     2	Robert Gruener
-     2	fscheibner
-     2	ptaylor
-     1	Alec Posney
-     1	Animesh Trivedi
-     1	Chongyu Zhu
-     1	Colin Nichols
-     1	Dominik Moritz
-     1	Eric Conlon
-     1	Eric Daniel
-     1	Fabrizio Fabbri
-     1	Florian Scheibner
-     1	Jacques Nadeau
-     1	Jim Apple
-     1	Jim Crist
-     1	Julius Neuffer
-     1	Kalon Mills
-     1	Kashif Rasul
-     1	Kevin Gurney
-     1	Laurent Goujon
-     1	Masayuki Takahashi
-     1	Mike Trinkala
-     1	Pav A
-     1	Pedro M. Duarte
-     1	Rene Sugar
-     1	Ted Haining
-     1	Toby Shaw
-     1	Victor Uriarte
-     1	Wan Shen Lim
-     1	William Forson
-     1	Yuhong Guo
-     1	luc1ph3r
-     1	praveenkrishna.d
-     1	revaliu
-```
-
-# Patch Committers
-
-The following Apache committers committed contributed patches to the
-repository. This includes patches from Apache Parquet that were merged.
-
-```
-$ git shortlog -csn apache-arrow-0.10.0..apache-arrow-0.11.0
-   471  Wes McKinney
-    66  Uwe L. Korn
-    30  Julien Le Dem
-    25  Korn, Uwe
-    23  Kouhei Sutou
-    18  Krisztián Szűcs
-    14  Nong Li
-    11  Antoine Pitrou
-     6  Deepak Majeti
-     3  Bryan Cutler
-     3  Li Jin
-     3  Philipp Moritz
-     3  Robert Nishihara
-     2  Brian Hulette
-     1  Andy Grove
-     1  Sidd
-     1  ptaylor
-```
-
-# Changelog
-
-## New Features and Improvements
-
-* [ARROW-1325](https://issues.apache.org/jira/browse/ARROW-1325) - [R] Bootstrap R bindings subproject
-* [ARROW-1424](https://issues.apache.org/jira/browse/ARROW-1424) - [Python] Initial bindings for libarrow\_gpu
-* [ARROW-1491](https://issues.apache.org/jira/browse/ARROW-1491) - [C++] Add casting implementations from strings to numbers or boolean
-* [ARROW-1521](https://issues.apache.org/jira/browse/ARROW-1521) - [C++] Add Reset method to BufferOutputStream to enable object reuse
-* [ARROW-1563](https://issues.apache.org/jira/browse/ARROW-1563) - [C++] Implement logical unary and binary kernels for boolean arrays
-* [ARROW-1860](https://issues.apache.org/jira/browse/ARROW-1860) - [C++] Add data structure to "stage" a sequence of IPC messages from in-memory data
-* [ARROW-1949](https://issues.apache.org/jira/browse/ARROW-1949) - [Python/C++] Add option to Array.from\_pandas and pyarrow.array to perform unsafe casts
-* [ARROW-1963](https://issues.apache.org/jira/browse/ARROW-1963) - [C++/Python] Create Array from sequence of numpy.datetime64
-* [ARROW-1968](https://issues.apache.org/jira/browse/ARROW-1968) - [Python] Unit testing setup for ORC files
-* [ARROW-2165](https://issues.apache.org/jira/browse/ARROW-2165) - enhance AllocatorListener to listen for child allocator addition and removal
-* [ARROW-2338](https://issues.apache.org/jira/browse/ARROW-2338) - [Scripts] Windows release verification script should create a conda environment
-* [ARROW-249](https://issues.apache.org/jira/browse/ARROW-249) - [Flight] Define GRPC IDL / wire protocol for messaging with Arrow data
-* [ARROW-25](https://issues.apache.org/jira/browse/ARROW-25) - [C++] Implement delimited file scanner / CSV reader
-* [ARROW-2520](https://issues.apache.org/jira/browse/ARROW-2520) - [Rust] CI should also build against nightly Rust
-* [ARROW-2555](https://issues.apache.org/jira/browse/ARROW-2555) - [Python] Provide an option to convert on coerce\_timestamps instead of error
-* [ARROW-2583](https://issues.apache.org/jira/browse/ARROW-2583) - [Rust] Buffer should be typeless
-* [ARROW-2617](https://issues.apache.org/jira/browse/ARROW-2617) - [Rust] Schema should contain fields not columns
-* [ARROW-2687](https://issues.apache.org/jira/browse/ARROW-2687) - [JS] Example usage in README is outdated
-* [ARROW-2734](https://issues.apache.org/jira/browse/ARROW-2734) - [Python] Cython api example doesn't work by default on macOS
-* [ARROW-2750](https://issues.apache.org/jira/browse/ARROW-2750) - [MATLAB] Add MATLAB support for reading numeric types from Feather files
-* [ARROW-2799](https://issues.apache.org/jira/browse/ARROW-2799) - [Python] Add safe option to Table.from\_pandas to avoid unsafe casts
-* [ARROW-2813](https://issues.apache.org/jira/browse/ARROW-2813) - [C++] Strip uninformative lcov output from Travis CI logs
-* [ARROW-2817](https://issues.apache.org/jira/browse/ARROW-2817) - [C++] Enable libraries to be installed in msys2 on Windows
-* [ARROW-2840](https://issues.apache.org/jira/browse/ARROW-2840) - [C++] See if stream alignment logic can be simplified
-* [ARROW-2865](https://issues.apache.org/jira/browse/ARROW-2865) - [C++/Python] Reduce some duplicated code in python/builtin\_convert.cc
-* [ARROW-2889](https://issues.apache.org/jira/browse/ARROW-2889) - [C++] Add optional argument to ADD\_ARROW\_TEST CMake function to add unit test prefix
-* [ARROW-2900](https://issues.apache.org/jira/browse/ARROW-2900) - [Python] Improve performance of appending nested NumPy arrays in builtin\_convert.cc
-* [ARROW-2936](https://issues.apache.org/jira/browse/ARROW-2936) - [Python] Implement Table.cast for casting from one schema to another (if possible)
-* [ARROW-2948](https://issues.apache.org/jira/browse/ARROW-2948) - [Packaging] Generate changelog with crossbow
-* [ARROW-2950](https://issues.apache.org/jira/browse/ARROW-2950) - [C++] Clean up util/bit-util.h
-* [ARROW-2952](https://issues.apache.org/jira/browse/ARROW-2952) - [C++] Dockerfile for running include-what-you-use checks
-* [ARROW-2958](https://issues.apache.org/jira/browse/ARROW-2958) - [C++] Flatbuffers EP fails to compile with GCC 8.1
-* [ARROW-2960](https://issues.apache.org/jira/browse/ARROW-2960) - [Packaging] Fix verify-release-candidate for binary packages and fix release cutting script for lib64 cmake issue
-* [ARROW-2964](https://issues.apache.org/jira/browse/ARROW-2964) - [Go] wire all currently implemented array types in array.MakeFromData
-* [ARROW-2971](https://issues.apache.org/jira/browse/ARROW-2971) - [Python] Give more descriptive names to python\_to\_arrow.cc/arrow\_to\_python.cc
-* [ARROW-2975](https://issues.apache.org/jira/browse/ARROW-2975) - [Plasma] TensorFlow op: Compilation only working if arrow found by pkg-config
-* [ARROW-2976](https://issues.apache.org/jira/browse/ARROW-2976) - [Python] Directory in pyarrow.get\_library\_dirs() on Travis doesn't contain libarrow.so
-* [ARROW-2979](https://issues.apache.org/jira/browse/ARROW-2979) - [GLib] Add operator functions in GArrowDecimal128
-* [ARROW-2983](https://issues.apache.org/jira/browse/ARROW-2983) - [Packaging] Verify source release and binary artifacts in different scripts
-* [ARROW-2989](https://issues.apache.org/jira/browse/ARROW-2989) - [C++] Remove deprecated APIs in 0.10.0 and below
-* [ARROW-2991](https://issues.apache.org/jira/browse/ARROW-2991) - [CI] Cut down number of AppVeyor jobs
-* [ARROW-2994](https://issues.apache.org/jira/browse/ARROW-2994) - [C++] Only include Python C header directories for Python-related compilation units
-* [ARROW-2996](https://issues.apache.org/jira/browse/ARROW-2996) - [C++] Fix typo in cpp/.clang-tidy
-* [ARROW-2998](https://issues.apache.org/jira/browse/ARROW-2998) - [C++] Add variants of AllocateBuffer, AllocateResizeableBuffer that return unique\_ptr<Buffer>
-* [ARROW-2999](https://issues.apache.org/jira/browse/ARROW-2999) - [Python] Do not run ASV benchmarks in every Travis CI build to improve runtimes
-* [ARROW-3000](https://issues.apache.org/jira/browse/ARROW-3000) - [Python] Do not build unit tests other than python-test in travis\_script\_python.sh
-* [ARROW-3001](https://issues.apache.org/jira/browse/ARROW-3001) - [Packaging] Don't modify PATH during rust release verification
-* [ARROW-3002](https://issues.apache.org/jira/browse/ARROW-3002) - [Python] Implement better DataType hash function
-* [ARROW-3003](https://issues.apache.org/jira/browse/ARROW-3003) - [Doc] Enable Java doc in dev/gen\_apidocs/create\_documents.sh
-* [ARROW-3005](https://issues.apache.org/jira/browse/ARROW-3005) - [Website] Update website and write blog post for 0.10.0 release announcement
-* [ARROW-3008](https://issues.apache.org/jira/browse/ARROW-3008) - [Packaging] Verify GPU related modules if available
-* [ARROW-3009](https://issues.apache.org/jira/browse/ARROW-3009) - [Python] pyarrow.orc uses APIs now prohibited in 0.10.0
-* [ARROW-3010](https://issues.apache.org/jira/browse/ARROW-3010) - [GLib] Update README to use Bundler
-* [ARROW-3017](https://issues.apache.org/jira/browse/ARROW-3017) - [C++] Don't throw exception in arrow/util/thread-pool.h
-* [ARROW-3018](https://issues.apache.org/jira/browse/ARROW-3018) - [Plasma] Improve random ObjectID generation
-* [ARROW-3019](https://issues.apache.org/jira/browse/ARROW-3019) - [Packaging] Use Bundler to verify Arrow GLib
-* [ARROW-3021](https://issues.apache.org/jira/browse/ARROW-3021) - [Go] support for List
-* [ARROW-3022](https://issues.apache.org/jira/browse/ARROW-3022) - [Go] support for Struct
-* [ARROW-3023](https://issues.apache.org/jira/browse/ARROW-3023) - [C++] Use gold linker in builds if it is available
-* [ARROW-3024](https://issues.apache.org/jira/browse/ARROW-3024) - [C++] Replace usages of std::mutex with atomics in memory\_pool.cc
-* [ARROW-3026](https://issues.apache.org/jira/browse/ARROW-3026) - [Plasma] Only run Plasma Python unit tests under valgrind once instead of twice in CI
-* [ARROW-3027](https://issues.apache.org/jira/browse/ARROW-3027) - [Ruby] Stop "git tag" by "rake release"
-* [ARROW-3028](https://issues.apache.org/jira/browse/ARROW-3028) - [Python] Trim unneeded work from documentation build in Travis CI
-* [ARROW-3029](https://issues.apache.org/jira/browse/ARROW-3029) - [Python] pkg\_resources is slow
-* [ARROW-3031](https://issues.apache.org/jira/browse/ARROW-3031) - [Go] Streamline release of Arrays and Builders
-* [ARROW-3034](https://issues.apache.org/jira/browse/ARROW-3034) - [Packaging] Source archive can't be extracted by bsdtar on MSYS2
-* [ARROW-3035](https://issues.apache.org/jira/browse/ARROW-3035) - [Rust] Examples in README.md do not run
-* [ARROW-3036](https://issues.apache.org/jira/browse/ARROW-3036) - [Go] add support for slicing Arrays
-* [ARROW-3037](https://issues.apache.org/jira/browse/ARROW-3037) - [Go] add support NullArray
-* [ARROW-3042](https://issues.apache.org/jira/browse/ARROW-3042) - [Go] add badge to GoDoc in the Go-Arrow README
-* [ARROW-3043](https://issues.apache.org/jira/browse/ARROW-3043) - [C++] pthread doesn't exist on MinGW
-* [ARROW-3044](https://issues.apache.org/jira/browse/ARROW-3044) - [Python] Remove all occurrences of cython's legacy property definition syntax
-* [ARROW-3045](https://issues.apache.org/jira/browse/ARROW-3045) - [Python] Remove nullcheck from ipc Message and MessageReader
-* [ARROW-3046](https://issues.apache.org/jira/browse/ARROW-3046) - [GLib] Use rubyish method in test-orc-file-reader.rb
-* [ARROW-3050](https://issues.apache.org/jira/browse/ARROW-3050) - [C++] Adopt HiveServer2 client C++ codebase
-* [ARROW-3057](https://issues.apache.org/jira/browse/ARROW-3057) - [INTEGRATION] Fix spark and hdfs dockerfiles
-* [ARROW-3059](https://issues.apache.org/jira/browse/ARROW-3059) - [C++] Streamline namespace array::test
-* [ARROW-3060](https://issues.apache.org/jira/browse/ARROW-3060) - [C++] Factor out parsing routines
-* [ARROW-3062](https://issues.apache.org/jira/browse/ARROW-3062) - [Python] Extend fast libtensorflow\_framework.so compatibility workaround to Python 2.7
-* [ARROW-3064](https://issues.apache.org/jira/browse/ARROW-3064) - [C++] Add option to ADD\_ARROW\_TEST to indicate additional dependencies for particular unit test executables
-* [ARROW-3067](https://issues.apache.org/jira/browse/ARROW-3067) - [Packaging] Support dev/rc/release .deb/.rpm builds
-* [ARROW-3068](https://issues.apache.org/jira/browse/ARROW-3068) - [Packaging] Bump version to 0.11.0-SNAPSHOT
-* [ARROW-3069](https://issues.apache.org/jira/browse/ARROW-3069) - [Release] Stop using SHA1 checksums per ASF policy
-* [ARROW-3072](https://issues.apache.org/jira/browse/ARROW-3072) - [C++] Use ARROW\_RETURN\_NOT\_OK instead of RETURN\_NOT\_OK in header files
-* [ARROW-3075](https://issues.apache.org/jira/browse/ARROW-3075) - [C++] Incorporate apache/parquet-cpp codebase into Arrow C++ codebase and build system
-* [ARROW-3076](https://issues.apache.org/jira/browse/ARROW-3076) - [Website] Add Google Analytics tags to C++, Python API docs
-* [ARROW-3088](https://issues.apache.org/jira/browse/ARROW-3088) - [Rust] Use internal \`Result<T>\` type instead of \`Result<T, ArrowError>\`
-* [ARROW-3090](https://issues.apache.org/jira/browse/ARROW-3090) - [Rust] Accompany error messages with assertions
-* [ARROW-3094](https://issues.apache.org/jira/browse/ARROW-3094) - [Python] Allow lighter construction of pa.Schema / pa.StructType
-* [ARROW-3099](https://issues.apache.org/jira/browse/ARROW-3099) - [C++] Add benchmark for number parsing
-* [ARROW-3105](https://issues.apache.org/jira/browse/ARROW-3105) - [Plasma] Improve flushing error message
-* [ARROW-3106](https://issues.apache.org/jira/browse/ARROW-3106) - [Website] Update committers and PMC roster on website
-* [ARROW-3109](https://issues.apache.org/jira/browse/ARROW-3109) - [Python] Add Python 3.7 virtualenvs to manylinux1 container
-* [ARROW-3110](https://issues.apache.org/jira/browse/ARROW-3110) - [C++] Compilation warnings with gcc 7.3.0
-* [ARROW-3111](https://issues.apache.org/jira/browse/ARROW-3111) - [Java] Enable changing default logging level when running tests
-* [ARROW-3114](https://issues.apache.org/jira/browse/ARROW-3114) - [Website] Add information about user@ mailing list to website / Community page
-* [ARROW-3115](https://issues.apache.org/jira/browse/ARROW-3115) - [Java] Style Checks - Fix import ordering
-* [ARROW-3116](https://issues.apache.org/jira/browse/ARROW-3116) - [Plasma] Add "ls" to object store
-* [ARROW-3117](https://issues.apache.org/jira/browse/ARROW-3117) - [GLib] Add garrow\_chunked\_array\_to\_string()
-* [ARROW-3119](https://issues.apache.org/jira/browse/ARROW-3119) - [Packaging] Nightly packaging script fails
-* [ARROW-3127](https://issues.apache.org/jira/browse/ARROW-3127) - [C++] Add Tutorial about Sending Tensor from C++ to Python
-* [ARROW-3128](https://issues.apache.org/jira/browse/ARROW-3128) - [C++] Support system shared zlib
-* [ARROW-3129](https://issues.apache.org/jira/browse/ARROW-3129) - [Packaging] Stop to use deprecated BuildRoot and Group in .rpm
-* [ARROW-3130](https://issues.apache.org/jira/browse/ARROW-3130) - [Go] add initial support for Go modules
-* [ARROW-3136](https://issues.apache.org/jira/browse/ARROW-3136) - [C++] Clean up arrow:: public API
-* [ARROW-3142](https://issues.apache.org/jira/browse/ARROW-3142) - [C++] Fetch all libs from toolchain environment
-* [ARROW-3143](https://issues.apache.org/jira/browse/ARROW-3143) - [C++] CopyBitmap into existing memory
-* [ARROW-3146](https://issues.apache.org/jira/browse/ARROW-3146) - [C++] Barebones Flight RPC server and client implementations
-* [ARROW-3147](https://issues.apache.org/jira/browse/ARROW-3147) - [C++] MSVC version isn't detected in code page 932
-* [ARROW-3148](https://issues.apache.org/jira/browse/ARROW-3148) - [C++]  MSVC shows C4819 warning on code page 932
-* [ARROW-3152](https://issues.apache.org/jira/browse/ARROW-3152) - [C++][Packaging] Use dynamic linking for zlib in conda recipes
-* [ARROW-3153](https://issues.apache.org/jira/browse/ARROW-3153) - [Packaging] Fix broken nightly package builds introduced with recent cmake changes and orc tests
-* [ARROW-3157](https://issues.apache.org/jira/browse/ARROW-3157) - [C++] Improve buffer creation for typed data
-* [ARROW-3158](https://issues.apache.org/jira/browse/ARROW-3158) - [C++] Handle float truncation during casting
-* [ARROW-3160](https://issues.apache.org/jira/browse/ARROW-3160) - [Python] Improve pathlib.Path support in parquet and filesystem modules
-* [ARROW-3163](https://issues.apache.org/jira/browse/ARROW-3163) - [Python] Cython dependency is missing in non wheel package
-* [ARROW-3167](https://issues.apache.org/jira/browse/ARROW-3167) - [CI] Limit clcache cache size
-* [ARROW-3170](https://issues.apache.org/jira/browse/ARROW-3170) - [C++] Implement "readahead spooler" class for background input buffering
-* [ARROW-3171](https://issues.apache.org/jira/browse/ARROW-3171) - [Java] checkstyle - fix line length and indentation
-* [ARROW-3172](https://issues.apache.org/jira/browse/ARROW-3172) - [Rust] Update documentation for datatypes.rs
-* [ARROW-3174](https://issues.apache.org/jira/browse/ARROW-3174) - [Rust] run examples as part of CI
-* [ARROW-3177](https://issues.apache.org/jira/browse/ARROW-3177) - [Rust] Update expected error messages for tests that 'should panic'
-* [ARROW-3180](https://issues.apache.org/jira/browse/ARROW-3180) - [C++] Add docker-compose setup to simulate Travis CI run locally
-* [ARROW-3181](https://issues.apache.org/jira/browse/ARROW-3181) - [Packaging] Adjust conda package scripts to account for Parquet codebase migration
-* [ARROW-3182](https://issues.apache.org/jira/browse/ARROW-3182) - [C++] Merge Gandiva codebase
-* [ARROW-3187](https://issues.apache.org/jira/browse/ARROW-3187) - [Plasma] Make Plasma Log pluggable with glog
-* [ARROW-3195](https://issues.apache.org/jira/browse/ARROW-3195) - [C++] NumPy initialization error check is missing in test
-* [ARROW-3196](https://issues.apache.org/jira/browse/ARROW-3196) - Enable merge\_arrow\_py.py script to merge Parquet patches and set fix versions
-* [ARROW-3197](https://issues.apache.org/jira/browse/ARROW-3197) - [C++] Add instructions to cpp/README.md about Parquet-only development and Arrow+Parquet
-* [ARROW-3211](https://issues.apache.org/jira/browse/ARROW-3211) - [C++] gold linker doesn't work with MinGW-w64
-* [ARROW-3212](https://issues.apache.org/jira/browse/ARROW-3212) - [C++] Create deterministic IPC metadata
-* [ARROW-3213](https://issues.apache.org/jira/browse/ARROW-3213) - [C++] Use CMake to build vendored Snappy on Windows
-* [ARROW-3214](https://issues.apache.org/jira/browse/ARROW-3214) - [C++] Disable insecure warnings with MinGW build
-* [ARROW-3215](https://issues.apache.org/jira/browse/ARROW-3215) - [C++] Add support for finding libpython on MSYS2
-* [ARROW-3216](https://issues.apache.org/jira/browse/ARROW-3216) - [C++] libpython isn't linked to libarrow\_python in MinGW build
-* [ARROW-3217](https://issues.apache.org/jira/browse/ARROW-3217) - [C++] ARROW\_STATIC definition is missing in MinGW build
-* [ARROW-3218](https://issues.apache.org/jira/browse/ARROW-3218) - [C++] Utilities has needless pthread link in MinGW build
-* [ARROW-3219](https://issues.apache.org/jira/browse/ARROW-3219) - [C++] Use Win32 API in MinGW
-* [ARROW-3223](https://issues.apache.org/jira/browse/ARROW-3223) - [GLib] Use the same shared object versioning rule in C++
-* [ARROW-3229](https://issues.apache.org/jira/browse/ARROW-3229) - [Packaging]: Adjust wheel package scripts to account for Parquet codebase migration
-* [ARROW-3234](https://issues.apache.org/jira/browse/ARROW-3234) - [C++] Link order is wrong when ARROW\_ORC=on and ARROW\_PROTOBUF\_USE\_SHARED=ON
-* [ARROW-3235](https://issues.apache.org/jira/browse/ARROW-3235) - [Packaging] Update deb names
-* [ARROW-3236](https://issues.apache.org/jira/browse/ARROW-3236) - [C++] OutputStream bookkeeping logic when writing IPC file format is incorrect
-* [ARROW-3240](https://issues.apache.org/jira/browse/ARROW-3240) - [GLib] Add build instructions using Meson
-* [ARROW-3242](https://issues.apache.org/jira/browse/ARROW-3242) - [C++] Use coarser-grained dispatch to SIMD hash functions
-* [ARROW-3249](https://issues.apache.org/jira/browse/ARROW-3249) - [Python] Run flake8 on integration\_test.py and crossbow.py
-* [ARROW-3250](https://issues.apache.org/jira/browse/ARROW-3250) - [C++] Create Buffer implementation that takes ownership for the memory from a std::string via std::move
-* [ARROW-3252](https://issues.apache.org/jira/browse/ARROW-3252) - [C++] Do not hard code the "v" part of versions in thirdparty toolchain
-* [ARROW-3257](https://issues.apache.org/jira/browse/ARROW-3257) - [C++] Stop to use IMPORTED\_LINK\_INTERFACE\_LIBRARIES
-* [ARROW-3258](https://issues.apache.org/jira/browse/ARROW-3258) - [GLib] CI is failued on macOS
-* [ARROW-3259](https://issues.apache.org/jira/browse/ARROW-3259) - [GLib] Rename "writeable" to "writable"
-* [ARROW-3261](https://issues.apache.org/jira/browse/ARROW-3261) - [Python] Add "field" method to select fields from StructArray
-* [ARROW-3262](https://issues.apache.org/jira/browse/ARROW-3262) - [Python] Implement \_\_getitem\_\_ with integers on pyarrow.Column
-* [ARROW-3264](https://issues.apache.org/jira/browse/ARROW-3264) - [Java] checkstyle - fix whitespace
-* [ARROW-3267](https://issues.apache.org/jira/browse/ARROW-3267) - [Python] Create empty table from schema
-* [ARROW-3268](https://issues.apache.org/jira/browse/ARROW-3268) - [CI] Reduce conda times on AppVeyor
-* [ARROW-3269](https://issues.apache.org/jira/browse/ARROW-3269) - [Python] Fix warnings in unit test suite
-* [ARROW-3270](https://issues.apache.org/jira/browse/ARROW-3270) - [Release] Adjust release verification scripts to recent parquet migration
-* [ARROW-3274](https://issues.apache.org/jira/browse/ARROW-3274) - [Packaging] Missing glog dependency from conda-forge recipes
-* [ARROW-3276](https://issues.apache.org/jira/browse/ARROW-3276) - [Packaging] Add support Parquet related Linux packages
-* [ARROW-3281](https://issues.apache.org/jira/browse/ARROW-3281) - [Java] Make sure that WritableByteChannel in WriteChannel writes out complete bytes
-* [ARROW-3282](https://issues.apache.org/jira/browse/ARROW-3282) - [R] initial R functionality
-* [ARROW-3284](https://issues.apache.org/jira/browse/ARROW-3284) - [R] Adding R Error in Status
-* [ARROW-3285](https://issues.apache.org/jira/browse/ARROW-3285) - [GLib] Add arrow\_cpp\_build\_type and arrow\_cpp\_build\_dir Meson options
-* [ARROW-3286](https://issues.apache.org/jira/browse/ARROW-3286) - [C++] ARROW\_EXPORT for RecordBatchBuilder is missing
-* [ARROW-3287](https://issues.apache.org/jira/browse/ARROW-3287) - [C++] "redeclared without dllimport attribute after being referenced with dll linkage" with MinGW
-* [ARROW-3288](https://issues.apache.org/jira/browse/ARROW-3288) - [GLib] Add new API index for 0.11.0
-* [ARROW-3300](https://issues.apache.org/jira/browse/ARROW-3300) - [Release] Update .deb package names in preparation
-* [ARROW-3301](https://issues.apache.org/jira/browse/ARROW-3301) - [Website] Update Jekyll and Bootstrap 4
-* [ARROW-3305](https://issues.apache.org/jira/browse/ARROW-3305) - [JS] Incorrect development documentation link in javascript readme
-* [ARROW-3309](https://issues.apache.org/jira/browse/ARROW-3309) - [JS] Missing links from DEVELOP.md
-* [ARROW-3313](https://issues.apache.org/jira/browse/ARROW-3313) - [R] Run clang-format, cpplint checks on R C++ code
-* [ARROW-3319](https://issues.apache.org/jira/browse/ARROW-3319) - [GLib] Expose AlignStream methods in InputStream, OutputStream classes
-* [ARROW-3320](https://issues.apache.org/jira/browse/ARROW-3320) - [C++] Improve float parsing performance
-* [ARROW-3321](https://issues.apache.org/jira/browse/ARROW-3321) - [C++] Improve integer parsing performance
-* [ARROW-3334](https://issues.apache.org/jira/browse/ARROW-3334) - [Python] Update conda packages to new numpy requirement
-* [ARROW-3335](https://issues.apache.org/jira/browse/ARROW-3335) - [Python] Add ccache to manylinux1 container
-* [ARROW-3339](https://issues.apache.org/jira/browse/ARROW-3339) - [R] Support for character vectors
-* [ARROW-3341](https://issues.apache.org/jira/browse/ARROW-3341) - [R] Support for logical vector
-* [ARROW-3349](https://issues.apache.org/jira/browse/ARROW-3349) - [C++] Use aligned API in MinGW
-* [ARROW-3350](https://issues.apache.org/jira/browse/ARROW-3350) - [Website] Fix powered by links
-* [ARROW-3352](https://issues.apache.org/jira/browse/ARROW-3352) - [Packaging] Fix recently failing wheel builds
-* [ARROW-3356](https://issues.apache.org/jira/browse/ARROW-3356) - [Python] Document parameters of Table.to\_pandas method
-* [ARROW-3357](https://issues.apache.org/jira/browse/ARROW-3357) - [Rust] Add a mutable buffer implementation
-* [ARROW-3360](https://issues.apache.org/jira/browse/ARROW-3360) - [GLib] Import Parquet bindings
-* [ARROW-3363](https://issues.apache.org/jira/browse/ARROW-3363) - [C++/Python] Add helper functions to detect scalar Python types
-* [ARROW-3371](https://issues.apache.org/jira/browse/ARROW-3371) - [Python] Remove check\_metadata argument for Field.equals docstring
-* [ARROW-3375](https://issues.apache.org/jira/browse/ARROW-3375) - [Rust] Remove memory\_pool.rs
-* [ARROW-3376](https://issues.apache.org/jira/browse/ARROW-3376) - [C++] Add double-conversion to cpp/thirdparty/download\_dependencies.sh
-* [ARROW-3377](https://issues.apache.org/jira/browse/ARROW-3377) - [Gandiva][C++] Remove If statement from bit map set function
-* [ARROW-3382](https://issues.apache.org/jira/browse/ARROW-3382) - [C++] Run Gandiva tests in Travis CI
-* [ARROW-3392](https://issues.apache.org/jira/browse/ARROW-3392) - [Python] Support filters in disjunctive normal form in ParquetDataset
-* [ARROW-3395](https://issues.apache.org/jira/browse/ARROW-3395) - [C++/Python] Add docker container for linting
-* [ARROW-3397](https://issues.apache.org/jira/browse/ARROW-3397) - [C++] Use relative CMake path for modules
-* [ARROW-3400](https://issues.apache.org/jira/browse/ARROW-3400) - [Packaging] Add support Parquet GLib related Linux packages
-* [ARROW-3404](https://issues.apache.org/jira/browse/ARROW-3404) - [C++] Make CSV chunker faster
-* [ARROW-3411](https://issues.apache.org/jira/browse/ARROW-3411) - [Packaging] dev/release/01-perform.sh doesn't have executable bit
-* [ARROW-3412](https://issues.apache.org/jira/browse/ARROW-3412) - [Packaging] rat failure in dev/release/02-source.sh
-* [ARROW-3413](https://issues.apache.org/jira/browse/ARROW-3413) - [Packaging] dev/release/02-source.sh doesn't generate Parquet GLib document
-* [ARROW-3415](https://issues.apache.org/jira/browse/ARROW-3415) - [Packaging] dev/release/verify-release-cndidate.sh fails in "conda activate arrow-test"
-* [ARROW-3416](https://issues.apache.org/jira/browse/ARROW-3416) - [Packaging] dev/release/02-source.sh must use SHA512 instead of SHA1
-* [ARROW-3417](https://issues.apache.org/jira/browse/ARROW-3417) - [Packaging] dev/release/verify-release-cndidate.sh fails Parquet C++ test
-* [ARROW-3418](https://issues.apache.org/jira/browse/ARROW-3418) - [C++] Update Parquet snapshot version for release
-* [ARROW-3423](https://issues.apache.org/jira/browse/ARROW-3423) - [Packaging] Remove RC information from deb/rpm
-
-## Bug Fixes
-
-* [ARROW-1380](https://issues.apache.org/jira/browse/ARROW-1380) - [C++] Fix "still reachable" valgrind warnings when PLASMA\_VALGRIND=1
-* [ARROW-1661](https://issues.apache.org/jira/browse/ARROW-1661) - [Python] Python 3.7 support
-* [ARROW-1799](https://issues.apache.org/jira/browse/ARROW-1799) - [Plasma C++] Make unittest does not create plasma store executable
-* [ARROW-1996](https://issues.apache.org/jira/browse/ARROW-1996) - [Python] pyarrow.read\_serialized cannot read concatenated records
-* [ARROW-2027](https://issues.apache.org/jira/browse/ARROW-2027) - [C++] ipc::Message::SerializeTo does not pad the message body
-* [ARROW-2220](https://issues.apache.org/jira/browse/ARROW-2220) - Change default fix version in merge tool to be the next mainline release version
-* [ARROW-2310](https://issues.apache.org/jira/browse/ARROW-2310) - Source release scripts fail with Java8
-* [ARROW-2646](https://issues.apache.org/jira/browse/ARROW-2646) - [C++/Python] Pandas roundtrip for date objects
-* [ARROW-2776](https://issues.apache.org/jira/browse/ARROW-2776) - [C++] Do not pass -Wno-noexcept-type for compilers that do not support it
-* [ARROW-2782](https://issues.apache.org/jira/browse/ARROW-2782) - [Python] Ongoing Travis CI failures in Plasma unit tests
-* [ARROW-2814](https://issues.apache.org/jira/browse/ARROW-2814) - [Python] Unify PyObject\* sequence conversion paths for built-in sequences, NumPy arrays
-* [ARROW-2854](https://issues.apache.org/jira/browse/ARROW-2854) - [C++/Python] Casting float NaN to int should raise an error on safe cast
-* [ARROW-2925](https://issues.apache.org/jira/browse/ARROW-2925) - [JS] Documentation failing in docker container
-* [ARROW-2965](https://issues.apache.org/jira/browse/ARROW-2965) - [Python] Possible uint64 overflow issues in python\_to\_arrow.cc
-* [ARROW-2966](https://issues.apache.org/jira/browse/ARROW-2966) - [Python] Data type conversion error
-* [ARROW-2973](https://issues.apache.org/jira/browse/ARROW-2973) - [Python] pitrou/asv.git@customize\_commands does not work with the "new" way of activating conda
-* [ARROW-2974](https://issues.apache.org/jira/browse/ARROW-2974) - [Python] Replace usages of "source activate" with "conda activate" in CI scripts
-* [ARROW-2986](https://issues.apache.org/jira/browse/ARROW-2986) - [C++] /EHsc possibly needed for Visual Studio 2015 builds
-* [ARROW-2992](https://issues.apache.org/jira/browse/ARROW-2992) - [Python] Parquet benchmark failure
-* [ARROW-3006](https://issues.apache.org/jira/browse/ARROW-3006) - [GLib] .gir/.typelib for GPU aren't installed
-* [ARROW-3007](https://issues.apache.org/jira/browse/ARROW-3007) - [Packaging] libarrow-gpu10 deb for Ubuntu 18.04 has broken dependencies
-* [ARROW-3011](https://issues.apache.org/jira/browse/ARROW-3011) - [CI] Remove Slack notification
-* [ARROW-3012](https://issues.apache.org/jira/browse/ARROW-3012) - [Python] Installation crashes with setuptools\_scm error
-* [ARROW-3013](https://issues.apache.org/jira/browse/ARROW-3013) - [Website] Fix download links on website for tarballs, checksums
-* [ARROW-3015](https://issues.apache.org/jira/browse/ARROW-3015) - [Python] Fix documentation typo for pa.uint8
-* [ARROW-3047](https://issues.apache.org/jira/browse/ARROW-3047) - [C++] cmake downloads and builds ORC even though it's installed
-* [ARROW-3049](https://issues.apache.org/jira/browse/ARROW-3049) - [C++/Python] ORC reader fails on empty file
-* [ARROW-3053](https://issues.apache.org/jira/browse/ARROW-3053) - [Python] Pandas decimal conversion segfault
-* [ARROW-3056](https://issues.apache.org/jira/browse/ARROW-3056) - [Python] Indicate in NativeFile docstrings methods that are part of the RawIOBase API but not implemented
-* [ARROW-3061](https://issues.apache.org/jira/browse/ARROW-3061) - [Java] headroom does not take into account reservation
-* [ARROW-3065](https://issues.apache.org/jira/browse/ARROW-3065) - [Python] concat\_tables() failing from bad Pandas Metadata
-* [ARROW-3083](https://issues.apache.org/jira/browse/ARROW-3083) - [Python] Version in manylinux1 wheel builds is wrong
-* [ARROW-3093](https://issues.apache.org/jira/browse/ARROW-3093) - [C++] Linking errors with ORC enabled
-* [ARROW-3095](https://issues.apache.org/jira/browse/ARROW-3095) - [Python] test\_plasma.py fails
-* [ARROW-3098](https://issues.apache.org/jira/browse/ARROW-3098) - [Python] BufferReader doesn't adhere to the seek protocol
-* [ARROW-3100](https://issues.apache.org/jira/browse/ARROW-3100) - [CI] C/glib build broken on OS X
-* [ARROW-3125](https://issues.apache.org/jira/browse/ARROW-3125) - [Python] Update ASV instructions
-* [ARROW-3132](https://issues.apache.org/jira/browse/ARROW-3132) - Regenerate 0.10.0 changelog
-* [ARROW-3140](https://issues.apache.org/jira/browse/ARROW-3140) - [Plasma] Plasma fails building with GPU enabled
-* [ARROW-3141](https://issues.apache.org/jira/browse/ARROW-3141) - [Python] Tensorflow support in pyarrow wheels pins numpy>=1.14
-* [ARROW-3145](https://issues.apache.org/jira/browse/ARROW-3145) - [C++] Thrift compiler reruns in arrow/dbi/hiveserver2/thrift when using Ninja build
-* [ARROW-3173](https://issues.apache.org/jira/browse/ARROW-3173) - [Rust] dynamic\_types example does not run
-* [ARROW-3175](https://issues.apache.org/jira/browse/ARROW-3175) - [Java] Upgrade to official FlatBuffers release (Flatbuffers incompatibility)
-* [ARROW-3183](https://issues.apache.org/jira/browse/ARROW-3183) - [Python] get\_library\_dirs on Windows can give the wrong directory
-* [ARROW-3188](https://issues.apache.org/jira/browse/ARROW-3188) - [Python] Table.from\_arrays segfaults if lists and schema are passed
-* [ARROW-3190](https://issues.apache.org/jira/browse/ARROW-3190) - [C++] "WriteableFile" is misspelled, should be renamed "WritableFile" with deprecation for old name
-* [ARROW-3206](https://issues.apache.org/jira/browse/ARROW-3206) - [C++] Building with ARROW\_HIVESERVER2=ON with unit tests disabled causes error
-* [ARROW-3227](https://issues.apache.org/jira/browse/ARROW-3227) - [Python] NativeFile.write shouldn't accept unicode strings
-* [ARROW-3228](https://issues.apache.org/jira/browse/ARROW-3228) - [Python] Immutability of bytes is ignored
-* [ARROW-3231](https://issues.apache.org/jira/browse/ARROW-3231) - [Python] Sphinx's autodoc\_default\_flags is now deprecated
-* [ARROW-3237](https://issues.apache.org/jira/browse/ARROW-3237) - [CI] Update linux packaging filenames in rat exclusion list
-* [ARROW-3241](https://issues.apache.org/jira/browse/ARROW-3241) - [Plasma] test\_plasma\_list test failure on Ubuntu 14.04
-* [ARROW-3251](https://issues.apache.org/jira/browse/ARROW-3251) - [C++] Conversion warnings in cast.cc
-* [ARROW-3256](https://issues.apache.org/jira/browse/ARROW-3256) - [JS] File footer and message metadata is inconsistent
-* [ARROW-3279](https://issues.apache.org/jira/browse/ARROW-3279) - [C++] Allow linking Arrow tests dynamically on Windows
-* [ARROW-3299](https://issues.apache.org/jira/browse/ARROW-3299) - [C++] Appveyor builds failing
-* [ARROW-3322](https://issues.apache.org/jira/browse/ARROW-3322) - [CI] Rust job always runs on AppVeyor
-* [ARROW-3327](https://issues.apache.org/jira/browse/ARROW-3327) - [Python] manylinux container confusing
-* [ARROW-3338](https://issues.apache.org/jira/browse/ARROW-3338) - [Python] Crash when schema and columns do not match
-* [ARROW-3342](https://issues.apache.org/jira/browse/ARROW-3342) - Appveyor builds have stopped triggering on GitHub
-* [ARROW-3348](https://issues.apache.org/jira/browse/ARROW-3348) - Plasma store dies when an object that a dead client is waiting for gets created.
-* [ARROW-3354](https://issues.apache.org/jira/browse/ARROW-3354) - [Python] read\_record\_batch interfaces differ in pyarrow and pyarrow.cuda
-* [ARROW-3369](https://issues.apache.org/jira/browse/ARROW-3369) - [Packaging] Wheel builds are failing due to wheel 0.32 release
-* [ARROW-3370](https://issues.apache.org/jira/browse/ARROW-3370) - [Packaging] Centos 6 build is failing
-* [ARROW-3373](https://issues.apache.org/jira/browse/ARROW-3373) - Fix bug in which plasma store can die when client gets multiple objects and object becomes available.
-* [ARROW-3374](https://issues.apache.org/jira/browse/ARROW-3374) - [Python] Dictionary has out-of-bound index when creating DictionaryArray from Pandas with NaN
-* [ARROW-3393](https://issues.apache.org/jira/browse/ARROW-3393) - [C++] Fix compiler warning in util/task-group-cc on clang 6
-* [ARROW-3394](https://issues.apache.org/jira/browse/ARROW-3394) - [Java] Remove duplicate dependency entry in Flight
-* [ARROW-3403](https://issues.apache.org/jira/browse/ARROW-3403) - [Website] Source tarball link missing from install page
-* [ARROW-3420](https://issues.apache.org/jira/browse/ARROW-3420) - [C++] Fix outstanding include-what-you-use issues in src/arrow, src/parquet codebases
-
-[1]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.11.0/
-[2]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.11.0/binaries
-[3]: https://github.com/apache/arrow/releases/tag/apache-arrow-0.11.0
diff --git a/site/_release/0.11.1.md b/site/_release/0.11.1.md
deleted file mode 100644
index f58162d..0000000
--- a/site/_release/0.11.1.md
+++ /dev/null
@@ -1,52 +0,0 @@
----
-layout: default
-title: Apache Arrow 0.11.1 Release
-permalink: /release/0.11.1.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow 0.11.1 (19 October 2018)
-
-This is a bugfix release to address a Python packaging issue with zlib that
-resulted in bug [ARROW-3514][4].
-
-## Download
-
-* [**Source Artifacts**][1]
-* [**Binary Artifacts**][2]
-* [Git tag][3]
-
-# Changelog
-
-## New Features and Improvements
-
-* [ARROW-3353](https://issues.apache.org/jira/browse/ARROW-3353) - [Packaging] Build python 3.7 wheels
-* [ARROW-3534](https://issues.apache.org/jira/browse/ARROW-3534) - [Python] Update zlib library in manylinux1 image
-* [ARROW-3546](https://issues.apache.org/jira/browse/ARROW-3546) - [Python] Provide testing setup to verify wheel binaries work in one or more common Linux distributions
-* [ARROW-3565](https://issues.apache.org/jira/browse/ARROW-3565) - [Python] Pin tensorflow to 1.11.0 in manylinux1 container
-
-## Bug Fixes
-
-* [ARROW-3514](https://issues.apache.org/jira/browse/ARROW-3514) - [Python] zlib deflate exception when writing Parquet file
-
-[1]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.11.1/
-[2]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.11.1/binaries
-[3]: https://github.com/apache/arrow/releases/tag/apache-arrow-0.11.1
-[4]: https://issues.apache.org/jira/browse/ARROW-3514
\ No newline at end of file
diff --git a/site/_release/0.12.0.md b/site/_release/0.12.0.md
deleted file mode 100644
index 5a1306b..0000000
--- a/site/_release/0.12.0.md
+++ /dev/null
@@ -1,736 +0,0 @@
----
-layout: default
-title: Apache Arrow 0.12.0 Release
-permalink: /release/0.12.0.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow 0.12.0 (20 January 2019)
-
-This is a major release covering more than 3 months of development.
-
-## Download
-
-* [**Source Artifacts**][1]
-* [**Binary Artifacts**][2]
-* [Git tag 8ca41384b5324bfd0ef3d3ed3f728e1d10ed73f0][3]
-
-## Contributors
-
-This release includes 601 commits from 77 distinct contributors.
-
-```shell
-$ git shortlog -sn apache-arrow-0.11.0..apache-arrow-0.12.0
-    94  Kouhei Sutou
-    76  Wes McKinney
-    61  Antoine Pitrou
-    60  Krisztián Szűcs
-    25  Pindikura Ravindra
-    25  Yosuke Shiro
-    21  Romain Francois
-    17  Chao Sun
-    16  Andy Grove
-    15  Korn, Uwe
-    15  Sebastien Binet
-    13  Paddy Horan
-    13  Philipp Moritz
-    10  François Saint-Jacques
-     9  Praveen
-     8  Robert Nishihara
-     7  Uwe L. Korn
-     6  Brian Hulette
-     6  Bryan Cutler
-     6  praveenbingo
-     6  suquark
-     5  Javier Luraschi
-     5  Kousuke Saruta
-     4  Tanya Schlusser
-     3  Benjamin Kietzman
-     3  Dimitri Vorona
-     3  James Lamb
-     3  Kenta Murata
-     3  Neville Dipale
-     3  Yuhong Guo
-     3  alexandreyc
-     2  Animesh Trivedi
-     2  Deepak Majeti
-     2  Dmitry Kalinkin
-     2  Hatem Helal
-     2  Kengo Seki
-     2  Micah Kornfield
-     2  Pearu Peterson
-     2  Yasuo Honda
-     2  jlapacik
-     2  ptaylor
-     2  shyam
-     1  Antonio Cavallo
-     1  Binyang2014
-     1  Bradley Skaggs
-     1  Brent Kerby
-     1  Bruno P. Kinoshita
-     1  Christopher Hutchinson
-     1  Dmitry Vukolov
-     1  Dominik Moritz
-     1  Dustin Long
-     1  Jacques Nadeau
-     1  Jeffrey Wong
-     1  Joris Van den Bossche
-     1  Justin Dunham
-     1  Li Jin
-     1  Renjie Liu
-     1  ShubhamChaurasia
-     1  Siddharth Dave
-     1  Suvayu Ali
-     1  Sweeney, Mack
-     1  Sylvain Joubert
-     1  Tim Paine
-     1  Todd Mostak
-     1  Vivekanand Vellanki
-     1  Yevgeni Litvin
-     1  Yuqi Gu
-     1  Zhenyuan Zhao
-     1  andrew
-     1  c-bata
-     1  cav71
-     1  kabukawa
-     1  m.wang
-     1  minmin.fmm
-     1  philjdf
-     1  sarutak
-     1  siddharth
-     1  zeroshade
-```
-
-# Patch Committers
-
-The following Apache committers merged contributed patches to the repository.
-
-```
-$ git shortlog -csn apache-arrow-0.11.0..apache-arrow-0.12.0
-   297  Wes McKinney
-    79  Kouhei Sutou
-    76  Krisztián Szűcs
-    35  Antoine Pitrou
-    25  Korn, Uwe
-    25  Yosuke Shiro
-    16  Philipp Moritz
-    15  Andy Grove
-    14  Uwe L. Korn
-     6  Robert Nishihara
-     4  Brian Hulette
-     4  Sidd
-     3  Bryan Cutler
-     3  GitHub
-```
-
-# Changelog
-
-## New Features and Improvements
-
-* [ARROW-1019](https://issues.apache.org/jira/browse/ARROW-1019) - [C++] Implement input stream and output stream with Gzip codec
-* [ARROW-1423](https://issues.apache.org/jira/browse/ARROW-1423) - [C++] Create non-owned CudaContext from context handle provided by thirdparty user
-* [ARROW-1492](https://issues.apache.org/jira/browse/ARROW-1492) - [C++] Type casting function kernel suite
-* [ARROW-1688](https://issues.apache.org/jira/browse/ARROW-1688) - [Java] Fail build on checkstyle warnings
-* [ARROW-1696](https://issues.apache.org/jira/browse/ARROW-1696) - [C++] Add codec benchmarks
-* [ARROW-1993](https://issues.apache.org/jira/browse/ARROW-1993) - [Python] Add function for determining implied Arrow schema from pandas.DataFrame
-* [ARROW-2211](https://issues.apache.org/jira/browse/ARROW-2211) - [C++] Use simpler hash functions for integers
-* [ARROW-2216](https://issues.apache.org/jira/browse/ARROW-2216) - [CI] CI descriptions and envars are misleading
-* [ARROW-2337](https://issues.apache.org/jira/browse/ARROW-2337) - [Scripts] Windows release verification script should use boost DSOs instead of static linkage
-* [ARROW-2475](https://issues.apache.org/jira/browse/ARROW-2475) - [Format] Confusing array length description
-* [ARROW-2483](https://issues.apache.org/jira/browse/ARROW-2483) - [Rust] use bit-packing for boolean vectors
-* [ARROW-2504](https://issues.apache.org/jira/browse/ARROW-2504) - [Website] Add ApacheCon NA link
-* [ARROW-2535](https://issues.apache.org/jira/browse/ARROW-2535) - [Python] Provide pre-commit hooks that check flake8
-* [ARROW-2560](https://issues.apache.org/jira/browse/ARROW-2560) - [Rust] The Rust README should include Rust-specific information on contributing
-* [ARROW-2624](https://issues.apache.org/jira/browse/ARROW-2624) - [Python] Random schema and data generator for Arrow conversion and Parquet testing
-* [ARROW-2637](https://issues.apache.org/jira/browse/ARROW-2637) - [C++/Python] Build support and instructions for development on Alpine Linux
-* [ARROW-2653](https://issues.apache.org/jira/browse/ARROW-2653) - [C++] Refactor hash table support
-* [ARROW-2670](https://issues.apache.org/jira/browse/ARROW-2670) - [C++/Python] Add Ubuntu 18.04 / gcc7 as a nightly build
-* [ARROW-2673](https://issues.apache.org/jira/browse/ARROW-2673) - [Python] Add documentation + docstring for ARROW-2661
-* [ARROW-2684](https://issues.apache.org/jira/browse/ARROW-2684) - [Python] Various documentation improvements
-* [ARROW-2712](https://issues.apache.org/jira/browse/ARROW-2712) - [C#] Initial C# .NET library
-* [ARROW-2720](https://issues.apache.org/jira/browse/ARROW-2720) - [C++] Clean up cmake CXX\_STANDARD and PIC flag setting
-* [ARROW-2759](https://issues.apache.org/jira/browse/ARROW-2759) - Export notification socket of Plasma
-* [ARROW-2803](https://issues.apache.org/jira/browse/ARROW-2803) - [C++] Put hashing function into src/arrow/util
-* [ARROW-2807](https://issues.apache.org/jira/browse/ARROW-2807) - [Python] Enable memory-mapping to be toggled in get\_reader when reading Parquet files
-* [ARROW-2808](https://issues.apache.org/jira/browse/ARROW-2808) - [Python] Add unit tests for ProxyMemoryPool, enable new default MemoryPool to be constructed
-* [ARROW-2919](https://issues.apache.org/jira/browse/ARROW-2919) - [C++] Improve error message when listing empty HDFS file
-* [ARROW-2968](https://issues.apache.org/jira/browse/ARROW-2968) - [R] Multi-threaded conversion from Arrow table to R data.frame
-* [ARROW-3020](https://issues.apache.org/jira/browse/ARROW-3020) - [Python] Addition of option to allow empty Parquet row groups
-* [ARROW-3038](https://issues.apache.org/jira/browse/ARROW-3038) - [Go] add support for StringArray
-* [ARROW-3063](https://issues.apache.org/jira/browse/ARROW-3063) - [Go] move list of supported/TODO features to confluence
-* [ARROW-3070](https://issues.apache.org/jira/browse/ARROW-3070) - [Release] Host binary artifacts for RCs and releases on ASF Bintray account instead of dist/mirror system
-* [ARROW-3108](https://issues.apache.org/jira/browse/ARROW-3108) - [C++] arrow::PrettyPrint for Table instances
-* [ARROW-3126](https://issues.apache.org/jira/browse/ARROW-3126) - [Python] Make Buffered\* IO classes available to Python, incorporate into input\_stream, output\_stream factory functions
-* [ARROW-3131](https://issues.apache.org/jira/browse/ARROW-3131) - [Go] add test for Go-1.11
-* [ARROW-3161](https://issues.apache.org/jira/browse/ARROW-3161) - [Packaging] Ensure to run pyarrow unit tests in conda and wheel builds
-* [ARROW-3169](https://issues.apache.org/jira/browse/ARROW-3169) - [C++] Break array-test.cc and array.cc into multiple compilation units
-* [ARROW-3184](https://issues.apache.org/jira/browse/ARROW-3184) - [C++] Add modular build targets, "all" target, and require explicit target when invoking make or ninja
-* [ARROW-3194](https://issues.apache.org/jira/browse/ARROW-3194) - [Java] Fix setValueCount in spitAndTransfer for variable width vectors
-* [ARROW-3199](https://issues.apache.org/jira/browse/ARROW-3199) - [Plasma] Check for EAGAIN in recvmsg and sendmsg
-* [ARROW-3209](https://issues.apache.org/jira/browse/ARROW-3209) - [C++] Rename libarrow\_gpu to libarrow\_cuda
-* [ARROW-3230](https://issues.apache.org/jira/browse/ARROW-3230) - [Python] Missing comparisons on ChunkedArray, Table
-* [ARROW-3233](https://issues.apache.org/jira/browse/ARROW-3233) - [Python] Sphinx documentation for pyarrow.cuda GPU support
-* [ARROW-3248](https://issues.apache.org/jira/browse/ARROW-3248) - [C++] Arrow tests should have label "arrow"
-* [ARROW-3260](https://issues.apache.org/jira/browse/ARROW-3260) - [CI] Make linting a separate job
-* [ARROW-3272](https://issues.apache.org/jira/browse/ARROW-3272) - [Java] Document checkstyle deviations from Google style guide
-* [ARROW-3273](https://issues.apache.org/jira/browse/ARROW-3273) - [Java] checkstyle - fix javadoc style
-* [ARROW-3278](https://issues.apache.org/jira/browse/ARROW-3278) - [Python] Retrieve StructType's and StructArray's field by name
-* [ARROW-3291](https://issues.apache.org/jira/browse/ARROW-3291) - [C++] Convenience API for constructing arrow::io::BufferReader from std::string
-* [ARROW-3303](https://issues.apache.org/jira/browse/ARROW-3303) - [C++] Enable example arrays to be written with a simplified JSON representation
-* [ARROW-3306](https://issues.apache.org/jira/browse/ARROW-3306) - [R] Objects and support functions different kinds of arrow::Buffer
-* [ARROW-3307](https://issues.apache.org/jira/browse/ARROW-3307) - [R] Convert chunked arrow::Column to R vector
-* [ARROW-3310](https://issues.apache.org/jira/browse/ARROW-3310) - [R] Create wrapper classes for various Arrow IO interfaces
-* [ARROW-3312](https://issues.apache.org/jira/browse/ARROW-3312) - [R] Use same .clang-format file for both R binding C++ code and main C++ codebase
-* [ARROW-3318](https://issues.apache.org/jira/browse/ARROW-3318) - [C++] Convenience method for reading all batches from an IPC stream or file as arrow::Table
-* [ARROW-3323](https://issues.apache.org/jira/browse/ARROW-3323) - [Java] checkstyle - fix naming
-* [ARROW-3331](https://issues.apache.org/jira/browse/ARROW-3331) - [C++] Add re2 to ThirdpartyToolchain
-* [ARROW-3340](https://issues.apache.org/jira/browse/ARROW-3340) - [R] support for dates and time classes
-* [ARROW-3347](https://issues.apache.org/jira/browse/ARROW-3347) - [Rust] Implement PrimitiveArrayBuilder
-* [ARROW-3353](https://issues.apache.org/jira/browse/ARROW-3353) - [Packaging] Build python 3.7 wheels
-* [ARROW-3355](https://issues.apache.org/jira/browse/ARROW-3355) - [R] Support for factors
-* [ARROW-3358](https://issues.apache.org/jira/browse/ARROW-3358) - [Gandiva][C++] Replace usages of gandiva/status.h with arrow/status.h
-* [ARROW-3362](https://issues.apache.org/jira/browse/ARROW-3362) - [R] Guard against null buffers
-* [ARROW-3366](https://issues.apache.org/jira/browse/ARROW-3366) - [R] Dockerfile for docker-compose setup
-* [ARROW-3368](https://issues.apache.org/jira/browse/ARROW-3368) - [Integration/CI/Python] Add dask integration test to docker-compose setup
-* [ARROW-3380](https://issues.apache.org/jira/browse/ARROW-3380) - [Python] Support reading CSV files and more from a gzipped file
-* [ARROW-3381](https://issues.apache.org/jira/browse/ARROW-3381) - [C++] Implement InputStream for bz2 files
-* [ARROW-3383](https://issues.apache.org/jira/browse/ARROW-3383) - [Java] Run Gandiva tests in Travis CI
-* [ARROW-3384](https://issues.apache.org/jira/browse/ARROW-3384) - [Gandiva] Sync remaining commits from gandiva repo
-* [ARROW-3385](https://issues.apache.org/jira/browse/ARROW-3385) - [Java] [Gandiva] Deploy gandiva snapshot jars automatically
-* [ARROW-3387](https://issues.apache.org/jira/browse/ARROW-3387) - [C++] Function to cast binary to string/utf8 with UTF8 validation
-* [ARROW-3398](https://issues.apache.org/jira/browse/ARROW-3398) - [Rust] Update existing Builder to use MutableBuffer internally
-* [ARROW-3402](https://issues.apache.org/jira/browse/ARROW-3402) - [Gandiva][C++] Utilize common bitmap operation implementations in precompiled IR routines
-* [ARROW-3407](https://issues.apache.org/jira/browse/ARROW-3407) - [C++] Add UTF8 conversion modes in CSV reader conversion options
-* [ARROW-3409](https://issues.apache.org/jira/browse/ARROW-3409) - [C++] Add streaming compression interfaces
-* [ARROW-3421](https://issues.apache.org/jira/browse/ARROW-3421) - [C++] Add include-what-you-use setup to primary docker-compose.yml
-* [ARROW-3427](https://issues.apache.org/jira/browse/ARROW-3427) - [C++] Add Windows support, Unix static libs for double-conversion package in conda-forge
-* [ARROW-3429](https://issues.apache.org/jira/browse/ARROW-3429) - [Packaging] Add a script to release binaries that use source archive at dist.apache.orgtable bit
-* [ARROW-3430](https://issues.apache.org/jira/browse/ARROW-3430) - [Packaging] Add workaround to verify 0.11.0
-* [ARROW-3431](https://issues.apache.org/jira/browse/ARROW-3431) - [GLib] Include Gemfile to archive
-* [ARROW-3432](https://issues.apache.org/jira/browse/ARROW-3432) - [Packaging] Variables aren't expanded Subversion commit message
-* [ARROW-3439](https://issues.apache.org/jira/browse/ARROW-3439) - [R] R language bindings for Feather format
-* [ARROW-3440](https://issues.apache.org/jira/browse/ARROW-3440) - [Gandiva][C++] Remove outdated cpp/src/gandiva/README.md, add build documentation to cpp/README.md
-* [ARROW-3441](https://issues.apache.org/jira/browse/ARROW-3441) - [Gandiva][C++] Produce fewer test executables
-* [ARROW-3442](https://issues.apache.org/jira/browse/ARROW-3442) - [C++] Use dynamic linking for unit tests, ensure coverage working properly with clang
-* [ARROW-3450](https://issues.apache.org/jira/browse/ARROW-3450) - [R] Wrap MemoryMappedFile class
-* [ARROW-3451](https://issues.apache.org/jira/browse/ARROW-3451) - [Python] Allocate CUDA memory from a CUcontext created by numba.cuda
-* [ARROW-3455](https://issues.apache.org/jira/browse/ARROW-3455) - [Gandiva][C++] Support pkg-config for Gandiva
-* [ARROW-3456](https://issues.apache.org/jira/browse/ARROW-3456) - [CI] Reuse docker images and optimize docker-compose containers
-* [ARROW-3460](https://issues.apache.org/jira/browse/ARROW-3460) - [Packaging] Add a script to rebase master on local release branch
-* [ARROW-3461](https://issues.apache.org/jira/browse/ARROW-3461) - [Packaging] Add a script to upload RC artifacts as the official release
-* [ARROW-3462](https://issues.apache.org/jira/browse/ARROW-3462) - [Packaging] Update CHANGELOG for 0.11.0
-* [ARROW-3463](https://issues.apache.org/jira/browse/ARROW-3463) - [Website] Update for 0.11.0
-* [ARROW-3465](https://issues.apache.org/jira/browse/ARROW-3465) - [Documentation] Fix gen\_apidocs' docker image
-* [ARROW-3469](https://issues.apache.org/jira/browse/ARROW-3469) - [Gandiva] add travis entry for gandiva on OSX
-* [ARROW-3472](https://issues.apache.org/jira/browse/ARROW-3472) - [Gandiva] remove gandiva helpers library
-* [ARROW-3473](https://issues.apache.org/jira/browse/ARROW-3473) - [Format] Update Layout.md document to clarify use of 64-bit array lengths
-* [ARROW-3474](https://issues.apache.org/jira/browse/ARROW-3474) - [GLib] Extend gparquet API with get\_schema and read\_column
-* [ARROW-3479](https://issues.apache.org/jira/browse/ARROW-3479) - [R] Support to write record\_batch as stream
-* [ARROW-3482](https://issues.apache.org/jira/browse/ARROW-3482) - [C++] Build with JEMALLOC by default
-* [ARROW-3487](https://issues.apache.org/jira/browse/ARROW-3487) - [Gandiva] simplify NULL\_IF\_NULL functions that can return errors
-* [ARROW-3488](https://issues.apache.org/jira/browse/ARROW-3488) - [Packaging] Separate crossbow task definition files for packaging and tests
-* [ARROW-3489](https://issues.apache.org/jira/browse/ARROW-3489) - [Gandiva] Support for in expressions
-* [ARROW-3490](https://issues.apache.org/jira/browse/ARROW-3490) - [R] streaming arrow objects to output streams
-* [ARROW-3492](https://issues.apache.org/jira/browse/ARROW-3492) - [C++] Build jemalloc in parallel
-* [ARROW-3493](https://issues.apache.org/jira/browse/ARROW-3493) - [Java] Document BOUNDS\_CHECKING\_ENABLED
-* [ARROW-3499](https://issues.apache.org/jira/browse/ARROW-3499) - [R] Expose arrow::ipc::Message type
-* [ARROW-3501](https://issues.apache.org/jira/browse/ARROW-3501) - [Gandiva] Enable building with gcc 4.8.x on Ubuntu Trusty, similar distros
-* [ARROW-3504](https://issues.apache.org/jira/browse/ARROW-3504) - [Plasma] Add support for Plasma Client to put/get raw bytes without pyarrow serialization.
-* [ARROW-3505](https://issues.apache.org/jira/browse/ARROW-3505) - [R] Read record batch and table
-* [ARROW-3506](https://issues.apache.org/jira/browse/ARROW-3506) - [Packaging] Nightly tests for docker-compose images
-* [ARROW-3515](https://issues.apache.org/jira/browse/ARROW-3515) - Introduce NumericTensor class
-* [ARROW-3518](https://issues.apache.org/jira/browse/ARROW-3518) - [C++] Detect HOMEBREW\_PREFIX automatically
-* [ARROW-3519](https://issues.apache.org/jira/browse/ARROW-3519) - [Gandiva] Add support for functions that can return variable len output
-* [ARROW-3521](https://issues.apache.org/jira/browse/ARROW-3521) - [GLib] Run Python using find\_program in meson.build
-* [ARROW-3529](https://issues.apache.org/jira/browse/ARROW-3529) - [Ruby] Import Red Parquet
-* [ARROW-3530](https://issues.apache.org/jira/browse/ARROW-3530) - [Java/Python] Add conversion for pyarrow.Schema from org.apache…pojo.Schema
-* [ARROW-3533](https://issues.apache.org/jira/browse/ARROW-3533) - [Python/Documentation] Use sphinx\_rtd\_theme instead of Bootstrap
-* [ARROW-3536](https://issues.apache.org/jira/browse/ARROW-3536) - [C++] Fast UTF8 validation functions
-* [ARROW-3537](https://issues.apache.org/jira/browse/ARROW-3537) - [Rust] Implement Tensor Type
-* [ARROW-3539](https://issues.apache.org/jira/browse/ARROW-3539) - [CI/Packaging] Update scripts to build against vendored jemalloc
-* [ARROW-3540](https://issues.apache.org/jira/browse/ARROW-3540) - [Rust] Incorporate BooleanArray into PrimitiveArray
-* [ARROW-3542](https://issues.apache.org/jira/browse/ARROW-3542) - [C++] Use unsafe appends when building array from CSV
-* [ARROW-3545](https://issues.apache.org/jira/browse/ARROW-3545) - [C++/Python] Normalize child/field terminology with StructType
-* [ARROW-3547](https://issues.apache.org/jira/browse/ARROW-3547) - [R] Protect against Null crash when reading from RecordBatch
-* [ARROW-3548](https://issues.apache.org/jira/browse/ARROW-3548) - Speed up storing small objects in the object store.
-* [ARROW-3551](https://issues.apache.org/jira/browse/ARROW-3551) - Change MapD to OmniSci on Powered By page
-* [ARROW-3555](https://issues.apache.org/jira/browse/ARROW-3555) - [Plasma] Unify plasma client get function using metadata.
-* [ARROW-3556](https://issues.apache.org/jira/browse/ARROW-3556) - [CI] Disable optimizations on Windows
-* [ARROW-3557](https://issues.apache.org/jira/browse/ARROW-3557) - [Python] Set language\_level in Cython sources
-* [ARROW-3558](https://issues.apache.org/jira/browse/ARROW-3558) - [Plasma] Remove fatal error when plasma client calls get on an unsealed object that it created.
-* [ARROW-3559](https://issues.apache.org/jira/browse/ARROW-3559) - Statically link libraries for plasma\_store\_server executable.
-* [ARROW-3562](https://issues.apache.org/jira/browse/ARROW-3562) - [R] Disallow creation of objects with null shared\_ptr<T>
-* [ARROW-3563](https://issues.apache.org/jira/browse/ARROW-3563) - [C++] Declare public link dependencies so arrow\_static, plasma\_static automatically pull in transitive dependencies
-* [ARROW-3566](https://issues.apache.org/jira/browse/ARROW-3566) - Clarify that the type of dictionary encoded field should be the encoded(index) type
-* [ARROW-3567](https://issues.apache.org/jira/browse/ARROW-3567) - [Gandiva] [GLib] Add GLib bindings of Gandiva
-* [ARROW-3568](https://issues.apache.org/jira/browse/ARROW-3568) - [Packaging] Run pyarrow unittests for windows wheels
-* [ARROW-3569](https://issues.apache.org/jira/browse/ARROW-3569) - [Packaging] Run pyarrow unittests when building conda package
-* [ARROW-3574](https://issues.apache.org/jira/browse/ARROW-3574) - Fix remaining bug with plasma static versus shared libraries.
-* [ARROW-3576](https://issues.apache.org/jira/browse/ARROW-3576) - [Python] Expose compressed file readers as NativeFile
-* [ARROW-3577](https://issues.apache.org/jira/browse/ARROW-3577) - [Go] add support for ChunkedArray
-* [ARROW-3581](https://issues.apache.org/jira/browse/ARROW-3581) - [Gandiva][C++] ARROW\_PROTOBUF\_USE\_SHARED isn't used
-* [ARROW-3582](https://issues.apache.org/jira/browse/ARROW-3582) - [CI] Gandiva C++ build is always triggered
-* [ARROW-3583](https://issues.apache.org/jira/browse/ARROW-3583) - [Python/Java] Create RecordBatch from VectorSchemaRoot
-* [ARROW-3584](https://issues.apache.org/jira/browse/ARROW-3584) - [Go] add support for Table
-* [ARROW-3587](https://issues.apache.org/jira/browse/ARROW-3587) - [Python] Efficient serialization for Arrow Objects (array, table, tensor, etc)
-* [ARROW-3588](https://issues.apache.org/jira/browse/ARROW-3588) - [Java] checkstyle - fix license
-* [ARROW-3589](https://issues.apache.org/jira/browse/ARROW-3589) - [Gandiva] Make it possible to compile gandiva without JNI
-* [ARROW-3591](https://issues.apache.org/jira/browse/ARROW-3591) - [R] Support to collect decimal type
-* [ARROW-3592](https://issues.apache.org/jira/browse/ARROW-3592) - [Python] Get BinaryArray value as zero copy memory view
-* [ARROW-3597](https://issues.apache.org/jira/browse/ARROW-3597) - [Gandiva] gandiva should integrate with ADD\_ARROW\_TEST for tests
-* [ARROW-3600](https://issues.apache.org/jira/browse/ARROW-3600) - [Packaging] Support Ubuntu 18.10
-* [ARROW-3601](https://issues.apache.org/jira/browse/ARROW-3601) - [Rust] Release 0.11.0
-* [ARROW-3602](https://issues.apache.org/jira/browse/ARROW-3602) - [Gandiva] [Python] Add preliminary Cython bindings for Gandiva
-* [ARROW-3603](https://issues.apache.org/jira/browse/ARROW-3603) - [Gandiva][C++] Can't build with vendored Boost
-* [ARROW-3605](https://issues.apache.org/jira/browse/ARROW-3605) - Remove AE library from plasma header files.
-* [ARROW-3607](https://issues.apache.org/jira/browse/ARROW-3607) - [Java] delete() method via JNI for plasma
-* [ARROW-3608](https://issues.apache.org/jira/browse/ARROW-3608) - [R] Support for time32 and time64 array types
-* [ARROW-3609](https://issues.apache.org/jira/browse/ARROW-3609) - [Gandiva] Move benchmark tests out of unit test
-* [ARROW-3610](https://issues.apache.org/jira/browse/ARROW-3610) - [C++] Add interface to turn stl\_allocator into arrow::MemoryPool
-* [ARROW-3611](https://issues.apache.org/jira/browse/ARROW-3611) - Give error more quickly when pyarrow serialization context is used incorrectly.
-* [ARROW-3612](https://issues.apache.org/jira/browse/ARROW-3612) - [Go] implement RecordBatch and RecordBatchReader
-* [ARROW-3615](https://issues.apache.org/jira/browse/ARROW-3615) - [R] Support for NaN
-* [ARROW-3616](https://issues.apache.org/jira/browse/ARROW-3616) - [Java] checkstyle - fix remaining coding checks
-* [ARROW-3618](https://issues.apache.org/jira/browse/ARROW-3618) - [Packaging/Documentation] Add \`-c conda-forge\` option to avoid PackagesNotFoundError
-* [ARROW-3620](https://issues.apache.org/jira/browse/ARROW-3620) - [Python] Document multithreading options in Sphinx and add to api.rst
-* [ARROW-3621](https://issues.apache.org/jira/browse/ARROW-3621) - [Go] implement TableBatchReader
-* [ARROW-3622](https://issues.apache.org/jira/browse/ARROW-3622) - [Go] implement Schema.Equal
-* [ARROW-3623](https://issues.apache.org/jira/browse/ARROW-3623) - [Go] implement Field.Equal
-* [ARROW-3624](https://issues.apache.org/jira/browse/ARROW-3624) - [Python/C++] Support for zero-sized device buffers
-* [ARROW-3626](https://issues.apache.org/jira/browse/ARROW-3626) - [Go] add a CSV TableReader
-* [ARROW-3629](https://issues.apache.org/jira/browse/ARROW-3629) - [Python] Add write\_to\_dataset to Python Sphinx API listing
-* [ARROW-3630](https://issues.apache.org/jira/browse/ARROW-3630) - [Plasma] [GLib] Add GLib bindings of Plasma
-* [ARROW-3632](https://issues.apache.org/jira/browse/ARROW-3632) - [Packaging] Update deb names in dev/tasks/tasks.yml in dev/release/00-prepare.sh
-* [ARROW-3633](https://issues.apache.org/jira/browse/ARROW-3633) - [Packaging] Update deb names in dev/tasks/tasks.yml for 0.12.0
-* [ARROW-3636](https://issues.apache.org/jira/browse/ARROW-3636) - [C++/Python] Update arrow/python/pyarrow\_api.h
-* [ARROW-3638](https://issues.apache.org/jira/browse/ARROW-3638) - [C++][Python] Move reading from Feather as Table feature to C++ from Python
-* [ARROW-3639](https://issues.apache.org/jira/browse/ARROW-3639) - [Packaging] Run gandiva nightly packaging tasks
-* [ARROW-3640](https://issues.apache.org/jira/browse/ARROW-3640) - [Go] add support for Tensors
-* [ARROW-3641](https://issues.apache.org/jira/browse/ARROW-3641) - [C++/Python] remove public keyword from Cython api functions
-* [ARROW-3642](https://issues.apache.org/jira/browse/ARROW-3642) - [C++] Add arrowConfig.cmake generation
-* [ARROW-3645](https://issues.apache.org/jira/browse/ARROW-3645) - [Python] Document compression support in Sphinx
-* [ARROW-3646](https://issues.apache.org/jira/browse/ARROW-3646) - [Python] Add convenience factories to create IO streams
-* [ARROW-3647](https://issues.apache.org/jira/browse/ARROW-3647) - [R] Crash after unloading bit64 package
-* [ARROW-3648](https://issues.apache.org/jira/browse/ARROW-3648) - [Plasma] Add API to get metadata and data at the same time
-* [ARROW-3649](https://issues.apache.org/jira/browse/ARROW-3649) - [Rust] Refactor MutableBuffer's resize
-* [ARROW-3656](https://issues.apache.org/jira/browse/ARROW-3656) - [C++] Allow whitespace in numeric CSV fields
-* [ARROW-3657](https://issues.apache.org/jira/browse/ARROW-3657) - [R] Require bit64 package
-* [ARROW-3659](https://issues.apache.org/jira/browse/ARROW-3659) - [C++] Clang Travis build (matrix entry 2) might not actually be using clang
-* [ARROW-3660](https://issues.apache.org/jira/browse/ARROW-3660) - [C++] Don't unnecessarily lock MemoryMappedFile for resizing in readonly files
-* [ARROW-3661](https://issues.apache.org/jira/browse/ARROW-3661) - [Gandiva][GLib] Improve constant name
-* [ARROW-3662](https://issues.apache.org/jira/browse/ARROW-3662) - [C++] Add a const overload to MemoryMappedFile::GetSize
-* [ARROW-3664](https://issues.apache.org/jira/browse/ARROW-3664) - [Rust] Add benchmark for PrimitiveArrayBuilder
-* [ARROW-3665](https://issues.apache.org/jira/browse/ARROW-3665) - [Rust] Implement StructArrayBuilder
-* [ARROW-3666](https://issues.apache.org/jira/browse/ARROW-3666) - [C++] Improve CSV parser performance
-* [ARROW-3672](https://issues.apache.org/jira/browse/ARROW-3672) - [Go] implement Time32 array
-* [ARROW-3673](https://issues.apache.org/jira/browse/ARROW-3673) - [Go] implement Time64 array
-* [ARROW-3674](https://issues.apache.org/jira/browse/ARROW-3674) - [Go] implement Date32 array
-* [ARROW-3675](https://issues.apache.org/jira/browse/ARROW-3675) - [Go] implement Date64 array
-* [ARROW-3677](https://issues.apache.org/jira/browse/ARROW-3677) - [Go] implement FixedSizedBinary array
-* [ARROW-3681](https://issues.apache.org/jira/browse/ARROW-3681) - [Go] add benchmarks for CSV reader
-* [ARROW-3682](https://issues.apache.org/jira/browse/ARROW-3682) - [Go] unexport encoding/csv.Reader from CSV reader
-* [ARROW-3683](https://issues.apache.org/jira/browse/ARROW-3683) - [Go] add functional-option style to CSV reader
-* [ARROW-3684](https://issues.apache.org/jira/browse/ARROW-3684) - [Go] add chunk size option to CSV reader
-* [ARROW-3692](https://issues.apache.org/jira/browse/ARROW-3692) - [Gandiva] [Ruby] Add Ruby bindings of Gandiva
-* [ARROW-3693](https://issues.apache.org/jira/browse/ARROW-3693) - [R] Invalid buffer for empty characters with null data
-* [ARROW-3694](https://issues.apache.org/jira/browse/ARROW-3694) - [Java] Avoid superfluous string creation when logging level is disabled
-* [ARROW-3695](https://issues.apache.org/jira/browse/ARROW-3695) - [Gandiva] Use add\_arrow\_lib()
-* [ARROW-3696](https://issues.apache.org/jira/browse/ARROW-3696) - [C++] Add feather::TableWriter::Write(table)
-* [ARROW-3697](https://issues.apache.org/jira/browse/ARROW-3697) - [Ruby] Add schema#[]
-* [ARROW-3701](https://issues.apache.org/jira/browse/ARROW-3701) - [Gandiva] Add support for decimal operations
-* [ARROW-3708](https://issues.apache.org/jira/browse/ARROW-3708) - [Packaging] Nightly CentOS builds are failing
-* [ARROW-3713](https://issues.apache.org/jira/browse/ARROW-3713) - [Rust] Implement BinaryArrayBuilder
-* [ARROW-3718](https://issues.apache.org/jira/browse/ARROW-3718) - [Gandiva] Remove spurious gtest include
-* [ARROW-3719](https://issues.apache.org/jira/browse/ARROW-3719) - [GLib] Support read/write tabl to/from Feather
-* [ARROW-3720](https://issues.apache.org/jira/browse/ARROW-3720) - [GLib] Use "indices" instead of "indexes"
-* [ARROW-3721](https://issues.apache.org/jira/browse/ARROW-3721) - [Gandiva] [Python] Support all Gandiva literals
-* [ARROW-3722](https://issues.apache.org/jira/browse/ARROW-3722) - [C++] Allow specifying column types to CSV reader
-* [ARROW-3723](https://issues.apache.org/jira/browse/ARROW-3723) - [Plasma] [Ruby] Add Ruby bindings of Plasma
-* [ARROW-3724](https://issues.apache.org/jira/browse/ARROW-3724) - [GLib] Update gitignore
-* [ARROW-3725](https://issues.apache.org/jira/browse/ARROW-3725) - [GLib] Add field readers to GArrowStructDataType
-* [ARROW-3726](https://issues.apache.org/jira/browse/ARROW-3726) - [Rust] CSV Reader & Writer
-* [ARROW-3727](https://issues.apache.org/jira/browse/ARROW-3727) - [Python] Document use of pyarrow.foreign\_buffer, cuda.foreign\_buffer in Sphinx
-* [ARROW-3731](https://issues.apache.org/jira/browse/ARROW-3731) - [R] R API for reading and writing Parquet files
-* [ARROW-3733](https://issues.apache.org/jira/browse/ARROW-3733) - [GLib] Add to\_string() to GArrowTable and GArrowColumn
-* [ARROW-3736](https://issues.apache.org/jira/browse/ARROW-3736) - [CI/Docker] Ninja test in docker-compose run cpp hangs
-* [ARROW-3738](https://issues.apache.org/jira/browse/ARROW-3738) - [C++] Add CSV conversion option to parse ISO8601-like timestamp strings
-* [ARROW-3741](https://issues.apache.org/jira/browse/ARROW-3741) - [R] Add support for arrow::compute::Cast to convert Arrow arrays from one type to another
-* [ARROW-3743](https://issues.apache.org/jira/browse/ARROW-3743) - [Ruby] Add support for saving/loading Feather
-* [ARROW-3744](https://issues.apache.org/jira/browse/ARROW-3744) - [Ruby] Use garrow\_table\_to\_string() in Arrow::Table#to\_s
-* [ARROW-3746](https://issues.apache.org/jira/browse/ARROW-3746) - [Gandiva] [Python] Make it possible to list all functions registered with Gandiva
-* [ARROW-3747](https://issues.apache.org/jira/browse/ARROW-3747) - [C++] Flip order of data members in arrow::Decimal128
-* [ARROW-3748](https://issues.apache.org/jira/browse/ARROW-3748) - [GLib] Add GArrowCSVReader
-* [ARROW-3749](https://issues.apache.org/jira/browse/ARROW-3749) - [GLib] Typos in documentation and test case name
-* [ARROW-3751](https://issues.apache.org/jira/browse/ARROW-3751) - [Python] Add more cython bindings for gandiva
-* [ARROW-3752](https://issues.apache.org/jira/browse/ARROW-3752) - [C++] Remove unused status::ArrowError
-* [ARROW-3753](https://issues.apache.org/jira/browse/ARROW-3753) - [Gandiva] Remove debug print
-* [ARROW-3755](https://issues.apache.org/jira/browse/ARROW-3755) - [GLib] Support for CompressedInputStream, CompressedOutputStream
-* [ARROW-3760](https://issues.apache.org/jira/browse/ARROW-3760) - [R] Support Arrow CSV reader
-* [ARROW-3773](https://issues.apache.org/jira/browse/ARROW-3773) - [C++] Remove duplicated AssertArraysEqual code in parquet/arrow/arrow-reader-writer-test.cc
-* [ARROW-3778](https://issues.apache.org/jira/browse/ARROW-3778) - [C++] Don't put implementations in test-util.h
-* [ARROW-3781](https://issues.apache.org/jira/browse/ARROW-3781) - [C++] Configure buffer size in arrow::io::BufferedOutputStream
-* [ARROW-3782](https://issues.apache.org/jira/browse/ARROW-3782) - [C++] Implement BufferedReader for C++
-* [ARROW-3784](https://issues.apache.org/jira/browse/ARROW-3784) - [R] Array with type fails with x is not a vector
-* [ARROW-3785](https://issues.apache.org/jira/browse/ARROW-3785) - [C++] Use double-conversion conda package in CI toolchain
-* [ARROW-3787](https://issues.apache.org/jira/browse/ARROW-3787) - Implement From<ListArray> for BinaryArray
-* [ARROW-3788](https://issues.apache.org/jira/browse/ARROW-3788) - [Ruby] Add support for CSV parser writtin in C++
-* [ARROW-3795](https://issues.apache.org/jira/browse/ARROW-3795) - [R] Support for retrieving NAs from INT64 arrays
-* [ARROW-3796](https://issues.apache.org/jira/browse/ARROW-3796) - [Rust] Add Example for PrimitiveArrayBuilder
-* [ARROW-3798](https://issues.apache.org/jira/browse/ARROW-3798) - [GLib] Add support for column type CSV read options
-* [ARROW-3800](https://issues.apache.org/jira/browse/ARROW-3800) - [C++] Vendor a string\_view backport
-* [ARROW-3803](https://issues.apache.org/jira/browse/ARROW-3803) - [C++/Python] Split C++ and Python unit test Travis CI jobs, run all C++ tests (including Gandiva) together
-* [ARROW-3807](https://issues.apache.org/jira/browse/ARROW-3807) - [R] Missing Field API
-* [ARROW-3819](https://issues.apache.org/jira/browse/ARROW-3819) - [Packaging] Update conda variant files to conform with feedstock after compiler migration
-* [ARROW-3821](https://issues.apache.org/jira/browse/ARROW-3821) - [Format/Documentation]: Fix typos and grammar issues in Flight.proto comments
-* [ARROW-3823](https://issues.apache.org/jira/browse/ARROW-3823) - [R] + buffer.complex
-* [ARROW-3825](https://issues.apache.org/jira/browse/ARROW-3825) - [Python] The Python README.md does not show how to run the unit test suite
-* [ARROW-3830](https://issues.apache.org/jira/browse/ARROW-3830) - [GLib] Add GArrowCodec
-* [ARROW-3834](https://issues.apache.org/jira/browse/ARROW-3834) - [Doc] Merge Python & C++ and move to top-level
-* [ARROW-3836](https://issues.apache.org/jira/browse/ARROW-3836) - [C++] Add PREFIX option to ADD\_ARROW\_BENCHMARK
-* [ARROW-3839](https://issues.apache.org/jira/browse/ARROW-3839) - [Rust] Add ability to infer schema in CSV reader
-* [ARROW-3841](https://issues.apache.org/jira/browse/ARROW-3841) - [C++] warning: catching polymorphic type by value
-* [ARROW-3842](https://issues.apache.org/jira/browse/ARROW-3842) - [R] RecordBatchStreamWriter api
-* [ARROW-3844](https://issues.apache.org/jira/browse/ARROW-3844) - [C++] Remove ARROW\_USE\_SSE and ARROW\_SSE3
-* [ARROW-3845](https://issues.apache.org/jira/browse/ARROW-3845) - [Gandiva] [GLib] Add GGandivaNode
-* [ARROW-3847](https://issues.apache.org/jira/browse/ARROW-3847) - [GLib] Remove unnecessary “\”.
-* [ARROW-3849](https://issues.apache.org/jira/browse/ARROW-3849) - Leverage Armv8 crc32 extension instructions to accelerate the hash computation for Arm64.
-* [ARROW-3851](https://issues.apache.org/jira/browse/ARROW-3851) - [C++] "make check-format" is slow
-* [ARROW-3852](https://issues.apache.org/jira/browse/ARROW-3852) - [C++] used uninitialized warning
-* [ARROW-3853](https://issues.apache.org/jira/browse/ARROW-3853) - [C++] Implement string to timestamp cast
-* [ARROW-3854](https://issues.apache.org/jira/browse/ARROW-3854) - [GLib] Deprecate garrow\_gio\_{input,output}\_stream\_get\_raw()
-* [ARROW-3855](https://issues.apache.org/jira/browse/ARROW-3855) - [Rust] Schema/Field/Datatype should implement serde traits
-* [ARROW-3856](https://issues.apache.org/jira/browse/ARROW-3856) - [Ruby] Support compressed CSV save/load
-* [ARROW-3858](https://issues.apache.org/jira/browse/ARROW-3858) - [GLib] Use {class\_name}\_get\_instance\_private
-* [ARROW-3859](https://issues.apache.org/jira/browse/ARROW-3859) - [Java] Fix ComplexWriter backward incompatible change
-* [ARROW-3860](https://issues.apache.org/jira/browse/ARROW-3860) - [Gandiva] [C++] Add option to use -static-libstdc++ when building libgandiva\_jni.so
-* [ARROW-3862](https://issues.apache.org/jira/browse/ARROW-3862) - [C++] Improve dependencies download script
-* [ARROW-3863](https://issues.apache.org/jira/browse/ARROW-3863) - [GLib] Use travis\_retry with brew bundle command
-* [ARROW-3864](https://issues.apache.org/jira/browse/ARROW-3864) - [GLib] Add support for allow-float-truncate cast option
-* [ARROW-3865](https://issues.apache.org/jira/browse/ARROW-3865) - [Packaging] Add double-conversion dependency to conda forge recipes and the windows wheel build
-* [ARROW-3867](https://issues.apache.org/jira/browse/ARROW-3867) - [Documentation] Uploading binary realase artifacts to Bintray
-* [ARROW-3868](https://issues.apache.org/jira/browse/ARROW-3868) - [Rust] Build against nightly Rust in CI
-* [ARROW-3870](https://issues.apache.org/jira/browse/ARROW-3870) - [C++] Add Peek to InputStream API
-* [ARROW-3871](https://issues.apache.org/jira/browse/ARROW-3871) - [R] Replace usages of C++ GetValuesSafely with new methods on ArrayData
-* [ARROW-3878](https://issues.apache.org/jira/browse/ARROW-3878) - [Rust] Improve primitive types
-* [ARROW-3880](https://issues.apache.org/jira/browse/ARROW-3880) - [Rust] PrimitiveArray<T> should support simple math operations
-* [ARROW-3883](https://issues.apache.org/jira/browse/ARROW-3883) - [Rust] Update Rust README to reflect new functionality
-* [ARROW-3884](https://issues.apache.org/jira/browse/ARROW-3884) - [Python] Add LLVM6 to manylinux1 base image
-* [ARROW-3885](https://issues.apache.org/jira/browse/ARROW-3885) - [Rust] Update version to 0.12.0 and update release instructions on wiki
-* [ARROW-3886](https://issues.apache.org/jira/browse/ARROW-3886) - [C++] Additional test cases for ARROW-3831
-* [ARROW-3891](https://issues.apache.org/jira/browse/ARROW-3891) - [Java] Remove Long.bitCount with simple bitmap operations
-* [ARROW-3893](https://issues.apache.org/jira/browse/ARROW-3893) - [C++] Improve adaptive int builder performance
-* [ARROW-3895](https://issues.apache.org/jira/browse/ARROW-3895) - [Rust] CSV reader should return Result<Option<>> not Option<Result<>>
-* [ARROW-3900](https://issues.apache.org/jira/browse/ARROW-3900) - [GLib] Add garrow\_mutable\_buffer\_set\_data()
-* [ARROW-3905](https://issues.apache.org/jira/browse/ARROW-3905) - [Ruby] Add StructDataType#[]
-* [ARROW-3906](https://issues.apache.org/jira/browse/ARROW-3906) - [C++] Break builder.cc into multiple compilation units
-* [ARROW-3908](https://issues.apache.org/jira/browse/ARROW-3908) - [Rust] Update rust dockerfile to use nightly toolchain
-* [ARROW-3910](https://issues.apache.org/jira/browse/ARROW-3910) - [Python] Set date\_as\_object to True in \*.to\_pandas as default after deduplicating logic implemented
-* [ARROW-3911](https://issues.apache.org/jira/browse/ARROW-3911) - [Python] Deduplicate datetime.date objects in Table.to\_pandas internals
-* [ARROW-3912](https://issues.apache.org/jira/browse/ARROW-3912) - [Plasma][GLib] Add support for creating and referring objects
-* [ARROW-3913](https://issues.apache.org/jira/browse/ARROW-3913) - [Gandiva] [GLib] Add GGandivaLiteralNode
-* [ARROW-3914](https://issues.apache.org/jira/browse/ARROW-3914) - [C++/Python/Packaging] Docker-compose setup for Alpine linux
-* [ARROW-3916](https://issues.apache.org/jira/browse/ARROW-3916) - [Python] Support caller-provided filesystem in \`ParquetWriter\` constructor
-* [ARROW-3922](https://issues.apache.org/jira/browse/ARROW-3922) - [C++] improve the performance of bitmap operations
-* [ARROW-3924](https://issues.apache.org/jira/browse/ARROW-3924) - [Packaging][Plasma] Add support for Plasma deb/rpm packages
-* [ARROW-3925](https://issues.apache.org/jira/browse/ARROW-3925) - [Python] Include autoconf in Linux/macOS dependencies in conda environment
-* [ARROW-3928](https://issues.apache.org/jira/browse/ARROW-3928) - [Python] Add option to deduplicate PyBytes / PyString / PyUnicode objects in Table.to\_pandas conversion path
-* [ARROW-3929](https://issues.apache.org/jira/browse/ARROW-3929) - [Go] improve memory usage of CSV reader to improve runtime performances
-* [ARROW-3930](https://issues.apache.org/jira/browse/ARROW-3930) - [C++] Random test data generation is slow
-* [ARROW-3932](https://issues.apache.org/jira/browse/ARROW-3932) - [Python/Documentation] Include Benchmarks.md in Sphinx docs
-* [ARROW-3934](https://issues.apache.org/jira/browse/ARROW-3934) - [Gandiva] Don't compile precompiled tests if ARROW\_GANDIVA\_BUILD\_TESTS=off
-* [ARROW-3938](https://issues.apache.org/jira/browse/ARROW-3938) - [Packaging] Stop to refer java/pom.xml to get version information
-* [ARROW-3939](https://issues.apache.org/jira/browse/ARROW-3939) - [Rust] Remove macro definition for ListArrayBuilder
-* [ARROW-3945](https://issues.apache.org/jira/browse/ARROW-3945) - [Website] Blog post about Gandiva code donation
-* [ARROW-3946](https://issues.apache.org/jira/browse/ARROW-3946) - [GLib] Add support for union
-* [ARROW-3948](https://issues.apache.org/jira/browse/ARROW-3948) - [CI][GLib] Set timeout to Homebrew
-* [ARROW-3950](https://issues.apache.org/jira/browse/ARROW-3950) - [Plasma] Don't force loading the TensorFlow op on import
-* [ARROW-3952](https://issues.apache.org/jira/browse/ARROW-3952) - [Rust] Specify edition="2018" in Cargo.toml
-* [ARROW-3958](https://issues.apache.org/jira/browse/ARROW-3958) - [Plasma] Reduce number of IPCs
-* [ARROW-3959](https://issues.apache.org/jira/browse/ARROW-3959) - [Rust] Time and Timestamp Support
-* [ARROW-3960](https://issues.apache.org/jira/browse/ARROW-3960) - [Rust] remove extern crate for Rust 2018
-* [ARROW-3963](https://issues.apache.org/jira/browse/ARROW-3963) - [Packaging/Docker] Nightly test for building sphinx documentations
-* [ARROW-3964](https://issues.apache.org/jira/browse/ARROW-3964) - [Go] More readable example for csv.Reader
-* [ARROW-3967](https://issues.apache.org/jira/browse/ARROW-3967) - [Gandiva] [C++] Make gandiva/node.h public
-* [ARROW-3970](https://issues.apache.org/jira/browse/ARROW-3970) - [Gandiva][C++] Remove unnecessary boost dependencies
-* [ARROW-3971](https://issues.apache.org/jira/browse/ARROW-3971) - [Python] Remove APIs deprecated in 0.11 and prior
-* [ARROW-3974](https://issues.apache.org/jira/browse/ARROW-3974) - [C++] Combine field\_builders\_ and children\_ members in array/builder.h
-* [ARROW-3982](https://issues.apache.org/jira/browse/ARROW-3982) - [C++] Allow "binary" input in simple JSON format
-* [ARROW-3983](https://issues.apache.org/jira/browse/ARROW-3983) - [Gandiva][Crossbow] Use static boost while packaging
-* [ARROW-3984](https://issues.apache.org/jira/browse/ARROW-3984) - [C++] Exit with error if user hits zstd ExternalProject path
-* [ARROW-3986](https://issues.apache.org/jira/browse/ARROW-3986) - [C++] Write prose documentation
-* [ARROW-3988](https://issues.apache.org/jira/browse/ARROW-3988) - [C++] Do not build unit tests by default in build system
-* [ARROW-3993](https://issues.apache.org/jira/browse/ARROW-3993) - [JS] CI Jobs Failing
-* [ARROW-3994](https://issues.apache.org/jira/browse/ARROW-3994) - [C++] Remove ARROW\_GANDIVA\_BUILD\_TESTS option
-* [ARROW-3995](https://issues.apache.org/jira/browse/ARROW-3995) - [CI] Use understandable names in Travis Matrix
-* [ARROW-3997](https://issues.apache.org/jira/browse/ARROW-3997) - [C++] [Doc] Clarify dictionary encoding integer signedness (and width?)
-* [ARROW-4002](https://issues.apache.org/jira/browse/ARROW-4002) - [C++][Gandiva] Remove CMake version check
-* [ARROW-4004](https://issues.apache.org/jira/browse/ARROW-4004) - [GLib] Replace GPU with CUDA
-* [ARROW-4005](https://issues.apache.org/jira/browse/ARROW-4005) - [Plasma] [GLib] Add gplasma\_client\_disconnect()
-* [ARROW-4006](https://issues.apache.org/jira/browse/ARROW-4006) - Add CODE\_OF\_CONDUCT.md
-* [ARROW-4009](https://issues.apache.org/jira/browse/ARROW-4009) - [CI] Run Valgrind and C++ code coverage in different bulds
-* [ARROW-4015](https://issues.apache.org/jira/browse/ARROW-4015) - [Plasma] remove legacy interfaces for plasma manager
-* [ARROW-4017](https://issues.apache.org/jira/browse/ARROW-4017) - [C++] Check and update vendored libraries
-* [ARROW-4026](https://issues.apache.org/jira/browse/ARROW-4026) - [C++] Use separate modular $COMPONENT-test targets for unit tests
-* [ARROW-4028](https://issues.apache.org/jira/browse/ARROW-4028) - [Rust] Merge parquet-rs codebase
-* [ARROW-4029](https://issues.apache.org/jira/browse/ARROW-4029) - [C++] Define and document naming convention for internal / private header files not to be installed
-* [ARROW-4030](https://issues.apache.org/jira/browse/ARROW-4030) - [CI] Use travis\_terminate to halt builds when a step fails
-* [ARROW-4035](https://issues.apache.org/jira/browse/ARROW-4035) - [Ruby] Support msys2 mingw dependencies
-* [ARROW-4037](https://issues.apache.org/jira/browse/ARROW-4037) - [Packaging] Remove workaround to verify 0.11.0
-* [ARROW-4038](https://issues.apache.org/jira/browse/ARROW-4038) - [Rust] Add array\_ops methods for boolean AND, OR, NOT
-* [ARROW-4039](https://issues.apache.org/jira/browse/ARROW-4039) - Update link to 'development.rst' page from Python README.md
-* [ARROW-4042](https://issues.apache.org/jira/browse/ARROW-4042) - [Rust] Inconsistent method naming between BinaryArray and PrimitiveArray
-* [ARROW-4043](https://issues.apache.org/jira/browse/ARROW-4043) - [Packaging/Docker] Python tests on alpine miss pytest dependency
-* [ARROW-4044](https://issues.apache.org/jira/browse/ARROW-4044) - [Packaging/Python] Add hypothesis test dependency to pyarrow conda recipe
-* [ARROW-4045](https://issues.apache.org/jira/browse/ARROW-4045) - [Packaging/Python] Add hypothesis test dependency to wheel crossbow tests
-* [ARROW-4048](https://issues.apache.org/jira/browse/ARROW-4048) - [GLib] Return ChunkedArray instead of Array in gparquet\_arrow\_file\_reader\_read\_column
-* [ARROW-4051](https://issues.apache.org/jira/browse/ARROW-4051) - [Gandiva] [GLib] Add support for null literal
-* [ARROW-4054](https://issues.apache.org/jira/browse/ARROW-4054) - [Python] Update gtest, flatbuffers and OpenSSL in manylinux1 base image
-* [ARROW-4060](https://issues.apache.org/jira/browse/ARROW-4060) - [Rust] Add Parquet/Arrow schema converter
-* [ARROW-4069](https://issues.apache.org/jira/browse/ARROW-4069) - [Python] Add tests for casting from binary to utf8
-* [ARROW-4075](https://issues.apache.org/jira/browse/ARROW-4075) - [Rust] Reuse array builder after calling finish()
-* [ARROW-4079](https://issues.apache.org/jira/browse/ARROW-4079) - [C++] Add machine benchmarks
-* [ARROW-4080](https://issues.apache.org/jira/browse/ARROW-4080) - [Rust] Improving lengthy build times in Appveyor
-* [ARROW-4082](https://issues.apache.org/jira/browse/ARROW-4082) - [C++] CMake tweaks: allow RelWithDebInfo, improve FindClangTools
-* [ARROW-4084](https://issues.apache.org/jira/browse/ARROW-4084) - [C++] Simplify Status and stringstream boilerplate
-* [ARROW-4085](https://issues.apache.org/jira/browse/ARROW-4085) - [GLib] Use "field" for struct data type
-* [ARROW-4087](https://issues.apache.org/jira/browse/ARROW-4087) - [C++] Make CSV nulls configurable
-* [ARROW-4093](https://issues.apache.org/jira/browse/ARROW-4093) - [C++] Deprecated method suggests wrong method
-* [ARROW-4098](https://issues.apache.org/jira/browse/ARROW-4098) - [Python] Deprecate pyarrow.open\_stream,open\_file in favor of pa.ipc.open\_stream/open\_file
-* [ARROW-4100](https://issues.apache.org/jira/browse/ARROW-4100) - [Gandiva][C++] Fix regex to ignore "." character
-* [ARROW-4102](https://issues.apache.org/jira/browse/ARROW-4102) - [C++] FixedSizeBinary identity cast not implemented
-* [ARROW-4103](https://issues.apache.org/jira/browse/ARROW-4103) - [Documentation] Add README to docs/ root
-* [ARROW-4105](https://issues.apache.org/jira/browse/ARROW-4105) - Add rust-toolchain to enforce user to use nightly toolchain for building
-* [ARROW-4107](https://issues.apache.org/jira/browse/ARROW-4107) - [Python] Use ninja in pyarrow manylinux1 build
-* [ARROW-4112](https://issues.apache.org/jira/browse/ARROW-4112) - [Packaging][Gandiva] Add support for deb packages
-* [ARROW-4116](https://issues.apache.org/jira/browse/ARROW-4116) - [Python] Clarify in development.rst that virtualenv cannot be used with miniconda/Anaconda
-* [ARROW-4122](https://issues.apache.org/jira/browse/ARROW-4122) - [C++] Initialize some uninitialized class members
-* [ARROW-4127](https://issues.apache.org/jira/browse/ARROW-4127) - [Documentation] Add Docker build instructions
-* [ARROW-4129](https://issues.apache.org/jira/browse/ARROW-4129) - [Python] Fix syntax problem in benchmark docs
-* [ARROW-4132](https://issues.apache.org/jira/browse/ARROW-4132) - [GLib] Add more GArrowTable constructors
-* [ARROW-4141](https://issues.apache.org/jira/browse/ARROW-4141) - [Ruby] Add support for creating schema from raw Ruby objects
-* [ARROW-4148](https://issues.apache.org/jira/browse/ARROW-4148) - [CI/Python] Disable ORC on nightly Alpine builds
-* [ARROW-4150](https://issues.apache.org/jira/browse/ARROW-4150) - [C++] Do not return buffers containing nullptr from internal allocations
-* [ARROW-4151](https://issues.apache.org/jira/browse/ARROW-4151) - [Rust] Restructure project directories
-* [ARROW-4152](https://issues.apache.org/jira/browse/ARROW-4152) - [GLib] Remove an example to show Torch integration
-* [ARROW-4153](https://issues.apache.org/jira/browse/ARROW-4153) - [GLib] Add builder\_append\_value() for consistency
-* [ARROW-4154](https://issues.apache.org/jira/browse/ARROW-4154) - [GLib] Add GArrowDecimal128DataType
-* [ARROW-4155](https://issues.apache.org/jira/browse/ARROW-4155) - [Rust] Implement array\_ops::sum() for PrimitiveArray<T>
-* [ARROW-4156](https://issues.apache.org/jira/browse/ARROW-4156) - [C++] xcodebuild failure for cmake generated project
-* [ARROW-4158](https://issues.apache.org/jira/browse/ARROW-4158) - [Dev] Allow maintainers to use a GitHub API token when merging pull requests
-* [ARROW-4160](https://issues.apache.org/jira/browse/ARROW-4160) - [Rust] Add README and executable files to parquet
-* [ARROW-4161](https://issues.apache.org/jira/browse/ARROW-4161) - [GLib] Add GPlasmaClientOptions
-* [ARROW-4162](https://issues.apache.org/jira/browse/ARROW-4162) - [Ruby] Add support for creating data types from description
-* [ARROW-4166](https://issues.apache.org/jira/browse/ARROW-4166) - [Ruby] Add support for saving to and loading from buffer
-* [ARROW-4167](https://issues.apache.org/jira/browse/ARROW-4167) - [Gandiva] switch to arrow/util/variant
-* [ARROW-4168](https://issues.apache.org/jira/browse/ARROW-4168) - [GLib] Use property to keep GArrowDataType passed in garrow\_field\_new()
-* [ARROW-4172](https://issues.apache.org/jira/browse/ARROW-4172) - [Rust] more consistent naming in array builders
-* [ARROW-4174](https://issues.apache.org/jira/browse/ARROW-4174) - [Ruby] Add support for building composite array from raw Ruby objects
-* [ARROW-4175](https://issues.apache.org/jira/browse/ARROW-4175) - [GLib] Add support for decimal compare operators
-* [ARROW-4177](https://issues.apache.org/jira/browse/ARROW-4177) - [C++] Add ThreadPool and TaskGroup microbenchmarks
-* [ARROW-4183](https://issues.apache.org/jira/browse/ARROW-4183) - [Ruby] Add Arrow::Struct as an element of Arrow::StructArray
-* [ARROW-4184](https://issues.apache.org/jira/browse/ARROW-4184) - [Ruby] Add Arrow::RecordBatch#to\_table
-* [ARROW-4191](https://issues.apache.org/jira/browse/ARROW-4191) - [C++] Use same CC and AR for jemalloc as for the main sources
-* [ARROW-4199](https://issues.apache.org/jira/browse/ARROW-4199) - [GLib] Add garrow\_seekable\_input\_stream\_peek()
-* [ARROW-4207](https://issues.apache.org/jira/browse/ARROW-4207) - [Gandiva] [GLib] Add support for IfNode
-* [ARROW-4210](https://issues.apache.org/jira/browse/ARROW-4210) - [Python] Mention boost-cpp directly in the conda meta.yaml for pyarrow
-* [ARROW-4211](https://issues.apache.org/jira/browse/ARROW-4211) - [GLib] Add GArrowFixedSizeBinaryDataType
-* [ARROW-4214](https://issues.apache.org/jira/browse/ARROW-4214) - [Ruby] Add support for building RecordBatch from raw Ruby objects
-* [ARROW-4216](https://issues.apache.org/jira/browse/ARROW-4216) - [Python] Add CUDA API docs
-* [ARROW-4228](https://issues.apache.org/jira/browse/ARROW-4228) - [GLib] Add garrow\_list\_data\_type\_get\_field()
-* [ARROW-4229](https://issues.apache.org/jira/browse/ARROW-4229) - [Packaging] Set crossbow target explicitly to enable building arbitrary arrow repo
-* [ARROW-4233](https://issues.apache.org/jira/browse/ARROW-4233) - [Packaging] Create a Dockerfile to build source archive
-* [ARROW-4239](https://issues.apache.org/jira/browse/ARROW-4239) - [Release] Updating .deb package names in the prepare script failed to run on OSX
-* [ARROW-4240](https://issues.apache.org/jira/browse/ARROW-4240) - [Packaging] Documents for Plasma GLib and Gandiva GLib are missing in source archive
-* [ARROW-4241](https://issues.apache.org/jira/browse/ARROW-4241) - [Packaging] Disable crossbow conda OSX clang builds
-* [ARROW-4243](https://issues.apache.org/jira/browse/ARROW-4243) - [Python] Test failure with pandas 0.24.0rc1
-* [ARROW-4249](https://issues.apache.org/jira/browse/ARROW-4249) - [Plasma] Remove reference to logging.h from plasma/common.h
-* [ARROW-4257](https://issues.apache.org/jira/browse/ARROW-4257) - [Release] Update release verification script to check binaries on Bintray
-* [ARROW-4266](https://issues.apache.org/jira/browse/ARROW-4266) - [Python][CI] Disable ORC tests in dask integration test
-* [ARROW-4269](https://issues.apache.org/jira/browse/ARROW-4269) - [Python] AttributeError: module 'pandas.core' has no attribute 'arrays'
-* [ARROW-4270](https://issues.apache.org/jira/browse/ARROW-4270) - [Packaging][Conda] Update xcode version and remove toolchain builds
-* [ARROW-4276](https://issues.apache.org/jira/browse/ARROW-4276) - [Release] Remove needless Bintray authentication from binaries verify script
-* [ARROW-45](https://issues.apache.org/jira/browse/ARROW-45) - [Python] Add unnest/flatten function for List types
-* [ARROW-554](https://issues.apache.org/jira/browse/ARROW-554) - [C++] Implement functions to conform unequal dictionaries amongst multiple Arrow arrays
-* [ARROW-854](https://issues.apache.org/jira/browse/ARROW-854) - [Format] Support sparse tensor
-* [ARROW-912](https://issues.apache.org/jira/browse/ARROW-912) - [Python] Account for multiarch systems in development.rst
-
-## Bug Fixes
-
-* [ARROW-1847](https://issues.apache.org/jira/browse/ARROW-1847) - [Doc] Document the difference between RecordBatch and Table in an FAQ fashion
-* [ARROW-1994](https://issues.apache.org/jira/browse/ARROW-1994) - [Python] Test against Pandas master
-* [ARROW-2026](https://issues.apache.org/jira/browse/ARROW-2026) - [Python] Cast all timestamp resolutions to INT96 use\_deprecated\_int96\_timestamps=True
-* [ARROW-2038](https://issues.apache.org/jira/browse/ARROW-2038) - [Python] Follow-up bug fixes for s3fs Parquet support
-* [ARROW-2113](https://issues.apache.org/jira/browse/ARROW-2113) - [Python] Incomplete CLASSPATH with "hadoop" contained in it can fool the classpath setting HDFS logic
-* [ARROW-2591](https://issues.apache.org/jira/browse/ARROW-2591) - [Python] Segmentation fault when writing empty ListType column to Parquet
-* [ARROW-2592](https://issues.apache.org/jira/browse/ARROW-2592) - [Python] Error reading old Parquet file due to metadata backwards compatibility issue
-* [ARROW-2708](https://issues.apache.org/jira/browse/ARROW-2708) - [C++] Internal GetValues function in arrow::compute should check for nullptr
-* [ARROW-2970](https://issues.apache.org/jira/browse/ARROW-2970) - [Python] NumPyConverter::Visit for Binary/String/FixedSizeBinary can overflow
-* [ARROW-3058](https://issues.apache.org/jira/browse/ARROW-3058) - [Python] Feather reads fail with unintuitive error when conversion from pandas yields ChunkedArray
-* [ARROW-3186](https://issues.apache.org/jira/browse/ARROW-3186) - [GLib] mesonbuild failures in Travis CI
-* [ARROW-3202](https://issues.apache.org/jira/browse/ARROW-3202) - [C++] Build does not succeed on Alpine Linux
-* [ARROW-3225](https://issues.apache.org/jira/browse/ARROW-3225) - [C++/Python] Pandas object conversion of ListType<DateType> and ListType<TimeType>
-* [ARROW-3324](https://issues.apache.org/jira/browse/ARROW-3324) - [Parquet] Free more internal resources when writing multiple row groups
-* [ARROW-3343](https://issues.apache.org/jira/browse/ARROW-3343) - [Java] Java tests fail non-deterministically with memory leak from Flight tests
-* [ARROW-3405](https://issues.apache.org/jira/browse/ARROW-3405) - [Python] Document CSV reader
-* [ARROW-3428](https://issues.apache.org/jira/browse/ARROW-3428) - [Python] from\_pandas gives incorrect results when converting floating point to bool
-* [ARROW-3436](https://issues.apache.org/jira/browse/ARROW-3436) - [C++] Boost version required by Gandiva is too new for Ubuntu 14.04
-* [ARROW-3437](https://issues.apache.org/jira/browse/ARROW-3437) - [Gandiva][C++] Configure static linking of libgcc, libstdc++ with LDFLAGS
-* [ARROW-3438](https://issues.apache.org/jira/browse/ARROW-3438) - [Packaging] Escaped bulletpoints in changelog
-* [ARROW-3445](https://issues.apache.org/jira/browse/ARROW-3445) - [GLib] Parquet GLib doesn't link Arrow GLib
-* [ARROW-3449](https://issues.apache.org/jira/browse/ARROW-3449) - [C++] Support CMake 3.2 for "out of the box" builds
-* [ARROW-3466](https://issues.apache.org/jira/browse/ARROW-3466) - [Python] Crash when importing tensorflow and pyarrow
-* [ARROW-3467](https://issues.apache.org/jira/browse/ARROW-3467) - Building against external double conversion is broken
-* [ARROW-3470](https://issues.apache.org/jira/browse/ARROW-3470) - [C++] Row-wise conversion tutorial has fallen out of date
-* [ARROW-3477](https://issues.apache.org/jira/browse/ARROW-3477) - [C++] Testsuite fails on 32 bit arch
-* [ARROW-3480](https://issues.apache.org/jira/browse/ARROW-3480) - [Website] Install document for Ubuntu is broken
-* [ARROW-3485](https://issues.apache.org/jira/browse/ARROW-3485) - [C++] Examples fail with Protobuf error
-* [ARROW-3494](https://issues.apache.org/jira/browse/ARROW-3494) - [C++] re2 conda-forge package not working in toolchain
-* [ARROW-3516](https://issues.apache.org/jira/browse/ARROW-3516) - [C++] Use unsigned type for difference of pointers in parallel\_memcpy
-* [ARROW-3517](https://issues.apache.org/jira/browse/ARROW-3517) - [C++] MinGW 32bit build causes g++ segv
-* [ARROW-3524](https://issues.apache.org/jira/browse/ARROW-3524) - [C++] Fix compiler warnings from ARROW-3409 on clang-6
-* [ARROW-3527](https://issues.apache.org/jira/browse/ARROW-3527) - [R] Unused variables in R-package C++ code
-* [ARROW-3528](https://issues.apache.org/jira/browse/ARROW-3528) - [R] Typo in R documentation
-* [ARROW-3535](https://issues.apache.org/jira/browse/ARROW-3535) - [Python] pip install tensorflow install too new numpy in manylinux1 build
-* [ARROW-3541](https://issues.apache.org/jira/browse/ARROW-3541) - [Rust] Update BufferBuilder to allow for new bit-packed BooleanArray
-* [ARROW-3544](https://issues.apache.org/jira/browse/ARROW-3544) - [Gandiva] Populate function registry in multiple compilation units to mitigate long compile times in release mode
-* [ARROW-3549](https://issues.apache.org/jira/browse/ARROW-3549) - [Rust] Replace i64 with usize for some bit utility functions
-* [ARROW-3573](https://issues.apache.org/jira/browse/ARROW-3573) - [Rust] with\_bitset does not set valid bits correctly
-* [ARROW-3580](https://issues.apache.org/jira/browse/ARROW-3580) - [Gandiva][C++] Build error with g++ 8.2.0
-* [ARROW-3586](https://issues.apache.org/jira/browse/ARROW-3586) - [Python] Segmentation fault when converting empty table to pandas with categoricals
-* [ARROW-3598](https://issues.apache.org/jira/browse/ARROW-3598) - [Plasma] plasma\_store\_server fails linking with GPU enabled
-* [ARROW-3613](https://issues.apache.org/jira/browse/ARROW-3613) - [Go] Resize does not correctly update the length
-* [ARROW-3614](https://issues.apache.org/jira/browse/ARROW-3614) - [R] Handle Type::TIMESTAMP from Arrow to R
-* [ARROW-3634](https://issues.apache.org/jira/browse/ARROW-3634) - [GLib] cuda.cpp compile error
-* [ARROW-3658](https://issues.apache.org/jira/browse/ARROW-3658) - [Rust] validation of offsets buffer is incorrect for \`List<T>\`
-* [ARROW-3670](https://issues.apache.org/jira/browse/ARROW-3670) - [C++] Use FindBacktrace to find execinfo.h support
-* [ARROW-3687](https://issues.apache.org/jira/browse/ARROW-3687) - [Rust] Anything measuring array slots should be \`usize\`
-* [ARROW-3698](https://issues.apache.org/jira/browse/ARROW-3698) - [C++] Segmentation fault when using a large table in Gandiva
-* [ARROW-3700](https://issues.apache.org/jira/browse/ARROW-3700) - [C++] CSV parser should allow ignoring empty lines
-* [ARROW-3703](https://issues.apache.org/jira/browse/ARROW-3703) - [Python] DataFrame.to\_parquet crashes if datetime column has time zones
-* [ARROW-3704](https://issues.apache.org/jira/browse/ARROW-3704) - [Gandiva] Can't build with g++ 8.2.0
-* [ARROW-3707](https://issues.apache.org/jira/browse/ARROW-3707) - [C++] test failure with zstd 1.3.7
-* [ARROW-3711](https://issues.apache.org/jira/browse/ARROW-3711) - [C++] Don't pass CXX\_FLAGS to C\_FLAGS
-* [ARROW-3712](https://issues.apache.org/jira/browse/ARROW-3712) - [CI] License check regression (RAT failure)
-* [ARROW-3715](https://issues.apache.org/jira/browse/ARROW-3715) - [C++] gflags\_ep fails to build with CMake 3.13
-* [ARROW-3716](https://issues.apache.org/jira/browse/ARROW-3716) - [R] Missing cases for ChunkedArray conversion
-* [ARROW-3728](https://issues.apache.org/jira/browse/ARROW-3728) - [Python] Merging Parquet Files - Pandas Meta in Schema Mismatch
-* [ARROW-3734](https://issues.apache.org/jira/browse/ARROW-3734) - [C++] Linking static zstd library fails on Arch x86-64
-* [ARROW-3740](https://issues.apache.org/jira/browse/ARROW-3740) - [C++] Calling ArrayBuilder::Resize with length smaller than current appended length results in invalid state
-* [ARROW-3742](https://issues.apache.org/jira/browse/ARROW-3742) - Fix pyarrow.types & gandiva cython bindings
-* [ARROW-3745](https://issues.apache.org/jira/browse/ARROW-3745) - [C++] CMake passes static libraries multiple times to linker
-* [ARROW-3754](https://issues.apache.org/jira/browse/ARROW-3754) - [Packaging] Zstd configure error on linux package builds
-* [ARROW-3756](https://issues.apache.org/jira/browse/ARROW-3756) - [CI/Docker/Java] Java tests are failing in docker-compose setup
-* [ARROW-3762](https://issues.apache.org/jira/browse/ARROW-3762) - [C++] Parquet arrow::Table reads error when overflowing capacity of BinaryArray
-* [ARROW-3765](https://issues.apache.org/jira/browse/ARROW-3765) - [Gandiva] Segfault when the validity bitmap has not been allocated
-* [ARROW-3766](https://issues.apache.org/jira/browse/ARROW-3766) - [Python] pa.Table.from\_pandas doesn't use schema ordering
-* [ARROW-3768](https://issues.apache.org/jira/browse/ARROW-3768) - [Python] set classpath to hdfs not hadoop executable
-* [ARROW-3790](https://issues.apache.org/jira/browse/ARROW-3790) - [C++] Signed to unsigned integer cast yields incorrect results when type sizes are the same
-* [ARROW-3792](https://issues.apache.org/jira/browse/ARROW-3792) - [Python] Segmentation fault when writing empty RecordBatches to Parquet
-* [ARROW-3793](https://issues.apache.org/jira/browse/ARROW-3793) - [C++] TestScalarAppendUnsafe is not testing unsafe appends
-* [ARROW-3797](https://issues.apache.org/jira/browse/ARROW-3797) - [Rust] BinaryArray::value\_offset incorrect in offset case
-* [ARROW-3805](https://issues.apache.org/jira/browse/ARROW-3805) - [Gandiva] handle null validity bitmap in if-else expressions
-* [ARROW-3831](https://issues.apache.org/jira/browse/ARROW-3831) - [C++] arrow::util::Codec::Decompress() doesn't return decompressed data size
-* [ARROW-3835](https://issues.apache.org/jira/browse/ARROW-3835) - [C++] arrow::io::CompressedOutputStream::raw() impementation is missing
-* [ARROW-3837](https://issues.apache.org/jira/browse/ARROW-3837) - [C++] gflags link errors on Windows
-* [ARROW-3866](https://issues.apache.org/jira/browse/ARROW-3866) - [Python] Column metadata is not transferred to tables in pyarrow
-* [ARROW-3874](https://issues.apache.org/jira/browse/ARROW-3874) - [Gandiva] Cannot build: LLVM not detected correctly
-* [ARROW-3879](https://issues.apache.org/jira/browse/ARROW-3879) - [C++] cuda-test failure
-* [ARROW-3888](https://issues.apache.org/jira/browse/ARROW-3888) - [C++] Compilation warnings with gcc 7.3.0
-* [ARROW-3889](https://issues.apache.org/jira/browse/ARROW-3889) - [Python] creating schema with invalid paramaters causes segmanetation fault
-* [ARROW-3890](https://issues.apache.org/jira/browse/ARROW-3890) - [Python] Creating Array with explicit string type fails on Python 2.7
-* [ARROW-3894](https://issues.apache.org/jira/browse/ARROW-3894) - [Python] Error reading IPC file with no record batches
-* [ARROW-3898](https://issues.apache.org/jira/browse/ARROW-3898) - parquet-arrow example has compilation errors
-* [ARROW-3920](https://issues.apache.org/jira/browse/ARROW-3920) - Plasma reference counting not properly done in TensorFlow custom operator.
-* [ARROW-3931](https://issues.apache.org/jira/browse/ARROW-3931) - Make possible to build regardless of LANG
-* [ARROW-3936](https://issues.apache.org/jira/browse/ARROW-3936) - Add \_O\_NOINHERIT to the file open flags on Windows
-* [ARROW-3937](https://issues.apache.org/jira/browse/ARROW-3937) - [Rust] Rust nightly build is failing
-* [ARROW-3940](https://issues.apache.org/jira/browse/ARROW-3940) - [Python/Documentation] Add required packages to the development instruction
-* [ARROW-3941](https://issues.apache.org/jira/browse/ARROW-3941) - [R] RecordBatchStreamReader$schema
-* [ARROW-3942](https://issues.apache.org/jira/browse/ARROW-3942) - [R] Feather api fixes
-* [ARROW-3953](https://issues.apache.org/jira/browse/ARROW-3953) - Compat with pandas 0.24 rename of MultiIndex labels -> codes
-* [ARROW-3955](https://issues.apache.org/jira/browse/ARROW-3955) - [GLib] Add (transfer full) to free when no longer needed
-* [ARROW-3957](https://issues.apache.org/jira/browse/ARROW-3957) - [Python] Better error message when user connects to HDFS cluster with wrong port
-* [ARROW-3961](https://issues.apache.org/jira/browse/ARROW-3961) - [Python/Documentation] Fix wrong path in the pyarrow README
-* [ARROW-3969](https://issues.apache.org/jira/browse/ARROW-3969) - [Rust] CI build broken because rustfmt not available on nightly toolchain
-* [ARROW-3976](https://issues.apache.org/jira/browse/ARROW-3976) - [Ruby] Homebrew donation solicitation on CLI breaking CI builds
-* [ARROW-3977](https://issues.apache.org/jira/browse/ARROW-3977) - [Gandiva] gandiva cpp tests not running in CI
-* [ARROW-3979](https://issues.apache.org/jira/browse/ARROW-3979) - [Gandiva] fix all valgrind reported errors
-* [ARROW-3980](https://issues.apache.org/jira/browse/ARROW-3980) - [C++] Fix CRTP use in json-simple.cc
-* [ARROW-3989](https://issues.apache.org/jira/browse/ARROW-3989) - [Rust] CSV reader should handle case sensitivity for boolean values
-* [ARROW-3996](https://issues.apache.org/jira/browse/ARROW-3996) - [C++] Insufficient description on build
-* [ARROW-4008](https://issues.apache.org/jira/browse/ARROW-4008) - [C++] Integration test executable failure
-* [ARROW-4011](https://issues.apache.org/jira/browse/ARROW-4011) - [Gandiva] Refer irhelpers.bc in build directory
-* [ARROW-4019](https://issues.apache.org/jira/browse/ARROW-4019) - [C++] Fix coverity issues
-* [ARROW-4033](https://issues.apache.org/jira/browse/ARROW-4033) - [C++] thirdparty/download\_dependencies.sh uses tools or options not available in older Linuxes
-* [ARROW-4034](https://issues.apache.org/jira/browse/ARROW-4034) - [Ruby] Interface for FileOutputStream doesn't respect append=True
-* [ARROW-4041](https://issues.apache.org/jira/browse/ARROW-4041) - [CI] Python 2.7 run uses Python 3.6
-* [ARROW-4049](https://issues.apache.org/jira/browse/ARROW-4049) - [C++] Arrow never use glog even though glog is linked.
-* [ARROW-4052](https://issues.apache.org/jira/browse/ARROW-4052) - [C++] Linker errors with glog and gflags
-* [ARROW-4053](https://issues.apache.org/jira/browse/ARROW-4053) - [Python/Integration] HDFS Tests failing with I/O operation on closed file
-* [ARROW-4055](https://issues.apache.org/jira/browse/ARROW-4055) - [Python] Fails to convert pytz.utc with versions 2018.3 and earlier
-* [ARROW-4058](https://issues.apache.org/jira/browse/ARROW-4058) - [C++] arrow-io-hdfs-test fails when run against HDFS cluster from docker-compose
-* [ARROW-4065](https://issues.apache.org/jira/browse/ARROW-4065) - [C++] arrowTargets.cmake is broken
-* [ARROW-4066](https://issues.apache.org/jira/browse/ARROW-4066) - Instructions to create Sphinx documentation
-* [ARROW-4070](https://issues.apache.org/jira/browse/ARROW-4070) - [C++] ARROW\_BOOST\_VENDORED doesn't work properly with ninja build
-* [ARROW-4073](https://issues.apache.org/jira/browse/ARROW-4073) - [Python] Parquet test failures on AppVeyor
-* [ARROW-4074](https://issues.apache.org/jira/browse/ARROW-4074) - [Python] test\_get\_library\_dirs\_win32 fails if libraries installed someplace different from conda or wheel packages
-* [ARROW-4078](https://issues.apache.org/jira/browse/ARROW-4078) - [CI] Run Travis job where documentation is built when docs/ is changed
-* [ARROW-4088](https://issues.apache.org/jira/browse/ARROW-4088) - [Python] Table.from\_batches() fails when passed a schema with metadata
-* [ARROW-4089](https://issues.apache.org/jira/browse/ARROW-4089) - [Plasma] The tutorial is wrong regarding the parameter type of PlasmaClient.Create
-* [ARROW-4101](https://issues.apache.org/jira/browse/ARROW-4101) - [C++] Binary identity cast not implemented
-* [ARROW-4106](https://issues.apache.org/jira/browse/ARROW-4106) - [Python] Tests fail to run because hypothesis update broke its API
-* [ARROW-4109](https://issues.apache.org/jira/browse/ARROW-4109) - [Packaging] Missing glog dependency from arrow-cpp conda recipe
-* [ARROW-4113](https://issues.apache.org/jira/browse/ARROW-4113) - [R] Version number patch broke build
-* [ARROW-4114](https://issues.apache.org/jira/browse/ARROW-4114) - [C++][DOCUMENTATION]
-* [ARROW-4115](https://issues.apache.org/jira/browse/ARROW-4115) - [Gandiva] valgrind complains that boolean output data buffer has uninited data
-* [ARROW-4118](https://issues.apache.org/jira/browse/ARROW-4118) - [Python] Error with "asv run"
-* [ARROW-4125](https://issues.apache.org/jira/browse/ARROW-4125) - [Python] ASV benchmarks fail to run if Plasma extension is not built (e.g. on Windows)
-* [ARROW-4126](https://issues.apache.org/jira/browse/ARROW-4126) - [Go] offset not used when accessing boolean array
-* [ARROW-4128](https://issues.apache.org/jira/browse/ARROW-4128) - [C++][DOCUMENTATION] Update style guide to reflect some more exceptions
-* [ARROW-4130](https://issues.apache.org/jira/browse/ARROW-4130) - [Go] offset not used when accessing binary array
-* [ARROW-4134](https://issues.apache.org/jira/browse/ARROW-4134) - [Packaging] Properly setup timezone in docker tests to prevent ORC adapter's abort
-* [ARROW-4135](https://issues.apache.org/jira/browse/ARROW-4135) - [Python] Can't reload a pandas dataframe containing a list of datetime.time
-* [ARROW-4137](https://issues.apache.org/jira/browse/ARROW-4137) - [Rust] Move parquet code into a separate crate
-* [ARROW-4138](https://issues.apache.org/jira/browse/ARROW-4138) - [Python] setuptools\_scm customization does not work for versions above 0.9.0 on Windows
-* [ARROW-4147](https://issues.apache.org/jira/browse/ARROW-4147) - [JAVA] Reduce heap usage for variable width vectors
-* [ARROW-4149](https://issues.apache.org/jira/browse/ARROW-4149) - [CI/C++] Parquet test misses ZSTD compression codec in CMake 3.2 nightly builds
-* [ARROW-4157](https://issues.apache.org/jira/browse/ARROW-4157) - [C++] -Wdocumentation failures with clang 6.0 on Ubuntu 18.04
-* [ARROW-4171](https://issues.apache.org/jira/browse/ARROW-4171) - [Rust] fix parquet crate release version
-* [ARROW-4173](https://issues.apache.org/jira/browse/ARROW-4173) - JIRA library name is wrong in error message of dev/merge\_arrow\_pr.py
-* [ARROW-4178](https://issues.apache.org/jira/browse/ARROW-4178) - [C++] Fix TSan and UBSan errors
-* [ARROW-4179](https://issues.apache.org/jira/browse/ARROW-4179) - [Python] Tests crashing on all platforms in CI
-* [ARROW-4185](https://issues.apache.org/jira/browse/ARROW-4185) - [Rust] Appveyor builds are broken
-* [ARROW-4186](https://issues.apache.org/jira/browse/ARROW-4186) - [C++] BitmapWriters clobber the first byte when length=0
-* [ARROW-4188](https://issues.apache.org/jira/browse/ARROW-4188) - [Rust] There should be a README in the top level rust directory
-* [ARROW-4197](https://issues.apache.org/jira/browse/ARROW-4197) - [C++] Emscripten compiler fails building Arrow
-* [ARROW-4200](https://issues.apache.org/jira/browse/ARROW-4200) - [C++] conda\_env\_\* files cannot be used to create a fresh conda environment on Windows
-* [ARROW-4209](https://issues.apache.org/jira/browse/ARROW-4209) - [Gandiva] returning IR structs causes issues with windows
-* [ARROW-4215](https://issues.apache.org/jira/browse/ARROW-4215) - [GLib] Fix typos in documentation
-* [ARROW-4227](https://issues.apache.org/jira/browse/ARROW-4227) - [GLib] Field in composite data type returns wrong data type
-* [ARROW-4237](https://issues.apache.org/jira/browse/ARROW-4237) - [Packaging] Fix CMAKE\_INSTALL\_LIBDIR in release verification script
-* [ARROW-4238](https://issues.apache.org/jira/browse/ARROW-4238) - [Packaging] Fix RC version conflict between crossbow and rake
-* [ARROW-4246](https://issues.apache.org/jira/browse/ARROW-4246) - [Plasma][Python] PlasmaClient.list doesn't work with CUDA enabled Plasma
-* [ARROW-4256](https://issues.apache.org/jira/browse/ARROW-4256) - [Release] Update Windows verification script for 0.12 release
-* [ARROW-4258](https://issues.apache.org/jira/browse/ARROW-4258) - [Python] Safe cast fails from numpy float64 array with nans to integer
-* [ARROW-4260](https://issues.apache.org/jira/browse/ARROW-4260) - [Python] test\_serialize\_deserialize\_pandas is failing in multiple build entries
-
-[1]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.12.0/
-[2]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.12.0/binaries
-[3]: https://github.com/apache/arrow/releases/tag/apache-arrow-0.12.0
diff --git a/site/_release/0.13.0.md b/site/_release/0.13.0.md
deleted file mode 100644
index 5a61a17..0000000
--- a/site/_release/0.13.0.md
+++ /dev/null
@@ -1,715 +0,0 @@
----
-layout: default
-title: Apache Arrow 0.13.0 Release
-permalink: /release/0.13.0.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow 0.13.0 (1 April 2019)
-
-This is a major release covering more than 2 months of development.
-
-## Download
-
-* [**Source Artifacts**][1]
-* **Binary Artifacts**
-  * [For CentOS][2]
-  * [For Debian][3]
-  * [For Python][4]
-  * [For Ubuntu][5]
-* [Git tag][6]
-
-## Contributors
-
-This release includes 550 commits from 81 distinct contributors.
-
-```console
-$ git shortlog -sn apache-arrow-0.12.0..apache-arrow-0.13.0
-    56	Wes McKinney
-    55	Kouhei Sutou
-    31	Antoine Pitrou
-    31	François Saint-Jacques
-    31	Korn, Uwe
-    30	Krisztián Szűcs
-    28	Andy Grove
-    27	Uwe L. Korn
-    21	Micah Kornfield
-    17	Benjamin Kietzman
-    17	Kenta Murata
-    17	Pindikura Ravindra
-    15	ptaylor
-    13	David Li
-    12	Paddy Horan
-    11	Neville Dipale
-     9	Philipp Moritz
-     8	Chao Sun
-     8	Romain Francois
-     7	Eric Erhardt
-     7	Yosuke Shiro
-     6	Nicolas Trinquier
-     5	Javier Luraschi
-     4	Anson Qian
-     4	Brian Hulette
-     4	Yuhong Guo
-     3	Anurag Khandelwal
-     3	Bryan Cutler
-     3	Dominik Moritz
-     3	Mike Pigott
-     3	Pearu Peterson
-     3	Sebastien Binet
-     2	Hatem Helal
-     2	Jeroen Ooms
-     2	Owen Nelson
-     2	Peter
-     2	Praveen
-     2	Renat Valiullin
-     2	Renjie Liu
-     2	emkornfield
-     2	tigerchen
-     1	Alexandre Crayssac
-     1	Areg Melik-Adamyan
-     1	Brandon Greenwell
-     1	Bruno P. Kinoshita
-     1	Christian Thiel
-     1	Christopher Hutchinson
-     1	Daniel Bast
-     1	Felix
-     1	Gene Novark
-     1	George Sakkis
-     1	Gidon Gershinsky
-     1	Ildar Musin
-     1	Ilya Tocar
-     1	Laurent Goujon
-     1	Matthias Vallentin
-     1	Maxim Novikov
-     1	Michael Pigott
-     1	Michael Vilim
-     1	Mike Bostock
-     1	P. Taylor Goetz
-     1	Paul Kernfeld
-     1	Paul Taylor
-     1	Prashanth Govindarajan
-     1	Robert Nishihara
-     1	Ross Smith II
-     1	Ryan Mackenzie White
-     1	Rylan Dmello
-     1	Stephen Toub
-     1	Steven Fackler
-     1	Suvayu Ali
-     1	Tanya Schlusser
-     1	Todd A. Anderson
-     1	Yu Ding
-     1	Yurui Zhou
-     1	alexandreyc
-     1	dsisnero
-     1	fjetter
-     1	praveenbingo
-     1	praveenkrishna
-     1	yl187661
-```
-
-## Patch Committers
-
-The following Apache committers merged contributed patches to the repository.
-
-```console
-$ git shortlog -csn apache-arrow-0.12.0..apache-arrow-0.13.0
-   193	Wes McKinney
-    95	Krisztián Szűcs
-    66	Uwe L. Korn
-    53	Kouhei Sutou
-    40	Antoine Pitrou
-    22	Andy Grove
-    21	Korn, Uwe
-    11	Pindikura Ravindra
-     9	Yosuke Shiro
-     8	Brian Hulette
-     7	Chao Sun
-     5	Philipp Moritz
-     5	Sebastien Binet
-     4	Bryan Cutler
-     2	GitHub
-     2	Paddy Horan
-     2	Robert Nishihara
-     2	Sidd
-     2	ptaylor
-     1	Micah Kornfield
-```
-
-## Changelog
-
-### New Features and Improvements
-
-* [ARROW-1425](https://issues.apache.org/jira/browse/ARROW-1425) - [Python] Document semantic differences between Spark timestamps and Arrow timestamps
-* [ARROW-1572](https://issues.apache.org/jira/browse/ARROW-1572) - [C++] Implement "value counts" kernels for tabulating value frequencies
-* [ARROW-1639](https://issues.apache.org/jira/browse/ARROW-1639) - [Python] More efficient serialization for RangeIndex in serialize\_pandas
-* [ARROW-1807](https://issues.apache.org/jira/browse/ARROW-1807) - [JAVA] Reduce Heap Usage (Phase 3): consolidate buffers
-* [ARROW-1896](https://issues.apache.org/jira/browse/ARROW-1896) - [C++] Do not allocate memory for primitive outputs in CastKernel::Call implementation
-* [ARROW-2015](https://issues.apache.org/jira/browse/ARROW-2015) - [Java] Use Java Time and Date APIs instead of JodaTime
-* [ARROW-2022](https://issues.apache.org/jira/browse/ARROW-2022) - [Format] Add custom metadata field specific to a RecordBatch message
-* [ARROW-2112](https://issues.apache.org/jira/browse/ARROW-2112) - [C++] Enable cpplint to be run on Windows
-* [ARROW-2409](https://issues.apache.org/jira/browse/ARROW-2409) - [Rust] Test for build warnings, remove current warnings
-* [ARROW-2627](https://issues.apache.org/jira/browse/ARROW-2627) - [Python] Add option (or some equivalent) to toggle memory mapping functionality when using parquet.ParquetFile or other read entry points
-* [ARROW-3107](https://issues.apache.org/jira/browse/ARROW-3107) - [C++] arrow::PrettyPrint for Column instances
-* [ARROW-3121](https://issues.apache.org/jira/browse/ARROW-3121) - [C++]  Mean kernel aggregate
-* [ARROW-3123](https://issues.apache.org/jira/browse/ARROW-3123) - [C++] Incremental Count, Count Not Null aggregator
-* [ARROW-3135](https://issues.apache.org/jira/browse/ARROW-3135) - [C++] Add helper functions for validity bitmap propagation in kernel context
-* [ARROW-3149](https://issues.apache.org/jira/browse/ARROW-3149) - [C++] Use gRPC (when it exists) from conda-forge for CI builds
-* [ARROW-3162](https://issues.apache.org/jira/browse/ARROW-3162) - [Python] Enable Flight servers to be implemented in pure Python
-* [ARROW-3239](https://issues.apache.org/jira/browse/ARROW-3239) - [C++] Improve random data generation functions
-* [ARROW-3289](https://issues.apache.org/jira/browse/ARROW-3289) - [C++] Implement DoPut command for Flight on client and server side  
-* [ARROW-3292](https://issues.apache.org/jira/browse/ARROW-3292) - [C++] Test Flight RPC in Travis CI
-* [ARROW-3297](https://issues.apache.org/jira/browse/ARROW-3297) - [Python] Python bindings for Flight C++ client
-* [ARROW-331](https://issues.apache.org/jira/browse/ARROW-331) - [Python] Timeline for dropping Python 2.7 support
-* [ARROW-3311](https://issues.apache.org/jira/browse/ARROW-3311) - [R] Functions for deserializing IPC components from arrow::Buffer or from IO interface
-* [ARROW-3361](https://issues.apache.org/jira/browse/ARROW-3361) - [R] Run cpp/build-support/cpplint.py on C++ source files
-* [ARROW-3364](https://issues.apache.org/jira/browse/ARROW-3364) - [Doc] Document docker compose setup
-* [ARROW-3367](https://issues.apache.org/jira/browse/ARROW-3367) - [INTEGRATION] Port Spark integration test to the docker-compose setup
-* [ARROW-3422](https://issues.apache.org/jira/browse/ARROW-3422) - [C++] Add "toolchain" target to ensure that all required toolchain libraries are built
-* [ARROW-3434](https://issues.apache.org/jira/browse/ARROW-3434) - [Packaging] Add Apache ORC C++ library to conda-forge
-* [ARROW-3511](https://issues.apache.org/jira/browse/ARROW-3511) - [Gandiva] support input selection vectors for both projector and filter
-* [ARROW-3532](https://issues.apache.org/jira/browse/ARROW-3532) - [Python] Schema, StructType, StructArray field retrieval by name should raise warning or exception for multiple matches
-* [ARROW-3550](https://issues.apache.org/jira/browse/ARROW-3550) - [C++] Use kUnknownNullCount in NumericArray constructor
-* [ARROW-3554](https://issues.apache.org/jira/browse/ARROW-3554) - [C++] Reverse traits for C++
-* [ARROW-3596](https://issues.apache.org/jira/browse/ARROW-3596) - [Packaging] Build gRPC in conda-forge
-* [ARROW-3619](https://issues.apache.org/jira/browse/ARROW-3619) - [R] Expose global thread pool optins
-* [ARROW-3631](https://issues.apache.org/jira/browse/ARROW-3631) - [C#] Add Appveyor build for C#
-* [ARROW-3653](https://issues.apache.org/jira/browse/ARROW-3653) - [Python/C++] Support data copying between different GPU devices
-* [ARROW-3735](https://issues.apache.org/jira/browse/ARROW-3735) - [Python] Proper error handling in \_ensure\_type
-* [ARROW-3761](https://issues.apache.org/jira/browse/ARROW-3761) - [R] Bindings for CompressedInputStream, CompressedOutputStream
-* [ARROW-3769](https://issues.apache.org/jira/browse/ARROW-3769) - [C++] Support reading non-dictionary encoded binary Parquet columns directly as DictionaryArray
-* [ARROW-3770](https://issues.apache.org/jira/browse/ARROW-3770) - [C++] Validate or add option to validate arrow::Table schema in parquet::arrow::FileWriter::WriteTable
-* [ARROW-3816](https://issues.apache.org/jira/browse/ARROW-3816) - [R] nrow.RecordBatch method
-* [ARROW-3824](https://issues.apache.org/jira/browse/ARROW-3824) - [R] Document developer workflow for building project, running unit tests in r/README.md
-* [ARROW-3838](https://issues.apache.org/jira/browse/ARROW-3838) - [Rust] Implement CSV Writer
-* [ARROW-3846](https://issues.apache.org/jira/browse/ARROW-3846) - [Gandiva] Build on Windows
-* [ARROW-3882](https://issues.apache.org/jira/browse/ARROW-3882) - [Rust] PrimitiveArray<T> should support cast operations
-* [ARROW-3903](https://issues.apache.org/jira/browse/ARROW-3903) - [Python] Random array generator for Arrow conversion and Parquet testing
-* [ARROW-3926](https://issues.apache.org/jira/browse/ARROW-3926) - [Python] Add Gandiva bindings to Python wheels
-* [ARROW-3951](https://issues.apache.org/jira/browse/ARROW-3951) - [Go] implement a CSV writer
-* [ARROW-3954](https://issues.apache.org/jira/browse/ARROW-3954) - [Rust] Add Slice to Array and ArrayData
-* [ARROW-3965](https://issues.apache.org/jira/browse/ARROW-3965) - [Java] JDBC-to-Arrow Conversion: Configuration Object
-* [ARROW-3966](https://issues.apache.org/jira/browse/ARROW-3966) - [Java] JDBC-to-Arrow Conversion: JDBC Metadata in Schema Fields
-* [ARROW-3972](https://issues.apache.org/jira/browse/ARROW-3972) - [C++] Update to LLVM and Clang bits to 7.0
-* [ARROW-3981](https://issues.apache.org/jira/browse/ARROW-3981) - [C++] Rename json.h
-* [ARROW-3985](https://issues.apache.org/jira/browse/ARROW-3985) - [C++] Pass -C option when compiling with ccache to avoid some warnings
-* [ARROW-4012](https://issues.apache.org/jira/browse/ARROW-4012) - [Documentation][C++] Document how to install Apache Arrow on MSYS2
-* [ARROW-4014](https://issues.apache.org/jira/browse/ARROW-4014) - [C++] Fix "LIBCMT" warnings on MSVC
-* [ARROW-4024](https://issues.apache.org/jira/browse/ARROW-4024) - [Python] Cython compilation error on cython==0.27.3
-* [ARROW-4031](https://issues.apache.org/jira/browse/ARROW-4031) - [C++] Refactor ArrayBuilder bitmap logic into TypedBufferBuilder<bool>
-* [ARROW-4056](https://issues.apache.org/jira/browse/ARROW-4056) - [C++] Upgrade to boost-cpp 1.69.0 again
-* [ARROW-4061](https://issues.apache.org/jira/browse/ARROW-4061) - [Rust] [Parquet] Implement "spaced" version for non-dictionary encoding/decoding
-* [ARROW-4071](https://issues.apache.org/jira/browse/ARROW-4071) - [Rust] Add rustfmt as a pre-commit hook
-* [ARROW-4072](https://issues.apache.org/jira/browse/ARROW-4072) - [Rust] Set default value for PARQUET\_TEST\_DATA
-* [ARROW-4094](https://issues.apache.org/jira/browse/ARROW-4094) - [Python] Store RangeIndex in Parquet files as metadata rather than a physical data column
-* [ARROW-4110](https://issues.apache.org/jira/browse/ARROW-4110) - [C++] Do not generate distinct cast kernels when input and output type are the same
-* [ARROW-4123](https://issues.apache.org/jira/browse/ARROW-4123) - [C++] Improve linting workflow and documentation for Windows-based developers
-* [ARROW-4124](https://issues.apache.org/jira/browse/ARROW-4124) - [C++] Abstract aggregation kernel API
-* [ARROW-4142](https://issues.apache.org/jira/browse/ARROW-4142) - [Java] JDBC-to-Arrow: JDBC Arrays
-* [ARROW-4165](https://issues.apache.org/jira/browse/ARROW-4165) - [C++] Port cpp/apidoc/Windows.md and other files to Sphinx / rst
-* [ARROW-4180](https://issues.apache.org/jira/browse/ARROW-4180) - [Java] Reduce verbose logging of ArrowBuf creation events?
-* [ARROW-4196](https://issues.apache.org/jira/browse/ARROW-4196) - [Rust] Add explicit SIMD vectorization for arithmetic ops in "array\_ops"
-* [ARROW-4198](https://issues.apache.org/jira/browse/ARROW-4198) - [Gandiva] Add support to cast timestamp
-* [ARROW-4204](https://issues.apache.org/jira/browse/ARROW-4204) - [Gandiva] implement decimal subtract
-* [ARROW-4205](https://issues.apache.org/jira/browse/ARROW-4205) - [Gandiva] Implement decimal multiply
-* [ARROW-4206](https://issues.apache.org/jira/browse/ARROW-4206) - [Gandiva] Implement decimal divide
-* [ARROW-4212](https://issues.apache.org/jira/browse/ARROW-4212) - [Python] [CUDA] Creating a CUDA buffer from Numba device array should be easier
-* [ARROW-4230](https://issues.apache.org/jira/browse/ARROW-4230) - [C++] Enable building flight against system gRPC
-* [ARROW-4234](https://issues.apache.org/jira/browse/ARROW-4234) - [C++] Add memory bandwidth benchmarks to arrow/util/machine-benchmark.cc
-* [ARROW-4235](https://issues.apache.org/jira/browse/ARROW-4235) - [GLib] Use "column\_builder" in GArrowRecordBatchBuilder
-* [ARROW-4236](https://issues.apache.org/jira/browse/ARROW-4236) - [JAVA] Distinct plasma client create exceptions
-* [ARROW-4245](https://issues.apache.org/jira/browse/ARROW-4245) - [Rust] Add Rustdoc header to each source file
-* [ARROW-4247](https://issues.apache.org/jira/browse/ARROW-4247) - [Packaging] Update verify script for 0.12.0
-* [ARROW-4251](https://issues.apache.org/jira/browse/ARROW-4251) - [C++] Add option to use vendored Boost in verify-release-candidate.sh
-* [ARROW-4262](https://issues.apache.org/jira/browse/ARROW-4262) - [Website] Blog post to give preview into using R and Arrow with Apache Spark
-* [ARROW-4263](https://issues.apache.org/jira/browse/ARROW-4263) - [Rust] Donate DataFusion
-* [ARROW-4265](https://issues.apache.org/jira/browse/ARROW-4265) - [C++] Automatic conversion between Table and std::vector<std::tuple<..>>
-* [ARROW-4268](https://issues.apache.org/jira/browse/ARROW-4268) - [C++] Add C primitive to Arrow:Type compile time in TypeTraits
-* [ARROW-4271](https://issues.apache.org/jira/browse/ARROW-4271) - [Rust] Move Parquet specific info to Parquet Readme
-* [ARROW-4273](https://issues.apache.org/jira/browse/ARROW-4273) - [Release] Fix verification script to use cf201901 conda-forge label
-* [ARROW-4277](https://issues.apache.org/jira/browse/ARROW-4277) - [C++] Add gmock to toolchain
-* [ARROW-4281](https://issues.apache.org/jira/browse/ARROW-4281) - [CI] Use Ubuntu Xenial (16.04) VMs on Travis-CI
-* [ARROW-4285](https://issues.apache.org/jira/browse/ARROW-4285) - [Python] Use proper builder interface for serialization
-* [ARROW-4287](https://issues.apache.org/jira/browse/ARROW-4287) - [C++] Ensure minimal bison version on OSX for Thrift
-* [ARROW-4289](https://issues.apache.org/jira/browse/ARROW-4289) - [C++] Forward AR and RANLIB to thirdparty builds
-* [ARROW-4290](https://issues.apache.org/jira/browse/ARROW-4290) - [C++/Gandiva] Support detecting correct LLVM version in Homebrew
-* [ARROW-4291](https://issues.apache.org/jira/browse/ARROW-4291) - [Dev] Support selecting features in release scripts
-* [ARROW-4294](https://issues.apache.org/jira/browse/ARROW-4294) - [Plasma] Add support for evicting objects to external store
-* [ARROW-4297](https://issues.apache.org/jira/browse/ARROW-4297) - [C++] Fix build for 32-bit MSYS2
-* [ARROW-4298](https://issues.apache.org/jira/browse/ARROW-4298) - [Java] Building Flight fails with OpenJDK 11
-* [ARROW-4299](https://issues.apache.org/jira/browse/ARROW-4299) - [Ruby] Depend on the same version as Red Arrow
-* [ARROW-4300](https://issues.apache.org/jira/browse/ARROW-4300) - [C++] Restore apache-arrow Homebrew recipe and define process for maintaining and updating for releases
-* [ARROW-4303](https://issues.apache.org/jira/browse/ARROW-4303) - [Gandiva/Python] Build LLVM with RTTI in manylinux1 container
-* [ARROW-4305](https://issues.apache.org/jira/browse/ARROW-4305) - [Rust] Fix parquet version number in README
-* [ARROW-4307](https://issues.apache.org/jira/browse/ARROW-4307) - [C++] FIx doxygen warnings, include doxygen warning checks in CI linting
-* [ARROW-4310](https://issues.apache.org/jira/browse/ARROW-4310) - [Website] Update install document for 0.12.0
-* [ARROW-4313](https://issues.apache.org/jira/browse/ARROW-4313) - Define general benchmark database schema
-* [ARROW-4315](https://issues.apache.org/jira/browse/ARROW-4315) - [Website] Home page of https://arrow.apache.org/ does not mention Go or Rust
-* [ARROW-4318](https://issues.apache.org/jira/browse/ARROW-4318) - [C++] Add Tensor::CountNonZero
-* [ARROW-4321](https://issues.apache.org/jira/browse/ARROW-4321) - [CI] Setup conda-forge channel globally in docker containers
-* [ARROW-4330](https://issues.apache.org/jira/browse/ARROW-4330) - [C++] Use FindThreads.cmake to handle -pthread compiler/link options
-* [ARROW-4332](https://issues.apache.org/jira/browse/ARROW-4332) - [Website] Instructions and scripts for publishing web site appear to be incorrect
-* [ARROW-4334](https://issues.apache.org/jira/browse/ARROW-4334) - [CI] Setup conda-forge channel globally in travis builds
-* [ARROW-4335](https://issues.apache.org/jira/browse/ARROW-4335) - [C++] Better document sparse tensor support
-* [ARROW-4336](https://issues.apache.org/jira/browse/ARROW-4336) - [C++] Default BUILD\_WARNING\_LEVEL to CHECKIN
-* [ARROW-4339](https://issues.apache.org/jira/browse/ARROW-4339) - [C++] rewrite cpp/README shorter, with a separate contribution guide
-* [ARROW-4340](https://issues.apache.org/jira/browse/ARROW-4340) - [C++] Update IWYU version in the \`lint\` dockerfile
-* [ARROW-4341](https://issues.apache.org/jira/browse/ARROW-4341) - [C++] Use TypedBufferBuilder<bool> in BooleanBuilder
-* [ARROW-4344](https://issues.apache.org/jira/browse/ARROW-4344) - [Java] Further cleanup maven output
-* [ARROW-4345](https://issues.apache.org/jira/browse/ARROW-4345) - [C++] Add Apache 2.0 license file to the Parquet-testing repository
-* [ARROW-4346](https://issues.apache.org/jira/browse/ARROW-4346) - [C++] Fix compiler warnings with gcc 8.2.0
-* [ARROW-4352](https://issues.apache.org/jira/browse/ARROW-4352) - [C++] Add support for system Google Test
-* [ARROW-4353](https://issues.apache.org/jira/browse/ARROW-4353) - [CI] Add jobs for 32-bit and 64-bit MinGW
-* [ARROW-4358](https://issues.apache.org/jira/browse/ARROW-4358) - [Gandiva][Crossbow] Trusty build broken
-* [ARROW-4361](https://issues.apache.org/jira/browse/ARROW-4361) - [Website] Update commiters list
-* [ARROW-4362](https://issues.apache.org/jira/browse/ARROW-4362) - [Java] Test OpenJDK 11 in CI
-* [ARROW-4363](https://issues.apache.org/jira/browse/ARROW-4363) - [C++] Add CMake format checks
-* [ARROW-4372](https://issues.apache.org/jira/browse/ARROW-4372) - [C++] Embed precompiled bitcode in the gandiva library
-* [ARROW-4373](https://issues.apache.org/jira/browse/ARROW-4373) - [Packaging] Travis fails to deploy conda packages on OSX
-* [ARROW-4375](https://issues.apache.org/jira/browse/ARROW-4375) - [CI] Sphinx dependencies were removed from docs conda environment
-* [ARROW-4376](https://issues.apache.org/jira/browse/ARROW-4376) - [Rust] Implement from\_buf\_reader for csv::Reader
-* [ARROW-4377](https://issues.apache.org/jira/browse/ARROW-4377) - [Rust] Implement std::fmt::Debug for all PrimitiveArrays
-* [ARROW-4379](https://issues.apache.org/jira/browse/ARROW-4379) - Register pyarrow serializers for collections.Counter and collections.deque.
-* [ARROW-4383](https://issues.apache.org/jira/browse/ARROW-4383) - [C++] Use the CMake's standard find features
-* [ARROW-4386](https://issues.apache.org/jira/browse/ARROW-4386) - [Rust] Implement Date and Time Arrays
-* [ARROW-4388](https://issues.apache.org/jira/browse/ARROW-4388) - [Go] add DimNames() method to tensor Interface?
-* [ARROW-4393](https://issues.apache.org/jira/browse/ARROW-4393) - [Rust] coding style: apply 90 characters per line limit
-* [ARROW-4396](https://issues.apache.org/jira/browse/ARROW-4396) - Update Typedoc to support TypeScript 3.2
-* [ARROW-4397](https://issues.apache.org/jira/browse/ARROW-4397) - [C++] dim\_names in Tensor and SparseTensor
-* [ARROW-4399](https://issues.apache.org/jira/browse/ARROW-4399) - [C++] Remove usage of "extern template class" from NumericArray<T>
-* [ARROW-4401](https://issues.apache.org/jira/browse/ARROW-4401) - [Python] Alpine dockerfile fails to build because pandas requires numpy as build dependency
-* [ARROW-4406](https://issues.apache.org/jira/browse/ARROW-4406) - Ignore "\*\_$folder$" files on S3
-* [ARROW-4408](https://issues.apache.org/jira/browse/ARROW-4408) - [CPP/Doc] Remove outdated Parquet documentation
-* [ARROW-4422](https://issues.apache.org/jira/browse/ARROW-4422) - [Plasma] Enforce memory limit in plasma, rather than relying on dlmalloc\_set\_footprint\_limit
-* [ARROW-4423](https://issues.apache.org/jira/browse/ARROW-4423) - [C++] Update version of vendored gtest to 1.8.1
-* [ARROW-4424](https://issues.apache.org/jira/browse/ARROW-4424) - [Python] Manylinux CI builds failing
-* [ARROW-4425](https://issues.apache.org/jira/browse/ARROW-4425) - Add link to 'Contributing' page in the top-level Arrow README
-* [ARROW-4430](https://issues.apache.org/jira/browse/ARROW-4430) - [C++] add unit test for currently unused append method
-* [ARROW-4431](https://issues.apache.org/jira/browse/ARROW-4431) - [C++] Build gRPC as ExternalProject without allowing it to build its vendored dependencies
-* [ARROW-4435](https://issues.apache.org/jira/browse/ARROW-4435) - [C#] Add .sln file and minor .csproj fix ups
-* [ARROW-4436](https://issues.apache.org/jira/browse/ARROW-4436) - [Documentation] Clarify instructions for building documentation
-* [ARROW-4442](https://issues.apache.org/jira/browse/ARROW-4442) - [JS] Overly broad type annotation for Chunked typeId leading to type mismatches in generated typing
-* [ARROW-4444](https://issues.apache.org/jira/browse/ARROW-4444) - [Testing] Add DataFusion test files to arrow-testing repo
-* [ARROW-4445](https://issues.apache.org/jira/browse/ARROW-4445) - [C++][Gandiva] Run Gandiva-LLVM tests in Appveyor
-* [ARROW-4446](https://issues.apache.org/jira/browse/ARROW-4446) - [Python] Run Gandiva tests on Windows and Appveyor
-* [ARROW-4448](https://issues.apache.org/jira/browse/ARROW-4448) - [JAVA][Flight] Flaky Flight java test
-* [ARROW-4449](https://issues.apache.org/jira/browse/ARROW-4449) - [Rust] Convert File to T: Read + Seek for schema inference
-* [ARROW-4454](https://issues.apache.org/jira/browse/ARROW-4454) - [C++] fix unused parameter warnings
-* [ARROW-4455](https://issues.apache.org/jira/browse/ARROW-4455) - [Plasma] g++ 8 reports class-memaccess warnings
-* [ARROW-4459](https://issues.apache.org/jira/browse/ARROW-4459) - [Testing] Add git submodule for arrow-testing data files
-* [ARROW-4460](https://issues.apache.org/jira/browse/ARROW-4460) - [Website] Write blog post to announce DataFusion donation
-* [ARROW-4461](https://issues.apache.org/jira/browse/ARROW-4461) - [C++] Expose bit-util methods for binary boolean operations that don't allocate
-* [ARROW-4462](https://issues.apache.org/jira/browse/ARROW-4462) - [C++] Upgrade LZ4 v1.7.5 to v1.8.3 to compile with VS2017
-* [ARROW-4464](https://issues.apache.org/jira/browse/ARROW-4464) - [Rust] [DataFusion] Add support for LIMIT
-* [ARROW-4466](https://issues.apache.org/jira/browse/ARROW-4466) - [Rust] [DataFusion] Add support for Parquet data sources
-* [ARROW-4468](https://issues.apache.org/jira/browse/ARROW-4468) - [Rust] Implement BitAnd/BitOr for &Buffer (with SIMD)
-* [ARROW-4472](https://issues.apache.org/jira/browse/ARROW-4472) - [Website][Python] Blog post about Python string memory use improvements in 0.12
-* [ARROW-4475](https://issues.apache.org/jira/browse/ARROW-4475) - [Python] Serializing objects that contain themselves
-* [ARROW-4476](https://issues.apache.org/jira/browse/ARROW-4476) - [Rust] [DataFusion] Post donation clean up tasks
-* [ARROW-4481](https://issues.apache.org/jira/browse/ARROW-4481) - [Website] Instructions for publishing web site are missing a step
-* [ARROW-4483](https://issues.apache.org/jira/browse/ARROW-4483) - [Website] Fix broken link (author) in DataFusion blog post
-* [ARROW-4485](https://issues.apache.org/jira/browse/ARROW-4485) - [CI] Determine maintenance approach to pinned conda-forge binutils package
-* [ARROW-4486](https://issues.apache.org/jira/browse/ARROW-4486) - [Python][CUDA] pyarrow.cuda.Context.foreign\_buffer should have a \`base=None\` argument
-* [ARROW-4488](https://issues.apache.org/jira/browse/ARROW-4488) - [Rust] From AsRef<[u8]> for Buffer does not ensure correct padding
-* [ARROW-4489](https://issues.apache.org/jira/browse/ARROW-4489) - [Rust] PrimitiveArray.value\_slice performs bounds checking when it should not
-* [ARROW-4490](https://issues.apache.org/jira/browse/ARROW-4490) - [Rust] Add explicit SIMD vectorization for boolean ops in "array\_ops"
-* [ARROW-4491](https://issues.apache.org/jira/browse/ARROW-4491) - [Python] Remove usage of std::to\_string and std::stoi
-* [ARROW-4499](https://issues.apache.org/jira/browse/ARROW-4499) - [Python][CI] Upgrade to latest flake8 3.7.5 in travis\_lint.sh
-* [ARROW-4502](https://issues.apache.org/jira/browse/ARROW-4502) - [C#] Add support for zero-copy reads
-* [ARROW-4506](https://issues.apache.org/jira/browse/ARROW-4506) - [Ruby] Add Arrow::RecordBatch#raw\_records
-* [ARROW-4513](https://issues.apache.org/jira/browse/ARROW-4513) - [Rust] Implement BitAnd/BitOr for &Bitmap
-* [ARROW-4518](https://issues.apache.org/jira/browse/ARROW-4518) - [JS] add jsdelivr to package.json
-* [ARROW-4528](https://issues.apache.org/jira/browse/ARROW-4528) - [C++] Update lint docker container to LLVM-7
-* [ARROW-4529](https://issues.apache.org/jira/browse/ARROW-4529) - [C++] Add test coverage for BitUtils::RoundDown
-* [ARROW-4531](https://issues.apache.org/jira/browse/ARROW-4531) - [C++] Handling of non-aligned slices in Sum kernel
-* [ARROW-4537](https://issues.apache.org/jira/browse/ARROW-4537) - [CI] Suppress shell warning on travis-ci
-* [ARROW-4539](https://issues.apache.org/jira/browse/ARROW-4539) - [Java]List vector child value count not set correctly
-* [ARROW-4540](https://issues.apache.org/jira/browse/ARROW-4540) - [Rust] Add basic JSON reader
-* [ARROW-4543](https://issues.apache.org/jira/browse/ARROW-4543) - [C#] Update Flat Buffers code to latest version
-* [ARROW-4547](https://issues.apache.org/jira/browse/ARROW-4547) - [Python][Documentation] Update python/development.rst with instructions for CUDA-enabled builds
-* [ARROW-4556](https://issues.apache.org/jira/browse/ARROW-4556) - [Rust] Preserve order of JSON inferred schema
-* [ARROW-4558](https://issues.apache.org/jira/browse/ARROW-4558) - [C++][Flight] Avoid undefined behavior with gRPC memory optimizations
-* [ARROW-4560](https://issues.apache.org/jira/browse/ARROW-4560) - [R] array() needs to take single input, not ...
-* [ARROW-4562](https://issues.apache.org/jira/browse/ARROW-4562) - [C++][Flight] Create outgoing composite grpc::ByteBuffer instead of allocating contiguous slice and copying IpcPayload into it
-* [ARROW-4565](https://issues.apache.org/jira/browse/ARROW-4565) - [R] Reading records with all non-null decimals SEGFAULTs
-* [ARROW-4568](https://issues.apache.org/jira/browse/ARROW-4568) - [C++] Add version macros to headers
-* [ARROW-4572](https://issues.apache.org/jira/browse/ARROW-4572) - [C++] Remove memory zeroing from PrimitiveAllocatingUnaryKernel
-* [ARROW-4583](https://issues.apache.org/jira/browse/ARROW-4583) - [Plasma] There are bugs reported by code scan tool
-* [ARROW-4586](https://issues.apache.org/jira/browse/ARROW-4586) - [Rust] Remove arrow/mod.rs as it is not needed
-* [ARROW-4590](https://issues.apache.org/jira/browse/ARROW-4590) - [Rust] Add explicit SIMD vectorization for comparison ops in "array\_ops"
-* [ARROW-4592](https://issues.apache.org/jira/browse/ARROW-4592) - [GLib] Stop configure immediately when GLib isn't available
-* [ARROW-4593](https://issues.apache.org/jira/browse/ARROW-4593) - [Ruby] Arrow::Array#[out\_of\_range] returns nil
-* [ARROW-4594](https://issues.apache.org/jira/browse/ARROW-4594) - [Ruby] Arrow::StructArray#[] returns Arrow::Struct instead of Arrow::Array
-* [ARROW-4595](https://issues.apache.org/jira/browse/ARROW-4595) - [Rust] [DataFusion] Implement DataFrame style API
-* [ARROW-4598](https://issues.apache.org/jira/browse/ARROW-4598) - [CI] Remove needless LLVM\_DIR for macOS
-* [ARROW-4599](https://issues.apache.org/jira/browse/ARROW-4599) - [C++] Add support for system GFlags
-* [ARROW-4602](https://issues.apache.org/jira/browse/ARROW-4602) - [Rust][ [DataFusion] Integrate query optimizer with ExecutionContext
-* [ARROW-4605](https://issues.apache.org/jira/browse/ARROW-4605) - [Rust] Move filter and limit code from DataFusion into compute module
-* [ARROW-4609](https://issues.apache.org/jira/browse/ARROW-4609) - [C++] Use google benchmark from toolchain
-* [ARROW-4610](https://issues.apache.org/jira/browse/ARROW-4610) - [Plasma] Avoid JNI from crashing
-* [ARROW-4611](https://issues.apache.org/jira/browse/ARROW-4611) - [C++] Rework CMake third-party logic
-* [ARROW-4612](https://issues.apache.org/jira/browse/ARROW-4612) - [Python] Use cython from PyPI for windows wheels build
-* [ARROW-4613](https://issues.apache.org/jira/browse/ARROW-4613) - [C++] Alpine build failing as libgtestd.so is not found
-* [ARROW-4614](https://issues.apache.org/jira/browse/ARROW-4614) - [C++/CI] Activate flight build in ci/docker\_build\_cpp.sh
-* [ARROW-4615](https://issues.apache.org/jira/browse/ARROW-4615) - [C++] Add checked\_pointer\_cast
-* [ARROW-4616](https://issues.apache.org/jira/browse/ARROW-4616) - [C++] Log message in BuildUtils as STATUS
-* [ARROW-4618](https://issues.apache.org/jira/browse/ARROW-4618) - [Docker] Makefile to build dependent docker images
-* [ARROW-4619](https://issues.apache.org/jira/browse/ARROW-4619) - [R]: Fix the autobrew script
-* [ARROW-4620](https://issues.apache.org/jira/browse/ARROW-4620) - [C#] Add unit tests for "Types" in arrow/csharp
-* [ARROW-4623](https://issues.apache.org/jira/browse/ARROW-4623) - [R] update Rcpp dependency
-* [ARROW-4628](https://issues.apache.org/jira/browse/ARROW-4628) - [Rust] [DataFusion] Implement type coercion query optimizer rule
-* [ARROW-4632](https://issues.apache.org/jira/browse/ARROW-4632) - [Ruby] Add BigDecimal#to\_arrow
-* [ARROW-4634](https://issues.apache.org/jira/browse/ARROW-4634) - [Rust] [Parquet] Reorganize test\_common mod to allow more test util codes.
-* [ARROW-4637](https://issues.apache.org/jira/browse/ARROW-4637) - [Python] Avoid importing Pandas unless necessary
-* [ARROW-4638](https://issues.apache.org/jira/browse/ARROW-4638) - [R] install instructions using brew
-* [ARROW-4640](https://issues.apache.org/jira/browse/ARROW-4640) - [Python] Add docker-compose configuration to build and test the project without pandas installed
-* [ARROW-4643](https://issues.apache.org/jira/browse/ARROW-4643) - [C++] Add compiler diagnostic color when using Ninja
-* [ARROW-4644](https://issues.apache.org/jira/browse/ARROW-4644) - [C++/Docker] Build Gandiva in the docker containers
-* [ARROW-4645](https://issues.apache.org/jira/browse/ARROW-4645) - [C++/Packaging] Ship Gandiva with OSX and Windows wheels
-* [ARROW-4646](https://issues.apache.org/jira/browse/ARROW-4646) - [C++/Packaging] Ship gandiva with the conda-forge packages
-* [ARROW-4655](https://issues.apache.org/jira/browse/ARROW-4655) - [Packaging] Parallelize binary upload
-* [ARROW-4662](https://issues.apache.org/jira/browse/ARROW-4662) - [Python] Add type\_codes property in UnionType
-* [ARROW-4667](https://issues.apache.org/jira/browse/ARROW-4667) - [C++] Suppress unused function warnings with MinGW
-* [ARROW-4670](https://issues.apache.org/jira/browse/ARROW-4670) - [Rust] compute::sum performance issue
-* [ARROW-4671](https://issues.apache.org/jira/browse/ARROW-4671) - [C++] MakeBuilder doesn't support Type::DICTIONARY
-* [ARROW-4673](https://issues.apache.org/jira/browse/ARROW-4673) - [C++] Implement AssertDatumEquals
-* [ARROW-4676](https://issues.apache.org/jira/browse/ARROW-4676) - [C++] Add support for debug build with MinGW
-* [ARROW-4678](https://issues.apache.org/jira/browse/ARROW-4678) - [Rust] Minimize unstable feature usage
-* [ARROW-4679](https://issues.apache.org/jira/browse/ARROW-4679) - [Rust] [DataFusion] Implement in-memory DataSource
-* [ARROW-4681](https://issues.apache.org/jira/browse/ARROW-4681) - [Rust] [DataFusion] Implement parallel query execution using threads
-* [ARROW-4686](https://issues.apache.org/jira/browse/ARROW-4686) - Only accept 'y' or 'n' in merge\_arrow\_pr.py prompts
-* [ARROW-4689](https://issues.apache.org/jira/browse/ARROW-4689) - [Go] add support for WASM
-* [ARROW-4690](https://issues.apache.org/jira/browse/ARROW-4690) - [Python] Building TensorFlow compatible wheels for Arrow
-* [ARROW-4692](https://issues.apache.org/jira/browse/ARROW-4692) - [Format][Documentation] Add more details about "sidecar" to flight proto
-* [ARROW-4693](https://issues.apache.org/jira/browse/ARROW-4693) - [CI] Build boost library with multi precision  
-* [ARROW-4697](https://issues.apache.org/jira/browse/ARROW-4697) - [C++] Add URI parsing facility
-* [ARROW-47](https://issues.apache.org/jira/browse/ARROW-47) - [C++] Consider adding a scalar type object model
-* [ARROW-4705](https://issues.apache.org/jira/browse/ARROW-4705) - [Rust] CSV reader should show line number and error message when failing to parse a line
-* [ARROW-4707](https://issues.apache.org/jira/browse/ARROW-4707) - [C++] move BitsetStack to bit-util.h
-* [ARROW-4718](https://issues.apache.org/jira/browse/ARROW-4718) - Add ArrowStreamWriter/Reader ctors that leave open the underlying Stream
-* [ARROW-4727](https://issues.apache.org/jira/browse/ARROW-4727) - [Rust] Implement ability to check if two schemas are the same
-* [ARROW-4730](https://issues.apache.org/jira/browse/ARROW-4730) - [C++] Add docker-compose entry for testing Fedora build with system packages
-* [ARROW-4731](https://issues.apache.org/jira/browse/ARROW-4731) - [C++] Add docker-compose entry for testing Ubuntu Xenial build with system packages
-* [ARROW-4732](https://issues.apache.org/jira/browse/ARROW-4732) - [C++] Add docker-compose entry for testing Debian Testing build with system packages
-* [ARROW-4733](https://issues.apache.org/jira/browse/ARROW-4733) - [C++] Add CI entry that builds without the conda-forge toolchain but with system packages
-* [ARROW-4734](https://issues.apache.org/jira/browse/ARROW-4734) - [Go] Add option to write a header for CSV writer
-* [ARROW-4735](https://issues.apache.org/jira/browse/ARROW-4735) - [Go] Benchmark strconv.Format vs. fmt.Sprintf for CSV writer
-* [ARROW-4739](https://issues.apache.org/jira/browse/ARROW-4739) - [Rust] [DataFusion] It should be possible to share a logical plan between threads
-* [ARROW-4740](https://issues.apache.org/jira/browse/ARROW-4740) - [Java] Upgrade to JUnit 5 
-* [ARROW-4743](https://issues.apache.org/jira/browse/ARROW-4743) - [Java] Fix documentation in arrow memory module
-* [ARROW-4745](https://issues.apache.org/jira/browse/ARROW-4745) - [C++][Documentation] Document process for replicating static\_crt builds on windows
-* [ARROW-4749](https://issues.apache.org/jira/browse/ARROW-4749) - [Rust] RecordBatch::new() should return result instead of panicking
-* [ARROW-4751](https://issues.apache.org/jira/browse/ARROW-4751) - [C++] Add pkg-config to conda\_env\_cpp.yml
-* [ARROW-4754](https://issues.apache.org/jira/browse/ARROW-4754) - [CI][Java] Flaky TestAuth Flight test
-* [ARROW-4756](https://issues.apache.org/jira/browse/ARROW-4756) - [CI] document the procedure to update docker image for manylinux1 builds
-* [ARROW-4758](https://issues.apache.org/jira/browse/ARROW-4758) - [Flight] Build fails on Mac due to missing Schema\_generated.h
-* [ARROW-4769](https://issues.apache.org/jira/browse/ARROW-4769) - [Rust] Improve array limit function where max records > len
-* [ARROW-4772](https://issues.apache.org/jira/browse/ARROW-4772) - Provide new ORC adapter interface that allow user to specify row number
-* [ARROW-4776](https://issues.apache.org/jira/browse/ARROW-4776) - [C++] DictionaryBuilder should support bootstrapping from an existing dict type
-* [ARROW-4777](https://issues.apache.org/jira/browse/ARROW-4777) - [C++/Python] manylinux1: Update lz4 to 1.8.3
-* [ARROW-4778](https://issues.apache.org/jira/browse/ARROW-4778) - [C++/Python] manylinux1: Update Thrift to 0.12.0
-* [ARROW-4782](https://issues.apache.org/jira/browse/ARROW-4782) - [C++] Prototype scalar and array expression types for developing deferred operator algebra
-* [ARROW-4786](https://issues.apache.org/jira/browse/ARROW-4786) - [C++/Python] Support better parallelisation in manylinux1 base build
-* [ARROW-4789](https://issues.apache.org/jira/browse/ARROW-4789) - [C++] Deprecate and and later remove arrow::io::ReadableFileInterface
-* [ARROW-4790](https://issues.apache.org/jira/browse/ARROW-4790) - [Python/Packaging] Update manylinux docker image in crossbow task
-* [ARROW-4791](https://issues.apache.org/jira/browse/ARROW-4791) - Unused dependencies in arrow and datafusion
-* [ARROW-4794](https://issues.apache.org/jira/browse/ARROW-4794) - [Python] Make pandas an optional test dependency
-* [ARROW-4797](https://issues.apache.org/jira/browse/ARROW-4797) - [Plasma] Avoid store crash if not enough memory is available
-* [ARROW-4801](https://issues.apache.org/jira/browse/ARROW-4801) - [GLib] Suppress pkgconfig.generate() warnings
-* [ARROW-4808](https://issues.apache.org/jira/browse/ARROW-4808) - [Java][Vector] Convenience methods for setting decimal vector
-* [ARROW-4817](https://issues.apache.org/jira/browse/ARROW-4817) - [Rust] [DataFusion] Small re-org of modules
-* [ARROW-4826](https://issues.apache.org/jira/browse/ARROW-4826) - [Go] export Flush method for CSV writer
-* [ARROW-4831](https://issues.apache.org/jira/browse/ARROW-4831) - [C++] CMAKE\_AR is not passed to ZSTD thirdparty dependency 
-* [ARROW-4833](https://issues.apache.org/jira/browse/ARROW-4833) - [Release] Document how to update the brew formula in the release management guide
-* [ARROW-4834](https://issues.apache.org/jira/browse/ARROW-4834) - [R] Feature flag to disable parquet
-* [ARROW-4835](https://issues.apache.org/jira/browse/ARROW-4835) - [GLib] Add boolean operations
-* [ARROW-4837](https://issues.apache.org/jira/browse/ARROW-4837) - [C++] Support c++filt on a custom path in the run-test.sh script
-* [ARROW-4839](https://issues.apache.org/jira/browse/ARROW-4839) - [C#] Add NuGet support
-* [ARROW-4846](https://issues.apache.org/jira/browse/ARROW-4846) - [Java] Update Jackson to 2.9.8
-* [ARROW-4849](https://issues.apache.org/jira/browse/ARROW-4849) - [C++] Add docker-compose entry for testing Ubuntu Bionic build with system packages
-* [ARROW-4854](https://issues.apache.org/jira/browse/ARROW-4854) - [Rust] Use Array Slice for limit kernel
-* [ARROW-4855](https://issues.apache.org/jira/browse/ARROW-4855) - [Packaging] Generate default package version based on cpp tags in crossbow.py
-* [ARROW-4858](https://issues.apache.org/jira/browse/ARROW-4858) - [Flight][Python] Enable custom FlightDataStream in Python
-* [ARROW-4859](https://issues.apache.org/jira/browse/ARROW-4859) - [GLib] Add garrow\_numeric\_array\_mean()
-* [ARROW-4862](https://issues.apache.org/jira/browse/ARROW-4862) - [GLib] Add GArrowCastOptions::allow-invalid-utf8 property
-* [ARROW-4865](https://issues.apache.org/jira/browse/ARROW-4865) - [Rust] Support casting lists and primitives to lists
-* [ARROW-4873](https://issues.apache.org/jira/browse/ARROW-4873) - [C++] Clarify documentation about how to use external ARROW\_PACKAGE\_PREFIX while also using CONDA dependency resolution
-* [ARROW-4878](https://issues.apache.org/jira/browse/ARROW-4878) - [C++] ARROW\_DEPENDENCY\_SOURCE=CONDA does not work properly with MSVC
-* [ARROW-4882](https://issues.apache.org/jira/browse/ARROW-4882) - [GLib] Add "Sum" functions
-* [ARROW-4887](https://issues.apache.org/jira/browse/ARROW-4887) - [GLib] Add garrow\_array\_count()
-* [ARROW-4889](https://issues.apache.org/jira/browse/ARROW-4889) - [C++] Add STATUS messages for Protobuf in CMake
-* [ARROW-4891](https://issues.apache.org/jira/browse/ARROW-4891) - [C++] ZLIB include directories not added
-* [ARROW-4892](https://issues.apache.org/jira/browse/ARROW-4892) - [Rust] [DataFusion] Move SQL parser and planner into sql package
-* [ARROW-4893](https://issues.apache.org/jira/browse/ARROW-4893) - [C++] conda packages should use $PREFIX inside of conda-build
-* [ARROW-4894](https://issues.apache.org/jira/browse/ARROW-4894) - [Rust] [DataFusion] Remove all uses of panic! from aggregate.rs
-* [ARROW-4895](https://issues.apache.org/jira/browse/ARROW-4895) - [Rust] [DataFusion] Move error.rs to top level package
-* [ARROW-4896](https://issues.apache.org/jira/browse/ARROW-4896) - [Rust] [DataFusion] Remove all uses of panic! from tests
-* [ARROW-4897](https://issues.apache.org/jira/browse/ARROW-4897) - [Rust] [DataFusion] Improve Rustdoc
-* [ARROW-4898](https://issues.apache.org/jira/browse/ARROW-4898) - [C++] Old versions of FindProtobuf.cmake use ALL-CAPS for variables
-* [ARROW-4899](https://issues.apache.org/jira/browse/ARROW-4899) - [Rust] [DataFusion] Remove all uses of panic! from expression.rs
-* [ARROW-4901](https://issues.apache.org/jira/browse/ARROW-4901) - [Go] Run tests in Appveyor
-* [ARROW-4905](https://issues.apache.org/jira/browse/ARROW-4905) - [C++][Plasma] Remove dlmalloc from client library
-* [ARROW-4907](https://issues.apache.org/jira/browse/ARROW-4907) - [CI] Add docker container to inspect docker context
-* [ARROW-4908](https://issues.apache.org/jira/browse/ARROW-4908) - [Rust] [DataFusion] Add support for parquet date/time in int32/64 encoding
-* [ARROW-4909](https://issues.apache.org/jira/browse/ARROW-4909) - [CI] Use hadolint to lint Dockerfiles
-* [ARROW-4910](https://issues.apache.org/jira/browse/ARROW-4910) - [Rust] [DataFusion] Remove all uses of unimplemented!
-* [ARROW-4915](https://issues.apache.org/jira/browse/ARROW-4915) - [GLib] Add support for arrow::NullBuilder
-* [ARROW-4922](https://issues.apache.org/jira/browse/ARROW-4922) - [Packaging] Use system libraris for .deb and .rpm
-* [ARROW-4924](https://issues.apache.org/jira/browse/ARROW-4924) - [Ruby] Add Decimal128#to\_s(scale=nil)
-* [ARROW-4926](https://issues.apache.org/jira/browse/ARROW-4926) - [Rust] [DataFusion] Update README for 0.13.0 release
-* [ARROW-4929](https://issues.apache.org/jira/browse/ARROW-4929) - [GLib] Add garrow\_array\_count\_values()
-* [ARROW-4932](https://issues.apache.org/jira/browse/ARROW-4932) - [GLib] Use G\_DECLARE\_DERIVABLE\_TYPE macro
-* [ARROW-4933](https://issues.apache.org/jira/browse/ARROW-4933) - [R] Autodetect Parquet support using pkg-config
-* [ARROW-4937](https://issues.apache.org/jira/browse/ARROW-4937) - [R] Clean pkg-config related logic
-* [ARROW-4939](https://issues.apache.org/jira/browse/ARROW-4939) - [Python] Add wrapper for "sum" kernel
-* [ARROW-4940](https://issues.apache.org/jira/browse/ARROW-4940) - [Rust] Enhance documentation for datafusion
-* [ARROW-4944](https://issues.apache.org/jira/browse/ARROW-4944) - [C++] Raise minimal required thrift-cpp to 0.11 in conda environment
-* [ARROW-4946](https://issues.apache.org/jira/browse/ARROW-4946) - [C++] Support detection of flatbuffers without FlatbuffersConfig.cmake 
-* [ARROW-4947](https://issues.apache.org/jira/browse/ARROW-4947) - [Flight][C++/Python] Remove redundant schema parameter in DoGet
-* [ARROW-4951](https://issues.apache.org/jira/browse/ARROW-4951) - [C++] Turn off cpp benchmarks in cpp docker images
-* [ARROW-4955](https://issues.apache.org/jira/browse/ARROW-4955) - [GLib] Add garrow\_file\_is\_closed()
-* [ARROW-4964](https://issues.apache.org/jira/browse/ARROW-4964) - [Ruby] Add closed check if available on auto close
-* [ARROW-4969](https://issues.apache.org/jira/browse/ARROW-4969) - [C++] Set RPATH in correct order for test executables on OSX
-* [ARROW-4977](https://issues.apache.org/jira/browse/ARROW-4977) - [Ruby] Add support for building on Windows
-* [ARROW-4978](https://issues.apache.org/jira/browse/ARROW-4978) - [Ruby] Fix wrong internal variable name for table data
-* [ARROW-4979](https://issues.apache.org/jira/browse/ARROW-4979) - [GLib] Add missing lock to garrow::GIOInputStream
-* [ARROW-4980](https://issues.apache.org/jira/browse/ARROW-4980) - [GLib] Use GInputStream as the parent of GArrowInputStream
-* [ARROW-4981](https://issues.apache.org/jira/browse/ARROW-4981) - [Ruby] Add support for CSV data encoding conversion
-* [ARROW-4983](https://issues.apache.org/jira/browse/ARROW-4983) - [Plasma] Unmap memory when the client is destroyed
-* [ARROW-4994](https://issues.apache.org/jira/browse/ARROW-4994) - [website] Update Details for ptgoetz
-* [ARROW-4995](https://issues.apache.org/jira/browse/ARROW-4995) - [R] Make sure winbuilder tests pass for package
-* [ARROW-4996](https://issues.apache.org/jira/browse/ARROW-4996) - [Plasma] There are many log files in /tmp
-* [ARROW-5003](https://issues.apache.org/jira/browse/ARROW-5003) - [R] remove dependency on withr
-* [ARROW-5006](https://issues.apache.org/jira/browse/ARROW-5006) - [R] parquet.cpp does not include enough Rcpp
-* [ARROW-5011](https://issues.apache.org/jira/browse/ARROW-5011) - [Release] Add support in the source release script for custom hash
-* [ARROW-5013](https://issues.apache.org/jira/browse/ARROW-5013) - [Rust] [DataFusion] Refactor runtime expression support
-* [ARROW-5014](https://issues.apache.org/jira/browse/ARROW-5014) - [Java] Fix typos in Flight module
-* [ARROW-5018](https://issues.apache.org/jira/browse/ARROW-5018) - [Release] Include JavaScript implementation
-* [ARROW-5032](https://issues.apache.org/jira/browse/ARROW-5032) - [C++] Headers in vendored/datetime directory aren't installed
-* [ARROW-5041](https://issues.apache.org/jira/browse/ARROW-5041) - [Release][C++] use bundled gtest and gmock in verify-release-candidate.bat
-* [ARROW-549](https://issues.apache.org/jira/browse/ARROW-549) - [C++] Add function to concatenate like-typed arrays
-* [ARROW-572](https://issues.apache.org/jira/browse/ARROW-572) - [C++] Apply visitor pattern in IPC metadata
-* [ARROW-585](https://issues.apache.org/jira/browse/ARROW-585) - [C++] Define public API for user-defined data types
-* [ARROW-694](https://issues.apache.org/jira/browse/ARROW-694) - [C++] Build JSON "scanner" for reading record batches from line-delimited JSON files
-
-### Bug Fixes
-
-* [ARROW-2392](https://issues.apache.org/jira/browse/ARROW-2392) - [Python] pyarrow RecordBatchStreamWriter allows writing batches with different schemas
-* [ARROW-295](https://issues.apache.org/jira/browse/ARROW-295) - Create DOAP File
-* [ARROW-3086](https://issues.apache.org/jira/browse/ARROW-3086) - [Glib] GISCAN fails due to conda-shipped openblas
-* [ARROW-3096](https://issues.apache.org/jira/browse/ARROW-3096) - [Python] Update Python source build instructions given Anaconda/conda-forge toolchain migration
-* [ARROW-3133](https://issues.apache.org/jira/browse/ARROW-3133) - [C++] Logical boolean kernels in kernels/boolean.cc cannot write into preallocated memory
-* [ARROW-3208](https://issues.apache.org/jira/browse/ARROW-3208) - [C++] Segmentation fault when casting dictionary to numeric with nullptr valid\_bitmap 
-* [ARROW-3564](https://issues.apache.org/jira/browse/ARROW-3564) - [Python] writing version 2.0 parquet format with dictionary encoding enabled
-* [ARROW-3578](https://issues.apache.org/jira/browse/ARROW-3578) - [Release] Address spurious Apache RAT failures in source release script
-* [ARROW-3593](https://issues.apache.org/jira/browse/ARROW-3593) - [R] CI builds failing due to GitHub API rate limits
-* [ARROW-3606](https://issues.apache.org/jira/browse/ARROW-3606) - [Python] flake8 fails on Crossbow
-* [ARROW-3669](https://issues.apache.org/jira/browse/ARROW-3669) - [Python] Convert big-endian numbers or raise error in pyarrow.array
-* [ARROW-3843](https://issues.apache.org/jira/browse/ARROW-3843) - [Python] Writing Parquet file from empty table created with Table.from\_pandas(..., preserve\_index=False) fails
-* [ARROW-3923](https://issues.apache.org/jira/browse/ARROW-3923) - [Java] JDBC-to-Arrow Conversion: Unnecessary Calendar Requirement
-* [ARROW-4081](https://issues.apache.org/jira/browse/ARROW-4081) - [Go] Sum methods on Mac OS X panic when the array is empty
-* [ARROW-4104](https://issues.apache.org/jira/browse/ARROW-4104) - [Java] race in AllocationManager during release
-* [ARROW-4117](https://issues.apache.org/jira/browse/ARROW-4117) - [Python] "asv dev" command fails with latest revision
-* [ARROW-4181](https://issues.apache.org/jira/browse/ARROW-4181) - [Python] TestConvertStructTypes.test\_from\_numpy\_large failing
-* [ARROW-4192](https://issues.apache.org/jira/browse/ARROW-4192) - "./dev/run\_docker\_compose.sh" is out of date
-* [ARROW-4213](https://issues.apache.org/jira/browse/ARROW-4213) - [Flight] C++ and Java implementations are incompatible
-* [ARROW-4244](https://issues.apache.org/jira/browse/ARROW-4244) - Clarify language around padding/alignment
-* [ARROW-4250](https://issues.apache.org/jira/browse/ARROW-4250) - [C++][Gandiva] Use approximate comparisons for floating point numbers in gandiva-projector-test
-* [ARROW-4252](https://issues.apache.org/jira/browse/ARROW-4252) - [C++] Status error context strings missing lines of code
-* [ARROW-4253](https://issues.apache.org/jira/browse/ARROW-4253) - [GLib] Cannot use non-system Boost specified with $BOOST\_ROOT
-* [ARROW-4254](https://issues.apache.org/jira/browse/ARROW-4254) - [C++] Gandiva tests fail to compile with Boost in Ubuntu 14.04 apt
-* [ARROW-4255](https://issues.apache.org/jira/browse/ARROW-4255) - [C++] Schema::GetFieldIndex is not thread-safe
-* [ARROW-4261](https://issues.apache.org/jira/browse/ARROW-4261) - [C++] CMake paths for IPC, Flight, Thrift, and Plasma don't support using Arrow as a subproject
-* [ARROW-4264](https://issues.apache.org/jira/browse/ARROW-4264) - [C++] Document why DCHECKs are used in kernels
-* [ARROW-4267](https://issues.apache.org/jira/browse/ARROW-4267) - [Python/C++][Parquet] Segfault when reading rowgroups with duplicated columns
-* [ARROW-4274](https://issues.apache.org/jira/browse/ARROW-4274) - [Gandiva] static jni library broken after decimal changes
-* [ARROW-4275](https://issues.apache.org/jira/browse/ARROW-4275) - [C++] gandiva-decimal\_single\_test extremely slow
-* [ARROW-4280](https://issues.apache.org/jira/browse/ARROW-4280) - [C++][Documentation] It looks like flex and bison are required for parquet
-* [ARROW-4282](https://issues.apache.org/jira/browse/ARROW-4282) - [Rust] builder benchmark is broken
-* [ARROW-4284](https://issues.apache.org/jira/browse/ARROW-4284) - [C#] File / Stream serialization fails due to type mismatch / missing footer
-* [ARROW-4295](https://issues.apache.org/jira/browse/ARROW-4295) - [Plasma] Incorrect log message when evicting objects
-* [ARROW-4296](https://issues.apache.org/jira/browse/ARROW-4296) - [Plasma] Starting Plasma store with use\_one\_memory\_mapped\_file enabled crashes due to improper memory alignment
-* [ARROW-4312](https://issues.apache.org/jira/browse/ARROW-4312) - [C++] Lint doesn't work anymore ("[Errno 24] Too many open files")
-* [ARROW-4319](https://issues.apache.org/jira/browse/ARROW-4319) - plasma/store.h pulls ins flatbuffer dependency
-* [ARROW-4320](https://issues.apache.org/jira/browse/ARROW-4320) - [C++] Add tests for non-contiguous tensors
-* [ARROW-4322](https://issues.apache.org/jira/browse/ARROW-4322) - [CI] docker nightlies fails after conda-forge compiler migration
-* [ARROW-4323](https://issues.apache.org/jira/browse/ARROW-4323) - [Packaging] Fix failing OSX clang conda forge builds
-* [ARROW-4326](https://issues.apache.org/jira/browse/ARROW-4326) - [C++] Development instructions in python/development.rst will not work for many Linux distros with new conda-forge toolchain
-* [ARROW-4327](https://issues.apache.org/jira/browse/ARROW-4327) - [Python] Add requirements-build.txt file to simplify setting up Python build environment
-* [ARROW-4328](https://issues.apache.org/jira/browse/ARROW-4328) - Make R build compatible with DARROW\_TENSORFLOW=ON
-* [ARROW-4329](https://issues.apache.org/jira/browse/ARROW-4329) - Python should include the parquet headers
-* [ARROW-4342](https://issues.apache.org/jira/browse/ARROW-4342) - [Gandiva][Java] spurious failures in projector cache test
-* [ARROW-4347](https://issues.apache.org/jira/browse/ARROW-4347) - [Python] Run Python Travis CI unit tests on Linux when Java codebase changed
-* [ARROW-4349](https://issues.apache.org/jira/browse/ARROW-4349) - [C++] Build all benchmarks on Windows without failing
-* [ARROW-4351](https://issues.apache.org/jira/browse/ARROW-4351) - [C++] Fail to build with static parquet
-* [ARROW-4355](https://issues.apache.org/jira/browse/ARROW-4355) - [C++] test-util functions are no longer part of libarrow
-* [ARROW-4360](https://issues.apache.org/jira/browse/ARROW-4360) - [C++] Query homebrew for Thrift
-* [ARROW-4364](https://issues.apache.org/jira/browse/ARROW-4364) - [C++] Fix -weverything -wextra compilation errors
-* [ARROW-4366](https://issues.apache.org/jira/browse/ARROW-4366) - [Docs] Change extension from format/README.md to format/README.rst
-* [ARROW-4367](https://issues.apache.org/jira/browse/ARROW-4367) - [C++] StringDictionaryBuilder segfaults on Finish with only null entries
-* [ARROW-4368](https://issues.apache.org/jira/browse/ARROW-4368) - Bintray repository signature verification fails
-* [ARROW-4370](https://issues.apache.org/jira/browse/ARROW-4370) - [Python] Table to pandas conversion fails for list of bool
-* [ARROW-4374](https://issues.apache.org/jira/browse/ARROW-4374) - [C++] DictionaryBuilder does not correctly report length and null\_count
-* [ARROW-4381](https://issues.apache.org/jira/browse/ARROW-4381) - [Docker] docker-compose build lint fails
-* [ARROW-4385](https://issues.apache.org/jira/browse/ARROW-4385) - [Python] default\_version of a release should not include SNAPSHOT
-* [ARROW-4389](https://issues.apache.org/jira/browse/ARROW-4389) - [R] Installing clang-tools in CI is failing on trusty
-* [ARROW-4395](https://issues.apache.org/jira/browse/ARROW-4395) - ts-node throws type error running \`bin/arrow2csv.js\`
-* [ARROW-4400](https://issues.apache.org/jira/browse/ARROW-4400) - [CI] install of clang tools failing
-* [ARROW-4403](https://issues.apache.org/jira/browse/ARROW-4403) - [Rust] CI fails due to formatting errors
-* [ARROW-4404](https://issues.apache.org/jira/browse/ARROW-4404) - [CI] AppVeyor toolchain build does not build anything
-* [ARROW-4407](https://issues.apache.org/jira/browse/ARROW-4407) - [C++] ExternalProject\_Add does not capture CC/CXX correctly
-* [ARROW-4410](https://issues.apache.org/jira/browse/ARROW-4410) - [C++] Fix InvertKernel edge cases
-* [ARROW-4413](https://issues.apache.org/jira/browse/ARROW-4413) - [Python] pyarrow.hdfs.connect() failing
-* [ARROW-4414](https://issues.apache.org/jira/browse/ARROW-4414) - [C++] Stop using cmake COMMAND\_EXPAND\_LISTS because it breaks package builds for older distros
-* [ARROW-4417](https://issues.apache.org/jira/browse/ARROW-4417) - [C++] Doc build broken
-* [ARROW-4420](https://issues.apache.org/jira/browse/ARROW-4420) - [INTEGRATION] Make spark integration test pass and test against spark's master branch
-* [ARROW-4421](https://issues.apache.org/jira/browse/ARROW-4421) - [Flight][C++] Handle large Flight data messages
-* [ARROW-4434](https://issues.apache.org/jira/browse/ARROW-4434) - [Python] Cannot create empty StructArray via pa.StructArray.from\_arrays
-* [ARROW-4440](https://issues.apache.org/jira/browse/ARROW-4440) - [C++] Fix flatbuffers build using msvc
-* [ARROW-4457](https://issues.apache.org/jira/browse/ARROW-4457) - [Python] Cannot create Decimal128 array using integers
-* [ARROW-4469](https://issues.apache.org/jira/browse/ARROW-4469) - [Python][C++] CI Failing for Python 2.7 and 3.6 with valgrind
-* [ARROW-4471](https://issues.apache.org/jira/browse/ARROW-4471) - [C++] Pass AR and RANLIB to all external projects
-* [ARROW-4474](https://issues.apache.org/jira/browse/ARROW-4474) - [Flight] FlightInfo should use signed integer types for payload size
-* [ARROW-4496](https://issues.apache.org/jira/browse/ARROW-4496) - [CI] CI failing for python Xcode 7.3
-* [ARROW-4498](https://issues.apache.org/jira/browse/ARROW-4498) - [Plasma] Plasma fails building with CUDA enabled
-* [ARROW-4500](https://issues.apache.org/jira/browse/ARROW-4500) - [C++] librt and pthread hacks can cause linking problems
-* [ARROW-4501](https://issues.apache.org/jira/browse/ARROW-4501) - [C++] Unique returns non-unique strings
-* [ARROW-4525](https://issues.apache.org/jira/browse/ARROW-4525) - [Rust] [Parquet] Convert ArrowError to ParquetError
-* [ARROW-4527](https://issues.apache.org/jira/browse/ARROW-4527) - [Packaging] Update linux packaging tasks to align with the LLVM 7 migration
-* [ARROW-4532](https://issues.apache.org/jira/browse/ARROW-4532) - [Java] varchar value buffer much larger than expected
-* [ARROW-4533](https://issues.apache.org/jira/browse/ARROW-4533) - [Python] Document how to run hypothesis tests
-* [ARROW-4535](https://issues.apache.org/jira/browse/ARROW-4535) - [C++] Fix MakeBuilder to preserve ListType's field name
-* [ARROW-4536](https://issues.apache.org/jira/browse/ARROW-4536) - Add data\_type argument in garrow\_list\_array\_new
-* [ARROW-4538](https://issues.apache.org/jira/browse/ARROW-4538) - [PYTHON] Remove index column from subschema in write\_to\_dataframe
-* [ARROW-4549](https://issues.apache.org/jira/browse/ARROW-4549) - [C++] Can't build benchmark code on CUDA enabled build
-* [ARROW-4550](https://issues.apache.org/jira/browse/ARROW-4550) - [JS] Fix AMD pattern
-* [ARROW-4559](https://issues.apache.org/jira/browse/ARROW-4559) - [Python] pyarrow can't read/write filenames with special characters
-* [ARROW-4563](https://issues.apache.org/jira/browse/ARROW-4563) - [Python] pa.decimal128 should validate inputs
-* [ARROW-4571](https://issues.apache.org/jira/browse/ARROW-4571) - [Format] Tensor.fbs file has multiple root\_type declarations
-* [ARROW-4576](https://issues.apache.org/jira/browse/ARROW-4576) - [Python] Benchmark failures
-* [ARROW-4577](https://issues.apache.org/jira/browse/ARROW-4577) - [C++] Interface link libraries declared on arrow\_shared target that are actually non-interface
-* [ARROW-4581](https://issues.apache.org/jira/browse/ARROW-4581) - [C++] gbenchmark\_ep is a dependency of unit tests when ARROW\_BUILD\_BENCHMARKS=ON
-* [ARROW-4582](https://issues.apache.org/jira/browse/ARROW-4582) - [C++/Python] Memory corruption on Pandas->Arrow conversion
-* [ARROW-4584](https://issues.apache.org/jira/browse/ARROW-4584) - [Python] Add built wheel to manylinux1 dockerignore.
-* [ARROW-4585](https://issues.apache.org/jira/browse/ARROW-4585) - [C++] Dependency of Flight C++ sources on generated protobuf is not respected
-* [ARROW-4587](https://issues.apache.org/jira/browse/ARROW-4587) - Flight C++ DoPut segfaults
-* [ARROW-4597](https://issues.apache.org/jira/browse/ARROW-4597) - [C++] Targets for system Google Mock shared library are missing
-* [ARROW-4601](https://issues.apache.org/jira/browse/ARROW-4601) - [Python] Master build is broken due to missing licence for .dockerignore
-* [ARROW-4608](https://issues.apache.org/jira/browse/ARROW-4608) - [C++] cmake script assumes that double-conversion installs static libs
-* [ARROW-4617](https://issues.apache.org/jira/browse/ARROW-4617) - [C++] Support double-conversion<3.1
-* [ARROW-4624](https://issues.apache.org/jira/browse/ARROW-4624) - [C++] Linker errors when building benchmarks
-* [ARROW-4629](https://issues.apache.org/jira/browse/ARROW-4629) - [Python] Pandas to arrow conversion slowed down by local imports
-* [ARROW-4639](https://issues.apache.org/jira/browse/ARROW-4639) - [CI] Crossbow build failing for Gandiva jars
-* [ARROW-4641](https://issues.apache.org/jira/browse/ARROW-4641) - [C++] Flight builds complain of -Wstrict-aliasing 
-* [ARROW-4642](https://issues.apache.org/jira/browse/ARROW-4642) - [R] Change \`f\` to \`file\` in \`read\_parquet\_file()\`
-* [ARROW-4654](https://issues.apache.org/jira/browse/ARROW-4654) - [C++] Implicit Flight target dependencies cause compilation failure
-* [ARROW-4657](https://issues.apache.org/jira/browse/ARROW-4657) - [Release] gbenchmark should not be needed for verification
-* [ARROW-4658](https://issues.apache.org/jira/browse/ARROW-4658) - [C++] Shared gflags is also a run-time conda requirement
-* [ARROW-4659](https://issues.apache.org/jira/browse/ARROW-4659) - [CI] ubuntu/debian nightlies fail because of missing gandiva files
-* [ARROW-4660](https://issues.apache.org/jira/browse/ARROW-4660) - [C++] gflags fails to build due to CMake error
-* [ARROW-4664](https://issues.apache.org/jira/browse/ARROW-4664) - [C++] DCHECK macro conditions are evaluated in release builds
-* [ARROW-4669](https://issues.apache.org/jira/browse/ARROW-4669) - [Java] No Bounds checking on ArrowBuf.slice
-* [ARROW-4672](https://issues.apache.org/jira/browse/ARROW-4672) - [C++] clang-7 matrix entry is build using gcc
-* [ARROW-4680](https://issues.apache.org/jira/browse/ARROW-4680) - [CI] [Rust] Travis CI builds fail with latest Rust 1.34.0-nightly (2019-02-25)
-* [ARROW-4684](https://issues.apache.org/jira/browse/ARROW-4684) - [Python] CI failures in test\_cython.py
-* [ARROW-4687](https://issues.apache.org/jira/browse/ARROW-4687) - [Python] FlightServerBase.run should exit on Ctrl-C
-* [ARROW-4688](https://issues.apache.org/jira/browse/ARROW-4688) - [C++][Parquet] 16MB limit on (nested) column chunk prevents tuning row\_group\_size
-* [ARROW-4696](https://issues.apache.org/jira/browse/ARROW-4696) - Verify release script is over optimist with CUDA detection
-* [ARROW-4699](https://issues.apache.org/jira/browse/ARROW-4699) - [C++] json parser should not rely on null terminated buffers
-* [ARROW-4704](https://issues.apache.org/jira/browse/ARROW-4704) - [CI][GLib] Plasma test is flaky
-* [ARROW-4710](https://issues.apache.org/jira/browse/ARROW-4710) - [C++][R] New linting script skip files with "cpp" extension
-* [ARROW-4712](https://issues.apache.org/jira/browse/ARROW-4712) - [C++][CI] Clang7 Valgrind complains when not move shared\_ptr
-* [ARROW-4721](https://issues.apache.org/jira/browse/ARROW-4721) - [Rust] [DataFusion] Propagate schema in filter
-* [ARROW-4724](https://issues.apache.org/jira/browse/ARROW-4724) - [C++] Python not being built nor test under MinGW builds
-* [ARROW-4728](https://issues.apache.org/jira/browse/ARROW-4728) - [JS] Failing test Table#assign with a zero-length Null column round-trips through serialization
-* [ARROW-4737](https://issues.apache.org/jira/browse/ARROW-4737) - [C#] tests are not running in CI
-* [ARROW-4744](https://issues.apache.org/jira/browse/ARROW-4744) - [CI][C++] Mingw32 builds failing
-* [ARROW-4750](https://issues.apache.org/jira/browse/ARROW-4750) - [C++] RapidJSON triggers Wclass-memaccess on GCC 8+
-* [ARROW-4760](https://issues.apache.org/jira/browse/ARROW-4760) - [C++] protobuf 3.7 defines EXPECT\_OK that clashes with Arrow's macro
-* [ARROW-4766](https://issues.apache.org/jira/browse/ARROW-4766) - [C++] Casting empty boolean array causes segfault
-* [ARROW-4767](https://issues.apache.org/jira/browse/ARROW-4767) - [C#] ArrowStreamReader crashes while reading the end of a stream
-* [ARROW-4768](https://issues.apache.org/jira/browse/ARROW-4768) - [C++][CI] arrow-test-array sometimes gets stuck in MinGW build
-* [ARROW-4774](https://issues.apache.org/jira/browse/ARROW-4774) - [C++][Parquet] Call Table::Validate when writing a table
-* [ARROW-4775](https://issues.apache.org/jira/browse/ARROW-4775) - [Website] Site navbar cannot be expanded
-* [ARROW-4783](https://issues.apache.org/jira/browse/ARROW-4783) - [C++][CI] Mingw32 builds sometimes timeout
-* [ARROW-4793](https://issues.apache.org/jira/browse/ARROW-4793) - [Ruby] Suppress unused variable warning
-* [ARROW-4796](https://issues.apache.org/jira/browse/ARROW-4796) - [Flight][Python] segfault in simple server implementation
-* [ARROW-4802](https://issues.apache.org/jira/browse/ARROW-4802) - [Python] Hadoop classpath discovery broken HADOOP\_HOME is a symlink
-* [ARROW-4807](https://issues.apache.org/jira/browse/ARROW-4807) - [Rust] Fix csv\_writer benchmark
-* [ARROW-4811](https://issues.apache.org/jira/browse/ARROW-4811) - [C++] An incorrect dependency leads "ninja" to re-evaluate steps unnecessarily on subsequent calls
-* [ARROW-4813](https://issues.apache.org/jira/browse/ARROW-4813) - [Ruby] Add tests for #== and #!=
-* [ARROW-4820](https://issues.apache.org/jira/browse/ARROW-4820) - [Python] hadoop class path derived not correct
-* [ARROW-4822](https://issues.apache.org/jira/browse/ARROW-4822) - [C++/Python] pyarrow.Table.equals segmentation fault on None
-* [ARROW-4828](https://issues.apache.org/jira/browse/ARROW-4828) - [Python] manylinux1 docker-compose context should be python/manylinux1
-* [ARROW-4850](https://issues.apache.org/jira/browse/ARROW-4850) - [CI] Integration test failures do not fail the Travis CI build
-* [ARROW-4853](https://issues.apache.org/jira/browse/ARROW-4853) - [Rust] Array slice doesn't work on ListArray and StructArray
-* [ARROW-4857](https://issues.apache.org/jira/browse/ARROW-4857) - [C++/Python/CI] docker-compose in manylinux1 crossbow jobs too old
-* [ARROW-4866](https://issues.apache.org/jira/browse/ARROW-4866) - [C++] zstd ExternalProject failing on Windows
-* [ARROW-4867](https://issues.apache.org/jira/browse/ARROW-4867) - [Python] Table.from\_pandas() column order not respected
-* [ARROW-4869](https://issues.apache.org/jira/browse/ARROW-4869) - [C++] Use of gmock fails in compute/kernels/util-internal-test.cc 
-* [ARROW-4870](https://issues.apache.org/jira/browse/ARROW-4870) - [Ruby] gemspec has wrong msys2 dependency listed
-* [ARROW-4871](https://issues.apache.org/jira/browse/ARROW-4871) - [Flight][Java] Handle large Flight messages
-* [ARROW-4872](https://issues.apache.org/jira/browse/ARROW-4872) - [Python] Keep backward compatibility for ParquetDatasetPiece
-* [ARROW-4881](https://issues.apache.org/jira/browse/ARROW-4881) - [Python] bundle\_zlib CMake function still uses ARROW\_BUILD\_TOOLCHAIN
-* [ARROW-4900](https://issues.apache.org/jira/browse/ARROW-4900) - mingw-w64 < 5 does not have \_\_cpuidex
-* [ARROW-4903](https://issues.apache.org/jira/browse/ARROW-4903) - [C++] Building tests using only static libs not possible
-* [ARROW-4906](https://issues.apache.org/jira/browse/ARROW-4906) - [Format] Fix document to describe that SparseMatrixIndexCSR assumes indptr is sorted for each row
-* [ARROW-4918](https://issues.apache.org/jira/browse/ARROW-4918) - [C++] Add cmake-format to pre-commit
-* [ARROW-4928](https://issues.apache.org/jira/browse/ARROW-4928) - [Python] Hypothesis test failures
-* [ARROW-4931](https://issues.apache.org/jira/browse/ARROW-4931) - [C++] CMake fails on gRPC ExternalProject
-* [ARROW-4942](https://issues.apache.org/jira/browse/ARROW-4942) - [Ruby] Remove needless omits
-* [ARROW-4948](https://issues.apache.org/jira/browse/ARROW-4948) - [JS] Nightly test failing with "Cannot assign to read only property"
-* [ARROW-4950](https://issues.apache.org/jira/browse/ARROW-4950) - [C++] Thirdparty CMake error get\_target\_property() called with non-existent target LZ4::lz4
-* [ARROW-4952](https://issues.apache.org/jira/browse/ARROW-4952) - [C++] Equals / ApproxEquals behaviour undefined on FP NaNs
-* [ARROW-4954](https://issues.apache.org/jira/browse/ARROW-4954) - [Python] test failure with Flight enabled
-* [ARROW-4958](https://issues.apache.org/jira/browse/ARROW-4958) - [C++] Purely static linking broken
-* [ARROW-4961](https://issues.apache.org/jira/browse/ARROW-4961) - [C++][Python] Add GTest\_SOURCE=BUNDLED to relevant build docs that use conda-forge toolchain
-* [ARROW-4962](https://issues.apache.org/jira/browse/ARROW-4962) - [C++] Warning level to CHECKIN can't compile on modern GCC
-* [ARROW-4976](https://issues.apache.org/jira/browse/ARROW-4976) - [JS] RecordBatchReader should reset its Node/DOM streams
-* [ARROW-4982](https://issues.apache.org/jira/browse/ARROW-4982) - [GLib][CI] Run tests on AppVeyor
-* [ARROW-4984](https://issues.apache.org/jira/browse/ARROW-4984) - [Flight][C++] Flight server segfaults when port is in use
-* [ARROW-4986](https://issues.apache.org/jira/browse/ARROW-4986) - [CI] Travis fails to install llvm@7 
-* [ARROW-4989](https://issues.apache.org/jira/browse/ARROW-4989) - [C++] Builds fails to find Ubuntu-packaged re2 library
-* [ARROW-4991](https://issues.apache.org/jira/browse/ARROW-4991) - [CI] Bump travis node version to 11.12
-* [ARROW-4997](https://issues.apache.org/jira/browse/ARROW-4997) - [C#] ArrowStreamReader doesn't consume whole stream and doesn't implement sync read
-* [ARROW-5009](https://issues.apache.org/jira/browse/ARROW-5009) - [C++] Cleanup using to std::\* in files
-* [ARROW-5010](https://issues.apache.org/jira/browse/ARROW-5010) - [Release] Fix release script with llvm-7
-* [ARROW-5012](https://issues.apache.org/jira/browse/ARROW-5012) - [C++] "testing" headers not installed
-* [ARROW-5023](https://issues.apache.org/jira/browse/ARROW-5023) - [Release] Default value syntax in shell is wrong
-* [ARROW-5024](https://issues.apache.org/jira/browse/ARROW-5024) - [Release] crossbow.py --arrow-version causes missing variable error
-* [ARROW-5025](https://issues.apache.org/jira/browse/ARROW-5025) - [Python][Packaging] wheel for Windows are broken
-* [ARROW-5026](https://issues.apache.org/jira/browse/ARROW-5026) - [Python][Packaging] conda package on non Windows is broken
-* [ARROW-5029](https://issues.apache.org/jira/browse/ARROW-5029) - [C++] Compilation warnings in release mode
-* [ARROW-5031](https://issues.apache.org/jira/browse/ARROW-5031) - [Dev] Release verification script does not run CUDA tests in Python
-* [ARROW-5042](https://issues.apache.org/jira/browse/ARROW-5042) - [Release] Wrong ARROW\_DEPENDENCY\_SOURCE in verification script
-* [ARROW-5043](https://issues.apache.org/jira/browse/ARROW-5043) - [Release][Ruby] red-arrow dependency can't be resolve in verification script
-* [ARROW-5044](https://issues.apache.org/jira/browse/ARROW-5044) - [Release][Rust] Format error in verification script
-* [ARROW-5046](https://issues.apache.org/jira/browse/ARROW-5046) - [Release][C++] Plasma test is fragile in verification script
-* [ARROW-5047](https://issues.apache.org/jira/browse/ARROW-5047) - [Release] Always set up parquet-testing in verification script
-* [ARROW-5048](https://issues.apache.org/jira/browse/ARROW-5048) - [Release][Rust] arrow-testing is missing in verification script
-* [ARROW-5050](https://issues.apache.org/jira/browse/ARROW-5050) - [C++] cares\_ep should build before grpc\_ep
-
-
-[1]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.13.0/
-[2]: https://bintray.com/apache/arrow/centos/0.13.0/
-[3]: https://bintray.com/apache/arrow/debian/0.13.0/
-[4]: https://bintray.com/apache/arrow/python/0.13.0/
-[5]: https://bintray.com/apache/arrow/ubuntu/0.13.0/
-[6]: https://github.com/apache/arrow/releases/tag/apache-arrow-0.13.0
diff --git a/site/_release/0.14.0.md b/site/_release/0.14.0.md
deleted file mode 100644
index ed191d9..0000000
--- a/site/_release/0.14.0.md
+++ /dev/null
@@ -1,760 +0,0 @@
----
-layout: default
-title: Apache Arrow 0.14.0 Release
-permalink: /release/0.14.0.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow 0.14.0 (4 July 2019)
-
-This is a major release covering more than 3 months of development.
-
-## Download
-
-* [**Source Artifacts**][1]
-* **Binary Artifacts**
-  * [For CentOS][2]
-  * [For Debian][3]
-  * [For Python][4]
-  * [For Ubuntu][5]
-* [Git tag][6]
-
-## Contributors
-
-This release includes 570 commits from 78 distinct contributors.
-
-```console
-$ git shortlog -sn apache-arrow-0.13.0..apache-arrow-0.14.0
-    67	Antoine Pitrou
-    62	Wes McKinney
-    37	Sebastien Binet
-    34	Sutou Kouhei
-    25	Kouhei Sutou
-    24	Neal Richardson
-    22	Romain Francois
-    21	Joris Van den Bossche
-    20	Benjamin Kietzman
-    19	Micah Kornfield
-    18	Krisztián Szűcs
-    16	David Li
-    16	François Saint-Jacques
-    14	Yosuke Shiro
-    11	Pindikura Ravindra
-    11	Praveen
-    10	liyafan82
-    10	tianchen
-     7	Kenta Murata
-     7	Neville Dipale
-     7	Renjie Liu
-     6	Chao Sun
-     6	Deepak Majeti
-     6	Hatem Helal
-     5	Andy Grove
-     4	Brian Hulette
-     4	Bryan Cutler
-     4	Eric Erhardt
-     4	Zhuo Peng
-     4	ptaylor
-     4	shengjun.li
-     3	Marco Neumann
-     3	Philipp Moritz
-     3	Prudhvi Porandla
-     3	Renat Valiullin
-     3	TP Boudreau
-     3	Zhiyuan Zheng
-     2	Fabio B. Silva
-     2	HyukjinKwon
-     2	John Muehlhausen
-     2	Prashanth Govindarajan
-     2	alexandreyc
-     2	shyam
-     1	Alex Sergeev
-     1	Anatoly Myachev
-     1	Anson Qian
-     1	Frank Wessels
-     1	Gidon Gershinsky
-     1	Guillaume Horel
-     1	Ivan Sadikov
-     1	Jamie Blondin
-     1	Javier Luraschi
-     1	Jim Walker
-     1	Johannes Luong
-     1	Jonathan A. Sternberg
-     1	Kevin Gurney
-     1	MASAMIKI
-     1	Mark Cafaro
-     1	Masayuki Takahashi
-     1	Miguel Cabrera
-     1	Nuno Sucena Almeida
-     1	Pearu Peterson
-     1	Robin Kåveland Hansen
-     1	Rok
-     1	Ryan Murray
-     1	Takuya Kato
-     1	Yngve Sekse Kristiansen
-     1	Yuqi Gu
-     1	Yurui Zhou
-     1	Zhijun Fu
-     1	daslu
-     1	dlekkas
-     1	emkornfield
-     1	praveenbingo
-     1	rjzamora
-     1	siddharth
-     1	tianchen92
-     1	tiger
-```
-
-## Patch Committers
-
-The following Apache committers merged contributed patches to the repository.
-
-```console
-$ git shortlog -csn apache-arrow-0.13.0..apache-arrow-0.14.0
-   213	Wes McKinney
-    67	Antoine Pitrou
-    40	Sutou Kouhei
-    36	Krisztián Szűcs
-    36	Sebastien Binet
-    34	Micah Kornfield
-    33	Kouhei Sutou
-    22	Pindikura Ravindra
-    19	Romain Francois
-    18	Chao Sun
-    16	François Saint-Jacques
-    10	Uwe L. Korn
-     6	Bryan Cutler
-     4	Brian Hulette
-     4	Yosuke Shiro
-     3	GitHub
-     2	Andy Grove
-     2	Philipp Moritz
-     2	Sidd
-     1	Deepak Majeti
-     1	Robert Nishihara
-     1	ptaylor
-```
-
-## Changelog
-
-### New Features and Improvements
-
-* [ARROW-1012](https://issues.apache.org/jira/browse/ARROW-1012) - [C++] Create a configurable implementation of RecordBatchReader that reads from Apache Parquet files
-* [ARROW-1207](https://issues.apache.org/jira/browse/ARROW-1207) - [C++] Implement Map logical type
-* [ARROW-1261](https://issues.apache.org/jira/browse/ARROW-1261) - [Java] Add container type for Map logical type
-* [ARROW-1278](https://issues.apache.org/jira/browse/ARROW-1278) - Integration tests for Fixed Size List type
-* [ARROW-1279](https://issues.apache.org/jira/browse/ARROW-1279) - [Integration][Java] Integration tests for Map type
-* [ARROW-1280](https://issues.apache.org/jira/browse/ARROW-1280) - [C++] Implement Fixed Size List type
-* [ARROW-1496](https://issues.apache.org/jira/browse/ARROW-1496) - [JS] Upload coverage data to codecov.io
-* [ARROW-1558](https://issues.apache.org/jira/browse/ARROW-1558) - [C++] Implement boolean selection kernels
-* [ARROW-1774](https://issues.apache.org/jira/browse/ARROW-1774) - [C++] Add "view" function to create zero-copy views for compatible types, if supported
-* [ARROW-1957](https://issues.apache.org/jira/browse/ARROW-1957) - [Python] Write nanosecond timestamps using new NANO LogicalType Parquet unit
-* [ARROW-1983](https://issues.apache.org/jira/browse/ARROW-1983) - [Python] Add ability to write parquet \`\_metadata\` file
-* [ARROW-2057](https://issues.apache.org/jira/browse/ARROW-2057) - [Python] Configure size of data pages in pyarrow.parquet.write\_table
-* [ARROW-2102](https://issues.apache.org/jira/browse/ARROW-2102) - [C++] Implement take kernel functions - primitive value type
-* [ARROW-2103](https://issues.apache.org/jira/browse/ARROW-2103) - [C++] Implement take kernel functions - string/binary value type
-* [ARROW-2104](https://issues.apache.org/jira/browse/ARROW-2104) - [C++] Implement take kernel functions - nested array value type
-* [ARROW-2105](https://issues.apache.org/jira/browse/ARROW-2105) - [C++] Implement take kernel functions - properly handle special indices
-* [ARROW-2217](https://issues.apache.org/jira/browse/ARROW-2217) - [C++] Add option to use dynamic linking for compression library dependencies
-* [ARROW-2298](https://issues.apache.org/jira/browse/ARROW-2298) - [Python] Add option to not consider NaN to be null when converting to an integer Arrow type
-* [ARROW-2412](https://issues.apache.org/jira/browse/ARROW-2412) - [Integration] Add nested dictionary integration test
-* [ARROW-2467](https://issues.apache.org/jira/browse/ARROW-2467) - [Rust] Generate code using Flatbuffers
-* [ARROW-2517](https://issues.apache.org/jira/browse/ARROW-2517) - [Java] Add list<decimal> writer
-* [ARROW-2707](https://issues.apache.org/jira/browse/ARROW-2707) - [C++] Implement Table::Slice methods using Column::Slice
-* [ARROW-2796](https://issues.apache.org/jira/browse/ARROW-2796) - [C++] Simplify symbols.map file, use when building libarrow\_python
-* [ARROW-2818](https://issues.apache.org/jira/browse/ARROW-2818) - [Python] Better error message when passing SparseDataFrame into Table.from\_pandas
-* [ARROW-2835](https://issues.apache.org/jira/browse/ARROW-2835) - [C++] ReadAt/WriteAt are inconsistent with moving the files position
-* [ARROW-2969](https://issues.apache.org/jira/browse/ARROW-2969) - [R] Convert between StructArray and "nested" data.frame column containing data frame in each cell
-* [ARROW-2981](https://issues.apache.org/jira/browse/ARROW-2981) - [C++] Support scripts / documentation for running clang-tidy on codebase
-* [ARROW-3040](https://issues.apache.org/jira/browse/ARROW-3040) - [Go] add support for comparing Arrays
-* [ARROW-3041](https://issues.apache.org/jira/browse/ARROW-3041) - [Go] add support for TimeArray
-* [ARROW-3052](https://issues.apache.org/jira/browse/ARROW-3052) - [C++] Detect ORC system packages
-* [ARROW-3087](https://issues.apache.org/jira/browse/ARROW-3087) - [C++] Add kernels for comparison operations to scalars
-* [ARROW-3144](https://issues.apache.org/jira/browse/ARROW-3144) - [C++] Move "dictionary" member from DictionaryType to ArrayData to allow for changing dictionaries between Array chunks
-* [ARROW-3150](https://issues.apache.org/jira/browse/ARROW-3150) - [Python] Ship Flight-enabled Python wheels on Linux and Windows
-* [ARROW-3166](https://issues.apache.org/jira/browse/ARROW-3166) - [C++] Consolidate IO interfaces used in arrow/io and parquet-cpp
-* [ARROW-3191](https://issues.apache.org/jira/browse/ARROW-3191) - [Java] Add support for ArrowBuf to point to arbitrary memory.
-* [ARROW-3200](https://issues.apache.org/jira/browse/ARROW-3200) - [C++] Add support for reading Flight streams with dictionaries
-* [ARROW-3290](https://issues.apache.org/jira/browse/ARROW-3290) - [C++] Toolchain support for secure gRPC
-* [ARROW-3294](https://issues.apache.org/jira/browse/ARROW-3294) - [C++] Test Flight RPC on Windows / Appveyor
-* [ARROW-3314](https://issues.apache.org/jira/browse/ARROW-3314) - [R] Set -rpath using pkg-config when building
-* [ARROW-3419](https://issues.apache.org/jira/browse/ARROW-3419) - [C++] Run include-what-you-use checks as nightly build
-* [ARROW-3459](https://issues.apache.org/jira/browse/ARROW-3459) - [C++][Gandiva] Add support for variable length output vectors
-* [ARROW-3475](https://issues.apache.org/jira/browse/ARROW-3475) - [C++] Int64Builder.Finish(NumericArray<Int64Type>)
-* [ARROW-3572](https://issues.apache.org/jira/browse/ARROW-3572) - [Packaging] Correctly handle ssh origin urls for crossbow
-* [ARROW-3671](https://issues.apache.org/jira/browse/ARROW-3671) - [Go] implement Interval array
-* [ARROW-3676](https://issues.apache.org/jira/browse/ARROW-3676) - [Go] implement Decimal128 array
-* [ARROW-3679](https://issues.apache.org/jira/browse/ARROW-3679) - [Go] implement IPC protocol
-* [ARROW-3680](https://issues.apache.org/jira/browse/ARROW-3680) - [Go] implement Float16 array
-* [ARROW-3686](https://issues.apache.org/jira/browse/ARROW-3686) - [Python] Support for masked arrays in to/from numpy
-* [ARROW-3729](https://issues.apache.org/jira/browse/ARROW-3729) - [C++] Support for writing TIMESTAMP\_NANOS Parquet metadata
-* [ARROW-3732](https://issues.apache.org/jira/browse/ARROW-3732) - [R] Add functions to write RecordBatch or Schema to Message value, then read back
-* [ARROW-3758](https://issues.apache.org/jira/browse/ARROW-3758) - [R] Build R library on Windows, document build instructions for Windows developers
-* [ARROW-3759](https://issues.apache.org/jira/browse/ARROW-3759) - [R][CI] Build and test on Windows in Appveyor
-* [ARROW-3767](https://issues.apache.org/jira/browse/ARROW-3767) - [C++] Add cast for Null to any type
-* [ARROW-3780](https://issues.apache.org/jira/browse/ARROW-3780) - [R] Failed to fetch data: invalid data when collecting int16
-* [ARROW-3791](https://issues.apache.org/jira/browse/ARROW-3791) - [C++] Add type inference for boolean values in CSV files
-* [ARROW-3794](https://issues.apache.org/jira/browse/ARROW-3794) - [R] Consider mapping INT8 to integer() not raw()
-* [ARROW-3804](https://issues.apache.org/jira/browse/ARROW-3804) - [R] Consider lowering required R runtime
-* [ARROW-3810](https://issues.apache.org/jira/browse/ARROW-3810) - [R] type= argument for Array and ChunkedArray
-* [ARROW-3811](https://issues.apache.org/jira/browse/ARROW-3811) - [R] struct arrays inference
-* [ARROW-3814](https://issues.apache.org/jira/browse/ARROW-3814) - [R] RecordBatch$from\_arrays()
-* [ARROW-3815](https://issues.apache.org/jira/browse/ARROW-3815) - [R] refine record batch factory
-* [ARROW-3848](https://issues.apache.org/jira/browse/ARROW-3848) - [R] allow nbytes to be missing in RandomAccessFile$Read()
-* [ARROW-3897](https://issues.apache.org/jira/browse/ARROW-3897) - [MATLAB] Add MATLAB support for writing numeric datatypes to a Feather file
-* [ARROW-3904](https://issues.apache.org/jira/browse/ARROW-3904) - [C++/Python] Validate scale and precision of decimal128 type
-* [ARROW-4013](https://issues.apache.org/jira/browse/ARROW-4013) - [Documentation][C++] Document how to build Apache Arrow on MSYS2
-* [ARROW-4020](https://issues.apache.org/jira/browse/ARROW-4020) - [Release] Remove source artifacts from dev dist system after release vote passes
-* [ARROW-4036](https://issues.apache.org/jira/browse/ARROW-4036) - [C++] Make status codes pluggable
-* [ARROW-4047](https://issues.apache.org/jira/browse/ARROW-4047) - [Python] Document use of int96 timestamps and options in Parquet docs
-* [ARROW-4086](https://issues.apache.org/jira/browse/ARROW-4086) - [Java] Add apis to debug alloc failures
-* [ARROW-4121](https://issues.apache.org/jira/browse/ARROW-4121) - [C++] Refactor memory allocation from InvertKernel
-* [ARROW-4159](https://issues.apache.org/jira/browse/ARROW-4159) - [C++] Check for -Wdocumentation issues
-* [ARROW-4194](https://issues.apache.org/jira/browse/ARROW-4194) - [Format] Metadata.rst does not specify timezone for Timestamp type
-* [ARROW-4302](https://issues.apache.org/jira/browse/ARROW-4302) - [C++] Add OpenSSL to C++ build toolchain
-* [ARROW-4337](https://issues.apache.org/jira/browse/ARROW-4337) - [C#] Array / RecordBatch Builder Fluent API
-* [ARROW-4343](https://issues.apache.org/jira/browse/ARROW-4343) - [C++] Add as complete as possible Ubuntu Trusty / 14.04 build to docker-compose setup
-* [ARROW-4356](https://issues.apache.org/jira/browse/ARROW-4356) - [CI] Add integration (docker) test for turbodbc
-* [ARROW-4452](https://issues.apache.org/jira/browse/ARROW-4452) - [Python] Serializing sparse torch tensors
-* [ARROW-4453](https://issues.apache.org/jira/browse/ARROW-4453) - [Python] Create Cython wrappers for SparseTensor
-* [ARROW-4467](https://issues.apache.org/jira/browse/ARROW-4467) -  [Rust] [DataFusion] Create a REPL & Dockerfile for DataFusion
-* [ARROW-4503](https://issues.apache.org/jira/browse/ARROW-4503) - [C#] ArrowStreamReader allocates and copies data excessively
-* [ARROW-4504](https://issues.apache.org/jira/browse/ARROW-4504) - [C++] Reduce the number of unit test executables
-* [ARROW-4505](https://issues.apache.org/jira/browse/ARROW-4505) - [C++] Nicer PrettyPrint for date32
-* [ARROW-4566](https://issues.apache.org/jira/browse/ARROW-4566) - [C++][Flight] Add option to run arrow-flight-benchmark against a perf server running on a different host
-* [ARROW-4596](https://issues.apache.org/jira/browse/ARROW-4596) - [Rust] [DataFusion] Implement COUNT aggregate function
-* [ARROW-4622](https://issues.apache.org/jira/browse/ARROW-4622) - [C++] [Python] MakeDense and MakeSparse in UnionArray should accept a vector of Field
-* [ARROW-4625](https://issues.apache.org/jira/browse/ARROW-4625) - [Flight] Wrap server busy-wait methods
-* [ARROW-4626](https://issues.apache.org/jira/browse/ARROW-4626) - [Flight] Add application metadata field to DoGet
-* [ARROW-4627](https://issues.apache.org/jira/browse/ARROW-4627) - [Flight] Add application metadata field to DoPut
-* [ARROW-4701](https://issues.apache.org/jira/browse/ARROW-4701) - [C++] Add JSON chunker benchmarks
-* [ARROW-4702](https://issues.apache.org/jira/browse/ARROW-4702) - [C++] Upgrade dependency versions
-* [ARROW-4708](https://issues.apache.org/jira/browse/ARROW-4708) - [C++] Add multithreaded JSON reader
-* [ARROW-4714](https://issues.apache.org/jira/browse/ARROW-4714) - [C++][Java] Providing JNI interface to Read ORC file via Arrow C++
-* [ARROW-4717](https://issues.apache.org/jira/browse/ARROW-4717) - [C#] Consider exposing ValueTask instead of Task
-* [ARROW-4719](https://issues.apache.org/jira/browse/ARROW-4719) - [C#] Implement ChunkedArray, Column and Table in C#
-* [ARROW-4741](https://issues.apache.org/jira/browse/ARROW-4741) - [Java] Add documentation to all classes and enable checkstyle for class javadocs
-* [ARROW-4787](https://issues.apache.org/jira/browse/ARROW-4787) - [C++] Include "null" values (perhaps with an option to toggle on/off) in hash kernel actions
-* [ARROW-4788](https://issues.apache.org/jira/browse/ARROW-4788) - [C++] Develop less verbose API for constructing StructArray
-* [ARROW-4800](https://issues.apache.org/jira/browse/ARROW-4800) - [C++] Create/port a StatusOr implementation to be able to return a status or a type
-* [ARROW-4805](https://issues.apache.org/jira/browse/ARROW-4805) - [Rust] Write temporal arrays to CSV
-* [ARROW-4806](https://issues.apache.org/jira/browse/ARROW-4806) - [Rust] Support casting temporal arrays in cast kernels
-* [ARROW-4824](https://issues.apache.org/jira/browse/ARROW-4824) - [Python] read\_csv should accept io.StringIO objects
-* [ARROW-4827](https://issues.apache.org/jira/browse/ARROW-4827) - [C++] Implement benchmark comparison between two git revisions
-* [ARROW-4847](https://issues.apache.org/jira/browse/ARROW-4847) - [Python] Add pyarrow.table factory function that dispatches to various ctors based on type of input
-* [ARROW-4904](https://issues.apache.org/jira/browse/ARROW-4904) - [C++] Move implementations in arrow/ipc/test-common.h into libarrow\_testing
-* [ARROW-4911](https://issues.apache.org/jira/browse/ARROW-4911) - [R] Support for building package for Windows
-* [ARROW-4912](https://issues.apache.org/jira/browse/ARROW-4912) - [C++, Python] Allow specifying column names to CSV reader
-* [ARROW-4913](https://issues.apache.org/jira/browse/ARROW-4913) - [Java][Memory] Limit number of ledgers and arrowbufs
-* [ARROW-4945](https://issues.apache.org/jira/browse/ARROW-4945) - [Flight] Enable Flight integration tests in Travis
-* [ARROW-4956](https://issues.apache.org/jira/browse/ARROW-4956) - [C#] Allow ArrowBuffers to wrap external Memory in C#
-* [ARROW-4959](https://issues.apache.org/jira/browse/ARROW-4959) - [Gandiva][Crossbow] Builds broken
-* [ARROW-4968](https://issues.apache.org/jira/browse/ARROW-4968) - [Rust] StructArray builder and From<> methods should check that field types match schema
-* [ARROW-4971](https://issues.apache.org/jira/browse/ARROW-4971) - [Go] DataType equality
-* [ARROW-4972](https://issues.apache.org/jira/browse/ARROW-4972) - [Go] Array equality
-* [ARROW-4973](https://issues.apache.org/jira/browse/ARROW-4973) - [Go] Slice Array equality
-* [ARROW-4974](https://issues.apache.org/jira/browse/ARROW-4974) - [Go] Array approx equality
-* [ARROW-4990](https://issues.apache.org/jira/browse/ARROW-4990) - [C++] Kernel to compare array with array
-* [ARROW-4993](https://issues.apache.org/jira/browse/ARROW-4993) - [C++] Display summary at the end of CMake configuration
-* [ARROW-5000](https://issues.apache.org/jira/browse/ARROW-5000) - [Python] Fix deprecation warning from setup.py
-* [ARROW-5007](https://issues.apache.org/jira/browse/ARROW-5007) - [C++] Move DCHECK out of sse-utils
-* [ARROW-5020](https://issues.apache.org/jira/browse/ARROW-5020) - [C++][Gandiva] Split Gandiva-related conda packages for builds into separate .yml conda env file
-* [ARROW-5027](https://issues.apache.org/jira/browse/ARROW-5027) - [Python] Add JSON Reader
-* [ARROW-5038](https://issues.apache.org/jira/browse/ARROW-5038) - [Rust] [DataFusion] Implement AVG aggregate function
-* [ARROW-5039](https://issues.apache.org/jira/browse/ARROW-5039) - [Rust] [DataFusion] Fix bugs in CAST support
-* [ARROW-5045](https://issues.apache.org/jira/browse/ARROW-5045) - [Rust] Code coverage silently failing in CI
-* [ARROW-5053](https://issues.apache.org/jira/browse/ARROW-5053) - [Rust] [DataFusion] Use env var for location of arrow test data
-* [ARROW-5054](https://issues.apache.org/jira/browse/ARROW-5054) - [C++][Release] Test Flight in verify-release-candidate.sh
-* [ARROW-5056](https://issues.apache.org/jira/browse/ARROW-5056) - [Packaging] Adjust conda recipes to use ORC conda-forge package on unix systems
-* [ARROW-5061](https://issues.apache.org/jira/browse/ARROW-5061) - [Release] Improve 03-binary performance
-* [ARROW-5062](https://issues.apache.org/jira/browse/ARROW-5062) - [Java] Shade Java Guava dependency for Flight
-* [ARROW-5063](https://issues.apache.org/jira/browse/ARROW-5063) - [Java] FlightClient should not create a child allocator
-* [ARROW-5064](https://issues.apache.org/jira/browse/ARROW-5064) - [Release] Pass PKG\_CONFIG\_PATH to glib in the verification script
-* [ARROW-5066](https://issues.apache.org/jira/browse/ARROW-5066) - [Integration] Add flags to enable/disable implementations in integration/integration\_test.py
-* [ARROW-5071](https://issues.apache.org/jira/browse/ARROW-5071) - [Benchmarking] Performs a benchmark run with archery
-* [ARROW-5076](https://issues.apache.org/jira/browse/ARROW-5076) - [Packaging] Improve post binary upload performance
-* [ARROW-5077](https://issues.apache.org/jira/browse/ARROW-5077) - [Rust] Release process should change Cargo.toml to use release versions
-* [ARROW-5078](https://issues.apache.org/jira/browse/ARROW-5078) - [Documentation] Sphinx is failed by RemovedInSphinx30Warning
-* [ARROW-5079](https://issues.apache.org/jira/browse/ARROW-5079) - [Release] Add a script to release C# package
-* [ARROW-5080](https://issues.apache.org/jira/browse/ARROW-5080) - [Release] Add a script to release Rust packages
-* [ARROW-5081](https://issues.apache.org/jira/browse/ARROW-5081) - [C++] Consistently use PATH\_SUFFIXES in CMake config
-* [ARROW-5082](https://issues.apache.org/jira/browse/ARROW-5082) - [Python][Packaging] Reduce size of macOS and manylinux1 wheels
-* [ARROW-5083](https://issues.apache.org/jira/browse/ARROW-5083) - [Developer] In merge\_arrow\_pr.py script, allow user to set a released Fix Version
-* [ARROW-5088](https://issues.apache.org/jira/browse/ARROW-5088) - [C++] Do not set -Werror when using BUILD\_WARNING\_LEVEL=CHECKIN in release mode
-* [ARROW-5091](https://issues.apache.org/jira/browse/ARROW-5091) - [Flight] Rename FlightGetInfo message to FlightInfo
-* [ARROW-5093](https://issues.apache.org/jira/browse/ARROW-5093) - [Packaging] Add support for selective binary upload
-* [ARROW-5094](https://issues.apache.org/jira/browse/ARROW-5094) - [Packaging] Add APT/Yum verification scripts
-* [ARROW-5102](https://issues.apache.org/jira/browse/ARROW-5102) - [C++] Reduce header dependencies
-* [ARROW-5108](https://issues.apache.org/jira/browse/ARROW-5108) - [Go] implement reading primitive arrays from Arrow file
-* [ARROW-5109](https://issues.apache.org/jira/browse/ARROW-5109) - [Go] implement reading binary/string arrays from Arrow file
-* [ARROW-5110](https://issues.apache.org/jira/browse/ARROW-5110) - [Go] implement reading struct arrays from Arrow file
-* [ARROW-5111](https://issues.apache.org/jira/browse/ARROW-5111) - [Go] implement reading list arrays from Arrow file
-* [ARROW-5112](https://issues.apache.org/jira/browse/ARROW-5112) - [Go] implement writing arrays to Arrow file
-* [ARROW-5113](https://issues.apache.org/jira/browse/ARROW-5113) - [C++][Flight] Unit tests in C++ for DoPut
-* [ARROW-5115](https://issues.apache.org/jira/browse/ARROW-5115) - [JS] Implement the Vector Builders
-* [ARROW-5116](https://issues.apache.org/jira/browse/ARROW-5116) - [Rust] move kernel related files under compute/kernels
-* [ARROW-5124](https://issues.apache.org/jira/browse/ARROW-5124) - [C++] Add support for Parquet in MinGW build
-* [ARROW-5126](https://issues.apache.org/jira/browse/ARROW-5126) - [Rust] [Parquet] Convert parquet column desc to arrow data type
-* [ARROW-5127](https://issues.apache.org/jira/browse/ARROW-5127) - [Rust] [Parquet] Add page iterator
-* [ARROW-5136](https://issues.apache.org/jira/browse/ARROW-5136) - [Flight] Implement call options (timeouts)
-* [ARROW-5137](https://issues.apache.org/jira/browse/ARROW-5137) - [Flight] Implement authentication APIs
-* [ARROW-5145](https://issues.apache.org/jira/browse/ARROW-5145) - [C++] Release mode lacks convenience input validation
-* [ARROW-5150](https://issues.apache.org/jira/browse/ARROW-5150) - [Ruby] Add Arrow::Table#raw\_records
-* [ARROW-5155](https://issues.apache.org/jira/browse/ARROW-5155) - [GLib][Ruby] Add support for building union arrays from data type
-* [ARROW-5157](https://issues.apache.org/jira/browse/ARROW-5157) - [Website] Add MATLAB to powered by Apache Arrow page
-* [ARROW-5162](https://issues.apache.org/jira/browse/ARROW-5162) - [Rust] [Parquet] Rename mod reader to arrow.
-* [ARROW-5163](https://issues.apache.org/jira/browse/ARROW-5163) - [Gandiva] Cast timestamp/date are incorrectly evaluating year 0097 to 1997
-* [ARROW-5164](https://issues.apache.org/jira/browse/ARROW-5164) - [Gandiva] [C++] Introduce 32bit hash functions
-* [ARROW-5165](https://issues.apache.org/jira/browse/ARROW-5165) - [Python][Documentation] Build docs don't suggest assigning $ARROW\_BUILD\_TYPE
-* [ARROW-5168](https://issues.apache.org/jira/browse/ARROW-5168) - [GLib] Add garrow\_array\_take()
-* [ARROW-5171](https://issues.apache.org/jira/browse/ARROW-5171) - [C++] Use LESS instead of LOWER in compare enum option.
-* [ARROW-5172](https://issues.apache.org/jira/browse/ARROW-5172) - [Go] implement reading fixed-size binary arrays from Arrow file
-* [ARROW-5178](https://issues.apache.org/jira/browse/ARROW-5178) - [Python] Allow creating Table from Python dict
-* [ARROW-5179](https://issues.apache.org/jira/browse/ARROW-5179) - [Python] Return plain dicts, not OrderedDict, on Python 3.7+
-* [ARROW-5185](https://issues.apache.org/jira/browse/ARROW-5185) - [C++] Add support for Boost with CMake configuration file
-* [ARROW-5187](https://issues.apache.org/jira/browse/ARROW-5187) - [Rust] Ability to flatten StructArray into a RecordBatch
-* [ARROW-5188](https://issues.apache.org/jira/browse/ARROW-5188) - [Rust] Add temporal builders for StructArray
-* [ARROW-5189](https://issues.apache.org/jira/browse/ARROW-5189) - [Rust] [Parquet] Format individual fields within a parquet row
-* [ARROW-5190](https://issues.apache.org/jira/browse/ARROW-5190) - [R] Discussion: tibble dependency in R package
-* [ARROW-5191](https://issues.apache.org/jira/browse/ARROW-5191) - [Rust] Expose CSV and JSON reader schemas
-* [ARROW-5203](https://issues.apache.org/jira/browse/ARROW-5203) - [GLib] Add support for Compare filter
-* [ARROW-5204](https://issues.apache.org/jira/browse/ARROW-5204) - [C++] Improve BufferBuilder performance
-* [ARROW-5212](https://issues.apache.org/jira/browse/ARROW-5212) - [Go] Array BinaryBuilder in Go library has no access to resize the values buffer
-* [ARROW-5218](https://issues.apache.org/jira/browse/ARROW-5218) - [C++] Improve build when third-party library locations are specified
-* [ARROW-5219](https://issues.apache.org/jira/browse/ARROW-5219) - [C++] Build protobuf\_ep in parallel when using Ninja
-* [ARROW-5222](https://issues.apache.org/jira/browse/ARROW-5222) - [Python] Issues with installing pyarrow for development on MacOS
-* [ARROW-5225](https://issues.apache.org/jira/browse/ARROW-5225) - [Java] Improve performance of BaseValueVector#getValidityBufferSizeFromCount
-* [ARROW-5226](https://issues.apache.org/jira/browse/ARROW-5226) - [Gandiva] support compare operators for decimal
-* [ARROW-5238](https://issues.apache.org/jira/browse/ARROW-5238) - [Python] Improve usability of pyarrow.dictionary function
-* [ARROW-5241](https://issues.apache.org/jira/browse/ARROW-5241) - [Python] Add option to disable writing statistics to parquet file
-* [ARROW-5250](https://issues.apache.org/jira/browse/ARROW-5250) - [Java] remove javadoc suppression on methods.
-* [ARROW-5252](https://issues.apache.org/jira/browse/ARROW-5252) - [C++] Change variant implementation
-* [ARROW-5256](https://issues.apache.org/jira/browse/ARROW-5256) - [Packaging][deb] Failed to build with LLVM 7.1.0
-* [ARROW-5257](https://issues.apache.org/jira/browse/ARROW-5257) - [Website] Update site to use "official" Apache Arrow logo, add clearly marked links to logo
-* [ARROW-5258](https://issues.apache.org/jira/browse/ARROW-5258) - [C++/Python] Expose file metadata of dataset pieces to caller
-* [ARROW-5261](https://issues.apache.org/jira/browse/ARROW-5261) - [C++] Finish implementation of scalar types for Duration and Interval
-* [ARROW-5262](https://issues.apache.org/jira/browse/ARROW-5262) - [Python] Fix typo
-* [ARROW-5264](https://issues.apache.org/jira/browse/ARROW-5264) - [Java] Allow enabling/disabling boundary checking by environmental variable
-* [ARROW-5266](https://issues.apache.org/jira/browse/ARROW-5266) - [Go] implement read/write IPC for Float16
-* [ARROW-5268](https://issues.apache.org/jira/browse/ARROW-5268) - [GLib] Add GArrowJSONReader
-* [ARROW-5269](https://issues.apache.org/jira/browse/ARROW-5269) - [C++] Whitelist benchmarks candidates for regression checks
-* [ARROW-5275](https://issues.apache.org/jira/browse/ARROW-5275) - [C++] Write generic filesystem tests
-* [ARROW-5281](https://issues.apache.org/jira/browse/ARROW-5281) - [Rust] [Parquet] Move DataPageBuilder to test\_common
-* [ARROW-5284](https://issues.apache.org/jira/browse/ARROW-5284) - [Rust] Replace libc with std::alloc for memory allocation
-* [ARROW-5286](https://issues.apache.org/jira/browse/ARROW-5286) - [Python] support Structs in Table.from\_pandas given a known schema
-* [ARROW-5288](https://issues.apache.org/jira/browse/ARROW-5288) - [Documentation] Enrich the contribution guidelines
-* [ARROW-5289](https://issues.apache.org/jira/browse/ARROW-5289) - [C++] Move arrow/util/concatenate.h to arrow/array/
-* [ARROW-5290](https://issues.apache.org/jira/browse/ARROW-5290) - [Java] Provide a flag to enable/disable null-checking in vectors' get methods
-* [ARROW-5291](https://issues.apache.org/jira/browse/ARROW-5291) - [Python] Add wrapper for "take" kernel on Array
-* [ARROW-5298](https://issues.apache.org/jira/browse/ARROW-5298) - [Rust] Add debug implementation for Buffer
-* [ARROW-5299](https://issues.apache.org/jira/browse/ARROW-5299) - [C++] ListArray comparison is incorrect
-* [ARROW-5309](https://issues.apache.org/jira/browse/ARROW-5309) - [Python] Add clarifications to Python "append" methods that return new objects
-* [ARROW-5311](https://issues.apache.org/jira/browse/ARROW-5311) - [C++] Return more specific invalid Status in Take kernel
-* [ARROW-5313](https://issues.apache.org/jira/browse/ARROW-5313) - [Format] Comments on Field table are a bit confusing
-* [ARROW-5317](https://issues.apache.org/jira/browse/ARROW-5317) - [Rust] [Parquet] impl IntoIterator for SerializedFileReader
-* [ARROW-5319](https://issues.apache.org/jira/browse/ARROW-5319) - [CI] Enable ccache with MinGW builds
-* [ARROW-5321](https://issues.apache.org/jira/browse/ARROW-5321) - [Gandiva][C++] add isnull and isnotnull for utf8 and binary types
-* [ARROW-5323](https://issues.apache.org/jira/browse/ARROW-5323) - [CI] Use compression with clcache
-* [ARROW-5328](https://issues.apache.org/jira/browse/ARROW-5328) - [R] Add shell scripts to do a full package rebuild and test locally
-* [ARROW-5329](https://issues.apache.org/jira/browse/ARROW-5329) - Add support for building MATLAB interface to Feather directly within MATLAB
-* [ARROW-5334](https://issues.apache.org/jira/browse/ARROW-5334) - [C++] Add "Type" to names of arrow::Integer, arrow::FloatingPoint classes for consistency
-* [ARROW-5335](https://issues.apache.org/jira/browse/ARROW-5335) - [Python] Raise on variable dictionaries when converting to pandas
-* [ARROW-5339](https://issues.apache.org/jira/browse/ARROW-5339) - [C++] Add jemalloc to thirdparty dependency download script
-* [ARROW-5341](https://issues.apache.org/jira/browse/ARROW-5341) - [C++] Add instructions about fixing and testing for -Wdocumentation clang warnings locally
-* [ARROW-5342](https://issues.apache.org/jira/browse/ARROW-5342) - [Format] Formalize extension type metadata in IPC protocol
-* [ARROW-5346](https://issues.apache.org/jira/browse/ARROW-5346) - [C++] Revert changes to qualify duration in vendored date code
-* [ARROW-5349](https://issues.apache.org/jira/browse/ARROW-5349) - [Python/C++] Provide a way to specify the file path in parquet ColumnChunkMetaData
-* [ARROW-5361](https://issues.apache.org/jira/browse/ARROW-5361) - [R] Follow DictionaryType/DictionaryArray changes from ARROW-3144
-* [ARROW-5363](https://issues.apache.org/jira/browse/ARROW-5363) - [GLib] Fix coding styles
-* [ARROW-5364](https://issues.apache.org/jira/browse/ARROW-5364) - [C++] Use ASCII rather than UTF-8 in BuildUtils.cmake comment
-* [ARROW-5365](https://issues.apache.org/jira/browse/ARROW-5365) - [C++][CI] Add UBSan and ASAN into CI
-* [ARROW-5368](https://issues.apache.org/jira/browse/ARROW-5368) - [C++] Disable jemalloc by default with MinGW
-* [ARROW-5369](https://issues.apache.org/jira/browse/ARROW-5369) - [C++] Add support for glog on Windows
-* [ARROW-5370](https://issues.apache.org/jira/browse/ARROW-5370) - [C++] Detect system uriparser by default
-* [ARROW-5372](https://issues.apache.org/jira/browse/ARROW-5372) - [GLib] Add support for null/boolean values CSV read option
-* [ARROW-5378](https://issues.apache.org/jira/browse/ARROW-5378) - [C++] Add local FileSystem implementation
-* [ARROW-5380](https://issues.apache.org/jira/browse/ARROW-5380) - [C++] Fix and enable UBSan for unaligned accesses.
-* [ARROW-5384](https://issues.apache.org/jira/browse/ARROW-5384) - [Go] add FixedSizeList array
-* [ARROW-5389](https://issues.apache.org/jira/browse/ARROW-5389) - [C++] Add an internal temporary directory API
-* [ARROW-5392](https://issues.apache.org/jira/browse/ARROW-5392) - [C++][CI][MinGW] Disable static library build on AppVeyor
-* [ARROW-5393](https://issues.apache.org/jira/browse/ARROW-5393) - [R] Add tests and example for read\_parquet()
-* [ARROW-5395](https://issues.apache.org/jira/browse/ARROW-5395) - [C++] Utilize stream EOS in File format
-* [ARROW-5396](https://issues.apache.org/jira/browse/ARROW-5396) - [JS] Ensure reader and writer support files and streams with no RecordBatches
-* [ARROW-5401](https://issues.apache.org/jira/browse/ARROW-5401) - [CI] [C++] Print ccache statistics on Travis-CI
-* [ARROW-5404](https://issues.apache.org/jira/browse/ARROW-5404) - [C++] nonstd::string\_view conflicts with std::string\_view in c++17
-* [ARROW-5407](https://issues.apache.org/jira/browse/ARROW-5407) - [C++] Integration test Travis CI entry builds many unnecessary targets
-* [ARROW-5413](https://issues.apache.org/jira/browse/ARROW-5413) - [C++] CSV reader doesn't remove BOM
-* [ARROW-5415](https://issues.apache.org/jira/browse/ARROW-5415) - [Release] Release script should update R version everywhere
-* [ARROW-5416](https://issues.apache.org/jira/browse/ARROW-5416) - [Website] Add Homebrew to project installation page
-* [ARROW-5418](https://issues.apache.org/jira/browse/ARROW-5418) - [CI][R] Run code coverage and report to codecov.io
-* [ARROW-5420](https://issues.apache.org/jira/browse/ARROW-5420) - [Java] Implement or remove getCurrentSizeInBytes in VariableWidthVector
-* [ARROW-5427](https://issues.apache.org/jira/browse/ARROW-5427) - [Python] RangeIndex serialization change implications
-* [ARROW-5428](https://issues.apache.org/jira/browse/ARROW-5428) - [C++] Add option to set "read extent" in arrow::io::BufferedInputStream
-* [ARROW-5429](https://issues.apache.org/jira/browse/ARROW-5429) - [Java] Provide alternative buffer allocation policy
-* [ARROW-5432](https://issues.apache.org/jira/browse/ARROW-5432) - [Python] Add 'read\_at' method to pyarrow.NativeFile
-* [ARROW-5433](https://issues.apache.org/jira/browse/ARROW-5433) - [C++][Parquet] improve parquet-reader columns information
-* [ARROW-5434](https://issues.apache.org/jira/browse/ARROW-5434) - [Java] Introduce wrappers for backward compatibility for ArrowBuf changes in ARROW-3191
-* [ARROW-5436](https://issues.apache.org/jira/browse/ARROW-5436) - [Python] expose filters argument in parquet.read\_table
-* [ARROW-5438](https://issues.apache.org/jira/browse/ARROW-5438) - [JS] Utilize stream EOS in File format
-* [ARROW-5441](https://issues.apache.org/jira/browse/ARROW-5441) - [C++] Implement FindArrowFlight.cmake
-* [ARROW-5442](https://issues.apache.org/jira/browse/ARROW-5442) - [Website] Clarify what makes a release artifact "official"
-* [ARROW-5443](https://issues.apache.org/jira/browse/ARROW-5443) - [Gandiva][Crossbow] Turn parquet encryption off
-* [ARROW-5447](https://issues.apache.org/jira/browse/ARROW-5447) - [CI] [Ruby] CI is failed on AppVeyor
-* [ARROW-5449](https://issues.apache.org/jira/browse/ARROW-5449) - [C++] Local filesystem implementation: investigate Windows UNC paths
-* [ARROW-5451](https://issues.apache.org/jira/browse/ARROW-5451) - [C++][Gandiva] Add round functions for decimals
-* [ARROW-5452](https://issues.apache.org/jira/browse/ARROW-5452) - [R] Add documentation website (pkgdown)
-* [ARROW-5461](https://issues.apache.org/jira/browse/ARROW-5461) - [Java] Add micro-benchmarks for Float8Vector and allocators
-* [ARROW-5463](https://issues.apache.org/jira/browse/ARROW-5463) - [Rust] Implement AsRef for Buffer
-* [ARROW-5464](https://issues.apache.org/jira/browse/ARROW-5464) - [Archery] Bad --benchmark-filter default
-* [ARROW-5465](https://issues.apache.org/jira/browse/ARROW-5465) - [Crossbow] Support writing submitted job definition yaml to a file
-* [ARROW-5466](https://issues.apache.org/jira/browse/ARROW-5466) - [Java] Dockerize Java builds in Travis CI, run multiple JDKs in single entry
-* [ARROW-5467](https://issues.apache.org/jira/browse/ARROW-5467) - [Go] implement read/write IPC for Time32/Time64 arrays
-* [ARROW-5468](https://issues.apache.org/jira/browse/ARROW-5468) - [Go] implement read/write IPC for Timestamp arrays
-* [ARROW-5469](https://issues.apache.org/jira/browse/ARROW-5469) - [Go] implement read/write IPC for Date32/Date64 arrays
-* [ARROW-5470](https://issues.apache.org/jira/browse/ARROW-5470) - [CI] C++ local filesystem patch breaks Travis R job
-* [ARROW-5472](https://issues.apache.org/jira/browse/ARROW-5472) - [Development] Add warning to PR merge tool if no JIRA component is set
-* [ARROW-5474](https://issues.apache.org/jira/browse/ARROW-5474) - [C++] Document required Boost version
-* [ARROW-5476](https://issues.apache.org/jira/browse/ARROW-5476) - [Java][Memory] Fix Netty ArrowBuf Slice
-* [ARROW-5477](https://issues.apache.org/jira/browse/ARROW-5477) - [C++] Check required RapidJSON version
-* [ARROW-5478](https://issues.apache.org/jira/browse/ARROW-5478) - [Packaging] Drop Ubuntu 14.04 support
-* [ARROW-5481](https://issues.apache.org/jira/browse/ARROW-5481) - [GLib] garrow\_seekable\_input\_stream\_peek() misses "error" parameter document
-* [ARROW-5483](https://issues.apache.org/jira/browse/ARROW-5483) - [Java] add ValueVector constructors that take a Field object
-* [ARROW-5485](https://issues.apache.org/jira/browse/ARROW-5485) - [Gandiva][Crossbow] OSx builds failing
-* [ARROW-5486](https://issues.apache.org/jira/browse/ARROW-5486) - [GLib] Add binding of gandiva::FunctionRegistry and related things
-* [ARROW-5488](https://issues.apache.org/jira/browse/ARROW-5488) - [R] Workaround when C++ lib not available
-* [ARROW-5490](https://issues.apache.org/jira/browse/ARROW-5490) - [C++] Remove ARROW\_BOOST\_HEADER\_ONLY
-* [ARROW-5491](https://issues.apache.org/jira/browse/ARROW-5491) - [C++] Remove unecessary semicolons following MACRO definitions
-* [ARROW-5492](https://issues.apache.org/jira/browse/ARROW-5492) - [R] Add "col\_select" argument to read\_\* functions to read subset of columns
-* [ARROW-5495](https://issues.apache.org/jira/browse/ARROW-5495) - [C++] Use HTTPS consistently for downloading dependencies
-* [ARROW-5496](https://issues.apache.org/jira/browse/ARROW-5496) - [R][CI] Fix relative paths in R codecov.io reporting
-* [ARROW-5498](https://issues.apache.org/jira/browse/ARROW-5498) - [C++] Build failure with Flatbuffers 1.11.0 and MinGW
-* [ARROW-5500](https://issues.apache.org/jira/browse/ARROW-5500) - [R] read\_csv\_arrow() signature should match readr::read\_csv()
-* [ARROW-5503](https://issues.apache.org/jira/browse/ARROW-5503) - [R] add read\_json()
-* [ARROW-5504](https://issues.apache.org/jira/browse/ARROW-5504) - [R] move use\_threads argument to global option
-* [ARROW-5509](https://issues.apache.org/jira/browse/ARROW-5509) - [R] write\_parquet()
-* [ARROW-5511](https://issues.apache.org/jira/browse/ARROW-5511) - [Packaging] Enable Flight in Conda packages
-* [ARROW-5512](https://issues.apache.org/jira/browse/ARROW-5512) - [C++] Draft initial public APIs for Datasets project
-* [ARROW-5513](https://issues.apache.org/jira/browse/ARROW-5513) - [Java] Refactor method name for getstartOffset to use camel case
-* [ARROW-5516](https://issues.apache.org/jira/browse/ARROW-5516) - [Python] Development page for pyarrow has a missing dependency in using pip
-* [ARROW-5518](https://issues.apache.org/jira/browse/ARROW-5518) - [Java] Set VectorSchemaRoot rowCount to 0 on allocateNew and clear
-* [ARROW-5524](https://issues.apache.org/jira/browse/ARROW-5524) - [C++] Turn off PARQUET\_BUILD\_ENCRYPTION in CMake if OpenSSL not found
-* [ARROW-5526](https://issues.apache.org/jira/browse/ARROW-5526) - [Developer] Add more prominent notice to GitHub issue template to direct bug reports to JIRA
-* [ARROW-5529](https://issues.apache.org/jira/browse/ARROW-5529) - [Flight] Allow serving with multiple TLS certificates
-* [ARROW-5531](https://issues.apache.org/jira/browse/ARROW-5531) - [Python] Support binary, utf8, and nested types in Array.from\_buffers
-* [ARROW-5533](https://issues.apache.org/jira/browse/ARROW-5533) - [Plasma] Plasma client should be thread-safe
-* [ARROW-5534](https://issues.apache.org/jira/browse/ARROW-5534) - [GLib] Add garrow\_table\_concatenate()
-* [ARROW-5535](https://issues.apache.org/jira/browse/ARROW-5535) - [GLib] Add garrow\_table\_slice()
-* [ARROW-5537](https://issues.apache.org/jira/browse/ARROW-5537) - [JS] Support delta dictionaries in RecordBatchWriter and DictionaryBuilder
-* [ARROW-5538](https://issues.apache.org/jira/browse/ARROW-5538) - [C++] Restrict minimum OpenSSL version to 1.0.2
-* [ARROW-5541](https://issues.apache.org/jira/browse/ARROW-5541) - [R] cast from negative int32 to uint32 and uint64 are now safe
-* [ARROW-5544](https://issues.apache.org/jira/browse/ARROW-5544) - [Archery] should not return non-zero in \`benchmark diff\` sub command on regression
-* [ARROW-5545](https://issues.apache.org/jira/browse/ARROW-5545) - [C++][Docs] Clarify expectation of UTC values for timestamps with time zones in C++ API docs
-* [ARROW-5547](https://issues.apache.org/jira/browse/ARROW-5547) - [C++][FlightRPC] arrow-flight.pc isn't provided
-* [ARROW-5552](https://issues.apache.org/jira/browse/ARROW-5552) - [Go] make Schema and Field implement Stringer
-* [ARROW-5554](https://issues.apache.org/jira/browse/ARROW-5554) - Add a python wrapper for arrow::Concatenate
-* [ARROW-5555](https://issues.apache.org/jira/browse/ARROW-5555) - [R] Add install\_arrow() function to assist the user in obtaining C++ runtime libraries
-* [ARROW-5556](https://issues.apache.org/jira/browse/ARROW-5556) - [Doc] Document JSON reader
-* [ARROW-5557](https://issues.apache.org/jira/browse/ARROW-5557) - [C++] Investigate performance of VisitBitsUnrolled on different platforms
-* [ARROW-5564](https://issues.apache.org/jira/browse/ARROW-5564) - [C++] Add uriparser to conda-forge
-* [ARROW-5565](https://issues.apache.org/jira/browse/ARROW-5565) - [Python] Document how to use gdb when working on pyarrow
-* [ARROW-5567](https://issues.apache.org/jira/browse/ARROW-5567) - [C++]  Fix build error of memory-benchmark
-* [ARROW-5574](https://issues.apache.org/jira/browse/ARROW-5574) - [R] documentation error for read\_arrow()
-* [ARROW-5580](https://issues.apache.org/jira/browse/ARROW-5580) - Correct definitions of timestamp functions in Gandiva
-* [ARROW-5581](https://issues.apache.org/jira/browse/ARROW-5581) - [Java] Provide interfaces and initial implementations for vector sorting
-* [ARROW-5582](https://issues.apache.org/jira/browse/ARROW-5582) - [Go] add support for comparing Records
-* [ARROW-5586](https://issues.apache.org/jira/browse/ARROW-5586) - [R] convert Array of LIST type to R lists
-* [ARROW-5587](https://issues.apache.org/jira/browse/ARROW-5587) - [Java] Add more maven style check for Java code
-* [ARROW-5590](https://issues.apache.org/jira/browse/ARROW-5590) - [R] Run "no libarrow" R build in the same CI entry if possible
-* [ARROW-5591](https://issues.apache.org/jira/browse/ARROW-5591) - [Go] implement read/write IPC for Duration & Intervals
-* [ARROW-5597](https://issues.apache.org/jira/browse/ARROW-5597) - [Packaging][deb] Add Flight packages
-* [ARROW-5600](https://issues.apache.org/jira/browse/ARROW-5600) - [R] R package namespace cleanup
-* [ARROW-5602](https://issues.apache.org/jira/browse/ARROW-5602) - [Java][Gandiva] Add test for decimal round functions
-* [ARROW-5604](https://issues.apache.org/jira/browse/ARROW-5604) - [Go] improve test coverage of type-traits
-* [ARROW-5609](https://issues.apache.org/jira/browse/ARROW-5609) - [C++] Set CMP0068 CMake policy to avoid macOS warnings
-* [ARROW-5612](https://issues.apache.org/jira/browse/ARROW-5612) - [Python][Documentation] Clarify date\_as\_object option behavior
-* [ARROW-5621](https://issues.apache.org/jira/browse/ARROW-5621) - [Go] implement read/write IPC for Decimal128 arrays
-* [ARROW-5622](https://issues.apache.org/jira/browse/ARROW-5622) - [C++][Dataset] arrow-dataset.pc isn't provided
-* [ARROW-5625](https://issues.apache.org/jira/browse/ARROW-5625) - [R] convert Array of struct type to data frame columns
-* [ARROW-5632](https://issues.apache.org/jira/browse/ARROW-5632) - [Doc] Add some documentation describing compile/debug workflow on macOS with Xcode IDE
-* [ARROW-5633](https://issues.apache.org/jira/browse/ARROW-5633) - [Python] Enable bz2 in Linux wheels
-* [ARROW-5635](https://issues.apache.org/jira/browse/ARROW-5635) - [C++] Support "compacting" a table
-* [ARROW-5637](https://issues.apache.org/jira/browse/ARROW-5637) - [Gandiva] [Java]Complete IN Expression
-* [ARROW-5639](https://issues.apache.org/jira/browse/ARROW-5639) - [Java] Remove floating point computation from getOffsetBufferValueCapacity
-* [ARROW-5641](https://issues.apache.org/jira/browse/ARROW-5641) - [GLib] Remove enums files generated by GNU Autotools from Git targets
-* [ARROW-5643](https://issues.apache.org/jira/browse/ARROW-5643) - [Flight] Add ability to override hostname checking
-* [ARROW-5650](https://issues.apache.org/jira/browse/ARROW-5650) - [Python] Update manylinux dependency versions
-* [ARROW-5652](https://issues.apache.org/jira/browse/ARROW-5652) - [CI] Fix iwyu docker image
-* [ARROW-5656](https://issues.apache.org/jira/browse/ARROW-5656) - [Python] Enable Flight wheels on macOS
-* [ARROW-5659](https://issues.apache.org/jira/browse/ARROW-5659) - [C++] Add support for finding OpenSSL installed by Homebrew
-* [ARROW-5660](https://issues.apache.org/jira/browse/ARROW-5660) - [GLib][CI] Use the latest macOS image and all Homebrew based libraries
-* [ARROW-5661](https://issues.apache.org/jira/browse/ARROW-5661) - Support hash functions for decimal in Gandiva
-* [ARROW-5662](https://issues.apache.org/jira/browse/ARROW-5662) - [C++] Add support for BOOST\_SOURCE=AUTO|BUNDLED|SYSTEM
-* [ARROW-5663](https://issues.apache.org/jira/browse/ARROW-5663) - [Packaging][RPM] Update CentOS packages for 0.14.0
-* [ARROW-5664](https://issues.apache.org/jira/browse/ARROW-5664) - [Crossbow] Execute nightly crossbow tests on CircleCI instead of Travis
-* [ARROW-5668](https://issues.apache.org/jira/browse/ARROW-5668) - [Python] Display "not null" in Schema.\_\_repr\_\_ for non-nullable fields
-* [ARROW-5669](https://issues.apache.org/jira/browse/ARROW-5669) - [Crossbow] manylinux1 wheel building failing
-* [ARROW-5670](https://issues.apache.org/jira/browse/ARROW-5670) - [Crossbow] get\_apache\_mirror.py fails with TLS error on macOS with Python 3.5
-* [ARROW-5671](https://issues.apache.org/jira/browse/ARROW-5671) - [crossbow] mac os python wheels failing
-* [ARROW-5672](https://issues.apache.org/jira/browse/ARROW-5672) - [Java] Refactor redundant method modifier
-* [ARROW-5683](https://issues.apache.org/jira/browse/ARROW-5683) - [R] Add snappy to Rtools Windows builds
-* [ARROW-5684](https://issues.apache.org/jira/browse/ARROW-5684) - [Packaging][deb] Add support for Ubuntu 19.04
-* [ARROW-5685](https://issues.apache.org/jira/browse/ARROW-5685) - [Packaging][deb] Add support for Apache Arrow Datasets
-* [ARROW-5687](https://issues.apache.org/jira/browse/ARROW-5687) - [C++] Remove remaining uses of ARROW\_BOOST\_VENDORED
-* [ARROW-5690](https://issues.apache.org/jira/browse/ARROW-5690) - [Packaging][Python] macOS wheels broken: libprotobuf.18.dylib missing
-* [ARROW-5694](https://issues.apache.org/jira/browse/ARROW-5694) - [Python] List of decimals are not supported when converting to pandas
-* [ARROW-5695](https://issues.apache.org/jira/browse/ARROW-5695) - [C#][Release] Run sourcelink test in verify-release-candidate.sh
-* [ARROW-5696](https://issues.apache.org/jira/browse/ARROW-5696) - [Gandiva] [C++] Introduce castVarcharVarchar
-* [ARROW-5699](https://issues.apache.org/jira/browse/ARROW-5699) - [C++] Optimize parsing of Decimal128 in CSV
-* [ARROW-5701](https://issues.apache.org/jira/browse/ARROW-5701) - [C++][Gandiva] Build expressions only for the required selection vector types
-* [ARROW-5702](https://issues.apache.org/jira/browse/ARROW-5702) - [C++] parquet::arrow::FileReader::GetSchema()
-* [ARROW-5704](https://issues.apache.org/jira/browse/ARROW-5704) - [C++] Stop using ARROW\_TEMPLATE\_EXPORT for SparseTensorImpl class
-* [ARROW-5705](https://issues.apache.org/jira/browse/ARROW-5705) - [Java] Optimize BaseValueVector#computeCombinedBufferSize logic
-* [ARROW-5706](https://issues.apache.org/jira/browse/ARROW-5706) - [Java] Remove type conversion in getValidityBufferValueCapacity
-* [ARROW-5707](https://issues.apache.org/jira/browse/ARROW-5707) - [Java] Improve the performance and code structure for ArrowRecordBatch
-* [ARROW-5710](https://issues.apache.org/jira/browse/ARROW-5710) - [C++] Allow compiling Gandiva with Ninja on Windows
-* [ARROW-5718](https://issues.apache.org/jira/browse/ARROW-5718) - [R] auto splice data frames in record\_batch() and table()
-* [ARROW-5721](https://issues.apache.org/jira/browse/ARROW-5721) - [Rust] Move array related code into a separate module
-* [ARROW-5724](https://issues.apache.org/jira/browse/ARROW-5724) - [R] [CI] AppVeyor build should use ccache
-* [ARROW-5725](https://issues.apache.org/jira/browse/ARROW-5725) - [Crossbow] Port conda recipes to azure pipelines
-* [ARROW-5726](https://issues.apache.org/jira/browse/ARROW-5726) - [Java] Implement a common interface for int vectors
-* [ARROW-5727](https://issues.apache.org/jira/browse/ARROW-5727) - [Python] [CI] Install pytest-faulthandler before running tests
-* [ARROW-5748](https://issues.apache.org/jira/browse/ARROW-5748) - [Packaging][deb] Add support for Debian GNU/Linux buster
-* [ARROW-5749](https://issues.apache.org/jira/browse/ARROW-5749) - [Python] Add Python binding for Table::CombineChunks()
-* [ARROW-5751](https://issues.apache.org/jira/browse/ARROW-5751) - [Packaging][Python] Python macOS wheels have dynamic dependency on libcares
-* [ARROW-5752](https://issues.apache.org/jira/browse/ARROW-5752) - [Java] Improve the performance of ArrowBuf#setZero
-* [ARROW-5755](https://issues.apache.org/jira/browse/ARROW-5755) - [Rust] [Parquet] Add derived clone for Type
-* [ARROW-5768](https://issues.apache.org/jira/browse/ARROW-5768) - [Release] There are needless newlines at the end of CHANGELOG.md
-* [ARROW-5773](https://issues.apache.org/jira/browse/ARROW-5773) - [R] Clean up documentation before release
-* [ARROW-5780](https://issues.apache.org/jira/browse/ARROW-5780) - [C++] Add benchmark for Decimal128 operations
-* [ARROW-5782](https://issues.apache.org/jira/browse/ARROW-5782) - [Release] Setup test data for Flight in dev/release/01-perform.sh
-* [ARROW-5783](https://issues.apache.org/jira/browse/ARROW-5783) - [Release][C#] Exclude dummy.git from RAT check
-* [ARROW-5785](https://issues.apache.org/jira/browse/ARROW-5785) - Rust datafusion implementation should not depend on rustyline
-* [ARROW-5787](https://issues.apache.org/jira/browse/ARROW-5787) - [Release][Rust] Use local modules to verify RC
-* [ARROW-5793](https://issues.apache.org/jira/browse/ARROW-5793) - [Release] Avoid duplicate known host SSH error in dev/release/03-binary.sh
-* [ARROW-5794](https://issues.apache.org/jira/browse/ARROW-5794) - [Release] Skip uploading already uploaded binaries
-* [ARROW-5795](https://issues.apache.org/jira/browse/ARROW-5795) - [Release] Add missing waits on uploading binaries
-* [ARROW-5796](https://issues.apache.org/jira/browse/ARROW-5796) - [Release][APT] Update expected package list
-* [ARROW-5797](https://issues.apache.org/jira/browse/ARROW-5797) - [Release][APT] Update supported distributions
-* [ARROW-5813](https://issues.apache.org/jira/browse/ARROW-5813) - [C++] Support checking the equality of the different contiguous tensors
-* [ARROW-5818](https://issues.apache.org/jira/browse/ARROW-5818) - [Java][Gandiva] support varlen output vectors
-* [ARROW-5820](https://issues.apache.org/jira/browse/ARROW-5820) - [Release] Remove undefined variable check from verify script
-* [ARROW-653](https://issues.apache.org/jira/browse/ARROW-653) - [Python / C++] Add debugging function to print an array's buffer contents in hexadecimal
-* [ARROW-767](https://issues.apache.org/jira/browse/ARROW-767) - [C++] Adopt FileSystem abstraction
-* [ARROW-835](https://issues.apache.org/jira/browse/ARROW-835) - [Format] Add Timedelta type to describe time intervals
-* [ARROW-840](https://issues.apache.org/jira/browse/ARROW-840) - [Python] Provide Python API for creating user-defined data types that can survive Arrow IPC
-* [ARROW-973](https://issues.apache.org/jira/browse/ARROW-973) - [Website] Add FAQ page about project
-
-### Bug Fixes
-
-* [ARROW-1837](https://issues.apache.org/jira/browse/ARROW-1837) - [Java] Unable to read unsigned integers outside signed range for bit width in integration tests
-* [ARROW-2119](https://issues.apache.org/jira/browse/ARROW-2119) - [C++][Java] Handle Arrow stream with zero record batch
-* [ARROW-2136](https://issues.apache.org/jira/browse/ARROW-2136) - [Python] Non-nullable schema fields not checked in conversions from pandas
-* [ARROW-2256](https://issues.apache.org/jira/browse/ARROW-2256) - [C++] Fuzzer builds fail out of the box on Ubuntu 16.04 using LLVM apt repos
-* [ARROW-2461](https://issues.apache.org/jira/browse/ARROW-2461) - [Python] Build wheels for manylinux2010 tag
-* [ARROW-3344](https://issues.apache.org/jira/browse/ARROW-3344) - [Python] test\_plasma.py fails (in test\_plasma\_list)
-* [ARROW-3399](https://issues.apache.org/jira/browse/ARROW-3399) - [Python] Cannot serialize numpy matrix object
-* [ARROW-3650](https://issues.apache.org/jira/browse/ARROW-3650) - [Python] Mixed column indexes are read back as strings
-* [ARROW-3762](https://issues.apache.org/jira/browse/ARROW-3762) - [C++] Parquet arrow::Table reads error when overflowing capacity of BinaryArray
-* [ARROW-4021](https://issues.apache.org/jira/browse/ARROW-4021) - [Ruby] Error building red-arrow on msys2
-* [ARROW-4076](https://issues.apache.org/jira/browse/ARROW-4076) - [Python] schema validation and filters
-* [ARROW-4139](https://issues.apache.org/jira/browse/ARROW-4139) - [Python] Cast Parquet column statistics to unicode if UTF8 ConvertedType is set
-* [ARROW-4301](https://issues.apache.org/jira/browse/ARROW-4301) - [Java][Gandiva] Maven snapshot version update does not seem to update Gandiva submodule
-* [ARROW-4324](https://issues.apache.org/jira/browse/ARROW-4324) - [Python] Array dtype inference incorrect when created from list of mixed numpy scalars
-* [ARROW-4350](https://issues.apache.org/jira/browse/ARROW-4350) - [Python] dtype=object arrays cannot be converted to a list-of-list ListArray
-* [ARROW-4447](https://issues.apache.org/jira/browse/ARROW-4447) - [C++] Investigate dynamic linking for libthift
-* [ARROW-4516](https://issues.apache.org/jira/browse/ARROW-4516) - [Python] Error while creating a ParquetDataset on a path without \`\_common\_dataset\` but with an empty \`\_tempfile\`
-* [ARROW-4523](https://issues.apache.org/jira/browse/ARROW-4523) - [JS] Add row proxy generation benchmark
-* [ARROW-4651](https://issues.apache.org/jira/browse/ARROW-4651) - [Format] Flight Location should be more flexible than a (host, port) pair
-* [ARROW-4675](https://issues.apache.org/jira/browse/ARROW-4675) - [Python] Error serializing bool ndarray in py2 and deserializing in py3
-* [ARROW-4694](https://issues.apache.org/jira/browse/ARROW-4694) - [CI] detect-changes.py is inconsistent
-* [ARROW-4723](https://issues.apache.org/jira/browse/ARROW-4723) - [Python] Skip \_files when reading a directory containing parquet files
-* [ARROW-4725](https://issues.apache.org/jira/browse/ARROW-4725) - [C++] Dictionary tests disabled under MinGW builds
-* [ARROW-4823](https://issues.apache.org/jira/browse/ARROW-4823) - [Python] read\_csv shouldn't close file handles it doesn't own
-* [ARROW-4845](https://issues.apache.org/jira/browse/ARROW-4845) - [R] Compiler warnings on Windows MingW64
-* [ARROW-4851](https://issues.apache.org/jira/browse/ARROW-4851) - [Java] BoundsChecking.java defaulting behavior for old drill parameter seems off
-* [ARROW-4885](https://issues.apache.org/jira/browse/ARROW-4885) - [Python] read\_csv() can't handle decimal128 columns
-* [ARROW-4886](https://issues.apache.org/jira/browse/ARROW-4886) - [Rust] Inconsistent behaviour with casting sliced primitive array to list array
-* [ARROW-4923](https://issues.apache.org/jira/browse/ARROW-4923) - Expose setters for Decimal vector that take long and double inputs
-* [ARROW-4934](https://issues.apache.org/jira/browse/ARROW-4934) - [Python] Address deprecation notice that will be a bug in Python 3.8
-* [ARROW-5019](https://issues.apache.org/jira/browse/ARROW-5019) - [C#] ArrowStreamWriter doesn't work on a non-seekable stream
-* [ARROW-5049](https://issues.apache.org/jira/browse/ARROW-5049) - [Python] org/apache/hadoop/fs/FileSystem class not found when pyarrow FileSystem used in spark
-* [ARROW-5051](https://issues.apache.org/jira/browse/ARROW-5051) - [GLib][Gandiva] Test failure in release verification script
-* [ARROW-5058](https://issues.apache.org/jira/browse/ARROW-5058) - [Release] 02-source.sh generates e-mail template with wrong links
-* [ARROW-5068](https://issues.apache.org/jira/browse/ARROW-5068) - [Gandiva][Packaging] Fix gandiva nightly builds after the CMake refactor
-* [ARROW-5090](https://issues.apache.org/jira/browse/ARROW-5090) - Parquet linking fails on MacOS due to @rpath in dylib
-* [ARROW-5092](https://issues.apache.org/jira/browse/ARROW-5092) - [C#] Source Link doesn't work with the C# release script
-* [ARROW-5095](https://issues.apache.org/jira/browse/ARROW-5095) - [Flight][C++] Flight DoGet doesn't expose server error message
-* [ARROW-5096](https://issues.apache.org/jira/browse/ARROW-5096) - [Packaging][deb] plasma-store-server packages are missing
-* [ARROW-5097](https://issues.apache.org/jira/browse/ARROW-5097) - [Packaging][CentOS6] arrow-lib has unresolvable dependencies
-* [ARROW-5098](https://issues.apache.org/jira/browse/ARROW-5098) - [Website] Update APT install document for 0.13.0
-* [ARROW-5100](https://issues.apache.org/jira/browse/ARROW-5100) - [JS] Writer swaps byte order if buffers share the same underlying ArrayBuffer
-* [ARROW-5117](https://issues.apache.org/jira/browse/ARROW-5117) - [Go] Panic when appending zero slices after initializing a builder
-* [ARROW-5119](https://issues.apache.org/jira/browse/ARROW-5119) - [Go] invalid Stringer implementation for array.Boolean
-* [ARROW-5129](https://issues.apache.org/jira/browse/ARROW-5129) - [Rust][Parquet] Column writer bug: check dictionary encoder when adding a new data page
-* [ARROW-5130](https://issues.apache.org/jira/browse/ARROW-5130) - [Python] Segfault when importing TensorFlow after Pyarrow
-* [ARROW-5132](https://issues.apache.org/jira/browse/ARROW-5132) - [Java] Errors on building gandiva\_jni.dll on Windows with Visual Studio 2017
-* [ARROW-5138](https://issues.apache.org/jira/browse/ARROW-5138) - [Python/C++] Row group retrieval doesn't restore index properly
-* [ARROW-5142](https://issues.apache.org/jira/browse/ARROW-5142) - [CI] Fix conda calls in AppVeyor scripts
-* [ARROW-5144](https://issues.apache.org/jira/browse/ARROW-5144) - [Python] ParquetDataset and ParquetPiece not serializable
-* [ARROW-5146](https://issues.apache.org/jira/browse/ARROW-5146) - [Dev] Merge script imposes directory name
-* [ARROW-5147](https://issues.apache.org/jira/browse/ARROW-5147) - [C++] get an error in building: Could NOT find DoubleConversion
-* [ARROW-5148](https://issues.apache.org/jira/browse/ARROW-5148) - [CI] [C++] LLVM-related compile errors
-* [ARROW-5149](https://issues.apache.org/jira/browse/ARROW-5149) - [Packaging][Wheel] Pin LLVM to version 7 in windows builds
-* [ARROW-5152](https://issues.apache.org/jira/browse/ARROW-5152) - [Python] CMake warnings when building
-* [ARROW-5159](https://issues.apache.org/jira/browse/ARROW-5159) - Unable to build benches in arrow crate.
-* [ARROW-5160](https://issues.apache.org/jira/browse/ARROW-5160) - [C++] ABORT\_NOT\_OK evalutes expression twice
-* [ARROW-5166](https://issues.apache.org/jira/browse/ARROW-5166) - [Python][Parquet] Statistics for uint64 columns may overflow
-* [ARROW-5167](https://issues.apache.org/jira/browse/ARROW-5167) - [C++] Upgrade string-view-light to latest
-* [ARROW-5169](https://issues.apache.org/jira/browse/ARROW-5169) - [Python] non-nullable fields are converted to nullable in Table.from\_pandas
-* [ARROW-5173](https://issues.apache.org/jira/browse/ARROW-5173) - [Go] handle multiple concatenated streams back-to-back
-* [ARROW-5174](https://issues.apache.org/jira/browse/ARROW-5174) - [Go] implement Stringer for DataTypes
-* [ARROW-5177](https://issues.apache.org/jira/browse/ARROW-5177) - [Python] ParquetReader.read\_column() doesn't check bounds
-* [ARROW-5183](https://issues.apache.org/jira/browse/ARROW-5183) - [CI] MinGW build failures on AppVeyor
-* [ARROW-5184](https://issues.apache.org/jira/browse/ARROW-5184) - [Rust] Broken links and other documentation warnings
-* [ARROW-5194](https://issues.apache.org/jira/browse/ARROW-5194) - [C++][Plasma] TEST(PlasmaSerialization, GetReply) is failing
-* [ARROW-5195](https://issues.apache.org/jira/browse/ARROW-5195) - [Python] read\_csv ignores null\_values on string types
-* [ARROW-5201](https://issues.apache.org/jira/browse/ARROW-5201) - [Python] Import ABCs from collections is deprecated in Python 3.7
-* [ARROW-5208](https://issues.apache.org/jira/browse/ARROW-5208) - [Python] Inconsistent resulting type during casting in pa.array() when mask is present
-* [ARROW-5214](https://issues.apache.org/jira/browse/ARROW-5214) - [C++] Offline dependency downloader misses some libraries
-* [ARROW-5217](https://issues.apache.org/jira/browse/ARROW-5217) - [Rust] [CI] DataFusion test failure
-* [ARROW-5232](https://issues.apache.org/jira/browse/ARROW-5232) - [Java] value vector size increases rapidly in case of clear/setSafe loop
-* [ARROW-5233](https://issues.apache.org/jira/browse/ARROW-5233) - [Go] migrate to new flatbuffers-v1.11.0
-* [ARROW-5237](https://issues.apache.org/jira/browse/ARROW-5237) - [Python] pandas\_version key in pandas metadata no longer populated
-* [ARROW-5240](https://issues.apache.org/jira/browse/ARROW-5240) - [C++][CI] cmake\_format 0.5.0 appears to fail the build
-* [ARROW-5242](https://issues.apache.org/jira/browse/ARROW-5242) - [C++] Arrow doesn't compile cleanly with Visual Studio 2017 Update 9 or later due to narrowing
-* [ARROW-5243](https://issues.apache.org/jira/browse/ARROW-5243) - [Java][Gandiva] Add test for decimal compare functions
-* [ARROW-5245](https://issues.apache.org/jira/browse/ARROW-5245) - [C++][CI] Unpin cmake\_format
-* [ARROW-5246](https://issues.apache.org/jira/browse/ARROW-5246) - [Go] use Go-1.12 in CI
-* [ARROW-5249](https://issues.apache.org/jira/browse/ARROW-5249) - [Java] Flight client doesn't handle auth correctly in some cases
-* [ARROW-5253](https://issues.apache.org/jira/browse/ARROW-5253) - [C++] external Snappy fails on Alpine
-* [ARROW-5254](https://issues.apache.org/jira/browse/ARROW-5254) - [Flight][Java] DoAction does not support result streams
-* [ARROW-5255](https://issues.apache.org/jira/browse/ARROW-5255) - [Java] Implement user-defined data types API
-* [ARROW-5260](https://issues.apache.org/jira/browse/ARROW-5260) - [Python][C++] Crash when deserializing from components in a fresh new process
-* [ARROW-5274](https://issues.apache.org/jira/browse/ARROW-5274) - [JavaScript] Wrong array type for countBy
-* [ARROW-5285](https://issues.apache.org/jira/browse/ARROW-5285) - [C++][Plasma] GpuProcessHandle is not released when GPU object deleted
-* [ARROW-5293](https://issues.apache.org/jira/browse/ARROW-5293) - [C++] Take kernel on DictionaryArray does not preserve ordered flag
-* [ARROW-5294](https://issues.apache.org/jira/browse/ARROW-5294) - [CI] setuptools\_scm failures
-* [ARROW-5296](https://issues.apache.org/jira/browse/ARROW-5296) - [Java] Sporadic Flight test failures
-* [ARROW-5301](https://issues.apache.org/jira/browse/ARROW-5301) - [Python] parquet documentation outdated on nthreads argument
-* [ARROW-5306](https://issues.apache.org/jira/browse/ARROW-5306) - [CI] [GLib] Disable GTK-Doc
-* [ARROW-5308](https://issues.apache.org/jira/browse/ARROW-5308) - [Go] remove deprecated Feather format
-* [ARROW-5314](https://issues.apache.org/jira/browse/ARROW-5314) - [Go] Incorrect Printing for String Arrays with Offsets
-* [ARROW-5325](https://issues.apache.org/jira/browse/ARROW-5325) - [Archery][Benchmark] Output properly formatted jsonlines from benchmark diff cli command
-* [ARROW-5330](https://issues.apache.org/jira/browse/ARROW-5330) - [Python] [CI] Run Python Flight tests on Travis-CI
-* [ARROW-5332](https://issues.apache.org/jira/browse/ARROW-5332) - [R] R package fails to build/install: error in dyn.load()
-* [ARROW-5348](https://issues.apache.org/jira/browse/ARROW-5348) - [CI] [Java] Gandiva checkstyle failure
-* [ARROW-5360](https://issues.apache.org/jira/browse/ARROW-5360) - [Rust] Builds are broken by rustyline on nightly 2019-05-16+
-* [ARROW-5362](https://issues.apache.org/jira/browse/ARROW-5362) - [C++] Compression round trip test can cause some sanitizers to to fail
-* [ARROW-5371](https://issues.apache.org/jira/browse/ARROW-5371) - [Release] Add tests for dev/release/00-prepare.sh
-* [ARROW-5373](https://issues.apache.org/jira/browse/ARROW-5373) - [Java] Add missing details for Gandiva Java Build
-* [ARROW-5376](https://issues.apache.org/jira/browse/ARROW-5376) - [C++] Compile failure on gcc 5.4.0
-* [ARROW-5383](https://issues.apache.org/jira/browse/ARROW-5383) - [Go] update IPC flatbuf (new Duration type)
-* [ARROW-5387](https://issues.apache.org/jira/browse/ARROW-5387) - [Go] properly handle sub-slice of List
-* [ARROW-5388](https://issues.apache.org/jira/browse/ARROW-5388) - [Go] use arrow.TypeEqual in array.NewChunked
-* [ARROW-5390](https://issues.apache.org/jira/browse/ARROW-5390) - [CI] Job time limit exceeded on Travis
-* [ARROW-5397](https://issues.apache.org/jira/browse/ARROW-5397) - Test Flight TLS support
-* [ARROW-5398](https://issues.apache.org/jira/browse/ARROW-5398) - [Python] Flight tests broken by URI changes
-* [ARROW-5403](https://issues.apache.org/jira/browse/ARROW-5403) - [C++] Test failures not propagated in Windows shared builds
-* [ARROW-5411](https://issues.apache.org/jira/browse/ARROW-5411) - [C++][Python] Build error building on Mac OS Mojave
-* [ARROW-5412](https://issues.apache.org/jira/browse/ARROW-5412) - [Java] Integration test fails with UnsupportedOperationException
-* [ARROW-5419](https://issues.apache.org/jira/browse/ARROW-5419) - [C++] CSV strings\_can\_be\_null option doesn't respect all null\_values
-* [ARROW-5421](https://issues.apache.org/jira/browse/ARROW-5421) - [Packaging][Crossbow] Duplicated key in nightly test configuration
-* [ARROW-5430](https://issues.apache.org/jira/browse/ARROW-5430) - [Python] Can read but not write parquet partitioned on large ints
-* [ARROW-5435](https://issues.apache.org/jira/browse/ARROW-5435) - [Java] add test for IntervalYearVector#getAsStringBuilder
-* [ARROW-5437](https://issues.apache.org/jira/browse/ARROW-5437) - [Python] Missing pandas pytest marker from parquet tests
-* [ARROW-5446](https://issues.apache.org/jira/browse/ARROW-5446) - [C++] Use cmake header install directory instead of include
-* [ARROW-5448](https://issues.apache.org/jira/browse/ARROW-5448) - [CI] MinGW build failures on AppVeyor
-* [ARROW-5453](https://issues.apache.org/jira/browse/ARROW-5453) - [C++] Just-released cmake-format 0.5.2 breaks the build
-* [ARROW-5455](https://issues.apache.org/jira/browse/ARROW-5455) - [Rust] Build broken by 2019-05-30 Rust nightly
-* [ARROW-5456](https://issues.apache.org/jira/browse/ARROW-5456) - [GLib][Plasma] Installed plasma-glib may be used on building document
-* [ARROW-5457](https://issues.apache.org/jira/browse/ARROW-5457) - [GLib][Plasma] Environment variable name for test is wrong
-* [ARROW-5459](https://issues.apache.org/jira/browse/ARROW-5459) - [Go] implement Stringer for Float16 DataType
-* [ARROW-5462](https://issues.apache.org/jira/browse/ARROW-5462) - [Go] support writing zero-length List
-* [ARROW-5479](https://issues.apache.org/jira/browse/ARROW-5479) - [Rust] [DataFusion] Use ARROW\_TEST\_DATA instead of relative path for testing
-* [ARROW-5487](https://issues.apache.org/jira/browse/ARROW-5487) - [CI] [Python] Failure in docs build
-* [ARROW-5493](https://issues.apache.org/jira/browse/ARROW-5493) - [Integration/Go] add Go support for IPC integration tests
-* [ARROW-5507](https://issues.apache.org/jira/browse/ARROW-5507) - [Plasma] [CUDA] Compile error
-* [ARROW-5514](https://issues.apache.org/jira/browse/ARROW-5514) - [C++] Printer for uint64 shows wrong values
-* [ARROW-5517](https://issues.apache.org/jira/browse/ARROW-5517) - [C++] Header collection CMake logic should only consider filename without directory included
-* [ARROW-5520](https://issues.apache.org/jira/browse/ARROW-5520) - [C++][Packaging] No NVidia CUDA toolkit on AArch64C
-* [ARROW-5521](https://issues.apache.org/jira/browse/ARROW-5521) - [Packaging] License check fails with Apache RAT 0.13
-* [ARROW-5528](https://issues.apache.org/jira/browse/ARROW-5528) - Concatenate() crashes when concatenating empty binary arrays.
-* [ARROW-5532](https://issues.apache.org/jira/browse/ARROW-5532) - [JS] Field Metadata Not Read
-* [ARROW-5551](https://issues.apache.org/jira/browse/ARROW-5551) - [Go] invalid FixedSizeArray representation
-* [ARROW-5553](https://issues.apache.org/jira/browse/ARROW-5553) - [Ruby] red-arrow gem does not compile on ruby:2.5 docker image
-* [ARROW-5576](https://issues.apache.org/jira/browse/ARROW-5576) - [C++] Flaky thrift\_ep tarball downloads
-* [ARROW-5577](https://issues.apache.org/jira/browse/ARROW-5577) - [C++] Link failure due to googletest shared library on Alpine Linux
-* [ARROW-5583](https://issues.apache.org/jira/browse/ARROW-5583) - [Java] When the isSet of a NullableValueHolder is 0, the buffer field should not be used
-* [ARROW-5584](https://issues.apache.org/jira/browse/ARROW-5584) - [Java] Add import for link reference in FieldReader javadoc
-* [ARROW-5589](https://issues.apache.org/jira/browse/ARROW-5589) - [C++][Fuzzing] arrow-ipc-fuzzing-test crash 2354085db0125113f04f7bd23f54b85cca104713
-* [ARROW-5592](https://issues.apache.org/jira/browse/ARROW-5592) - [Go] implement Duration array
-* [ARROW-5596](https://issues.apache.org/jira/browse/ARROW-5596) - [Python] Flight tests failing on Python 2.7
-* [ARROW-5601](https://issues.apache.org/jira/browse/ARROW-5601) - [gandiva] Error when projector with a string field
-* [ARROW-5603](https://issues.apache.org/jira/browse/ARROW-5603) - [Python] register pytest markers to avoid warnings
-* [ARROW-5605](https://issues.apache.org/jira/browse/ARROW-5605) - [C++][Fuzzing] arrow-ipc-fuzzing-test crash 74aec871d14bb6b07c72ea8f0e8c9f72cbe6b73c
-* [ARROW-5606](https://issues.apache.org/jira/browse/ARROW-5606) - [Python] pandas.RangeIndex.\_start/\_stop/\_step are deprecated
-* [ARROW-5608](https://issues.apache.org/jira/browse/ARROW-5608) - [C++][parquet] Invalid memory access when using parquet::arrow::ColumnReader
-* [ARROW-5615](https://issues.apache.org/jira/browse/ARROW-5615) - [C++] Compilation error due to C++11 string literals on gcc 5.4.0 Ubuntu 16.04
-* [ARROW-5616](https://issues.apache.org/jira/browse/ARROW-5616) - [Python] C++ build failure against Python 2.7 headers
-* [ARROW-5617](https://issues.apache.org/jira/browse/ARROW-5617) - [C++] thrift\_ep 0.12.0 fails to build when using ARROW\_BOOST\_VENDORED=ON
-* [ARROW-5619](https://issues.apache.org/jira/browse/ARROW-5619) - [C++] get\_apache\_mirror.py doesn't work with Python 3.5
-* [ARROW-5623](https://issues.apache.org/jira/browse/ARROW-5623) - [CI][GLib] Failed on macOS
-* [ARROW-5624](https://issues.apache.org/jira/browse/ARROW-5624) - [C++] -Duriparser\_SOURCE=BUNDLED is broken
-* [ARROW-5626](https://issues.apache.org/jira/browse/ARROW-5626) - [C++][Gandiva] Expression cache should consider precision and scale too
-* [ARROW-5629](https://issues.apache.org/jira/browse/ARROW-5629) - [C++] Fix Coverity issues
-* [ARROW-5631](https://issues.apache.org/jira/browse/ARROW-5631) - [C++] CMake 3.2 build is broken
-* [ARROW-5648](https://issues.apache.org/jira/browse/ARROW-5648) - [C++] Build fails on mingw without codecvt
-* [ARROW-5654](https://issues.apache.org/jira/browse/ARROW-5654) - [C++] ChunkedArray should validate the types of the arrays
-* [ARROW-5674](https://issues.apache.org/jira/browse/ARROW-5674) - [Python] Missing pandas pytest markers from test\_parquet.py
-* [ARROW-5675](https://issues.apache.org/jira/browse/ARROW-5675) - [Doc] Fix typo in documentation describing compile/debug workflow on macOS with Xcode IDE
-* [ARROW-5678](https://issues.apache.org/jira/browse/ARROW-5678) - [R][Lint] Fix hadolint docker linting error
-* [ARROW-5693](https://issues.apache.org/jira/browse/ARROW-5693) - [Go] skip IPC integration test for Decimal128
-* [ARROW-5697](https://issues.apache.org/jira/browse/ARROW-5697) - [GLib] c\_glib/Dockerfile is broken
-* [ARROW-5698](https://issues.apache.org/jira/browse/ARROW-5698) - [R] r/Dockerfile docker-compose build is broken
-* [ARROW-5709](https://issues.apache.org/jira/browse/ARROW-5709) - [C++] gandiva-date\_time\_test failure on Windows
-* [ARROW-5714](https://issues.apache.org/jira/browse/ARROW-5714) - [JS] Inconsistent behavior in Int64Builder with/without BigNum
-* [ARROW-5723](https://issues.apache.org/jira/browse/ARROW-5723) - [Gandiva][Crossbow] Builds failing
-* [ARROW-5728](https://issues.apache.org/jira/browse/ARROW-5728) - [Python] [CI] Travis-CI failures in test\_jvm.py
-* [ARROW-5730](https://issues.apache.org/jira/browse/ARROW-5730) - [Python][CI] Selectively skip test cases in the dask integration test
-* [ARROW-5732](https://issues.apache.org/jira/browse/ARROW-5732) - [C++] macOS builds failing idiosyncratically on master with warnings from pmmintrin.h
-* [ARROW-5735](https://issues.apache.org/jira/browse/ARROW-5735) - [C++] Appveyor builds failing persistently in thrift\_ep build
-* [ARROW-5737](https://issues.apache.org/jira/browse/ARROW-5737) - [C++][Gandiva] Gandiva not building in manylinux
-* [ARROW-5738](https://issues.apache.org/jira/browse/ARROW-5738) - [Crossbow][Conda] OSX package builds are failing with missing intrinsics
-* [ARROW-5739](https://issues.apache.org/jira/browse/ARROW-5739) - [CI] Fix docker python build
-* [ARROW-5750](https://issues.apache.org/jira/browse/ARROW-5750) - [Java] Java compilation failures on master
-* [ARROW-5754](https://issues.apache.org/jira/browse/ARROW-5754) - [C++]Missing override for ~GrpcStreamWriter?
-* [ARROW-5765](https://issues.apache.org/jira/browse/ARROW-5765) - [C++] TestDictionary.Validate test is crashed with release build
-* [ARROW-5769](https://issues.apache.org/jira/browse/ARROW-5769) - [Java] org.apache.arrow.flight.TestTls is failed via dev/release/00-prepare.sh
-* [ARROW-5770](https://issues.apache.org/jira/browse/ARROW-5770) - [C++] Fix -Wpessimizing-move in result.h
-* [ARROW-5771](https://issues.apache.org/jira/browse/ARROW-5771) - [Python] Docker python-nopandas job fails
-* [ARROW-5774](https://issues.apache.org/jira/browse/ARROW-5774) - [Java][Documentation] Document the need to checkout git submodules for flight
-* [ARROW-5781](https://issues.apache.org/jira/browse/ARROW-5781) - [Archery] Ensure benchmark clone accepts remotes in revision
-* [ARROW-5791](https://issues.apache.org/jira/browse/ARROW-5791) - [Python] pyarrow.csv.read\_csv hangs + eats all RAM
-* [ARROW-5816](https://issues.apache.org/jira/browse/ARROW-5816) - [Release] Parallel curl does not work reliably in verify-release-candidate-sh
-* [ARROW-5824](https://issues.apache.org/jira/browse/ARROW-5824) - [Gandiva] [C++] Fix decimal null
-* [ARROW-61](https://issues.apache.org/jira/browse/ARROW-61) - [Java] Method can return the value bigger than long MAX\_VALUE
-
-
-[1]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.14.0/
-[2]: https://bintray.com/apache/arrow/centos/0.14.0/
-[3]: https://bintray.com/apache/arrow/debian/0.14.0/
-[4]: https://bintray.com/apache/arrow/python/0.14.0/
-[5]: https://bintray.com/apache/arrow/ubuntu/0.14.0/
-[6]: https://github.com/apache/arrow/releases/tag/apache-arrow-0.14.0
diff --git a/site/_release/0.14.1.md b/site/_release/0.14.1.md
deleted file mode 100644
index eb5d8d6..0000000
--- a/site/_release/0.14.1.md
+++ /dev/null
@@ -1,133 +0,0 @@
----
-layout: default
-title: Apache Arrow 0.14.1 Release
-permalink: /release/0.14.1.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow 0.14.1 (22 July 2019)
-
-This is a bugfix release to address a Python wheel packaging issues and Parquet
-forward compatibility problems.
-
-## Download
-
-* [**Source Artifacts**][1]
-* **Binary Artifacts**
-  * [For CentOS][2]
-  * [For Debian][3]
-  * [For Python][4]
-  * [For Ubuntu][5]
-* [Git tag][6]
-
-## Contributors
-
-This release includes 52 commits from 16 distinct contributors.
-
-```console
-$ git shortlog -sn apache-arrow-0.14.0..apache-arrow-0.14.1
-    15	Sutou Kouhei
-    10	Krisztián Szűcs
-     5	Antoine Pitrou
-     5	Wes McKinney
-     2	David Li
-     2	Eric Erhardt
-     2	Joris Van den Bossche
-     2	Micah Kornfield
-     2	Yosuke Shiro
-     1	Hatem Helal
-     1	Marco Neumann
-     1	Marius Seritan
-     1	Neal Richardson
-     1	Philipp Moritz
-     1	TP Boudreau
-     1	Uwe L. Korn
-```
-
-## Patch Committers
-
-The following Apache committers merged contributed patches to the repository.
-
-```console
-$ git shortlog -csn apache-arrow-0.14.0..apache-arrow-0.14.1
-    43	Wes McKinney
-     9	Krisztián Szűcs
-```
-
-## Changelog
-
-### New Features and Improvements
-
-* [ARROW-5351](https://issues.apache.org/jira/browse/ARROW-5351) - [Rust] Add support for take kernel functions
-* [ARROW-5380](https://issues.apache.org/jira/browse/ARROW-5380) - [C++] Fix and enable UBSan for unaligned accesses.
-* [ARROW-5564](https://issues.apache.org/jira/browse/ARROW-5564) - [C++] Add uriparser to conda-forge
-* [ARROW-5609](https://issues.apache.org/jira/browse/ARROW-5609) - [C++] Set CMP0068 CMake policy to avoid macOS warnings
-* [ARROW-5784](https://issues.apache.org/jira/browse/ARROW-5784) - [Release][GLib] Replace c\_glib/ after running c\_glib/autogen.sh in dev/release/02-source.sh
-* [ARROW-5785](https://issues.apache.org/jira/browse/ARROW-5785) - [Rust] Rust datafusion implementation should not depend on rustyline
-* [ARROW-5787](https://issues.apache.org/jira/browse/ARROW-5787) - [Release][Rust] Use local modules to verify RC
-* [ARROW-5793](https://issues.apache.org/jira/browse/ARROW-5793) - [Release] Avoid duplicate known host SSH error in dev/release/03-binary.sh
-* [ARROW-5794](https://issues.apache.org/jira/browse/ARROW-5794) - [Release] Skip uploading already uploaded binaries
-* [ARROW-5795](https://issues.apache.org/jira/browse/ARROW-5795) - [Release] Add missing waits on uploading binaries
-* [ARROW-5796](https://issues.apache.org/jira/browse/ARROW-5796) - [Release][APT] Update expected package list
-* [ARROW-5797](https://issues.apache.org/jira/browse/ARROW-5797) - [Release][APT] Update supported distributions
-* [ARROW-5820](https://issues.apache.org/jira/browse/ARROW-5820) - [Release] Remove undefined variable check from verify script
-* [ARROW-5827](https://issues.apache.org/jira/browse/ARROW-5827) - [C++] Require c-ares CMake config
-* [ARROW-5828](https://issues.apache.org/jira/browse/ARROW-5828) - [C++] Add Protocol Buffers version check
-* [ARROW-5866](https://issues.apache.org/jira/browse/ARROW-5866) - [C++] Remove duplicate library in cpp/Brewfile
-* [ARROW-5877](https://issues.apache.org/jira/browse/ARROW-5877) - [FlightRPC] Fix auth incompatibilities between Python/Java
-* [ARROW-5904](https://issues.apache.org/jira/browse/ARROW-5904) - [Java] [Plasma] Fix compilation of Plasma Java client
-* [ARROW-5908](https://issues.apache.org/jira/browse/ARROW-5908) - [C#] ArrowStreamWriter doesn't align buffers to 8 bytes
-* [ARROW-5934](https://issues.apache.org/jira/browse/ARROW-5934) - [Python] Bundle arrow's LICENSE with the wheels
-* [ARROW-5937](https://issues.apache.org/jira/browse/ARROW-5937) - [Release] Stop parallel binary upload
-* [ARROW-5938](https://issues.apache.org/jira/browse/ARROW-5938) - [Release] Create branch for adding release note automatically
-* [ARROW-5939](https://issues.apache.org/jira/browse/ARROW-5939) - [Release] Add support for generating vote email template separately
-* [ARROW-5940](https://issues.apache.org/jira/browse/ARROW-5940) - [Release] Add support for re-uploading sign/checksum for binary artifacts
-* [ARROW-5941](https://issues.apache.org/jira/browse/ARROW-5941) - [Release] Avoid re-uploading already uploaded binary artifacts
-* [ARROW-5958](https://issues.apache.org/jira/browse/ARROW-5958) - [Python] Link zlib statically in the wheels
-
-### Bug Fixes
-
-* [ARROW-5775](https://issues.apache.org/jira/browse/ARROW-5775) - [C++] StructArray : cached boxed fields not thread-safe
-* [ARROW-5790](https://issues.apache.org/jira/browse/ARROW-5790) - [Python] Passing zero-dim numpy array to pa.array causes segfault
-* [ARROW-5791](https://issues.apache.org/jira/browse/ARROW-5791) - [Python] pyarrow.csv.read\_csv hangs + eats all RAM
-* [ARROW-5816](https://issues.apache.org/jira/browse/ARROW-5816) - [Release] Parallel curl does not work reliably in verify-release-candidate-sh
-* [ARROW-5836](https://issues.apache.org/jira/browse/ARROW-5836) - [Java][OSX] Flight tests are failing: address already in use
-* [ARROW-5838](https://issues.apache.org/jira/browse/ARROW-5838) - [C++][Flight][OSX] Building 3rdparty grpc cannot find OpenSSL
-* [ARROW-5849](https://issues.apache.org/jira/browse/ARROW-5849) - [C++] Compiler warnings on mingw-w64
-* [ARROW-5851](https://issues.apache.org/jira/browse/ARROW-5851) - [C++] Compilation of reference benchmarks fails
-* [ARROW-5856](https://issues.apache.org/jira/browse/ARROW-5856) - [Python] linking 3rd party cython modules against pyarrow fails since 0.14.0
-* [ARROW-5863](https://issues.apache.org/jira/browse/ARROW-5863) - [Python] Segmentation Fault via pytest-runner
-* [ARROW-5868](https://issues.apache.org/jira/browse/ARROW-5868) - [Python] manylinux2010 wheels have shared library dependency on liblz4
-* [ARROW-5873](https://issues.apache.org/jira/browse/ARROW-5873) - [Python] Segmentation fault when comparing schema with None
-* [ARROW-5874](https://issues.apache.org/jira/browse/ARROW-5874) - [Python] pyarrow 0.14.0 macOS wheels depend on shared libs under /usr/local/opt
-* [ARROW-5878](https://issues.apache.org/jira/browse/ARROW-5878) - [Python][C++] Parquet reader not forward compatible for timestamps without timezone
-* [ARROW-5886](https://issues.apache.org/jira/browse/ARROW-5886) - [Python][Packaging] Manylinux1/2010 compliance issue with libz
-* [ARROW-5887](https://issues.apache.org/jira/browse/ARROW-5887) - [C#] ArrowStreamWriter writes FieldNodes in wrong order
-* [ARROW-5889](https://issues.apache.org/jira/browse/ARROW-5889) - [Python][C++] Parquet backwards compat for timestamps without timezone broken
-* [ARROW-5899](https://issues.apache.org/jira/browse/ARROW-5899) - [Python][Packaging] Bundle uriparser.dll in windows wheels
-* [ARROW-5921](https://issues.apache.org/jira/browse/ARROW-5921) - [C++][Fuzzing] Missing nullptr checks in IPC
-
-
-[1]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.14.1/
-[2]: https://bintray.com/apache/arrow/centos/0.14.1/
-[3]: https://bintray.com/apache/arrow/debian/0.14.1/
-[4]: https://bintray.com/apache/arrow/python/0.14.1/
-[5]: https://bintray.com/apache/arrow/ubuntu/0.14.1/
-[6]: https://github.com/apache/arrow/releases/tag/apache-arrow-0.14.1
diff --git a/site/_release/0.2.0.md b/site/_release/0.2.0.md
deleted file mode 100644
index 72bce7d..0000000
--- a/site/_release/0.2.0.md
+++ /dev/null
@@ -1,255 +0,0 @@
----
-layout: default
-title: 0.2.0 Release
-permalink: /release/0.2.0.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow 0.2.0 (18 February 2017)
-
-## Download
-
-* [**Source Artifacts**][6]
-* [Git tag][2]
-
-# Changelog
-
-## Contributors
-
-```shell
-$ git shortlog -sn apache-arrow-0.1.0..apache-arrow-0.2.0
-    73  Wes McKinney
-    55  Uwe L. Korn
-    16  Julien Le Dem
-     4  Bryan Cutler
-     4  Nong Li
-     2  Christopher C. Aycock
-     2  Jingyuan Wang
-     2  Kouhei Sutou
-     2  Laurent Goujon
-     2  Leif Walsh
-     1  Emilio Lahr-Vivaz
-     1  Holden Karau
-     1  Li Jin
-     1  Mohamed Zenadi
-     1  Peter Hoffmann
-     1  Steven Phillips
-     1  adeneche
-     1  ahnj
-     1  vkorukanti
-```
-
-## New Features and Improvements
-
-* [ARROW-108](https://issues.apache.org/jira/browse/ARROW-108) - [C++] Add IPC round trip for union types
-* [ARROW-189](https://issues.apache.org/jira/browse/ARROW-189) - C++: Use ExternalProject to build thirdparty dependencies
-* [ARROW-191](https://issues.apache.org/jira/browse/ARROW-191) - Python: Provide infrastructure for manylinux1 wheels
-* [ARROW-221](https://issues.apache.org/jira/browse/ARROW-221) - Add switch for writing Parquet 1.0 compatible logical types
-* [ARROW-227](https://issues.apache.org/jira/browse/ARROW-227) - [C++/Python] Hook arrow_io generic reader / writer interface into arrow_parquet
-* [ARROW-228](https://issues.apache.org/jira/browse/ARROW-228) - [Python] Create an Arrow-cpp-compatible interface for reading bytes from Python file-like objects
-* [ARROW-243](https://issues.apache.org/jira/browse/ARROW-243) - [C++] Add "driver" option to HdfsClient to choose between libhdfs and libhdfs3 at runtime
-* [ARROW-268](https://issues.apache.org/jira/browse/ARROW-268) - [C++] Flesh out union implementation to have all required methods for IPC
-* [ARROW-303](https://issues.apache.org/jira/browse/ARROW-303) - [C++] Also build static libraries for leaf libraries
-* [ARROW-312](https://issues.apache.org/jira/browse/ARROW-312) - [Python] Provide Python API to read/write the Arrow IPC file format
-* [ARROW-317](https://issues.apache.org/jira/browse/ARROW-317) - [C++] Implement zero-copy Slice method on arrow::Buffer that retains reference to parent
-* [ARROW-327](https://issues.apache.org/jira/browse/ARROW-327) - [Python] Remove conda builds from Travis CI processes
-* [ARROW-328](https://issues.apache.org/jira/browse/ARROW-328) - [C++] Return shared_ptr by value instead of const-ref?
-* [ARROW-33](https://issues.apache.org/jira/browse/ARROW-33) - C++: Implement zero-copy array slicing
-* [ARROW-330](https://issues.apache.org/jira/browse/ARROW-330) - [C++] CMake functions to simplify shared / static library configuration
-* [ARROW-332](https://issues.apache.org/jira/browse/ARROW-332) - [Python] Add helper function to convert RecordBatch to pandas.DataFrame
-* [ARROW-333](https://issues.apache.org/jira/browse/ARROW-333) - Make writers update their internal schema even when no data is written.
-* [ARROW-335](https://issues.apache.org/jira/browse/ARROW-335) - Improve Type apis and toString() by encapsulating flatbuffers better
-* [ARROW-336](https://issues.apache.org/jira/browse/ARROW-336) - Run Apache Rat in Travis builds
-* [ARROW-338](https://issues.apache.org/jira/browse/ARROW-338) - [C++] Refactor IPC vector "loading" and "unloading" to be based on cleaner visitor pattern
-* [ARROW-350](https://issues.apache.org/jira/browse/ARROW-350) - Add Kerberos support to HDFS shim
-* [ARROW-353](https://issues.apache.org/jira/browse/ARROW-353) - Arrow release 0.2
-* [ARROW-355](https://issues.apache.org/jira/browse/ARROW-355) - Add tests for serialising arrays of empty strings to Parquet
-* [ARROW-356](https://issues.apache.org/jira/browse/ARROW-356) - Add documentation about reading Parquet
-* [ARROW-359](https://issues.apache.org/jira/browse/ARROW-359) - Need to document ARROW_LIBHDFS_DIR
-* [ARROW-360](https://issues.apache.org/jira/browse/ARROW-360) - C++: Add method to shrink PoolBuffer using realloc
-* [ARROW-361](https://issues.apache.org/jira/browse/ARROW-361) - Python: Support reading a column-selection from Parquet files
-* [ARROW-363](https://issues.apache.org/jira/browse/ARROW-363) - Set up Java/C++ integration test harness
-* [ARROW-365](https://issues.apache.org/jira/browse/ARROW-365) - Python: Provide Array.to_pandas()
-* [ARROW-366](https://issues.apache.org/jira/browse/ARROW-366) - [java] implement Dictionary vector
-* [ARROW-367](https://issues.apache.org/jira/browse/ARROW-367) - [java] converter csv/json <=> Arrow file format for Integration tests
-* [ARROW-368](https://issues.apache.org/jira/browse/ARROW-368) - Document use of LD_LIBRARY_PATH when using Python
-* [ARROW-369](https://issues.apache.org/jira/browse/ARROW-369) - [Python] Add ability to convert multiple record batches at once to pandas
-* [ARROW-372](https://issues.apache.org/jira/browse/ARROW-372) - Create JSON arrow file format for integration tests
-* [ARROW-373](https://issues.apache.org/jira/browse/ARROW-373) - [C++] Implement C++ version of JSON file format for testing
-* [ARROW-374](https://issues.apache.org/jira/browse/ARROW-374) - Python: clarify unicode vs. binary in API
-* [ARROW-377](https://issues.apache.org/jira/browse/ARROW-377) - Python: Add support for conversion of Pandas.Categorical
-* [ARROW-379](https://issues.apache.org/jira/browse/ARROW-379) - Python: Use setuptools_scm/setuptools_scm_git_archive to provide the version number
-* [ARROW-380](https://issues.apache.org/jira/browse/ARROW-380) - [Java] optimize null count when serializing vectors.
-* [ARROW-381](https://issues.apache.org/jira/browse/ARROW-381) - [C++] Simplify primitive array type builders to use a default type singleton
-* [ARROW-382](https://issues.apache.org/jira/browse/ARROW-382) - Python: Extend API documentation
-* [ARROW-383](https://issues.apache.org/jira/browse/ARROW-383) - [C++] Implement C++ version of ARROW-367 integration test validator
-* [ARROW-389](https://issues.apache.org/jira/browse/ARROW-389) - Python: Write Parquet files to pyarrow.io.NativeFile objects
-* [ARROW-394](https://issues.apache.org/jira/browse/ARROW-394) - Add integration tests for boolean, list, struct, and other basic types
-* [ARROW-396](https://issues.apache.org/jira/browse/ARROW-396) - Python: Add pyarrow.schema.Schema.equals
-* [ARROW-409](https://issues.apache.org/jira/browse/ARROW-409) - Python: Change pyarrow.Table.dataframe_from_batches API to create Table instead
-* [ARROW-410](https://issues.apache.org/jira/browse/ARROW-410) - [C++] Add Flush method to arrow::io::OutputStream
-* [ARROW-411](https://issues.apache.org/jira/browse/ARROW-411) - [Java] Move Intergration.compare and Intergration.compareSchemas to a public utils class
-* [ARROW-415](https://issues.apache.org/jira/browse/ARROW-415) - C++: Add Equals implementation to compare Tables
-* [ARROW-416](https://issues.apache.org/jira/browse/ARROW-416) - C++: Add Equals implementation to compare Columns
-* [ARROW-417](https://issues.apache.org/jira/browse/ARROW-417) - C++: Add Equals implementation to compare ChunkedArrays
-* [ARROW-418](https://issues.apache.org/jira/browse/ARROW-418) - [C++] Consolidate array container and builder code, remove arrow/types
-* [ARROW-419](https://issues.apache.org/jira/browse/ARROW-419) - [C++] Promote util/{status.h, buffer.h, memory-pool.h} to top level of arrow/ source directory
-* [ARROW-423](https://issues.apache.org/jira/browse/ARROW-423) - C++: Define BUILD_BYPRODUCTS in external project to support non-make CMake generators
-* [ARROW-425](https://issues.apache.org/jira/browse/ARROW-425) - Python: Expose a C function to convert arrow::Table to pyarrow.Table
-* [ARROW-426](https://issues.apache.org/jira/browse/ARROW-426) - Python: Conversion from pyarrow.Array to a Python list
-* [ARROW-427](https://issues.apache.org/jira/browse/ARROW-427) - [C++] Implement dictionary-encoded array container
-* [ARROW-428](https://issues.apache.org/jira/browse/ARROW-428) - [Python] Deserialize from Arrow record batches to pandas in parallel using a thread pool
-* [ARROW-430](https://issues.apache.org/jira/browse/ARROW-430) - Python: Better version handling
-* [ARROW-432](https://issues.apache.org/jira/browse/ARROW-432) - [Python] Avoid unnecessary memory copy in to_pandas conversion by using low-level pandas internals APIs
-* [ARROW-438](https://issues.apache.org/jira/browse/ARROW-438) - [Python] Concatenate Table instances with equal schemas
-* [ARROW-440](https://issues.apache.org/jira/browse/ARROW-440) - [C++] Support pkg-config
-* [ARROW-441](https://issues.apache.org/jira/browse/ARROW-441) - [Python] Expose Arrow's file and memory map classes as NativeFile subclasses
-* [ARROW-442](https://issues.apache.org/jira/browse/ARROW-442) - [Python] Add public Python API to inspect Parquet file metadata
-* [ARROW-444](https://issues.apache.org/jira/browse/ARROW-444) - [Python] Avoid unnecessary memory copies from use of PyBytes_* C APIs
-* [ARROW-449](https://issues.apache.org/jira/browse/ARROW-449) - Python: Conversion from pyarrow.{Table,RecordBatch} to a Python dict
-* [ARROW-450](https://issues.apache.org/jira/browse/ARROW-450) - Python: Fixes for PARQUET-818
-* [ARROW-456](https://issues.apache.org/jira/browse/ARROW-456) - C++: Add jemalloc based MemoryPool
-* [ARROW-457](https://issues.apache.org/jira/browse/ARROW-457) - Python: Better control over memory pool
-* [ARROW-458](https://issues.apache.org/jira/browse/ARROW-458) - Python: Expose jemalloc MemoryPool
-* [ARROW-461](https://issues.apache.org/jira/browse/ARROW-461) - [Python] Implement conversion between arrow::DictionaryArray and pandas.Categorical
-* [ARROW-463](https://issues.apache.org/jira/browse/ARROW-463) - C++: Support jemalloc 4.x
-* [ARROW-466](https://issues.apache.org/jira/browse/ARROW-466) - C++: ExternalProject for jemalloc
-* [ARROW-467](https://issues.apache.org/jira/browse/ARROW-467) - [Python] Run parquet-cpp unit tests in Travis CI
-* [ARROW-468](https://issues.apache.org/jira/browse/ARROW-468) - Python: Conversion of nested data in pd.DataFrames to/from Arrow structures
-* [ARROW-470](https://issues.apache.org/jira/browse/ARROW-470) - [Python] Add "FileSystem" abstraction to access directories of files in a uniform way
-* [ARROW-471](https://issues.apache.org/jira/browse/ARROW-471) - [Python] Enable ParquetFile to pass down separately-obtained file metadata
-* [ARROW-472](https://issues.apache.org/jira/browse/ARROW-472) - [Python] Expose parquet::{SchemaDescriptor, ColumnDescriptor}::Equals
-* [ARROW-474](https://issues.apache.org/jira/browse/ARROW-474) - Create an Arrow streaming file fomat
-* [ARROW-475](https://issues.apache.org/jira/browse/ARROW-475) - [Python] High level support for reading directories of Parquet files (as a single Arrow table) from supported file system interfaces
-* [ARROW-476](https://issues.apache.org/jira/browse/ARROW-476) - [Integration] Add integration tests for Binary / Varbytes type
-* [ARROW-477](https://issues.apache.org/jira/browse/ARROW-477) - [Java] Add support for second/microsecond/nanosecond timestamps in-memory and in IPC/JSON layer
-* [ARROW-478](https://issues.apache.org/jira/browse/ARROW-478) - [Python] Accept a PyBytes object in the pyarrow.io.BufferReader ctor
-* [ARROW-479](https://issues.apache.org/jira/browse/ARROW-479) - Python: Test for expected schema in Pandas conversion
-* [ARROW-484](https://issues.apache.org/jira/browse/ARROW-484) - Add more detail about what of technology can be found in the Arrow implementations to README
-* [ARROW-485](https://issues.apache.org/jira/browse/ARROW-485) - [Java] Users are required to initialize VariableLengthVectors.offsetVector before calling VariableLengthVectors.mutator.getSafe
-* [ARROW-490](https://issues.apache.org/jira/browse/ARROW-490) - Python: Update manylinux1 build scripts
-* [ARROW-495](https://issues.apache.org/jira/browse/ARROW-495) - [C++] Add C++ implementation of streaming serialized format
-* [ARROW-497](https://issues.apache.org/jira/browse/ARROW-497) - [Java] Integration test harness for streaming format
-* [ARROW-498](https://issues.apache.org/jira/browse/ARROW-498) - [C++] Integration test harness for streaming format
-* [ARROW-503](https://issues.apache.org/jira/browse/ARROW-503) - [Python] Interface to streaming binary format
-* [ARROW-506](https://issues.apache.org/jira/browse/ARROW-506) - Implement Arrow Echo server for integration testing
-* [ARROW-508](https://issues.apache.org/jira/browse/ARROW-508) - [C++] Make file/memory-mapped file interfaces threadsafe
-* [ARROW-509](https://issues.apache.org/jira/browse/ARROW-509) - [Python] Add support for PARQUET-835 (parallel column reads)
-* [ARROW-512](https://issues.apache.org/jira/browse/ARROW-512) - C++: Add method to check for primitive types
-* [ARROW-514](https://issues.apache.org/jira/browse/ARROW-514) - [Python] Accept pyarrow.io.Buffer as input to StreamReader, FileReader classes
-* [ARROW-515](https://issues.apache.org/jira/browse/ARROW-515) - [Python] Add StreamReader/FileReader methods that read all record batches as a Table
-* [ARROW-521](https://issues.apache.org/jira/browse/ARROW-521) - [C++/Python] Track peak memory use in default MemoryPool
-* [ARROW-524](https://issues.apache.org/jira/browse/ARROW-524) - [java] provide apis to access nested vectors and buffers
-* [ARROW-525](https://issues.apache.org/jira/browse/ARROW-525) - Python: Add more documentation to the package
-* [ARROW-527](https://issues.apache.org/jira/browse/ARROW-527) - clean drill-module.conf file
-* [ARROW-529](https://issues.apache.org/jira/browse/ARROW-529) - Python: Add jemalloc and Python 3.6 to manylinux1 build
-* [ARROW-531](https://issues.apache.org/jira/browse/ARROW-531) - Python: Document jemalloc, extend Pandas section, add Getting Involved
-* [ARROW-538](https://issues.apache.org/jira/browse/ARROW-538) - [C++] Set up AddressSanitizer (ASAN) builds
-* [ARROW-546](https://issues.apache.org/jira/browse/ARROW-546) - Python: Account for changes in PARQUET-867
-* [ARROW-547](https://issues.apache.org/jira/browse/ARROW-547) - [Python] Expose Array::Slice and RecordBatch::Slice
-* [ARROW-553](https://issues.apache.org/jira/browse/ARROW-553) - C++: Faster valid bitmap building
-* [ARROW-558](https://issues.apache.org/jira/browse/ARROW-558) - Add KEYS files
-* [ARROW-81](https://issues.apache.org/jira/browse/ARROW-81) - [Format] Add a Category logical type (distinct from dictionary-encoding)
-* [ARROW-96](https://issues.apache.org/jira/browse/ARROW-96) - C++: API documentation using Doxygen
-* [ARROW-97](https://issues.apache.org/jira/browse/ARROW-97) - Python: API documentation via sphinx-apidoc
-
-## Bug Fixes
-
-* [ARROW-112](https://issues.apache.org/jira/browse/ARROW-112) - [C++]  Style fix for constants/enums
-* [ARROW-202](https://issues.apache.org/jira/browse/ARROW-202) - [C++] Integrate with appveyor ci for windows support and get arrow building on windows
-* [ARROW-220](https://issues.apache.org/jira/browse/ARROW-220) - [C++] Build conda artifacts in a build environment with better cross-linux ABI compatibility
-* [ARROW-224](https://issues.apache.org/jira/browse/ARROW-224) - [C++] Address static linking of boost dependencies
-* [ARROW-230](https://issues.apache.org/jira/browse/ARROW-230) - Python: Do not name modules like native ones (i.e. rename pyarrow.io)
-* [ARROW-239](https://issues.apache.org/jira/browse/ARROW-239) - [Python] HdfsFile.read called with no arguments should read remainder of file
-* [ARROW-261](https://issues.apache.org/jira/browse/ARROW-261) - [C++] Refactor BinaryArray/StringArray classes to not inherit from ListArray
-* [ARROW-275](https://issues.apache.org/jira/browse/ARROW-275) - Add tests for UnionVector in Arrow File
-* [ARROW-294](https://issues.apache.org/jira/browse/ARROW-294) - [C++] Do not use fopen / fclose / etc. methods for memory mapped file implementation
-* [ARROW-322](https://issues.apache.org/jira/browse/ARROW-322) - [C++] Do not build HDFS IO interface optionally
-* [ARROW-323](https://issues.apache.org/jira/browse/ARROW-323) - [Python] Opt-in to PyArrow parquet build rather than skipping silently on failure
-* [ARROW-334](https://issues.apache.org/jira/browse/ARROW-334) - [Python] OS X rpath issues on some configurations
-* [ARROW-337](https://issues.apache.org/jira/browse/ARROW-337) - UnionListWriter.list() is doing more than it should, this can cause data corruption
-* [ARROW-339](https://issues.apache.org/jira/browse/ARROW-339) - Make merge_arrow_pr script work with Python 3
-* [ARROW-340](https://issues.apache.org/jira/browse/ARROW-340) - [C++] Opening a writeable file on disk that already exists does not truncate to zero
-* [ARROW-342](https://issues.apache.org/jira/browse/ARROW-342) - Set Python version on release
-* [ARROW-345](https://issues.apache.org/jira/browse/ARROW-345) - libhdfs integration doesn't work for Mac
-* [ARROW-346](https://issues.apache.org/jira/browse/ARROW-346) - Python API Documentation
-* [ARROW-348](https://issues.apache.org/jira/browse/ARROW-348) - [Python] CMake build type should be configurable on the command line
-* [ARROW-349](https://issues.apache.org/jira/browse/ARROW-349) - Six is missing as a requirement in the python setup.py
-* [ARROW-351](https://issues.apache.org/jira/browse/ARROW-351) - Time type has no unit
-* [ARROW-354](https://issues.apache.org/jira/browse/ARROW-354) - Connot compare an array of empty strings to another
-* [ARROW-357](https://issues.apache.org/jira/browse/ARROW-357) - Default Parquet chunk_size of 64k is too small
-* [ARROW-358](https://issues.apache.org/jira/browse/ARROW-358) - [C++] libhdfs can be in non-standard locations in some Hadoop distributions
-* [ARROW-362](https://issues.apache.org/jira/browse/ARROW-362) - Python: Calling to_pandas on a table read from Parquet leaks memory
-* [ARROW-371](https://issues.apache.org/jira/browse/ARROW-371) - Python: Table with null timestamp becomes float in pandas
-* [ARROW-375](https://issues.apache.org/jira/browse/ARROW-375) - columns parameter in parquet.read_table() raises KeyError for valid column
-* [ARROW-384](https://issues.apache.org/jira/browse/ARROW-384) - Align Java and C++ RecordBatch data and metadata layout
-* [ARROW-386](https://issues.apache.org/jira/browse/ARROW-386) - [Java] Respect case of struct / map field names
-* [ARROW-387](https://issues.apache.org/jira/browse/ARROW-387) - [C++] arrow::io::BufferReader does not permit shared memory ownership in zero-copy reads
-* [ARROW-390](https://issues.apache.org/jira/browse/ARROW-390) - C++: CMake fails on json-integration-test with ARROW_BUILD_TESTS=OFF
-* [ARROW-392](https://issues.apache.org/jira/browse/ARROW-392) - Fix string/binary integration tests
-* [ARROW-393](https://issues.apache.org/jira/browse/ARROW-393) - [JAVA] JSON file reader fails to set the buffer size on String data vector
-* [ARROW-395](https://issues.apache.org/jira/browse/ARROW-395) - Arrow file format writes record batches in reverse order.
-* [ARROW-398](https://issues.apache.org/jira/browse/ARROW-398) - [Java] Java file format requires bitmaps of all 1's to be written when there are no nulls
-* [ARROW-399](https://issues.apache.org/jira/browse/ARROW-399) - [Java] ListVector.loadFieldBuffers ignores the ArrowFieldNode length metadata
-* [ARROW-400](https://issues.apache.org/jira/browse/ARROW-400) - [Java] ArrowWriter writes length 0 for Struct types
-* [ARROW-401](https://issues.apache.org/jira/browse/ARROW-401) - [Java] Floating point vectors should do an approximate comparison in integration tests
-* [ARROW-402](https://issues.apache.org/jira/browse/ARROW-402) - [Java] "refCnt gone negative" error in integration tests
-* [ARROW-403](https://issues.apache.org/jira/browse/ARROW-403) - [JAVA] UnionVector: Creating a transfer pair doesn't transfer the schema to destination vector
-* [ARROW-404](https://issues.apache.org/jira/browse/ARROW-404) - [Python] Closing an HdfsClient while there are still open file handles results in a crash
-* [ARROW-405](https://issues.apache.org/jira/browse/ARROW-405) - [C++] Be less stringent about finding include/hdfs.h in HADOOP_HOME
-* [ARROW-406](https://issues.apache.org/jira/browse/ARROW-406) - [C++] Large HDFS reads must utilize the set file buffer size when making RPCs
-* [ARROW-408](https://issues.apache.org/jira/browse/ARROW-408) - [C++/Python] Remove defunct conda recipes
-* [ARROW-414](https://issues.apache.org/jira/browse/ARROW-414) - [Java] "Buffer too large to resize to ..." error
-* [ARROW-420](https://issues.apache.org/jira/browse/ARROW-420) - Align Date implementation between Java and C++
-* [ARROW-421](https://issues.apache.org/jira/browse/ARROW-421) - [Python] Zero-copy buffers read by pyarrow::PyBytesReader must retain a reference to the parent PyBytes to avoid premature garbage collection issues
-* [ARROW-422](https://issues.apache.org/jira/browse/ARROW-422) - C++: IPC should depend on rapidjson_ep if RapidJSON is vendored
-* [ARROW-429](https://issues.apache.org/jira/browse/ARROW-429) - git-archive SHA-256 checksums are changing
-* [ARROW-433](https://issues.apache.org/jira/browse/ARROW-433) - [Python] Date conversion is locale-dependent
-* [ARROW-434](https://issues.apache.org/jira/browse/ARROW-434) - Segfaults and encoding issues in Python Parquet reads
-* [ARROW-435](https://issues.apache.org/jira/browse/ARROW-435) - C++: Spelling mistake in if(RAPIDJSON_VENDORED)
-* [ARROW-437](https://issues.apache.org/jira/browse/ARROW-437) - [C++] clang compiler warnings from overridden virtual functions
-* [ARROW-445](https://issues.apache.org/jira/browse/ARROW-445) - C++: arrow_ipc is built before arrow/ipc/Message_generated.h was generated
-* [ARROW-447](https://issues.apache.org/jira/browse/ARROW-447) - Python: Align scalar/pylist string encoding with pandas' one.
-* [ARROW-455](https://issues.apache.org/jira/browse/ARROW-455) - [C++] BufferOutputStream dtor does not call Close()
-* [ARROW-469](https://issues.apache.org/jira/browse/ARROW-469) - C++: Add option so that resize doesn't decrease the capacity
-* [ARROW-481](https://issues.apache.org/jira/browse/ARROW-481) - [Python] Fix Python 2.7 regression in patch for PARQUET-472
-* [ARROW-486](https://issues.apache.org/jira/browse/ARROW-486) - [C++] arrow::io::MemoryMappedFile can't be casted to arrow::io::FileInterface
-* [ARROW-487](https://issues.apache.org/jira/browse/ARROW-487) - Python: ConvertTableToPandas segfaults if ObjectBlock::Write fails
-* [ARROW-494](https://issues.apache.org/jira/browse/ARROW-494) - [C++] When MemoryMappedFile is destructed, memory is unmapped even if buffer referecnes still exist
-* [ARROW-499](https://issues.apache.org/jira/browse/ARROW-499) - Update file serialization to use streaming serialization format
-* [ARROW-505](https://issues.apache.org/jira/browse/ARROW-505) - [C++] Fix compiler warnings in release mode
-* [ARROW-511](https://issues.apache.org/jira/browse/ARROW-511) - [Python] List[T] conversions not implemented for single arrays
-* [ARROW-513](https://issues.apache.org/jira/browse/ARROW-513) - [C++] Fix Appveyor build
-* [ARROW-519](https://issues.apache.org/jira/browse/ARROW-519) - [C++] Missing vtable in libarrow.dylib on Xcode 6.4
-* [ARROW-523](https://issues.apache.org/jira/browse/ARROW-523) - Python: Account for changes in PARQUET-834
-* [ARROW-533](https://issues.apache.org/jira/browse/ARROW-533) - [C++] arrow::TimestampArray / TimeArray has a broken constructor
-* [ARROW-535](https://issues.apache.org/jira/browse/ARROW-535) - [Python] Add type mapping for NPY_LONGLONG
-* [ARROW-537](https://issues.apache.org/jira/browse/ARROW-537) - [C++] StringArray/BinaryArray comparisons may be incorrect when values with non-zero length are null
-* [ARROW-540](https://issues.apache.org/jira/browse/ARROW-540) - [C++] Fix build in aftermath of ARROW-33
-* [ARROW-543](https://issues.apache.org/jira/browse/ARROW-543) - C++: Lazily computed null_counts counts number of non-null entries
-* [ARROW-544](https://issues.apache.org/jira/browse/ARROW-544) - [C++] ArrayLoader::LoadBinary fails for length-0 arrays
-* [ARROW-545](https://issues.apache.org/jira/browse/ARROW-545) - [Python] Ignore files without .parq or .parquet prefix when reading directory of files
-* [ARROW-548](https://issues.apache.org/jira/browse/ARROW-548) - [Python] Add nthreads option to pyarrow.Filesystem.read_parquet
-* [ARROW-551](https://issues.apache.org/jira/browse/ARROW-551) - C++: Construction of Column with nullptr Array segfaults
-* [ARROW-556](https://issues.apache.org/jira/browse/ARROW-556) - [Integration] Can not run Integration tests if different cpp build path
-* [ARROW-561](https://issues.apache.org/jira/browse/ARROW-561) - Update java & python dependencies to improve downstream packaging experience
-
-[2]: https://github.com/apache/arrow/releases/tag/apache-arrow-0.2.0
-[6]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.2.0/
diff --git a/site/_release/0.3.0.md b/site/_release/0.3.0.md
deleted file mode 100644
index a88ce4d..0000000
--- a/site/_release/0.3.0.md
+++ /dev/null
@@ -1,365 +0,0 @@
----
-layout: default
-title: 0.3.0 Release
-permalink: /release/0.3.0.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow 0.3.0 (5 May 2017)
-
-Read more in the [release blog post][8]
-
-## Download
-
-* [**Source Artifacts**][6]
-* [Git tag d8db8f8][2]
-
-# Changelog
-
-## Contributors
-
-```shell
-$ git shortlog -sn apache-arrow-0.2.0..apache-arrow-0.3.0
-   119  Wes McKinney
-    55  Kouhei Sutou
-    18  Uwe L. Korn
-    17  Julien Le Dem
-     9  Phillip Cloud
-     6  Bryan Cutler
-     5  Emilio Lahr-Vivaz
-     5  Philipp Moritz
-     4  Jeff Knupp
-     4  Johan Mabille
-     4  Max Risuhin
-     3  Miki Tebeka
-     3  Steven Phillips
-     2  Brian Hulette
-     2  Jeff Reback
-     2  Leif Walsh
-     1  Deepak Majeti
-     1  Holden Karau
-     1  Itai Incze
-     1  Julien Lafaye
-     1  Nong Li
-     1  Tsuyoshi Ozawa
-     1  rvernica
-```
-
-## New Features and Improvements
-
-* [ARROW-183](https://issues.apache.org/jira/browse/ARROW-183) - C++: Add storage type to DecimalType
-* [ARROW-231](https://issues.apache.org/jira/browse/ARROW-231) - C++: Add typed Resize to PoolBuffer
-* [ARROW-281](https://issues.apache.org/jira/browse/ARROW-281) - [C++] IPC/RPC support on Win32 platforms
-* [ARROW-316](https://issues.apache.org/jira/browse/ARROW-316) - Finalize Date type
-* [ARROW-341](https://issues.apache.org/jira/browse/ARROW-341) - [Python] Making libpyarrow available to third parties
-* [ARROW-452](https://issues.apache.org/jira/browse/ARROW-452) - [C++/Python] Merge "Feather" file format implementation
-* [ARROW-459](https://issues.apache.org/jira/browse/ARROW-459) - [C++] Implement IPC round trip for DictionaryArray, dictionaries shared across record batches
-* [ARROW-483](https://issues.apache.org/jira/browse/ARROW-483) - [C++/Python] Provide access to "custom_metadata" Field attribute in IPC setting
-* [ARROW-491](https://issues.apache.org/jira/browse/ARROW-491) - [C++] Add FixedWidthBinary type
-* [ARROW-493](https://issues.apache.org/jira/browse/ARROW-493) - [C++] Allow in-memory array over 2^31 -1 elements but require splitting at IPC / RPC boundaries
-* [ARROW-502](https://issues.apache.org/jira/browse/ARROW-502) - [C++/Python] Add MemoryPool implementation that logs allocation activity to std::cout
-* [ARROW-510](https://issues.apache.org/jira/browse/ARROW-510) - Add integration tests for date and time types
-* [ARROW-52](https://issues.apache.org/jira/browse/ARROW-52) - Set up project blog
-* [ARROW-520](https://issues.apache.org/jira/browse/ARROW-520) - [C++] Add STL-compliant allocator that hooks into an arrow::MemoryPool
-* [ARROW-528](https://issues.apache.org/jira/browse/ARROW-528) - [Python] Support _metadata or _common_metadata files when reading Parquet directories
-* [ARROW-534](https://issues.apache.org/jira/browse/ARROW-534) - [C++] Add IPC tests for date/time types
-* [ARROW-539](https://issues.apache.org/jira/browse/ARROW-539) - [Python] Support reading Parquet datasets with standard partition directory schemes
-* [ARROW-542](https://issues.apache.org/jira/browse/ARROW-542) - [Java] Implement dictionaries in stream/file encoding
-* [ARROW-550](https://issues.apache.org/jira/browse/ARROW-550) - [Format] Add a TensorMessage type
-* [ARROW-552](https://issues.apache.org/jira/browse/ARROW-552) - [Python] Add scalar value support for Dictionary type
-* [ARROW-557](https://issues.apache.org/jira/browse/ARROW-557) - [Python] Explicitly opt in to HDFS unit tests
-* [ARROW-563](https://issues.apache.org/jira/browse/ARROW-563) - C++: Support non-standard gcc version strings
-* [ARROW-566](https://issues.apache.org/jira/browse/ARROW-566) - Python: Deterministic position of libarrow in manylinux1 wheels
-* [ARROW-568](https://issues.apache.org/jira/browse/ARROW-568) - [C++] Add default implementations for TypeVisitor, ArrayVisitor methods that return NotImplemented
-* [ARROW-569](https://issues.apache.org/jira/browse/ARROW-569) - [C++] Set version for *.pc
-* [ARROW-574](https://issues.apache.org/jira/browse/ARROW-574) - Python: Add support for nested Python lists in Pandas conversion
-* [ARROW-576](https://issues.apache.org/jira/browse/ARROW-576) - [C++] Complete round trip Union file/stream IPC tests
-* [ARROW-577](https://issues.apache.org/jira/browse/ARROW-577) - [C++] Refactor StreamWriter and FileWriter to have private implementations
-* [ARROW-578](https://issues.apache.org/jira/browse/ARROW-578) - [C++] Add CMake option to add custom $CXXFLAGS
-* [ARROW-580](https://issues.apache.org/jira/browse/ARROW-580) - C++: Also provide jemalloc_X targets if only a static or shared version is found
-* [ARROW-582](https://issues.apache.org/jira/browse/ARROW-582) - [Java] Add Date/Time Support to JSON File
-* [ARROW-589](https://issues.apache.org/jira/browse/ARROW-589) - C++: Use system provided shared jemalloc if static is unavailable
-* [ARROW-593](https://issues.apache.org/jira/browse/ARROW-593) - [C++] Rename ReadableFileInterface to RandomAccessFile
-* [ARROW-598](https://issues.apache.org/jira/browse/ARROW-598) - [Python] Add support for converting pyarrow.Buffer to a memoryview with zero copy
-* [ARROW-603](https://issues.apache.org/jira/browse/ARROW-603) - [C++] Add RecordBatch::Validate method that at least checks that schema matches the array metadata
-* [ARROW-605](https://issues.apache.org/jira/browse/ARROW-605) - [C++] Refactor generic ArrayLoader class, support work for Feather merge
-* [ARROW-606](https://issues.apache.org/jira/browse/ARROW-606) - [C++] Upgrade to flatbuffers 1.6.0
-* [ARROW-608](https://issues.apache.org/jira/browse/ARROW-608) - [Format] Days since epoch date type
-* [ARROW-610](https://issues.apache.org/jira/browse/ARROW-610) - [C++] Win32 compatibility in file.cc
-* [ARROW-612](https://issues.apache.org/jira/browse/ARROW-612) - [Java] Field toString should show nullable flag status
-* [ARROW-615](https://issues.apache.org/jira/browse/ARROW-615) - Move ByteArrayReadableSeekableByteChannel to vector.util package
-* [ARROW-616](https://issues.apache.org/jira/browse/ARROW-616) - [C++] Remove -g flag in release builds
-* [ARROW-618](https://issues.apache.org/jira/browse/ARROW-618) - [Python] Implement support for DatetimeTZ custom type from pandas
-* [ARROW-620](https://issues.apache.org/jira/browse/ARROW-620) - [C++] Add date/time support to JSON reader/writer for integration testing
-* [ARROW-621](https://issues.apache.org/jira/browse/ARROW-621) - [C++] Implement an "inline visitor" template that enables visitor-pattern-like code without virtual function dispatch
-* [ARROW-625](https://issues.apache.org/jira/browse/ARROW-625) - [C++] Add time unit to TimeType::ToString
-* [ARROW-626](https://issues.apache.org/jira/browse/ARROW-626) - [Python] Enable pyarrow.BufferReader to read from any Python object implementing the buffer/memoryview protocol
-* [ARROW-631](https://issues.apache.org/jira/browse/ARROW-631) - [GLib] Import C API (C++ API wrapper) based on GLib from https://github.com/kou/arrow-glib
-* [ARROW-632](https://issues.apache.org/jira/browse/ARROW-632) - [Python] Add support for FixedWidthBinary type
-* [ARROW-635](https://issues.apache.org/jira/browse/ARROW-635) - [C++] Add JSON read/write support for FixedWidthBinary
-* [ARROW-637](https://issues.apache.org/jira/browse/ARROW-637) - [Format] Add time zone metadata to Timestamp type
-* [ARROW-646](https://issues.apache.org/jira/browse/ARROW-646) - Cache miniconda packages
-* [ARROW-647](https://issues.apache.org/jira/browse/ARROW-647) - [C++] Don't require Boost static libraries to support CentOS 7
-* [ARROW-648](https://issues.apache.org/jira/browse/ARROW-648) - [C++] Support multiarch on Debian
-* [ARROW-650](https://issues.apache.org/jira/browse/ARROW-650) - [GLib] Follow eadableFileInterface -> RnadomAccessFile change
-* [ARROW-651](https://issues.apache.org/jira/browse/ARROW-651) - [C++] Set shared library version for .deb packages
-* [ARROW-655](https://issues.apache.org/jira/browse/ARROW-655) - Implement DecimalArray
-* [ARROW-656](https://issues.apache.org/jira/browse/ARROW-656) - [C++] Implement IO interface that can read and write to a fixed-size mutable buffer
-* [ARROW-657](https://issues.apache.org/jira/browse/ARROW-657) - [Python] Write and read tensors (with zero copy) into shared memory
-* [ARROW-658](https://issues.apache.org/jira/browse/ARROW-658) - [C++] Implement in-memory arrow::Tensor objects
-* [ARROW-659](https://issues.apache.org/jira/browse/ARROW-659) - [C++] Add multithreaded memcpy implementation (for hardware where it helps)
-* [ARROW-660](https://issues.apache.org/jira/browse/ARROW-660) - [C++] Restore function that can read a complete encapsulated record batch message
-* [ARROW-661](https://issues.apache.org/jira/browse/ARROW-661) - [C++] Add a Flatbuffer metadata type that supports array data over 2^31 - 1 elements
-* [ARROW-662](https://issues.apache.org/jira/browse/ARROW-662) - [Format] Factor Flatbuffer schema metadata into a Schema.fbs
-* [ARROW-663](https://issues.apache.org/jira/browse/ARROW-663) - [Java] Support additional Time metadata + vector value accessors
-* [ARROW-664](https://issues.apache.org/jira/browse/ARROW-664) - Make C++ Arrow serialization deterministic
-* [ARROW-669](https://issues.apache.org/jira/browse/ARROW-669) - [Python] Attach proper tzinfo when computing boxed scalars for TimestampArray
-* [ARROW-670](https://issues.apache.org/jira/browse/ARROW-670) - Arrow 0.3 release
-* [ARROW-672](https://issues.apache.org/jira/browse/ARROW-672) - [Format] Bump metadata version for 0.3 release
-* [ARROW-674](https://issues.apache.org/jira/browse/ARROW-674) - [Java] Support additional Timestamp timezone metadata
-* [ARROW-675](https://issues.apache.org/jira/browse/ARROW-675) - [GLib] Update package metadata
-* [ARROW-676](https://issues.apache.org/jira/browse/ARROW-676) - [java] move from MinorType to FieldType in ValueVectors to carry all the relevant type bits
-* [ARROW-679](https://issues.apache.org/jira/browse/ARROW-679) - [Format] Change RecordBatch and Field length members from int to long
-* [ARROW-681](https://issues.apache.org/jira/browse/ARROW-681) - [C++] Build Arrow on Windows with dynamically linked boost
-* [ARROW-684](https://issues.apache.org/jira/browse/ARROW-684) - Python: More informative message when parquet-cpp but not parquet-arrow is available
-* [ARROW-687](https://issues.apache.org/jira/browse/ARROW-687) - [C++] Build and run full test suite in Appveyor
-* [ARROW-688](https://issues.apache.org/jira/browse/ARROW-688) - [C++] Use CMAKE_INSTALL_INCLUDEDIR for consistency
-* [ARROW-690](https://issues.apache.org/jira/browse/ARROW-690) - Only send JIRA updates to issues@arrow.apache.org
-* [ARROW-698](https://issues.apache.org/jira/browse/ARROW-698) - [C++] Add options to StreamWriter/FileWriter to permit large record batches
-* [ARROW-700](https://issues.apache.org/jira/browse/ARROW-700) - Add headroom interface for allocator.
-* [ARROW-701](https://issues.apache.org/jira/browse/ARROW-701) - [Java] Support additional Date metadata
-* [ARROW-706](https://issues.apache.org/jira/browse/ARROW-706) - [GLib] Add package install document
-* [ARROW-707](https://issues.apache.org/jira/browse/ARROW-707) - Python: All none-Pandas column should be converted to NullArray
-* [ARROW-708](https://issues.apache.org/jira/browse/ARROW-708) - [C++] Some IPC code simplification, perf analysis
-* [ARROW-710](https://issues.apache.org/jira/browse/ARROW-710) - [Python] Enable Feather APIs to read and write using Python file-like objects
-* [ARROW-712](https://issues.apache.org/jira/browse/ARROW-712) - [C++] Implement Array::Accept as inline visitor
-* [ARROW-717](https://issues.apache.org/jira/browse/ARROW-717) - [C++] IPC zero-copy round trips for arrow::Tensor
-* [ARROW-718](https://issues.apache.org/jira/browse/ARROW-718) - [Python] Expose arrow::Tensor with conversions to/from NumPy arrays
-* [ARROW-719](https://issues.apache.org/jira/browse/ARROW-719) - [GLib] Support prepared source archive release
-* [ARROW-722](https://issues.apache.org/jira/browse/ARROW-722) - [Python] pandas conversions for new date and time types/metadata
-* [ARROW-724](https://issues.apache.org/jira/browse/ARROW-724) - Add "How to Contribute" section to README
-* [ARROW-725](https://issues.apache.org/jira/browse/ARROW-725) - [Format] Constant length list type
-* [ARROW-727](https://issues.apache.org/jira/browse/ARROW-727) - [Python] Write memoryview-compatible objects in NativeFile.write with zero copy
-* [ARROW-728](https://issues.apache.org/jira/browse/ARROW-728) - [C++/Python] Add arrow::Table function for removing a column
-* [ARROW-729](https://issues.apache.org/jira/browse/ARROW-729) - [Java] Add vector type for 32-bit date as days since UNIX epoch
-* [ARROW-731](https://issues.apache.org/jira/browse/ARROW-731) - [C++] Add shared library related versions to .pc
-* [ARROW-733](https://issues.apache.org/jira/browse/ARROW-733) - [C++/Format] Change name of Fixed Width Binary to Fixed *Size* Binary for consistency
-* [ARROW-734](https://issues.apache.org/jira/browse/ARROW-734) - [Python] Support for pyarrow on Windows / MSVC
-* [ARROW-735](https://issues.apache.org/jira/browse/ARROW-735) - [C++] Developer instruction document for MSVC on Windows
-* [ARROW-737](https://issues.apache.org/jira/browse/ARROW-737) - [C++] Support obtaining mutable slices of mutable buffers
-* [ARROW-741](https://issues.apache.org/jira/browse/ARROW-741) - [Python] Add Python 3.6 to Travis CI
-* [ARROW-743](https://issues.apache.org/jira/browse/ARROW-743) - [C++] Consolidate unit tests for code in array.h
-* [ARROW-744](https://issues.apache.org/jira/browse/ARROW-744) - [GLib] Re-add an assertion to garrow_table_new() test
-* [ARROW-745](https://issues.apache.org/jira/browse/ARROW-745) - [C++] Allow use of system cpplint
-* [ARROW-746](https://issues.apache.org/jira/browse/ARROW-746) - [GLib] Add garrow_array_get_data_type()
-* [ARROW-748](https://issues.apache.org/jira/browse/ARROW-748) - [Python] Pin runtime library versions in conda-forge packages to force upgrades
-* [ARROW-751](https://issues.apache.org/jira/browse/ARROW-751) - [Python] Rename all Cython extensions to "private" status with leading underscore
-* [ARROW-752](https://issues.apache.org/jira/browse/ARROW-752) - [Python] Construct pyarrow.DictionaryArray from boxed pyarrow array objects
-* [ARROW-754](https://issues.apache.org/jira/browse/ARROW-754) - [GLib] Add garrow_array_is_null()
-* [ARROW-755](https://issues.apache.org/jira/browse/ARROW-755) - [GLib] Add garrow_array_get_value_type()
-* [ARROW-758](https://issues.apache.org/jira/browse/ARROW-758) - [C++] Fix compiler warnings on MSVC x64
-* [ARROW-761](https://issues.apache.org/jira/browse/ARROW-761) - [Python] Add function to compute the total size of tensor payloads, including metadata and padding
-* [ARROW-763](https://issues.apache.org/jira/browse/ARROW-763) - C++: Use `python-config` to find libpythonX.X.dylib
-* [ARROW-765](https://issues.apache.org/jira/browse/ARROW-765) - [Python] Make generic ArrowException subclass value error
-* [ARROW-768](https://issues.apache.org/jira/browse/ARROW-768) - [Java] Change the "boxed" object representation of date and time types
-* [ARROW-769](https://issues.apache.org/jira/browse/ARROW-769) - [GLib] Support building without installed Arrow C++
-* [ARROW-770](https://issues.apache.org/jira/browse/ARROW-770) - [C++] Move clang-tidy/format config files back to C++ source tree
-* [ARROW-771](https://issues.apache.org/jira/browse/ARROW-771) - [Python] Add APIs for reading individual Parquet row groups
-* [ARROW-773](https://issues.apache.org/jira/browse/ARROW-773) - [C++] Add function to create arrow::Table with column appended to existing table
-* [ARROW-774](https://issues.apache.org/jira/browse/ARROW-774) - [GLib] Remove needless LICENSE.txt copy
-* [ARROW-775](https://issues.apache.org/jira/browse/ARROW-775) - [Java] add simple constructors to value vectors
-* [ARROW-779](https://issues.apache.org/jira/browse/ARROW-779) - [C++/Python] Raise exception if old metadata encountered
-* [ARROW-782](https://issues.apache.org/jira/browse/ARROW-782) - [C++] Change struct to class for objects that meet the criteria in the Google style guide
-* [ARROW-788](https://issues.apache.org/jira/browse/ARROW-788) - Possible nondeterminism in Tensor serialization code
-* [ARROW-795](https://issues.apache.org/jira/browse/ARROW-795) - [C++] Combine libarrow/libarrow_io/libarrow_ipc
-* [ARROW-798](https://issues.apache.org/jira/browse/ARROW-798) - [Docs] Publish Format Markdown documents somehow on arrow.apache.org
-* [ARROW-802](https://issues.apache.org/jira/browse/ARROW-802) - [GLib] Add read examples
-* [ARROW-803](https://issues.apache.org/jira/browse/ARROW-803) - [GLib] Update package repository URL
-* [ARROW-804](https://issues.apache.org/jira/browse/ARROW-804) - [GLib] Update build document
-* [ARROW-806](https://issues.apache.org/jira/browse/ARROW-806) - [GLib] Support add/remove a column from table
-* [ARROW-807](https://issues.apache.org/jira/browse/ARROW-807) - [GLib] Update "Since" tag
-* [ARROW-808](https://issues.apache.org/jira/browse/ARROW-808) - [GLib] Remove needless ignore entries
-* [ARROW-810](https://issues.apache.org/jira/browse/ARROW-810) - [GLib] Remove io/ipc prefix
-* [ARROW-811](https://issues.apache.org/jira/browse/ARROW-811) - [GLib] Add GArrowBuffer
-* [ARROW-815](https://issues.apache.org/jira/browse/ARROW-815) - [Java] Allow for expanding underlying buffer size after allocation
-* [ARROW-816](https://issues.apache.org/jira/browse/ARROW-816) - [C++] Use conda packages for RapidJSON, Flatbuffers to speed up builds
-* [ARROW-818](https://issues.apache.org/jira/browse/ARROW-818) - [Python] Review public pyarrow.* API completeness and update docs
-* [ARROW-820](https://issues.apache.org/jira/browse/ARROW-820) - [C++] Build dependencies for Parquet library without arrow support
-* [ARROW-825](https://issues.apache.org/jira/browse/ARROW-825) - [Python] Generalize pyarrow.from_pylist to accept any object implementing the PySequence protocol
-* [ARROW-827](https://issues.apache.org/jira/browse/ARROW-827) - [Python] Variety of Parquet improvements to support Dask integration
-* [ARROW-828](https://issues.apache.org/jira/browse/ARROW-828) - [CPP] Document new requirement (libboost-regex-dev) in README.md
-* [ARROW-832](https://issues.apache.org/jira/browse/ARROW-832) - [C++] Upgrade thirdparty gtest to 1.8.0
-* [ARROW-833](https://issues.apache.org/jira/browse/ARROW-833) - [Python] "Quickstart" build / environment setup guide for Python developers
-* [ARROW-836](https://issues.apache.org/jira/browse/ARROW-836) - Test for timedelta compat with pandas
-* [ARROW-841](https://issues.apache.org/jira/browse/ARROW-841) - [Python] Add pyarrow build to Appveyor
-* [ARROW-844](https://issues.apache.org/jira/browse/ARROW-844) - [Format] Revise format/README.md to reflect progress reaching a more complete specification
-* [ARROW-845](https://issues.apache.org/jira/browse/ARROW-845) - [Python] Sync FindArrow.cmake changes from parquet-cpp
-* [ARROW-846](https://issues.apache.org/jira/browse/ARROW-846) - [GLib] Add GArrowTensor, GArrowInt8Tensor and GArrowUInt8Tensor
-* [ARROW-848](https://issues.apache.org/jira/browse/ARROW-848) - [Python] Improvements / fixes to conda quickstart guide
-* [ARROW-849](https://issues.apache.org/jira/browse/ARROW-849) - [C++] Add optional $ARROW_BUILD_TOOLCHAIN environment variable option for configuring build environment
-* [ARROW-857](https://issues.apache.org/jira/browse/ARROW-857) - [Python] Automate publishing Python documentation to arrow-site
-* [ARROW-860](https://issues.apache.org/jira/browse/ARROW-860) - [C++] Decide if typed Tensor subclasses are worthwhile
-* [ARROW-861](https://issues.apache.org/jira/browse/ARROW-861) - [Python] Move DEVELOPMENT.md to Sphinx docs
-* [ARROW-862](https://issues.apache.org/jira/browse/ARROW-862) - [Python] Improve source build instructions in README
-* [ARROW-863](https://issues.apache.org/jira/browse/ARROW-863) - [GLib] Use GBytes to implement zero-copy
-* [ARROW-864](https://issues.apache.org/jira/browse/ARROW-864) - [GLib] Unify Array files
-* [ARROW-865](https://issues.apache.org/jira/browse/ARROW-865) - [Python] Verify Parquet roundtrips for new date/time types
-* [ARROW-868](https://issues.apache.org/jira/browse/ARROW-868) - [GLib] Use GBytes to reduce copy
-* [ARROW-869](https://issues.apache.org/jira/browse/ARROW-869) - [JS] Rename directory to js/
-* [ARROW-871](https://issues.apache.org/jira/browse/ARROW-871) - [GLib] Unify DataType files
-* [ARROW-876](https://issues.apache.org/jira/browse/ARROW-876) - [GLib] Unify ArrayBuffer files
-* [ARROW-877](https://issues.apache.org/jira/browse/ARROW-877) - [GLib] Add garrow_array_get_null_bitmap()
-* [ARROW-878](https://issues.apache.org/jira/browse/ARROW-878) - [GLib] Add garrow_binary_array_get_buffer()
-* [ARROW-880](https://issues.apache.org/jira/browse/ARROW-880) - [GLib] Add garrow_primitive_array_get_buffer()
-* [ARROW-890](https://issues.apache.org/jira/browse/ARROW-890) - [GLib] Add GArrowMutableBuffer
-* [ARROW-892](https://issues.apache.org/jira/browse/ARROW-892) - [GLib] Fix GArrowTensor document
-* [ARROW-893](https://issues.apache.org/jira/browse/ARROW-893) - Add GLib document to Web site
-* [ARROW-894](https://issues.apache.org/jira/browse/ARROW-894) - [GLib] Add GArrowPoolBuffer
-* [ARROW-896](https://issues.apache.org/jira/browse/ARROW-896) - [Docs] Add Jekyll plugin for including rendered Jupyter notebooks on website
-* [ARROW-898](https://issues.apache.org/jira/browse/ARROW-898) - [C++] Expand metadata support to field level, provide for sharing instances of KeyValueMetadata
-* [ARROW-904](https://issues.apache.org/jira/browse/ARROW-904) - [GLib] Simplify error check codes
-* [ARROW-907](https://issues.apache.org/jira/browse/ARROW-907) - C++: Convenience construct Table from schema and arrays
-* [ARROW-908](https://issues.apache.org/jira/browse/ARROW-908) - [GLib] Unify OutputStream files
-* [ARROW-910](https://issues.apache.org/jira/browse/ARROW-910) - [C++] Write 0-length EOS indicator at end of stream
-* [ARROW-916](https://issues.apache.org/jira/browse/ARROW-916) - [GLib] Add GArrowBufferOutputStream
-* [ARROW-917](https://issues.apache.org/jira/browse/ARROW-917) - [GLib] Add GArrowBufferReader
-* [ARROW-918](https://issues.apache.org/jira/browse/ARROW-918) - [GLib] Use GArrowBuffer for read
-* [ARROW-919](https://issues.apache.org/jira/browse/ARROW-919) - [GLib] Use "id" to get type enum value from GArrowDataType
-* [ARROW-920](https://issues.apache.org/jira/browse/ARROW-920) - [GLib] Add Lua examples
-* [ARROW-925](https://issues.apache.org/jira/browse/ARROW-925) - [GLib] Fix GArrowBufferReader test
-* [ARROW-926](https://issues.apache.org/jira/browse/ARROW-926) - Update KEYS to include wesm
-* [ARROW-927](https://issues.apache.org/jira/browse/ARROW-927) - C++/Python: Add manylinux1 builds to Travis matrix
-* [ARROW-930](https://issues.apache.org/jira/browse/ARROW-930) - javadoc generation fails with java 8
-* [ARROW-931](https://issues.apache.org/jira/browse/ARROW-931) - [GLib] Reconstruct input stream
-* [ARROW-95](https://issues.apache.org/jira/browse/ARROW-95) - Scaffold Main Documentation using asciidoc
-* [ARROW-965](https://issues.apache.org/jira/browse/ARROW-965) - Website updates for 0.3.0 release
-* [ARROW-98](https://issues.apache.org/jira/browse/ARROW-98) - Java: API documentation
-
-## Bug Fixes
-
-* [ARROW-109](https://issues.apache.org/jira/browse/ARROW-109) - [C++] Investigate recursive data types limit in flatbuffers
-* [ARROW-208](https://issues.apache.org/jira/browse/ARROW-208) - Add checkstyle policy to java project
-* [ARROW-347](https://issues.apache.org/jira/browse/ARROW-347) - Add method to pass CallBack when creating a transfer pair
-* [ARROW-413](https://issues.apache.org/jira/browse/ARROW-413) - DATE type is not specified clearly
-* [ARROW-431](https://issues.apache.org/jira/browse/ARROW-431) - [Python] Review GIL release and acquisition in to_pandas conversion
-* [ARROW-443](https://issues.apache.org/jira/browse/ARROW-443) - [Python] Support for converting from strided pandas data in Table.from_pandas
-* [ARROW-451](https://issues.apache.org/jira/browse/ARROW-451) - [C++] Override DataType::Equals for other types with additional metadata
-* [ARROW-454](https://issues.apache.org/jira/browse/ARROW-454) - pojo.Field doesn't implement hashCode()
-* [ARROW-526](https://issues.apache.org/jira/browse/ARROW-526) - [Format] Update IPC.md to account for File format changes and Streaming format
-* [ARROW-565](https://issues.apache.org/jira/browse/ARROW-565) - [C++] Examine "Field::dictionary" member
-* [ARROW-570](https://issues.apache.org/jira/browse/ARROW-570) - Determine Java tools JAR location from project metadata
-* [ARROW-584](https://issues.apache.org/jira/browse/ARROW-584) - [C++] Fix compiler warnings exposed with -Wconversion
-* [ARROW-588](https://issues.apache.org/jira/browse/ARROW-588) - [C++] Fix compiler warnings on 32-bit platforms
-* [ARROW-595](https://issues.apache.org/jira/browse/ARROW-595) - [Python] StreamReader.schema returns None
-* [ARROW-604](https://issues.apache.org/jira/browse/ARROW-604) - Python: boxed Field instances are missing the reference to DataType
-* [ARROW-613](https://issues.apache.org/jira/browse/ARROW-613) - [JS] Implement random-access file format
-* [ARROW-617](https://issues.apache.org/jira/browse/ARROW-617) - Time type is not specified clearly
-* [ARROW-619](https://issues.apache.org/jira/browse/ARROW-619) - Python: Fix typos in setup.py args and LD_LIBRARY_PATH
-* [ARROW-623](https://issues.apache.org/jira/browse/ARROW-623) - segfault with __repr__ of empty Field
-* [ARROW-624](https://issues.apache.org/jira/browse/ARROW-624) - [C++] Restore MakePrimitiveArray function
-* [ARROW-627](https://issues.apache.org/jira/browse/ARROW-627) - [C++] Compatibility macros for exported extern template class declarations
-* [ARROW-628](https://issues.apache.org/jira/browse/ARROW-628) - [Python] Install nomkl metapackage when building parquet-cpp for faster Travis builds
-* [ARROW-630](https://issues.apache.org/jira/browse/ARROW-630) - [C++] IPC unloading for BooleanArray does not account for offset
-* [ARROW-636](https://issues.apache.org/jira/browse/ARROW-636) - [C++] Add Boost / other system requirements to C++ README
-* [ARROW-639](https://issues.apache.org/jira/browse/ARROW-639) - [C++] Invalid offset in slices
-* [ARROW-642](https://issues.apache.org/jira/browse/ARROW-642) - [Java] Remove temporary file in java/tools
-* [ARROW-644](https://issues.apache.org/jira/browse/ARROW-644) - Python: Cython should be a setup-only requirement
-* [ARROW-652](https://issues.apache.org/jira/browse/ARROW-652) - Remove trailing f in merge script output
-* [ARROW-654](https://issues.apache.org/jira/browse/ARROW-654) - [C++] Support timezone metadata in file/stream formats
-* [ARROW-668](https://issues.apache.org/jira/browse/ARROW-668) - [Python] Convert nanosecond timestamps to pandas.Timestamp when converting from TimestampValue
-* [ARROW-671](https://issues.apache.org/jira/browse/ARROW-671) - [GLib] License file isn't installed
-* [ARROW-673](https://issues.apache.org/jira/browse/ARROW-673) - [Java] Support additional Time metadata
-* [ARROW-677](https://issues.apache.org/jira/browse/ARROW-677) - [java] Fix checkstyle jcl-over-slf4j conflict issue
-* [ARROW-678](https://issues.apache.org/jira/browse/ARROW-678) - [GLib] Fix dependenciesfff
-* [ARROW-680](https://issues.apache.org/jira/browse/ARROW-680) - [C++] Multiarch support impacts user-supplied install prefix
-* [ARROW-682](https://issues.apache.org/jira/browse/ARROW-682) - Add self-validation checks in integration tests
-* [ARROW-683](https://issues.apache.org/jira/browse/ARROW-683) - [C++] Support date32 (DateUnit::DAY) in IPC metadata, rename date to date64
-* [ARROW-686](https://issues.apache.org/jira/browse/ARROW-686) - [C++] Account for time metadata changes, add time32 and time64 types
-* [ARROW-689](https://issues.apache.org/jira/browse/ARROW-689) - [GLib] Install header files and documents to wrong directories
-* [ARROW-691](https://issues.apache.org/jira/browse/ARROW-691) - [Java] Encode dictionary Int type in message format
-* [ARROW-697](https://issues.apache.org/jira/browse/ARROW-697) - [Java] Raise appropriate exceptions when encountering large (> INT32_MAX) record batches
-* [ARROW-699](https://issues.apache.org/jira/browse/ARROW-699) - [C++] Arrow dynamic libraries are missed on run of unit tests on Windows
-* [ARROW-702](https://issues.apache.org/jira/browse/ARROW-702) - Fix BitVector.copyFromSafe to reAllocate instead of returning false
-* [ARROW-703](https://issues.apache.org/jira/browse/ARROW-703) - Fix issue where setValueCount(0) doesn’t work in the case that we’ve shipped vectors across the wire
-* [ARROW-704](https://issues.apache.org/jira/browse/ARROW-704) - Fix bad import caused by conflicting changes
-* [ARROW-709](https://issues.apache.org/jira/browse/ARROW-709) - [C++] Restore type comparator for DecimalType
-* [ARROW-713](https://issues.apache.org/jira/browse/ARROW-713) - [C++] Fix linking issue with ipc benchmark
-* [ARROW-715](https://issues.apache.org/jira/browse/ARROW-715) - Python: Explicit pandas import makes it a hard requirement
-* [ARROW-716](https://issues.apache.org/jira/browse/ARROW-716) - error building arrow/python
-* [ARROW-720](https://issues.apache.org/jira/browse/ARROW-720) - [java] arrow should not have a dependency on slf4j bridges in compile
-* [ARROW-723](https://issues.apache.org/jira/browse/ARROW-723) - Arrow freezes on write if chunk_size=0
-* [ARROW-726](https://issues.apache.org/jira/browse/ARROW-726) - [C++] PyBuffer dtor may segfault if constructor passed an object not exporting buffer protocol
-* [ARROW-732](https://issues.apache.org/jira/browse/ARROW-732) - Schema comparison bugs in struct and union types
-* [ARROW-736](https://issues.apache.org/jira/browse/ARROW-736) - [Python] Mixed-type object DataFrame columns should not silently coerce to an Arrow type by default
-* [ARROW-738](https://issues.apache.org/jira/browse/ARROW-738) - [Python] Fix manylinux1 packaging
-* [ARROW-739](https://issues.apache.org/jira/browse/ARROW-739) - Parallel build fails non-deterministically.
-* [ARROW-740](https://issues.apache.org/jira/browse/ARROW-740) - FileReader fails for large objects
-* [ARROW-747](https://issues.apache.org/jira/browse/ARROW-747) - [C++] Fix spurious warning caused by passing dl to add_dependencies
-* [ARROW-749](https://issues.apache.org/jira/browse/ARROW-749) - [Python] Delete incomplete binary files when writing fails
-* [ARROW-753](https://issues.apache.org/jira/browse/ARROW-753) - [Python] Unit tests in arrow/python fail to link on some OS X platforms
-* [ARROW-756](https://issues.apache.org/jira/browse/ARROW-756) - [C++] Do not pass -fPIC when compiling with MSVC
-* [ARROW-757](https://issues.apache.org/jira/browse/ARROW-757) - [C++] MSVC build fails on googletest when using NMake
-* [ARROW-762](https://issues.apache.org/jira/browse/ARROW-762) - Kerberos Problem with PyArrow
-* [ARROW-776](https://issues.apache.org/jira/browse/ARROW-776) - [GLib] Cast type is wrong
-* [ARROW-777](https://issues.apache.org/jira/browse/ARROW-777) - [Java] Resolve getObject behavior per changes / discussion in ARROW-729
-* [ARROW-778](https://issues.apache.org/jira/browse/ARROW-778) - Modify merge tool to work on Windows
-* [ARROW-781](https://issues.apache.org/jira/browse/ARROW-781) - [Python/C++] Increase reference count for base object?
-* [ARROW-783](https://issues.apache.org/jira/browse/ARROW-783) - Integration tests fail for length-0 record batch
-* [ARROW-787](https://issues.apache.org/jira/browse/ARROW-787) - [GLib] Fix compilation errors caused by ARROW-758
-* [ARROW-793](https://issues.apache.org/jira/browse/ARROW-793) - [GLib] Wrong indent
-* [ARROW-794](https://issues.apache.org/jira/browse/ARROW-794) - [C++] Check whether data is contiguous in ipc::WriteTensor
-* [ARROW-797](https://issues.apache.org/jira/browse/ARROW-797) - [Python] Add updated pyarrow.* public API listing in Sphinx docs
-* [ARROW-800](https://issues.apache.org/jira/browse/ARROW-800) - [C++] Boost headers being transitively included in pyarrow
-* [ARROW-805](https://issues.apache.org/jira/browse/ARROW-805) - listing empty HDFS directory returns an error instead of returning empty list
-* [ARROW-809](https://issues.apache.org/jira/browse/ARROW-809) - C++: Writing sliced record batch to IPC writes the entire array
-* [ARROW-812](https://issues.apache.org/jira/browse/ARROW-812) - Pip install pyarrow on mac failed.
-* [ARROW-817](https://issues.apache.org/jira/browse/ARROW-817) - [C++] Fix incorrect code comment from ARROW-722
-* [ARROW-821](https://issues.apache.org/jira/browse/ARROW-821) - [Python] Extra file _table_api.h generated during Python build process
-* [ARROW-822](https://issues.apache.org/jira/browse/ARROW-822) - [Python] StreamWriter fails to open with socket as sink
-* [ARROW-826](https://issues.apache.org/jira/browse/ARROW-826) - Compilation error on Mac with -DARROW_PYTHON=on
-* [ARROW-829](https://issues.apache.org/jira/browse/ARROW-829) - Python: Parquet: Dictionary encoding is deactivated if column-wise compression was selected
-* [ARROW-830](https://issues.apache.org/jira/browse/ARROW-830) - Python: jemalloc is not anymore publicly exposed
-* [ARROW-839](https://issues.apache.org/jira/browse/ARROW-839) - [C++] Portable alternative to PyDate_to_ms function
-* [ARROW-847](https://issues.apache.org/jira/browse/ARROW-847) - C++: BUILD_BYPRODUCTS not specified anymore for gtest
-* [ARROW-852](https://issues.apache.org/jira/browse/ARROW-852) - Python: Also set Arrow Library PATHS when detection was done through pkg-config
-* [ARROW-853](https://issues.apache.org/jira/browse/ARROW-853) - [Python] It is no longer necessary to modify the RPATH of the Cython extensions on many environments
-* [ARROW-858](https://issues.apache.org/jira/browse/ARROW-858) - Remove dependency on boost regex
-* [ARROW-866](https://issues.apache.org/jira/browse/ARROW-866) - [Python] Error from file object destructor
-* [ARROW-867](https://issues.apache.org/jira/browse/ARROW-867) - [Python] Miscellaneous pyarrow MSVC fixes
-* [ARROW-875](https://issues.apache.org/jira/browse/ARROW-875) - Nullable variable length vector fillEmpties() fills an extra value
-* [ARROW-879](https://issues.apache.org/jira/browse/ARROW-879) - compat with pandas 0.20.0
-* [ARROW-882](https://issues.apache.org/jira/browse/ARROW-882) - [C++] On Windows statically built lib file overwrites lib file of shared build
-* [ARROW-886](https://issues.apache.org/jira/browse/ARROW-886) - VariableLengthVectors don't reAlloc offsets
-* [ARROW-887](https://issues.apache.org/jira/browse/ARROW-887) - [format] For backward compatibility, new unit fields must have default values matching previous implied unit
-* [ARROW-888](https://issues.apache.org/jira/browse/ARROW-888) - BitVector transfer() does not transfer ownership
-* [ARROW-895](https://issues.apache.org/jira/browse/ARROW-895) - Nullable variable length vector lastSet not set correctly
-* [ARROW-900](https://issues.apache.org/jira/browse/ARROW-900) - [Python] UnboundLocalError in ParquetDatasetPiece
-* [ARROW-903](https://issues.apache.org/jira/browse/ARROW-903) - [GLib] Remove a needless "."
-* [ARROW-914](https://issues.apache.org/jira/browse/ARROW-914) - [C++/Python] Fix Decimal ToBytes
-* [ARROW-922](https://issues.apache.org/jira/browse/ARROW-922) - Allow Flatbuffers and RapidJSON to be used locally on Windows
-* [ARROW-928](https://issues.apache.org/jira/browse/ARROW-928) - Update CMAKE script to detect unsupported msvc compilers versions
-* [ARROW-933](https://issues.apache.org/jira/browse/ARROW-933) - [Python] arrow_python bindings have debug print statement
-* [ARROW-934](https://issues.apache.org/jira/browse/ARROW-934) - [GLib] Glib sources missing from result of 02-source.sh
-* [ARROW-936](https://issues.apache.org/jira/browse/ARROW-936) - Fix release README
-* [ARROW-938](https://issues.apache.org/jira/browse/ARROW-938) - Fix Apache Rat errors from source release build
-
-[2]: https://github.com/apache/arrow/releases/tag/apache-arrow-0.3.0
-[6]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.3.0/
-[8]: http://arrow.apache.org/blog/2017/05/08/0.3-release/
\ No newline at end of file
diff --git a/site/_release/0.4.0.md b/site/_release/0.4.0.md
deleted file mode 100644
index d0060a3..0000000
--- a/site/_release/0.4.0.md
+++ /dev/null
@@ -1,138 +0,0 @@
----
-layout: default
-title: 0.4.0 Release
-permalink: /release/0.4.0.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow 0.4.0 (22 May 2017)
-
-Read more in the [release blog post][8]
-
-## Download
-
-* [**Source Artifacts**][6]
-* [Git tag a8f8ba0][2]
-
-# Changelog
-
-## Contributors
-
-```shell
-$ git shortlog -sn apache-arrow-0.3.0..apache-arrow-0.4.0
-    28  Wes McKinney
-    18  Kouhei Sutou
-     9  Uwe L. Korn
-     3  Brian Hulette
-     3  Emilio Lahr-Vivaz
-     3  Philipp Moritz
-     3  Phillip Cloud
-     2  Julien Le Dem
-     1  Bryan Cutler
-     1  Jeff Reback
-     1  Max Risuhin
-     1  Wenchen Fan
-     1  bgosztonyi
-```
-## New Features and Improvements
-
-* [ARROW-1000](https://issues.apache.org/jira/browse/ARROW-1000) - [GLib] Move install document to Website
-* [ARROW-1001](https://issues.apache.org/jira/browse/ARROW-1001) - [GLib] Unify writer files
-* [ARROW-1002](https://issues.apache.org/jira/browse/ARROW-1002) - [C++] It is not necessary to add padding after the magic header in the FileWriter implementation
-* [ARROW-1008](https://issues.apache.org/jira/browse/ARROW-1008) - [C++] Define abstract interface for stream iteration
-* [ARROW-1010](https://issues.apache.org/jira/browse/ARROW-1010) - [Website] Only show English posts in /blog/
-* [ARROW-1011](https://issues.apache.org/jira/browse/ARROW-1011) - [Format] Clarify requirements around buffer padding in validity bitmaps
-* [ARROW-1015](https://issues.apache.org/jira/browse/ARROW-1015) - [Java] Implement schema-level metadata
-* [ARROW-1016](https://issues.apache.org/jira/browse/ARROW-1016) - Python: Include C++ headers (optionally) in wheels
-* [ARROW-1022](https://issues.apache.org/jira/browse/ARROW-1022) - [Python] Add nthreads option to Feather read method
-* [ARROW-1024](https://issues.apache.org/jira/browse/ARROW-1024) - Python: Update build time numpy version to 1.10.1
-* [ARROW-1025](https://issues.apache.org/jira/browse/ARROW-1025) - [Website] Improve changelog on website
-* [ARROW-1027](https://issues.apache.org/jira/browse/ARROW-1027) - [Python] Allow negative indexing in fields/columns on pyarrow Table and Schema objects
-* [ARROW-1028](https://issues.apache.org/jira/browse/ARROW-1028) - [Python] Documentation updates after ARROW-1008
-* [ARROW-1029](https://issues.apache.org/jira/browse/ARROW-1029) - [Python] Fix --with-parquet build on Windows, add unit tests to Appveyor
-* [ARROW-1030](https://issues.apache.org/jira/browse/ARROW-1030) - Python: Account for library versioning in parquet-cpp
-* [ARROW-1031](https://issues.apache.org/jira/browse/ARROW-1031) - [GLib] Support pretty print
-* [ARROW-1037](https://issues.apache.org/jira/browse/ARROW-1037) - [GLib] Follow reader name change
-* [ARROW-1038](https://issues.apache.org/jira/browse/ARROW-1038) - [GLib] Follow writer name change
-* [ARROW-1040](https://issues.apache.org/jira/browse/ARROW-1040) - [GLib] Follow tensor IO
-* [ARROW-1044](https://issues.apache.org/jira/browse/ARROW-1044) - [GLib] Support Feather
-* [ARROW-182](https://issues.apache.org/jira/browse/ARROW-182) - [C++] Remove Array::Validate virtual function and make a separate method
-* [ARROW-29](https://issues.apache.org/jira/browse/ARROW-29) - C++: Add re2 as optional 3rd-party toolchain dependency
-* [ARROW-446](https://issues.apache.org/jira/browse/ARROW-446) - [Python] Document NativeFile interfaces, HDFS client in Sphinx
-* [ARROW-482](https://issues.apache.org/jira/browse/ARROW-482) - [Java] Provide API access to "custom_metadata" Field attribute in IPC setting
-* [ARROW-532](https://issues.apache.org/jira/browse/ARROW-532) - [Python] Expand pyarrow.parquet documentation for 0.3 release
-* [ARROW-629](https://issues.apache.org/jira/browse/ARROW-629) - [JS] Add unit test suite
-* [ARROW-714](https://issues.apache.org/jira/browse/ARROW-714) - [C++] Add import_pyarrow C API in the style of NumPy for thirdparty C++ users
-* [ARROW-819](https://issues.apache.org/jira/browse/ARROW-819) - [Python] Define public Cython API
-* [ARROW-872](https://issues.apache.org/jira/browse/ARROW-872) - [JS] Read streaming format
-* [ARROW-873](https://issues.apache.org/jira/browse/ARROW-873) - [JS] Implement fixed width list type
-* [ARROW-874](https://issues.apache.org/jira/browse/ARROW-874) - [JS] Read dictionary-encoded vectors
-* [ARROW-899](https://issues.apache.org/jira/browse/ARROW-899) - [Docs] Add CHANGELOG for 0.3.0
-* [ARROW-901](https://issues.apache.org/jira/browse/ARROW-901) - [Python] Write FixedSizeBinary to Parquet
-* [ARROW-923](https://issues.apache.org/jira/browse/ARROW-923) - [Docs] Generate Changelog for website with JIRA links
-* [ARROW-929](https://issues.apache.org/jira/browse/ARROW-929) - Move KEYS file to SVN, remove from git
-* [ARROW-943](https://issues.apache.org/jira/browse/ARROW-943) - [GLib] Support running unit tests with source archive
-* [ARROW-945](https://issues.apache.org/jira/browse/ARROW-945) - [GLib] Add a Lua example to show Torch integration
-* [ARROW-946](https://issues.apache.org/jira/browse/ARROW-946) - [GLib] Use "new" instead of "open" for constructor name
-* [ARROW-947](https://issues.apache.org/jira/browse/ARROW-947) - [Python] Improve execution time of manylinux1 build
-* [ARROW-953](https://issues.apache.org/jira/browse/ARROW-953) - Use cmake / curl from conda-forge in CI builds
-* [ARROW-954](https://issues.apache.org/jira/browse/ARROW-954) - Make it possible to compile Arrow with header-only boost
-* [ARROW-956](https://issues.apache.org/jira/browse/ARROW-956) - remove pandas pre-0.20.0 compat
-* [ARROW-957](https://issues.apache.org/jira/browse/ARROW-957) - [Doc] Add HDFS and Windows documents to doxygen output
-* [ARROW-961](https://issues.apache.org/jira/browse/ARROW-961) - [Python] Rename InMemoryOutputStream to BufferOutputStream
-* [ARROW-963](https://issues.apache.org/jira/browse/ARROW-963) - [GLib] Add equal
-* [ARROW-967](https://issues.apache.org/jira/browse/ARROW-967) - [GLib] Support initializing array with buffer
-* [ARROW-970](https://issues.apache.org/jira/browse/ARROW-970) - [Python] Accidentally calling pyarrow.Table() should not segfault process
-* [ARROW-977](https://issues.apache.org/jira/browse/ARROW-977) - [java] Add Timezone aware timestamp vectors
-* [ARROW-984](https://issues.apache.org/jira/browse/ARROW-984) - [GLib] Add Go examples
-* [ARROW-985](https://issues.apache.org/jira/browse/ARROW-985) - [GLib] Update package information
-* [ARROW-988](https://issues.apache.org/jira/browse/ARROW-988) - [JS] Add entry to Travis CI matrix
-* [ARROW-993](https://issues.apache.org/jira/browse/ARROW-993) - [GLib] Add missing error checks in Go examples
-* [ARROW-996](https://issues.apache.org/jira/browse/ARROW-996) - [Website] Add 0.3 release announce in Japanese
-* [ARROW-997](https://issues.apache.org/jira/browse/ARROW-997) - [Java] Implement transfer in FixedSizeListVector
-
-## Bug Fixes
-
-* [ARROW-1003](https://issues.apache.org/jira/browse/ARROW-1003) - [C++] Hdfs and java dlls fail to load when built for Windows with MSVC
-* [ARROW-1004](https://issues.apache.org/jira/browse/ARROW-1004) - ArrowInvalid: Invalid: Python object of type float is not None and is not a string, bool, or date object
-* [ARROW-1017](https://issues.apache.org/jira/browse/ARROW-1017) - Python: Table.to_pandas leaks memory
-* [ARROW-1023](https://issues.apache.org/jira/browse/ARROW-1023) - Python: Fix bundling of arrow-cpp for macOS
-* [ARROW-1033](https://issues.apache.org/jira/browse/ARROW-1033) - [Python] pytest discovers scripts/test_leak.py
-* [ARROW-1046](https://issues.apache.org/jira/browse/ARROW-1046) - [Python] Conform DataFrame metadata to pandas spec
-* [ARROW-1053](https://issues.apache.org/jira/browse/ARROW-1053) - [Python] Memory leak with RecordBatchFileReader
-* [ARROW-1054](https://issues.apache.org/jira/browse/ARROW-1054) - [Python] Test suite fails on pandas 0.19.2
-* [ARROW-813](https://issues.apache.org/jira/browse/ARROW-813) - [Python] setup.py sdist must also bundle dependent cmake modules
-* [ARROW-824](https://issues.apache.org/jira/browse/ARROW-824) - Date and Time Vectors should reflect timezone-less semantics
-* [ARROW-856](https://issues.apache.org/jira/browse/ARROW-856) - CmakeError by Unknown compiler.
-* [ARROW-881](https://issues.apache.org/jira/browse/ARROW-881) - [Python] Reconstruct Pandas DataFrame indexes using custom_metadata
-* [ARROW-909](https://issues.apache.org/jira/browse/ARROW-909) - libjemalloc.so.2: cannot open shared object file:
-* [ARROW-939](https://issues.apache.org/jira/browse/ARROW-939) - Fix division by zero for zero-dimensional Tensors
-* [ARROW-940](https://issues.apache.org/jira/browse/ARROW-940) - [JS] Generate multiple sets of artifacts
-* [ARROW-944](https://issues.apache.org/jira/browse/ARROW-944) - Python: Compat broken for pandas==0.18.1
-* [ARROW-948](https://issues.apache.org/jira/browse/ARROW-948) - [GLib] Update C++ header file list
-* [ARROW-952](https://issues.apache.org/jira/browse/ARROW-952) - Compilation error on macOS with clang-802.0.42
-* [ARROW-958](https://issues.apache.org/jira/browse/ARROW-958) - [Python] Conda build guide still needs ARROW_HOME, PARQUET_HOME
-* [ARROW-991](https://issues.apache.org/jira/browse/ARROW-991) - [Python] PyArray_SimpleNew should not be used with NPY_DATETIME
-* [ARROW-995](https://issues.apache.org/jira/browse/ARROW-995) - [Website] 0.3 release announce has a typo in reference
-* [ARROW-998](https://issues.apache.org/jira/browse/ARROW-998) - [Doc] File format documents incorrect schema location
-
-[2]: https://github.com/apache/arrow/releases/tag/apache-arrow-0.4.0
-[6]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.4.0/
-[8]: http://arrow.apache.org/blog/2017/05/23/0.4.0-release/
\ No newline at end of file
diff --git a/site/_release/0.4.1.md b/site/_release/0.4.1.md
deleted file mode 100644
index 7d48a99..0000000
--- a/site/_release/0.4.1.md
+++ /dev/null
@@ -1,98 +0,0 @@
----
-layout: default
-title: 0.4.1 Release
-permalink: /release/0.4.1.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow 0.4.1 (9 June 2017)
-
-This is primarily a bug fix release, but also includes some packaging and
-documentation improvements.
-
-Read more in the [release blog post][8].
-
-## Download
-
-* [**Source Artifacts**][6]
-* [Git tag 46315431][2]
-
-# Changelog
-
-## New Features and Improvements
-
-* [ARROW-1020](https://issues.apache.org/jira/browse/ARROW-1020) - [Format] Add additional language to Schema.fbs to clarify naive vs. localized Timestamp values
-* [ARROW-1034](https://issues.apache.org/jira/browse/ARROW-1034) - [Python] Enable creation of binary wheels on Windows / MSVC
-* [ARROW-1049](https://issues.apache.org/jira/browse/ARROW-1049) - [java] vector template cleanup
-* [ARROW-1060](https://issues.apache.org/jira/browse/ARROW-1060) - [Python] Add unit test for ARROW-1053
-* [ARROW-1063](https://issues.apache.org/jira/browse/ARROW-1063) - [Website] Blog post and website updates for 0.4.0 release
-* [ARROW-1068](https://issues.apache.org/jira/browse/ARROW-1068) - [Python] Create external repo with appveyor.yml configured for building Python wheel installers
-* [ARROW-1069](https://issues.apache.org/jira/browse/ARROW-1069) - Add instructions for publishing maven artifacts
-* [ARROW-1078](https://issues.apache.org/jira/browse/ARROW-1078) - [Python] Account for PARQUET-967
-* [ARROW-1080](https://issues.apache.org/jira/browse/ARROW-1080) - C++: Add tutorial about converting to/from row-wise representation
-* [ARROW-1082](https://issues.apache.org/jira/browse/ARROW-1082) - [GLib] Add CI on macOS
-* [ARROW-1084](https://issues.apache.org/jira/browse/ARROW-1084) - Implementations of BufferAllocator should handle Netty's OutOfDirectMemoryError
-* [ARROW-897](https://issues.apache.org/jira/browse/ARROW-897) - [GLib] Build arrow-glib as a separate build in the Travis CI build matrix
-* [ARROW-986](https://issues.apache.org/jira/browse/ARROW-986) - [Format] Update IPC.md to account for dictionary batches
-* [ARROW-990](https://issues.apache.org/jira/browse/ARROW-990) - [JS] Add tslint support for linting TypeScript
-
-## Bug Fixes
-
-* [ARROW-1039](https://issues.apache.org/jira/browse/ARROW-1039) - Python: pyarrow.Filesystem.read_parquet causing error if nthreads>1
-* [ARROW-1050](https://issues.apache.org/jira/browse/ARROW-1050) - [C++] Export arrow::ValidateArray
-* [ARROW-1051](https://issues.apache.org/jira/browse/ARROW-1051) - [Python] If pyarrow.parquet fails to import due to a shared library ABI conflict, the test_parquet.py tests silently do not run
-* [ARROW-1056](https://issues.apache.org/jira/browse/ARROW-1056) - [Python] Parquet+HDFS test failure due to writing pandas index
-* [ARROW-1057](https://issues.apache.org/jira/browse/ARROW-1057) - Fix cmake warning and msvc debug asserts
-* [ARROW-1062](https://issues.apache.org/jira/browse/ARROW-1062) - [GLib] Examples use old API
-* [ARROW-1066](https://issues.apache.org/jira/browse/ARROW-1066) - remove warning on feather for pandas >= 0.20.1
-* [ARROW-1070](https://issues.apache.org/jira/browse/ARROW-1070) - [C++] Feather files for date/time types should be written with the physical types
-* [ARROW-1075](https://issues.apache.org/jira/browse/ARROW-1075) - [GLib] Build error on macOS
-* [ARROW-1085](https://issues.apache.org/jira/browse/ARROW-1085) - [java] Follow up on template cleanup. Missing method for IntervalYear
-* [ARROW-1086](https://issues.apache.org/jira/browse/ARROW-1086) - [Python] pyarrow 0.4.0 on pypi is missing pxd files
-* [ARROW-1088](https://issues.apache.org/jira/browse/ARROW-1088) - [Python] test_unicode_filename test fails when unicode filenames aren't supported by system
-* [ARROW-1090](https://issues.apache.org/jira/browse/ARROW-1090) - [Python] build_ext usability
-* [ARROW-1091](https://issues.apache.org/jira/browse/ARROW-1091) - Decimal scale and precision are flipped
-* [ARROW-1092](https://issues.apache.org/jira/browse/ARROW-1092) - More Decimal and scale flipped follow-up
-* [ARROW-1094](https://issues.apache.org/jira/browse/ARROW-1094) - [C++] Incomplete buffer reads in arrow::io::ReadableFile should exactly truncate returned buffer
-* [ARROW-424](https://issues.apache.org/jira/browse/ARROW-424) - [C++] Threadsafety in arrow/io/hdfs.h
-
-## Contributors
-
-```shell
-$ git shortlog -sn apache-arrow-0.4.0..apache-arrow-0.4.1
-    16  Wes McKinney
-     5  Julien Le Dem
-     4  Kouhei Sutou
-     2  Jacob Scott
-     1  Brian Hulette
-     1  Jeff Knupp
-     1  Jeff Reback
-     1  Max Risuhin
-     1  Phillip Cloud
-     1  Tony Kelman
-     1  Uwe L. Korn
-     1  adeneche
-     1  kimanjun
-     1  rip.nsk
-```
-
-[2]: https://github.com/apache/arrow/releases/tag/apache-arrow-0.4.1
-[6]: https://www.apache.org/dyn/closer.cgi/arrow/arrow-0.4.1/
-[8]: http://arrow.apache.org/blog/2017/06/14/0.4.1-release/
\ No newline at end of file
diff --git a/site/_release/0.5.0.md b/site/_release/0.5.0.md
deleted file mode 100644
index f28d866..0000000
--- a/site/_release/0.5.0.md
+++ /dev/null
@@ -1,203 +0,0 @@
----
-layout: default
-title: Apache Arrow 0.5.0 Release
-permalink: /release/0.5.0.html
----
-<!--
-{% comment %}
-Licensed to the Apache Software Foundation (ASF) under one or more
-contributor license agreements.  See the NOTICE file distributed with
-this work for additional information regarding copyright ownership.
-The ASF licenses this file to you under the Apache License, Version 2.0
-(the "License"); you may not use this file except in compliance with
-the License.  You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software
-distributed under the License is distributed on an "AS IS" BASIS,
-WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-See the License for the specific language governing permissions and
-limitations under the License.
-{% endcomment %}
--->
-
-# Apache Arrow 0.5.0 (23 July 2017)
-
-This is a major release, with expanded features in the supported languages and
-additional integration test coverage between Java and C++.
-
-Read more in the [release blog post][8].
-
-## Download
-
-* [**Source Artifacts**][6]
-* [Git tag][2]
-
-## Contributors
-
-```shell
-$ git shortlog -sn apache-arrow-0.4.1..apache-arrow-0.5.0
-    42  Wes McKinney
-    22  Uwe L. Korn
-    12  Kouhei Sutou
-     9  Max Risuhin
-     9  Phillip Cloud
-     6  Philipp Moritz
-     5  Steven Phillips
-     3  Julien Le Dem
-     2  Bryan Cutler
-     2  Kengo Seki
-     2  Max Risukhin
-     2  fjetter
-     1  Antony Mayi
-     1  Deepak Majeti
-     1  Fang Zheng
-     1  Hideo Hattori
-     1  Holden Karau
-     1  Itai Incze
-     1  Jeff Knupp
-     1  LynnYuan
-     1  Mark Lavrynenko
-     1  Michael König
-     1  Robert Nishihara
-     1  Sudheesh Katkam
-     1  Zahari
-     1  vkorukanti
-```
-
-# Changelog
-
-## New Features and Improvements
-
-* [ARROW-1041](https://issues.apache.org/jira/browse/ARROW-1041) - [Python] Support read_pandas on a directory of Parquet files
-* [ARROW-1048](https://issues.apache.org/jira/browse/ARROW-1048) - Allow user LD_LIBRARY_PATH to be used with source release script
-* [ARROW-1052](https://issues.apache.org/jira/browse/ARROW-1052) - Arrow 0.5.0 release
-* [ARROW-1073](https://issues.apache.org/jira/browse/ARROW-1073) - C++: Adapative integer builder
-* [ARROW-1095](https://issues.apache.org/jira/browse/ARROW-1095) - [Website] Add Arrow icon asset
-* [ARROW-1100](https://issues.apache.org/jira/browse/ARROW-1100) - [Python] Add "mode" property to NativeFile instances
-* [ARROW-1102](https://issues.apache.org/jira/browse/ARROW-1102) - Make MessageSerializer.serializeMessage() public
-* [ARROW-111](https://issues.apache.org/jira/browse/ARROW-111) - [C++] Add static analyzer to tool chain to verify checking of Status returns
-* [ARROW-1120](https://issues.apache.org/jira/browse/ARROW-1120) - [Python] Write support for int96
-* [ARROW-1122](https://issues.apache.org/jira/browse/ARROW-1122) - [Website] Guest blog post on Arrow + ODBC from turbodbc
-* [ARROW-1123](https://issues.apache.org/jira/browse/ARROW-1123) - C++: Make jemalloc the default allocator
-* [ARROW-1135](https://issues.apache.org/jira/browse/ARROW-1135) - Upgrade Travis CI clang builds to use LLVM 4.0
-* [ARROW-1137](https://issues.apache.org/jira/browse/ARROW-1137) - Python: Ensure Pandas roundtrip of all-None column
-* [ARROW-1142](https://issues.apache.org/jira/browse/ARROW-1142) - [C++] Move over compression library toolchain from parquet-cpp
-* [ARROW-1145](https://issues.apache.org/jira/browse/ARROW-1145) - [GLib] Add get_values()
-* [ARROW-1146](https://issues.apache.org/jira/browse/ARROW-1146) - Add .gitignore for *_generated.h files in src/plasma/format
-* [ARROW-1148](https://issues.apache.org/jira/browse/ARROW-1148) - [C++] Raise minimum CMake version to 3.2
-* [ARROW-1151](https://issues.apache.org/jira/browse/ARROW-1151) - [C++] Add gcc branch prediction to status check macro
-* [ARROW-1154](https://issues.apache.org/jira/browse/ARROW-1154) - [C++] Migrate more computational utility code from parquet-cpp
-* [ARROW-1160](https://issues.apache.org/jira/browse/ARROW-1160) - C++: Implement DictionaryBuilder
-* [ARROW-1165](https://issues.apache.org/jira/browse/ARROW-1165) - [C++] Refactor PythonDecimalToArrowDecimal to not use templates
-* [ARROW-1172](https://issues.apache.org/jira/browse/ARROW-1172) - [C++] Use unique_ptr with array builder classes
-* [ARROW-1183](https://issues.apache.org/jira/browse/ARROW-1183) - [Python] Implement time type conversions in to_pandas
-* [ARROW-1185](https://issues.apache.org/jira/browse/ARROW-1185) - [C++] Clean up arrow::Status implementation, add warn_unused_result attribute for clang
-* [ARROW-1187](https://issues.apache.org/jira/browse/ARROW-1187) - Serialize a DataFrame with None column
-* [ARROW-1193](https://issues.apache.org/jira/browse/ARROW-1193) - [C++] Support pkg-config forarrow_python.so
-* [ARROW-1196](https://issues.apache.org/jira/browse/ARROW-1196) - [C++] Appveyor separate jobs for Debug/Release builds from sources; Build with conda toolchain; Build with NMake Makefiles Generator
-* [ARROW-1198](https://issues.apache.org/jira/browse/ARROW-1198) - Python: Add public C++ API to unwrap PyArrow object
-* [ARROW-1199](https://issues.apache.org/jira/browse/ARROW-1199) - [C++] Introduce mutable POD struct for generic array data
-* [ARROW-1202](https://issues.apache.org/jira/browse/ARROW-1202) - Remove semicolons from status macros
-* [ARROW-1212](https://issues.apache.org/jira/browse/ARROW-1212) - [GLib] Add garrow_binary_array_get_offsets_buffer()
-* [ARROW-1214](https://issues.apache.org/jira/browse/ARROW-1214) - [Python] Add classes / functions to enable stream message components to be handled outside of the stream reader class
-* [ARROW-1217](https://issues.apache.org/jira/browse/ARROW-1217) - [GLib] Add GInputStream based arrow::io::RandomAccessFile
-* [ARROW-1220](https://issues.apache.org/jira/browse/ARROW-1220) - [C++] Standartize usage of *_HOME cmake script variables for 3rd party libs
-* [ARROW-1221](https://issues.apache.org/jira/browse/ARROW-1221) - [C++] Pin clang-format version
-* [ARROW-1227](https://issues.apache.org/jira/browse/ARROW-1227) - [GLib] Support GOutputStream
-* [ARROW-1228](https://issues.apache.org/jira/browse/ARROW-1228) - [GLib] Test file name should be the same name as target class
-* [ARROW-1229](https://issues.apache.org/jira/browse/ARROW-1229) - [GLib] Follow Reader API change (get -> read)
-* [ARROW-1233](https://issues.apache.org/jira/browse/ARROW-1233) - [C++] Validate cmake script resolving of 3rd party linked libs from correct location in toolchain build
-* [ARROW-460](https://issues.apache.org/jira/browse/ARROW-460) - [C++] Implement JSON round trip for DictionaryArray
-* [ARROW-462](https://issues.apache.org/jira/browse/ARROW-462) - [C++] Implement in-memory conversions between non-nested primitive types and DictionaryArray equivalent
-* [ARROW-575](https://issues.apache.org/jira/browse/ARROW-575) - Python: Auto-detect nested lists and nested numpy arrays in Pandas
-* [ARROW-597](https://issues.apache.org/jira/browse/ARROW-597) - [Python] Add convenience function to yield DataFrame from any object that a StreamReader or FileReader can read from
-* [ARROW-599](https://issues.apache.org/jira/browse/ARROW-599) - [C++] Add LZ4 codec to 3rd-party toolchain
-* [ARROW-600](https://issues.apache.org/jira/browse/ARROW-600) - [C++] Add ZSTD codec to 3rd-party toolchain
-* [ARROW-692](https://issues.apache.org/jira/browse/ARROW-692) - Java<->C++ Integration tests for dictionary-encoded vectors
-* [ARROW-693](https://issues.apache.org/jira/browse/ARROW-693) - [Java] Add JSON support for dictionary vectors
-* [ARROW-742](https://issues.apache.org/jira/browse/ARROW-742) - Handling exceptions during execution of std::wstring_convert
-* [ARROW-834](https://issues.apache.org/jira/browse/ARROW-834) - [Python] Support creating Arrow arrays from Python iterables
-* [ARROW-915](https://issues.apache.org/jira/browse/ARROW-915) - Struct Array reads limited support
-* [ARROW-935](https://issues.apache.org/jira/browse/ARROW-935) - [Java] Build Javadoc in Travis CI
-* [ARROW-960](https://issues.apache.org/jira/browse/ARROW-960) - [Python] Add source build guide for macOS + Homebrew
-* [ARROW-962](https://issues.apache.org/jira/browse/ARROW-962) - [Python] Add schema attribute to FileReader
-* [ARROW-966](https://issues.apache.org/jira/browse/ARROW-966) - [Python] pyarrow.list_ should also accept Field instance
-* [ARROW-978](https://issues.apache.org/jira/browse/ARROW-978) - [Python] Use sphinx-bootstrap-theme for Sphinx documentation
-
-## Bug Fixes
-
-* [ARROW-1074](https://issues.apache.org/jira/browse/ARROW-1074) - from_pandas doesnt convert ndarray to list
-* [ARROW-1079](https://issues.apache.org/jira/browse/ARROW-1079) - [Python] Empty "private" directories should be ignored by Parquet interface
-* [ARROW-1081](https://issues.apache.org/jira/browse/ARROW-1081) - C++: arrow::test::TestBase::MakePrimitive doesn't fill null_bitmap
-* [ARROW-1096](https://issues.apache.org/jira/browse/ARROW-1096) - [C++] Memory mapping file over 4GB fails on Windows
-* [ARROW-1097](https://issues.apache.org/jira/browse/ARROW-1097) - Reading tensor needs file to be opened in writeable mode
-* [ARROW-1098](https://issues.apache.org/jira/browse/ARROW-1098) - Document Error?
-* [ARROW-1101](https://issues.apache.org/jira/browse/ARROW-1101) - UnionListWriter is not implementing all methods on interface ScalarWriter
-* [ARROW-1103](https://issues.apache.org/jira/browse/ARROW-1103) - [Python] Utilize pandas metadata from common _metadata Parquet file if it exists
-* [ARROW-1107](https://issues.apache.org/jira/browse/ARROW-1107) - [JAVA] NullableMapVector getField() should return nullable type
-* [ARROW-1108](https://issues.apache.org/jira/browse/ARROW-1108) - Check if ArrowBuf is empty buffer in getActualConsumedMemory() and getPossibleConsumedMemory()
-* [ARROW-1109](https://issues.apache.org/jira/browse/ARROW-1109) - [JAVA] transferOwnership fails when readerIndex is not 0
-* [ARROW-1110](https://issues.apache.org/jira/browse/ARROW-1110) - [JAVA] make union vector naming consistent
-* [ARROW-1111](https://issues.apache.org/jira/browse/ARROW-1111) - [JAVA] Make aligning buffers optional, and allow -1 for unknown null count
-* [ARROW-1112](https://issues.apache.org/jira/browse/ARROW-1112) - [JAVA] Set lastSet for VarLength and List vectors when loading
-* [ARROW-1113](https://issues.apache.org/jira/browse/ARROW-1113) - [C++] gflags EP build gets triggered (as a no-op) on subsequent calls to make or ninja build
-* [ARROW-1115](https://issues.apache.org/jira/browse/ARROW-1115) - [C++] Use absolute path for ccache
-* [ARROW-1117](https://issues.apache.org/jira/browse/ARROW-1117) - [Docs] Minor issues in GLib README
-* [ARROW-1124](https://issues.apache.org/jira/browse/ARROW-1124) - [Python] pyarrow needs to depend on numpy>=1.10 (not 1.9)
... 2899 lines suppressed ...