You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@arrow.apache.org by ko...@apache.org on 2021/01/14 23:48:28 UTC

[arrow] branch master updated: ARROW-11212: [Packaging][Python] Use vcpkg as dependency source for manylinux and windows wheels

This is an automated email from the ASF dual-hosted git repository.

kou pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/arrow.git


The following commit(s) were added to refs/heads/master by this push:
     new ddd9c6d  ARROW-11212: [Packaging][Python] Use vcpkg as dependency source for manylinux and windows wheels
ddd9c6d is described below

commit ddd9c6dd4e1155bd9050011889a8b712e91f4c46
Author: Krisztián Szűcs <sz...@gmail.com>
AuthorDate: Fri Jan 15 08:47:17 2021 +0900

    ARROW-11212: [Packaging][Python] Use vcpkg as dependency source for manylinux and windows wheels
    
    Also resolves:
    - ARROW-11213: [Packaging][Python] Dockerize wheel building on windows
    - ARROW-11215: [CI] Use named volumes by default for caching in docker-compose
    - ARROW-11231: [Python][Packaging] Enable mimalloc in wheels
    
    Main features:
    - dockerize windows builds
    - use vcpkg as the dependency source where we can explicitly pin a working version without worrying about the drag of build environment over time
    
    Potential follow-up:
    - use vcpkg on macos as well
    
    cc @kou @xhochy
    
    ## Manylinux testing
    
    Should be straightforward (I'm going to shared pre-built images for quicker testing).
    
    ## Windows testing
    
    Only windows host is able to run windows containers. I'm virtualizing windows on macOS (should work on linux as well) using virtualbox:
    
    ```bash
    git clone https://github.com/StefanScherer/windows-docker-machine
    cd windows-docker-machine
    # grant more resources in the vagrantfile under the virtualbox section (I use 8 cores and 16GiB of ram)
    vagrant up --provider virtualbox 2019-box
    docker context ls
    docker context use 2019-box
    docker image ls
    ```
    
    Now docker should use the windows docker daemon.
    
    ```bash
    pip install -e dev/archery[docker]
    # the volumes defined in the docker compose file should be commented out and pass arrow's source explicitly
    # this is required because compose is being executed on a unix host and the directory is mounted through a virtual machine
    PYTHON=3.6 archery docker run --no-pull --using-docker-cli -v C:$(pwd):C:/arrow python-wheel-windows-vs2017 cmd.exe
    # then within the container execute a build or do other interactive things
    arrow\ci\scripts\python_wheel_windown_build.bat
    ```
    
    Closes #9096 from kszucs/wheel-vcpkg
    
    Lead-authored-by: Krisztián Szűcs <sz...@gmail.com>
    Co-authored-by: Sutou Kouhei <ko...@clear-code.com>
    Co-authored-by: Antoine Pitrou <an...@python.org>
    Signed-off-by: Sutou Kouhei <ko...@clear-code.com>
---
 .env                                               |  12 +
 .github/workflows/cpp.yml                          |   1 +
 .github/workflows/cpp_cron.yml                     |   1 +
 .github/workflows/integration.yml                  |   1 +
 .github/workflows/java.yml                         |   1 +
 .github/workflows/java_jni.yml                     |   1 +
 .github/workflows/python.yml                       |   6 +-
 .github/workflows/python_cron.yml                  |   1 +
 .github/workflows/r.yml                            |   1 +
 .github/workflows/ruby.yml                         |   1 +
 ci/docker/python-wheel-manylinux-201x.dockerfile   | 103 ++++++++
 ci/docker/python-wheel-windows-vs2017.dockerfile   |  99 ++++++++
 ci/scripts/python_wheel_manylinux_build.sh         | 142 +++++++++++
 .../scripts/python_wheel_manylinux_test.sh         |  47 ++--
 ci/scripts/python_wheel_windows_build.bat          | 111 ++++++++
 ci/scripts/python_wheel_windows_test.bat           |  54 ++++
 ci/vcpkg/ports.patch                               |  48 ++++
 .../vcpkg/x64-linux-static-debug.cmake             |  16 +-
 .../vcpkg/x64-linux-static-release.cmake           |  16 +-
 .../vcpkg/x64-windows-static-md-debug.cmake        |  11 +-
 .../vcpkg/x64-windows-static-md-release.cmake      |  11 +-
 cpp/cmake_modules/ThirdpartyToolchain.cmake        |   5 +-
 dev/archery/archery/bot.py                         |   2 +-
 dev/archery/archery/cli.py                         | 116 +++++++--
 dev/archery/archery/docker.py                      | 282 +++++++++++++--------
 dev/archery/archery/testing.py                     |   7 +
 dev/archery/archery/tests/test_cli.py              | 232 ++++++++---------
 dev/archery/archery/tests/test_docker.py           | 127 +++-------
 dev/archery/archery/utils/command.py               |   3 +
 dev/archery/archery/utils/source.py                |   5 +-
 dev/release/rat_exclude_files.txt                  |   1 +
 dev/tasks/crossbow.py                              |   7 +-
 dev/tasks/python-wheels/azure.linux.yml            | 109 --------
 .../{github.win.yml => github.linux.yml}           |  85 ++++---
 .../{github.win.yml => github.windows.yml}         |  79 +++---
 dev/tasks/python-wheels/osx-build.sh               |   1 +
 dev/tasks/python-wheels/win-build.bat              | 116 ---------
 dev/tasks/tasks.yml                                | 166 +++---------
 docker-compose.yml                                 | 187 ++++++++------
 python/CMakeLists.txt                              |   7 +-
 python/manylinux1/.dockerignore                    |   1 -
 python/manylinux1/Dockerfile-x86_64_base           | 106 --------
 python/manylinux1/Dockerfile-x86_64_ubuntu         |  89 -------
 python/manylinux1/README.md                        | 130 ----------
 python/manylinux1/build_arrow.sh                   | 176 -------------
 python/manylinux1/scripts/build_absl.sh            |  35 ---
 python/manylinux1/scripts/build_aws_sdk.sh         |  48 ----
 python/manylinux1/scripts/build_bison.sh           |  29 ---
 python/manylinux1/scripts/build_boost.sh           |  50 ----
 python/manylinux1/scripts/build_brotli.sh          |  34 ---
 python/manylinux1/scripts/build_bz2.sh             |  30 ---
 python/manylinux1/scripts/build_cares.sh           |  34 ---
 python/manylinux1/scripts/build_ccache.sh          |  32 ---
 python/manylinux1/scripts/build_curl.sh            |  55 ----
 python/manylinux1/scripts/build_flatbuffers.sh     |  32 ---
 python/manylinux1/scripts/build_gflags.sh          |  39 ---
 python/manylinux1/scripts/build_glog.sh            |  35 ---
 python/manylinux1/scripts/build_grpc.sh            |  49 ----
 python/manylinux1/scripts/build_gtest.sh           |  39 ---
 python/manylinux1/scripts/build_llvm.sh            |  88 -------
 python/manylinux1/scripts/build_lz4.sh             |  35 ---
 python/manylinux1/scripts/build_openssl.sh         |  34 ---
 python/manylinux1/scripts/build_protobuf.sh        |  29 ---
 python/manylinux1/scripts/build_python.sh          | 218 ----------------
 python/manylinux1/scripts/build_rapidjson.sh       |  37 ---
 python/manylinux1/scripts/build_re2.sh             |  35 ---
 python/manylinux1/scripts/build_snappy.sh          |  31 ---
 python/manylinux1/scripts/build_thrift.sh          |  54 ----
 python/manylinux1/scripts/build_utf8proc.sh        |  38 ---
 python/manylinux1/scripts/build_zstd.sh            |  39 ---
 .../manylinux1/scripts/check_arrow_visibility.sh   |  35 ---
 python/manylinux1/scripts/python-tag-abi-tag.py    |  30 ---
 python/manylinux1/scripts/requirements.txt         |  34 ---
 python/manylinux201x/Dockerfile-x86_64_base_2010   | 101 --------
 python/manylinux201x/Dockerfile-x86_64_base_2014   | 101 --------
 python/manylinux201x/README.md                     |  77 ------
 python/manylinux201x/build_arrow.sh                | 173 -------------
 python/manylinux201x/scripts/build_absl.sh         |  35 ---
 python/manylinux201x/scripts/build_aws_sdk.sh      |  44 ----
 python/manylinux201x/scripts/build_boost.sh        |  50 ----
 python/manylinux201x/scripts/build_brotli.sh       |  33 ---
 python/manylinux201x/scripts/build_bz2.sh          |  30 ---
 python/manylinux201x/scripts/build_cares.sh        |  34 ---
 python/manylinux201x/scripts/build_curl.sh         |  53 ----
 python/manylinux201x/scripts/build_flatbuffers.sh  |  32 ---
 python/manylinux201x/scripts/build_gflags.sh       |  39 ---
 python/manylinux201x/scripts/build_glog.sh         |  35 ---
 python/manylinux201x/scripts/build_grpc.sh         |  49 ----
 python/manylinux201x/scripts/build_llvm.sh         |  87 -------
 python/manylinux201x/scripts/build_lz4.sh          |  35 ---
 python/manylinux201x/scripts/build_openssl.sh      |  31 ---
 python/manylinux201x/scripts/build_protobuf.sh     |  29 ---
 python/manylinux201x/scripts/build_rapidjson.sh    |  37 ---
 python/manylinux201x/scripts/build_re2.sh          |  35 ---
 python/manylinux201x/scripts/build_snappy.sh       |  31 ---
 python/manylinux201x/scripts/build_thrift.sh       |  50 ----
 python/manylinux201x/scripts/build_utf8proc.sh     |  38 ---
 python/manylinux201x/scripts/build_zstd.sh         |  39 ---
 .../scripts/check_arrow_visibility.sh              |  35 ---
 python/requirements-wheel-test.txt                 |   2 +-
 python/setup.py                                    |  11 -
 101 files changed, 1335 insertions(+), 4012 deletions(-)

diff --git a/.env b/.env
index d2b817d..f8a086c 100644
--- a/.env
+++ b/.env
@@ -18,6 +18,15 @@
 # All of the following environment variables are required to set default values
 # for the parameters in docker-compose.yml.
 
+# empty prefix means that the docker-compose configuration will use named
+# volumes which potentially improves the performance on docker for macos and
+# docker for windows, it also prevents the contamination of the source
+# directory
+# a non-empty prefix means that directories from the host are bind-mounted
+# into the container, it should be set to ".docker/" on github actions to keep
+# the cache plugin functional
+DOCKER_VOLUME_PREFIX=""
+
 ULIMIT_CORE=-1
 REPO=apache/arrow-dev
 ARCH=amd64
@@ -49,3 +58,6 @@ R_IMAGE=ubuntu-gcc-release
 R_TAG=latest
 # -1 does not attempt to install a devtoolset version, any positive integer will install devtoolset-n
 DEVTOOLSET_VERSION=-1
+
+# Used for the manylinux and windows wheels
+VCPKG=c7e96f2a5b73b3278b004aa88abec2f8ebfb43b5
diff --git a/.github/workflows/cpp.yml b/.github/workflows/cpp.yml
index f484521..48f74f7 100644
--- a/.github/workflows/cpp.yml
+++ b/.github/workflows/cpp.yml
@@ -39,6 +39,7 @@ on:
 
 env:
   DOCKER_BUILDKIT: 0
+  DOCKER_VOLUME_PREFIX: ".docker/"
   COMPOSE_DOCKER_CLI_BUILD: 1
   ARROW_ENABLE_TIMING_TESTS: OFF
   ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_USER }}
diff --git a/.github/workflows/cpp_cron.yml b/.github/workflows/cpp_cron.yml
index 700fb3e..5cd692b 100644
--- a/.github/workflows/cpp_cron.yml
+++ b/.github/workflows/cpp_cron.yml
@@ -31,6 +31,7 @@ on:
 
 env:
   DOCKER_BUILDKIT: 0
+  DOCKER_VOLUME_PREFIX: ".docker/"
   COMPOSE_DOCKER_CLI_BUILD: 1
   ARROW_ENABLE_TIMING_TESTS: OFF
   ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_USER }}
diff --git a/.github/workflows/integration.yml b/.github/workflows/integration.yml
index ead90ef..7cf2ddf 100644
--- a/.github/workflows/integration.yml
+++ b/.github/workflows/integration.yml
@@ -45,6 +45,7 @@ on:
 
 env:
   DOCKER_BUILDKIT: 0
+  DOCKER_VOLUME_PREFIX: ".docker/"
   COMPOSE_DOCKER_CLI_BUILD: 1
   ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_USER }}
   ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
diff --git a/.github/workflows/java.yml b/.github/workflows/java.yml
index 385d021..38f5884 100644
--- a/.github/workflows/java.yml
+++ b/.github/workflows/java.yml
@@ -37,6 +37,7 @@ on:
 
 env:
   DOCKER_BUILDKIT: 0
+  DOCKER_VOLUME_PREFIX: ".docker/"
   COMPOSE_DOCKER_CLI_BUILD: 1
   ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_USER }}
   ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
diff --git a/.github/workflows/java_jni.yml b/.github/workflows/java_jni.yml
index ef40642..46dae21 100644
--- a/.github/workflows/java_jni.yml
+++ b/.github/workflows/java_jni.yml
@@ -37,6 +37,7 @@ on:
 
 env:
   DOCKER_BUILDKIT: 0
+  DOCKER_VOLUME_PREFIX: ".docker/"
   COMPOSE_DOCKER_CLI_BUILD: 1
   ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_USER }}
   ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
diff --git a/.github/workflows/python.yml b/.github/workflows/python.yml
index 169a718..254d949 100644
--- a/.github/workflows/python.yml
+++ b/.github/workflows/python.yml
@@ -33,6 +33,7 @@ on:
 
 env:
   DOCKER_BUILDKIT: 0
+  DOCKER_VOLUME_PREFIX: ".docker/"
   COMPOSE_DOCKER_CLI_BUILD: 1
   ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_USER }}
   ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
@@ -50,7 +51,6 @@ jobs:
           - conda-python-3.8-nopandas
           - conda-python-3.6-pandas-0.23
           - conda-python-3.7-pandas-latest
-          - centos-python-3.6-manylinux1
         include:
           - name: conda-python-3.8-nopandas
             cache: conda-python-3.8
@@ -69,10 +69,6 @@ jobs:
             title: AMD64 Conda Python 3.7 Pandas latest
             python: 3.7
             pandas: latest
-          - name: centos-python-3.6-manylinux1
-            cache: manylinux1
-            image: centos-python-manylinux1
-            title: AMD64 CentOS 5.11 Python 3.6 manylinux1
     env:
       PYTHON: ${{ matrix.python || 3.7 }}
       UBUNTU: ${{ matrix.ubuntu || 18.04 }}
diff --git a/.github/workflows/python_cron.yml b/.github/workflows/python_cron.yml
index 256856b..88007ba 100644
--- a/.github/workflows/python_cron.yml
+++ b/.github/workflows/python_cron.yml
@@ -30,6 +30,7 @@ on:
 
 env:
   DOCKER_BUILDKIT: 0
+  DOCKER_VOLUME_PREFIX: ".docker/"
   COMPOSE_DOCKER_CLI_BUILD: 1
   ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_USER }}
   ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
diff --git a/.github/workflows/r.yml b/.github/workflows/r.yml
index 37aee19..3b200c1 100644
--- a/.github/workflows/r.yml
+++ b/.github/workflows/r.yml
@@ -41,6 +41,7 @@ on:
 
 env:
   DOCKER_BUILDKIT: 0
+  DOCKER_VOLUME_PREFIX: ".docker/"
   COMPOSE_DOCKER_CLI_BUILD: 1
   ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_USER }}
   ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
diff --git a/.github/workflows/ruby.yml b/.github/workflows/ruby.yml
index 03f4f0e..6ade7d8 100644
--- a/.github/workflows/ruby.yml
+++ b/.github/workflows/ruby.yml
@@ -45,6 +45,7 @@ on:
 
 env:
   DOCKER_BUILDKIT: 0
+  DOCKER_VOLUME_PREFIX: ".docker/"
   COMPOSE_DOCKER_CLI_BUILD: 1
   ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_USER }}
   ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
diff --git a/ci/docker/python-wheel-manylinux-201x.dockerfile b/ci/docker/python-wheel-manylinux-201x.dockerfile
new file mode 100644
index 0000000..0a17d86
--- /dev/null
+++ b/ci/docker/python-wheel-manylinux-201x.dockerfile
@@ -0,0 +1,103 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+ARG base
+FROM ${base}
+
+RUN yum install -y git flex curl autoconf zip wget
+
+# Install CMake
+ARG cmake=3.19.2
+RUN wget -q https://github.com/Kitware/CMake/releases/download/v${cmake}/cmake-${cmake}-Linux-x86_64.tar.gz -O - | \
+    tar -xzf - --directory /usr/local --strip-components=1
+
+# Install Ninja
+ARG ninja=1.10.2
+RUN mkdir /tmp/ninja && \
+    wget -q https://github.com/ninja-build/ninja/archive/v${ninja}.tar.gz -O - | \
+    tar -xzf - --directory /tmp/ninja --strip-components=1 && \
+    cd /tmp/ninja && \
+    ./configure.py --bootstrap && \
+    mv ninja /usr/local/bin && \
+    rm -rf /tmp/ninja
+
+# Install ccache
+ARG ccache=4.1
+RUN mkdir /tmp/ccache && \
+    wget -q https://github.com/ccache/ccache/archive/v${ccache}.tar.gz -O - | \
+    tar -xzf - --directory /tmp/ccache --strip-components=1 && \
+    cd /tmp/ccache && \
+    mkdir build && \
+    cd build && \
+    cmake -GNinja -DCMAKE_BUILD_TYPE=Release -DZSTD_FROM_INTERNET=ON .. && \
+    ninja install && \
+    rm -rf /tmp/ccache
+
+# Install vcpkg
+ARG vcpkg
+RUN git clone https://github.com/microsoft/vcpkg /opt/vcpkg && \
+    git -C /opt/vcpkg checkout ${vcpkg} && \
+    /opt/vcpkg/bootstrap-vcpkg.sh --useSystemBinaries --disableMetrics && \
+    ln -s /opt/vcpkg/vcpkg /usr/bin/vcpkg
+
+# Patch ports files as needed
+COPY ci/vcpkg arrow/ci/vcpkg
+RUN cd /opt/vcpkg && patch -p1 -i /arrow/ci/vcpkg/ports.patch
+
+ARG build_type=release
+ENV CMAKE_BUILD_TYPE=${build_type} \
+    VCPKG_FORCE_SYSTEM_BINARIES=1 \
+    VCPKG_OVERLAY_TRIPLETS=/arrow/ci/vcpkg \
+    VCPKG_DEFAULT_TRIPLET=x64-linux-static-${build_type}
+
+# TODO(kszucs): factor out the package enumeration to a text file and reuse it
+# from the windows image and potentially in a future macos wheel build
+RUN vcpkg install --clean-after-build \
+        abseil \
+        aws-sdk-cpp[config,cognito-identity,core,identity-management,s3,sts,transfer] \
+        boost-filesystem \
+        boost-regex \
+        brotli \
+        bzip2 \
+        c-ares \
+        curl \
+        flatbuffers \
+        gflags \
+        glog \
+        grpc \
+        lz4 \
+        openssl \
+        orc \
+        protobuf \
+        rapidjson \
+        re2 \
+        snappy \
+        thrift \
+        utf8proc \
+        zlib \
+        zstd
+
+ARG python=3.6
+ENV PYTHON_VERSION=${python}
+RUN PYTHON_ROOT=$(find /opt/python -name cp${PYTHON_VERSION/./}-*) && \
+    echo "export PATH=$PYTHON_ROOT/bin:\$PATH" >> /etc/profile.d/python.sh
+
+SHELL ["/bin/bash", "-i", "-c"]
+ENTRYPOINT ["/bin/bash", "-i", "-c"]
+
+COPY python/requirements-wheel-build.txt /arrow/python/
+RUN pip install -r /arrow/python/requirements-wheel-build.txt
diff --git a/ci/docker/python-wheel-windows-vs2017.dockerfile b/ci/docker/python-wheel-windows-vs2017.dockerfile
new file mode 100644
index 0000000..50372bd
--- /dev/null
+++ b/ci/docker/python-wheel-windows-vs2017.dockerfile
@@ -0,0 +1,99 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+# based on mcr.microsoft.com/windows/servercore:ltsc2019
+# contains choco and vs2017 preinstalled
+FROM abrarov/msvc-2017:2.10.0
+
+# Install CMake and Ninja
+RUN choco install --no-progress -r -y cmake --installargs 'ADD_CMAKE_TO_PATH=System' && \
+    choco install --no-progress -r -y gzip wget ninja
+
+# Add unix tools to path
+RUN setx path "%path%;C:\Program Files\Git\usr\bin"
+
+# Install vcpkg
+ARG vcpkg
+RUN git clone https://github.com/Microsoft/vcpkg && \
+    git -C vcpkg checkout %vcpkg% && \
+    vcpkg\bootstrap-vcpkg.bat -disableMetrics -win64 && \
+    setx PATH "%PATH%;C:\vcpkg"
+
+# Patch ports files as needed
+COPY ci/vcpkg arrow/ci/vcpkg
+RUN cd vcpkg && patch -p1 -i C:/arrow/ci/vcpkg/ports.patch
+
+# Configure vcpkg and install dependencies
+# NOTE: use windows batch environment notation for build arguments in RUN
+# statements but bash notation in ENV statements
+# VCPKG_FORCE_SYSTEM_BINARIES=1 spare around ~750MB of image size if the system
+# cmake's and ninja's versions are recent enough
+COPY ci/vcpkg arrow/ci/vcpkg
+ARG build_type=release
+ENV CMAKE_BUILD_TYPE=${build_type} \
+    VCPKG_OVERLAY_TRIPLETS=C:\\arrow\\ci\\vcpkg \
+    VCPKG_DEFAULT_TRIPLET=x64-windows-static-md-${build_type}
+RUN vcpkg install --clean-after-build \
+        abseil \
+        aws-sdk-cpp[config,cognito-identity,core,identity-management,s3,sts,transfer] \
+        boost-filesystem \
+        boost-multiprecision \
+        boost-regex \
+        boost-system \
+        brotli \
+        bzip2 \
+        c-ares \
+        curl \
+        flatbuffers \
+        gflags \
+        glog \
+        grpc \
+        lz4 \
+        openssl \
+        orc \
+        protobuf \
+        rapidjson \
+        re2 \
+        snappy \
+        thrift \
+        utf8proc \
+        zlib \
+        zstd
+
+# Remove previous installations of python from the base image
+RUN wmic product where "name like 'python%%'" call uninstall /nointeractive && \
+    rm -rf Python*
+
+# Define the full version number otherwise choco falls back to patch number 0 (3.7 => 3.7.0)
+ARG python=3.6
+RUN (if "%python%"=="3.6" setx PYTHON_VERSION 3.6.8) & \
+    (if "%python%"=="3.7" setx PYTHON_VERSION 3.7.4) & \
+    (if "%python%"=="3.8" setx PYTHON_VERSION 3.8.6) & \
+    (if "%python%"=="3.9" setx PYTHON_VERSION 3.9.1)
+RUN choco install -r -y --no-progress python --version=%PYTHON_VERSION%
+RUN python -m pip install -U pip
+
+COPY python/requirements-wheel-build.txt arrow/python/
+RUN pip install -r arrow/python/requirements-wheel-build.txt
+
+# TODO(kszucs): set clcache as the compiler
+ENV CLCACHE_DIR="C:\clcache"
+RUN pip install clcache
+
+# For debugging purposes
+# RUN wget --no-check-certificate https://github.com/lucasg/Dependencies/releases/download/v1.10/Dependencies_x64_Release.zip
+# RUN unzip Dependencies_x64_Release.zip -d Dependencies && setx path "%path%;C:\Depencencies"
diff --git a/ci/scripts/python_wheel_manylinux_build.sh b/ci/scripts/python_wheel_manylinux_build.sh
new file mode 100755
index 0000000..68e75c3
--- /dev/null
+++ b/ci/scripts/python_wheel_manylinux_build.sh
@@ -0,0 +1,142 @@
+#!/usr/bin/env bash
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+set -ex
+
+function check_arrow_visibility {
+    nm --demangle --dynamic /tmp/arrow-dist/lib/libarrow.so > nm_arrow.log
+
+    # Filter out Arrow symbols and see if anything remains.
+    # '_init' and '_fini' symbols may or not be present, we don't care.
+    # (note we must ignore the grep exit status when no match is found)
+    grep ' T ' nm_arrow.log | grep -v -E '(arrow|\b_init\b|\b_fini\b)' | cat - > visible_symbols.log
+
+    if [[ -f visible_symbols.log && `cat visible_symbols.log | wc -l` -eq 0 ]]; then
+        return 0
+    else
+        echo "== Unexpected symbols exported by libarrow.so =="
+        cat visible_symbols.log
+        echo "================================================"
+
+        exit 1
+    fi
+}
+
+echo "=== (${PYTHON_VERSION}) Clear output directories and leftovers ==="
+# Clear output directories and leftovers
+rm -rf /tmp/arrow-build
+rm -rf /arrow/python/dist
+rm -rf /arrow/python/build
+rm -rf /arrow/python/repaired_wheels
+rm -rf /arrow/python/pyarrow/*.so
+rm -rf /arrow/python/pyarrow/*.so.*
+
+echo "=== (${PYTHON_VERSION}) Building Arrow C++ libraries ==="
+: ${ARROW_DATASET:=ON}
+: ${ARROW_FLIGHT:=ON}
+: ${ARROW_GANDIVA:=OFF}
+: ${ARROW_HDFS:=ON}
+: ${ARROW_JEMALLOC:=ON}
+: ${ARROW_MIMALLOC:=ON}
+: ${ARROW_ORC:=ON}
+: ${ARROW_PARQUET:=ON}
+: ${ARROW_PLASMA:=ON}
+: ${ARROW_S3:=ON}
+: ${ARROW_TENSORFLOW:=ON}
+: ${ARROW_WITH_BROTLI:=ON}
+: ${ARROW_WITH_BZ2:=ON}
+: ${ARROW_WITH_LZ4:=ON}
+: ${ARROW_WITH_SNAPPY:=ON}
+: ${ARROW_WITH_ZLIB:=ON}
+: ${ARROW_WITH_ZSTD:=ON}
+: ${CMAKE_BUILD_TYPE:=release}
+: ${CMAKE_GENERATOR:=Ninja}
+
+mkdir /tmp/arrow-build
+pushd /tmp/arrow-build
+cmake \
+    -DARROW_BROTLI_USE_SHARED=OFF \
+    -DARROW_BUILD_SHARED=ON \
+    -DARROW_BUILD_STATIC=OFF \
+    -DARROW_BUILD_TESTS=OFF \
+    -DARROW_DATASET=${ARROW_DATASET} \
+    -DARROW_DEPENDENCY_SOURCE="SYSTEM" \
+    -DARROW_DEPENDENCY_USE_SHARED=OFF \
+    -DARROW_FLIGHT==${ARROW_FLIGHT} \
+    -DARROW_GANDIVA=${ARROW_GANDIVA} \
+    -DARROW_HDFS=${ARROW_HDFS} \
+    -DARROW_JEMALLOC=${ARROW_JEMALLOC} \
+    -DARROW_MIMALLOC=${ARROW_MIMALLOC} \
+    -DARROW_ORC=${ARROW_ORC} \
+    -DARROW_PACKAGE_KIND="manylinux${MANYLINUX_VERSION}" \
+    -DARROW_PARQUET=${ARROW_PARQUET} \
+    -DARROW_PLASMA=${ARROW_PLASMA} \
+    -DARROW_PYTHON=ON \
+    -DARROW_RPATH_ORIGIN=ON \
+    -DARROW_S3=${ARROW_S3} \
+    -DARROW_TENSORFLOW=${ARROW_TENSORFLOW} \
+    -DARROW_USE_CCACHE=ON \
+    -DARROW_UTF8PROC_USE_SHARED=OFF \
+    -DARROW_WITH_BROTLI=${ARROW_WITH_BROTLI} \
+    -DARROW_WITH_BZ2=${ARROW_WITH_BZ2} \
+    -DARROW_WITH_LZ4=${ARROW_WITH_LZ4} \
+    -DARROW_WITH_SNAPPY=${ARROW_WITH_SNAPPY} \
+    -DARROW_WITH_ZLIB=${ARROW_WITH_ZLIB} \
+    -DARROW_WITH_ZSTD=${ARROW_WITH_ZSTD} \
+    -DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE} \
+    -DCMAKE_INSTALL_LIBDIR=lib \
+    -DCMAKE_INSTALL_PREFIX=/tmp/arrow-dist \
+    -DCMAKE_TOOLCHAIN_FILE=/opt/vcpkg/scripts/buildsystems/vcpkg.cmake \
+    -DCMAKE_UNITY_BUILD=ON \
+    -DOPENSSL_USE_STATIC_LIBS=ON \
+    -DThrift_ROOT=/opt/vcpkg/installed/x64-linux/lib \
+    -DVCPKG_TARGET_TRIPLET=x64-linux-static-${CMAKE_BUILD_TYPE} \
+    -G ${CMAKE_GENERATOR} \
+    /arrow/cpp
+cmake --build . --target install
+popd
+
+# Check that we don't expose any unwanted symbols
+check_arrow_visibility
+
+echo "=== (${PYTHON_VERSION}) Building wheel ==="
+export PYARROW_BUILD_TYPE=${CMAKE_BUILD_TYPE}
+export PYARROW_BUNDLE_ARROW_CPP=1
+export PYARROW_CMAKE_GENERATOR=${CMAKE_GENERATOR}
+export PYARROW_INSTALL_TESTS=1
+export PYARROW_WITH_DATASET=${ARROW_DATASET}
+export PYARROW_WITH_FLIGHT=${ARROW_FLIGHT}
+export PYARROW_WITH_GANDIVA=${ARROW_GANDIVA}
+export PYARROW_WITH_HDFS=${ARROW_HDFS}
+export PYARROW_WITH_ORC=${ARROW_ORC}
+export PYARROW_WITH_PARQUET=${ARROW_PARQUET}
+export PYARROW_WITH_PLASMA=${ARROW_PLASMA}
+export PYARROW_WITH_S3=${ARROW_S3}
+# PyArrow build configuration
+export PKG_CONFIG_PATH=/usr/lib/pkgconfig:/tmp/arrow-dist/lib/pkgconfig
+
+pushd /arrow/python
+python setup.py bdist_wheel
+
+echo "=== (${PYTHON_VERSION}) Tag the wheel with manylinux${MANYLINUX_VERSION} ==="
+auditwheel repair \
+    --plat "manylinux${MANYLINUX_VERSION}_x86_64" \
+    -L . dist/pyarrow-*.whl \
+    -w repaired_wheels
+popd
diff --git a/dev/tasks/python-wheels/manylinux-test.sh b/ci/scripts/python_wheel_manylinux_test.sh
similarity index 61%
rename from dev/tasks/python-wheels/manylinux-test.sh
rename to ci/scripts/python_wheel_manylinux_test.sh
index e1a6659..d603f7c 100755
--- a/dev/tasks/python-wheels/manylinux-test.sh
+++ b/ci/scripts/python_wheel_manylinux_test.sh
@@ -1,5 +1,5 @@
-#!/bin/bash
-
+#!/usr/bin/env bash
+#
 # Licensed to the Apache Software Foundation (ASF) under one
 # or more contributor license agreements.  See the NOTICE file
 # distributed with this work for additional information
@@ -17,36 +17,43 @@
 # specific language governing permissions and limitations
 # under the License.
 
-set -e
+set -ex
 
-export ARROW_TEST_DATA=/arrow/testing/data
 export PYARROW_TEST_CYTHON=OFF
+export PYARROW_TEST_DATASET=ON
+export PYARROW_TEST_GANDIVA=OFF
+export PYARROW_TEST_HDFS=ON
+export PYARROW_TEST_ORC=ON
+export PYARROW_TEST_PANDAS=ON
+export PYARROW_TEST_PARQUET=ON
+export PYARROW_TEST_PLASMA=ON
+export PYARROW_TEST_S3=ON
+export PYARROW_TEST_TENSORFLOW=ON
+export PYARROW_TEST_FLIGHT=ON
 
-python --version
-# Install built wheel
-pip install -q /arrow/python/$WHEEL_DIR/dist/*.whl
-# Install test dependencies
-pip install -q -r /arrow/python/requirements-wheel-test.txt
-# Run pyarrow tests
-pytest -rs --pyargs pyarrow
+export ARROW_TEST_DATA=/arrow/testing/data
+export PARQUET_TEST_DATA=/arrow/submodules/parquet-testing/data
 
-if [[ "$1" == "--remove-system-libs" ]]; then
-  # Run import tests after removing the bundled dependencies from the system
-  # (be careful not to remove libz.so, required for some Python stdlib modules)
-  echo "Removing the following libraries to fail loudly if they are bundled incorrectly:"
-  ldconfig -p | grep "lib\(bz2\|lz4\|z[^.]\|boost\)" | awk -F'> ' '{print $2}' | sort | xargs rm -v -f
-fi
+# Install the built wheels
+pip install /arrow/python/repaired_wheels/*.whl
 
-# Test import and optional dependencies
+# Test that the modules are importable
 python -c "
 import pyarrow
+import pyarrow._hdfs
+import pyarrow._s3fs
 import pyarrow.csv
 import pyarrow.dataset
 import pyarrow.flight
 import pyarrow.fs
-import pyarrow._hdfs
 import pyarrow.json
+import pyarrow.orc
 import pyarrow.parquet
 import pyarrow.plasma
-import pyarrow._s3fs
 "
+
+# Install testing dependencies
+pip install -r /arrow/python/requirements-wheel-test.txt
+
+# Execute unittest
+pytest -v -r s --pyargs pyarrow
diff --git a/ci/scripts/python_wheel_windows_build.bat b/ci/scripts/python_wheel_windows_build.bat
new file mode 100644
index 0000000..a9a596b
--- /dev/null
+++ b/ci/scripts/python_wheel_windows_build.bat
@@ -0,0 +1,111 @@
+@rem Licensed to the Apache Software Foundation (ASF) under one
+@rem or more contributor license agreements.  See the NOTICE file
+@rem distributed with this work for additional information
+@rem regarding copyright ownership.  The ASF licenses this file
+@rem to you under the Apache License, Version 2.0 (the
+@rem "License"); you may not use this file except in compliance
+@rem with the License.  You may obtain a copy of the License at
+@rem
+@rem   http://www.apache.org/licenses/LICENSE-2.0
+@rem
+@rem Unless required by applicable law or agreed to in writing,
+@rem software distributed under the License is distributed on an
+@rem "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+@rem KIND, either express or implied.  See the License for the
+@rem specific language governing permissions and limitations
+@rem under the License.
+
+@echo on
+
+echo "Building windows wheel..."
+
+call "C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Auxiliary\Build\vcvars64.bat"
+
+echo "=== (%PYTHON_VERSION%) Clear output directories and leftovers ==="
+del /s /q C:\arrow-build
+del /s /q C:\arrow-dist
+del /s /q C:\arrow\python\dist
+del /s /q C:\arrow\python\build
+del /s /q C:\arrow\python\pyarrow\*.so
+del /s /q C:\arrow\python\pyarrow\*.so.*
+
+echo "=== (%PYTHON_VERSION%) Building Arrow C++ libraries ==="
+set ARROW_DATASET=ON
+set ARROW_FLIGHT=ON
+set ARROW_GANDIVA=OFF
+set ARROW_HDFS=ON
+set ARROW_ORC=OFF
+set ARROW_PARQUET=ON
+set ARROW_MIMALLOC=ON
+set ARROW_S3=ON
+set ARROW_TENSORFLOW=ON
+set ARROW_WITH_BROTLI=ON
+set ARROW_WITH_BZ2=ON
+set ARROW_WITH_LZ4=ON
+set ARROW_WITH_SNAPPY=ON
+set ARROW_WITH_ZLIB=ON
+set ARROW_WITH_ZSTD=ON
+set CMAKE_UNITY_BUILD=ON
+set CMAKE_GENERATOR=Visual Studio 15 2017 Win64
+
+mkdir C:\arrow-build
+pushd C:\arrow-build
+cmake ^
+    -DARROW_BUILD_SHARED=ON ^
+    -DARROW_BUILD_STATIC=OFF ^
+    -DARROW_BUILD_TESTS=OFF ^
+    -DARROW_CXXFLAGS="/MP" ^
+    -DARROW_DATASET=%ARROW_DATASET% ^
+    -DARROW_DEPENDENCY_SOURCE=SYSTEM ^
+    -DARROW_DEPENDENCY_USE_SHARED=OFF ^
+    -DARROW_FLIGHT=%ARROW_FLIGHT% ^
+    -DARROW_GANDIVA=%ARROW_GANDIVA% ^
+    -DARROW_HDFS=%ARROW_HDFS% ^
+    -DARROW_MIMALLOC=%ARROW_MIMALLOC% ^
+    -DARROW_ORC=%ARROW_ORC% ^
+    -DARROW_PACKAGE_KIND="wheel-windows" ^
+    -DARROW_PARQUET=%ARROW_PARQUET% ^
+    -DARROW_PYTHON=ON ^
+    -DARROW_S3=%ARROW_S3% ^
+    -DARROW_TENSORFLOW=%ARROW_TENSORFLOW% ^
+    -DARROW_WITH_BROTLI=%ARROW_WITH_BROTLI% ^
+    -DARROW_WITH_BZ2=%ARROW_WITH_BZ2% ^
+    -DARROW_WITH_LZ4=%ARROW_WITH_LZ4% ^
+    -DARROW_WITH_SNAPPY=%ARROW_WITH_SNAPPY% ^
+    -DARROW_WITH_ZLIB=%ARROW_WITH_ZLIB% ^
+    -DARROW_WITH_ZSTD=%ARROW_WITH_ZSTD% ^
+    -DCMAKE_BUILD_TYPE=%CMAKE_BUILD_TYPE% ^
+    -DLZ4_MSVC_LIB_PREFIX="" ^
+    -DLZ4_MSVC_STATIC_LIB_SUFFIX="" ^
+    -DZSTD_MSVC_LIB_PREFIX="" ^
+    -DCMAKE_CXX_COMPILER=clcache ^
+    -DCMAKE_INSTALL_PREFIX=C:\arrow-dist ^
+    -DCMAKE_TOOLCHAIN_FILE=C:\vcpkg\scripts\buildsystems\vcpkg.cmake ^
+    -DCMAKE_UNITY_BUILD=%CMAKE_UNITY_BUILD% ^
+    -DMSVC_LINK_VERBOSE=ON ^
+    -DVCPKG_TARGET_TRIPLET=x64-windows-static-md-%CMAKE_BUILD_TYPE% ^
+    -G "%CMAKE_GENERATOR%" ^
+    C:\arrow\cpp || exit /B
+cmake --build . --config %CMAKE_BUILD_TYPE% --target install || exit /B
+popd
+
+echo "=== (%PYTHON_VERSION%) Building wheel ==="
+set PYARROW_BUILD_TYPE=%CMAKE_BUILD_TYPE%
+set PYARROW_BUNDLE_ARROW_CPP=ON
+set PYARROW_BUNDLE_BOOST=OFF
+set PYARROW_CMAKE_GENERATOR=%CMAKE_GENERATOR%
+set PYARROW_INSTALL_TESTS=ON
+set PYARROW_WITH_DATASET=%ARROW_DATASET%
+set PYARROW_WITH_FLIGHT=%ARROW_FLIGHT%
+set PYARROW_WITH_GANDIVA=%ARROW_GANDIVA%
+set PYARROW_WITH_HDFS=%ARROW_HDFS%
+set PYARROW_WITH_ORC=%ARROW_ORC%
+set PYARROW_WITH_PARQUET=%ARROW_PARQUET%
+set PYARROW_WITH_S3=%ARROW_S3%
+set ARROW_HOME=C:\arrow-dist
+
+pushd C:\arrow\python
+@REM bundle the msvc runtime
+cp "C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Redist\MSVC\14.16.27012\x64\Microsoft.VC141.CRT\msvcp140.dll" pyarrow\
+python setup.py bdist_wheel || exit /B
+popd
diff --git a/ci/scripts/python_wheel_windows_test.bat b/ci/scripts/python_wheel_windows_test.bat
new file mode 100755
index 0000000..8352e58
--- /dev/null
+++ b/ci/scripts/python_wheel_windows_test.bat
@@ -0,0 +1,54 @@
+@rem Licensed to the Apache Software Foundation (ASF) under one
+@rem or more contributor license agreements.  See the NOTICE file
+@rem distributed with this work for additional information
+@rem regarding copyright ownership.  The ASF licenses this file
+@rem to you under the Apache License, Version 2.0 (the
+@rem "License"); you may not use this file except in compliance
+@rem with the License.  You may obtain a copy of the License at
+@rem
+@rem   http://www.apache.org/licenses/LICENSE-2.0
+@rem
+@rem Unless required by applicable law or agreed to in writing,
+@rem software distributed under the License is distributed on an
+@rem "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+@rem KIND, either express or implied.  See the License for the
+@rem specific language governing permissions and limitations
+@rem under the License.
+
+@echo on
+
+set PYARROW_TEST_CYTHON=OFF
+set PYARROW_TEST_DATASET=ON
+set PYARROW_TEST_GANDIVA=OFF
+set PYARROW_TEST_HDFS=ON
+set PYARROW_TEST_ORC=OFF
+set PYARROW_TEST_PANDAS=ON
+set PYARROW_TEST_PARQUET=ON
+set PYARROW_TEST_PLASMA=OFF
+set PYARROW_TEST_S3=OFF
+set PYARROW_TEST_TENSORFLOW=ON
+set PYARROW_TEST_FLIGHT=ON
+
+set ARROW_TEST_DATA=C:\arrow\testing\data
+set PARQUET_TEST_DATA=C:\arrow\submodules\parquet-testing\data
+
+@REM Install the built wheels
+python -m pip install numpy
+python -m pip install --no-index --find-links=C:\arrow\python\dist\ pyarrow || exit /B
+
+@REM Test that the modules are importable
+python -c "import pyarrow"
+python -c "import pyarrow._hdfs"
+python -c "import pyarrow._s3fs"
+python -c "import pyarrow.csv"
+python -c "import pyarrow.dataset"
+python -c "import pyarrow.flight"
+python -c "import pyarrow.fs"
+python -c "import pyarrow.json"
+python -c "import pyarrow.parquet"
+
+@REM Install testing dependencies
+pip install -r C:\arrow\python\requirements-wheel-test.txt
+
+@REM Execute unittest
+pytest -r s --pyargs pyarrow
diff --git a/ci/vcpkg/ports.patch b/ci/vcpkg/ports.patch
new file mode 100644
index 0000000..106bf72
--- /dev/null
+++ b/ci/vcpkg/ports.patch
@@ -0,0 +1,48 @@
+diff --git a/ports/curl/portfile.cmake b/ports/curl/portfile.cmake
+index 6e18aecd0..2ccecf33c 100644
+--- a/ports/curl/portfile.cmake
++++ b/ports/curl/portfile.cmake
+@@ -76,6 +76,8 @@ vcpkg_configure_cmake(
+         -DCMAKE_DISABLE_FIND_PACKAGE_Perl=ON
+         -DENABLE_DEBUG=ON
+         -DCURL_CA_FALLBACK=ON
++        -DCURL_CA_PATH=none
++        -DCURL_CA_BUNDLE=none
+ )
+ 
+ vcpkg_install_cmake()
+diff --git a/ports/snappy/portfile.cmake b/ports/snappy/portfile.cmake
+index 75dd13302..84345c7ca 100644
+--- a/ports/snappy/portfile.cmake
++++ b/ports/snappy/portfile.cmake
+@@ -4,6 +4,7 @@ vcpkg_from_github(
+     REF 537f4ad6240e586970fe554614542e9717df7902 # 1.1.8
+     SHA512 555d3b69a6759592736cbaae8f41654f0cf14e8be693b5dde37640191e53daec189f895872557b173e905d10681ef502f3e6ed8566811add963ffef96ce4016d
+     HEAD_REF master
++    PATCHES "snappy-disable-bmi.patch"
+ )
+ 
+ vcpkg_configure_cmake(
+diff --git a/ports/snappy/snappy-disable-bmi.patch b/ports/snappy/snappy-disable-bmi.patch
+new file mode 100644
+index 000000000..2cbb1533a
+--- /dev/null
++++ b/ports/snappy/snappy-disable-bmi.patch
+@@ -0,0 +1,17 @@
++--- snappy.cc  2020-06-27 17:38:49.718993748 -0500
+++++ snappy.cc  2020-06-27 17:37:57.543268213 -0500
++@@ -717,14 +717,10 @@
++ static inline uint32 ExtractLowBytes(uint32 v, int n) {
++   assert(n >= 0);
++   assert(n <= 4);
++-#if SNAPPY_HAVE_BMI2
++-  return _bzhi_u32(v, 8 * n);
++-#else
++   // This needs to be wider than uint32 otherwise `mask << 32` will be
++   // undefined.
++   uint64 mask = 0xffffffff;
++   return v & ~(mask << (8 * n));
++-#endif
++ }
++
++ static inline bool LeftShiftOverflows(uint8 value, uint32 shift) {
diff --git a/python/manylinux1/scripts/build_zlib.sh b/ci/vcpkg/x64-linux-static-debug.cmake
old mode 100755
new mode 100644
similarity index 77%
rename from python/manylinux1/scripts/build_zlib.sh
rename to ci/vcpkg/x64-linux-static-debug.cmake
index 71968c1..3acee2e
--- a/python/manylinux1/scripts/build_zlib.sh
+++ b/ci/vcpkg/x64-linux-static-debug.cmake
@@ -1,4 +1,3 @@
-#!/bin/bash -ex
 # Licensed to the Apache Software Foundation (ASF) under one
 # or more contributor license agreements.  See the NOTICE file
 # distributed with this work for additional information
@@ -16,11 +15,10 @@
 # specific language governing permissions and limitations
 # under the License.
 
-curl -sL https://zlib.net/zlib-1.2.11.tar.gz -o /zlib-1.2.11.tar.gz
-tar xf zlib-1.2.11.tar.gz
-pushd zlib-1.2.11
-CFLAGS=-fPIC ./configure --static
-make -j8
-make install
-popd
-rm -rf zlib-1.2.11.tar.gz zlib-1.2.11
+set(VCPKG_TARGET_ARCHITECTURE x64)
+set(VCPKG_CRT_LINKAGE dynamic)
+set(VCPKG_LIBRARY_LINKAGE static)
+
+set(VCPKG_CMAKE_SYSTEM_NAME Linux)
+
+set(VCPKG_BUILD_TYPE debug)
diff --git a/python/manylinux201x/scripts/build_zlib.sh b/ci/vcpkg/x64-linux-static-release.cmake
old mode 100755
new mode 100644
similarity index 77%
rename from python/manylinux201x/scripts/build_zlib.sh
rename to ci/vcpkg/x64-linux-static-release.cmake
index 71968c1..c2caa49
--- a/python/manylinux201x/scripts/build_zlib.sh
+++ b/ci/vcpkg/x64-linux-static-release.cmake
@@ -1,4 +1,3 @@
-#!/bin/bash -ex
 # Licensed to the Apache Software Foundation (ASF) under one
 # or more contributor license agreements.  See the NOTICE file
 # distributed with this work for additional information
@@ -16,11 +15,10 @@
 # specific language governing permissions and limitations
 # under the License.
 
-curl -sL https://zlib.net/zlib-1.2.11.tar.gz -o /zlib-1.2.11.tar.gz
-tar xf zlib-1.2.11.tar.gz
-pushd zlib-1.2.11
-CFLAGS=-fPIC ./configure --static
-make -j8
-make install
-popd
-rm -rf zlib-1.2.11.tar.gz zlib-1.2.11
+set(VCPKG_TARGET_ARCHITECTURE x64)
+set(VCPKG_CRT_LINKAGE dynamic)
+set(VCPKG_LIBRARY_LINKAGE static)
+
+set(VCPKG_CMAKE_SYSTEM_NAME Linux)
+
+set(VCPKG_BUILD_TYPE release)
diff --git a/python/manylinux1/scripts/install_cmake.sh b/ci/vcpkg/x64-windows-static-md-debug.cmake
old mode 100755
new mode 100644
similarity index 72%
rename from python/manylinux1/scripts/install_cmake.sh
rename to ci/vcpkg/x64-windows-static-md-debug.cmake
index 5a51b35..3eae3cf
--- a/python/manylinux1/scripts/install_cmake.sh
+++ b/ci/vcpkg/x64-windows-static-md-debug.cmake
@@ -1,4 +1,3 @@
-#!/bin/bash -e
 # Licensed to the Apache Software Foundation (ASF) under one
 # or more contributor license agreements.  See the NOTICE file
 # distributed with this work for additional information
@@ -16,8 +15,8 @@
 # specific language governing permissions and limitations
 # under the License.
 
-export CMAKE_VERSION=3.18.2.post1
-/opt/python/cp37-cp37m/bin/pip install cmake==${CMAKE_VERSION} ninja
-ln -s /opt/python/cp37-cp37m/bin/cmake /usr/bin/cmake
-ln -s /opt/python/cp37-cp37m/bin/ninja /usr/bin/ninja
-strip /opt/_internal/cpython-3.*/lib/python3.7/site-packages/cmake/data/bin/*
+set(VCPKG_TARGET_ARCHITECTURE x64)
+set(VCPKG_CRT_LINKAGE dynamic)
+set(VCPKG_LIBRARY_LINKAGE static)
+
+set(VCPKG_BUILD_TYPE debug)
diff --git a/python/manylinux201x/scripts/install_cmake.sh b/ci/vcpkg/x64-windows-static-md-release.cmake
old mode 100755
new mode 100644
similarity index 71%
rename from python/manylinux201x/scripts/install_cmake.sh
rename to ci/vcpkg/x64-windows-static-md-release.cmake
index 474b72e..b8dfbc8
--- a/python/manylinux201x/scripts/install_cmake.sh
+++ b/ci/vcpkg/x64-windows-static-md-release.cmake
@@ -1,4 +1,3 @@
-#!/bin/bash -e
 # Licensed to the Apache Software Foundation (ASF) under one
 # or more contributor license agreements.  See the NOTICE file
 # distributed with this work for additional information
@@ -16,8 +15,8 @@
 # specific language governing permissions and limitations
 # under the License.
 
-export CMAKE_VERSION=3.18.2.post1
-/opt/python/cp37-cp37m/bin/pip install cmake==${CMAKE_VERSION} ninja
-ln -fs /opt/python/cp37-cp37m/bin/cmake /usr/local/bin/cmake
-ln -fs /opt/python/cp37-cp37m/bin/ninja /usr/local/bin/ninja
-strip /opt/_internal/cpython-3.*/lib/python3.7/site-packages/cmake/data/bin/*
+set(VCPKG_TARGET_ARCHITECTURE x64)
+set(VCPKG_CRT_LINKAGE dynamic)
+set(VCPKG_LIBRARY_LINKAGE static)
+
+set(VCPKG_BUILD_TYPE release)
diff --git a/cpp/cmake_modules/ThirdpartyToolchain.cmake b/cpp/cmake_modules/ThirdpartyToolchain.cmake
index e5c72be..1ebf836 100644
--- a/cpp/cmake_modules/ThirdpartyToolchain.cmake
+++ b/cpp/cmake_modules/ThirdpartyToolchain.cmake
@@ -1326,6 +1326,7 @@ endif()
 
 macro(build_protobuf)
   message("Building Protocol Buffers from source")
+  set(PROTOBUF_VENDORED TRUE)
   set(PROTOBUF_PREFIX "${CMAKE_CURRENT_BINARY_DIR}/protobuf_ep-install")
   set(PROTOBUF_INCLUDE_DIR "${PROTOBUF_PREFIX}/include")
   # Newer protobuf releases always have a lib prefix independent from CMAKE_STATIC_LIBRARY_PREFIX
@@ -2466,13 +2467,15 @@ macro(build_grpc)
       -DgRPC_GFLAGS_PROVIDER=package
       -DgRPC_MSVC_STATIC_RUNTIME=${ARROW_USE_STATIC_CRT}
       -DgRPC_PROTOBUF_PROVIDER=package
-      -DgRPC_PROTOBUF_PACKAGE_TYPE=CONFIG
       -DgRPC_RE2_PROVIDER=package
       -DgRPC_SSL_PROVIDER=package
       -DgRPC_ZLIB_PROVIDER=package
       -DCMAKE_INSTALL_PREFIX=${GRPC_PREFIX}
       -DCMAKE_INSTALL_LIBDIR=lib
       -DBUILD_SHARED_LIBS=OFF)
+  if(PROTOBUF_VENDORED)
+    list(APPEND GRPC_CMAKE_ARGS -DgRPC_PROTOBUF_PACKAGE_TYPE=CONFIG)
+  endif()
   if(OPENSSL_ROOT_DIR)
     list(APPEND GRPC_CMAKE_ARGS -DOPENSSL_ROOT_DIR=${OPENSSL_ROOT_DIR})
   endif()
diff --git a/dev/archery/archery/bot.py b/dev/archery/archery/bot.py
index d222d1e..d81be5c 100644
--- a/dev/archery/archery/bot.py
+++ b/dev/archery/archery/bot.py
@@ -244,7 +244,7 @@ def actions(ctx):
 
 
 @actions.group()
-@click.option('--crossbow', '-c', default='ursa-labs/crossbow',
+@click.option('--crossbow', '-c', default='ursacomputing/crossbow',
               help='Crossbow repository on github to use')
 @click.pass_obj
 def crossbow(obj, crossbow):
diff --git a/dev/archery/archery/cli.py b/dev/archery/archery/cli.py
index 827f435..def4c9e 100644
--- a/dev/archery/archery/cli.py
+++ b/dev/archery/archery/cli.py
@@ -707,12 +707,32 @@ def trigger_bot(event_name, event_payload, arrow_token, crossbow_token):
     bot.handle(event_name, event_payload)
 
 
+def _mock_compose_calls(compose):
+    from types import MethodType
+    from subprocess import CompletedProcess
+
+    def _mock(compose, executable):
+        def _execute(self, *args, **kwargs):
+            params = ['{}={}'.format(k, v)
+                      for k, v in self.config.params.items()]
+            command = ' '.join(params + [executable] + list(args))
+            click.echo(command)
+            return CompletedProcess([], 0)
+        return MethodType(_execute, compose)
+
+    compose._execute_docker = _mock(compose, executable='docker')
+    compose._execute_compose = _mock(compose, executable='docker-compose')
+
+
 @archery.group('docker')
 @click.option("--src", metavar="<arrow_src>", default=None,
               callback=validate_arrow_sources,
               help="Specify Arrow source directory.")
+@click.option('--dry-run/--execute', default=False,
+              help="Display the docker-compose commands instead of executing "
+                   "them.")
 @click.pass_obj
-def docker_compose(obj, src):
+def docker_compose(obj, src, dry_run):
     """Interact with docker-compose based builds."""
     from .docker import DockerCompose
 
@@ -725,7 +745,51 @@ def docker_compose(obj, src):
 
     # take the docker-compose parameters like PYTHON, PANDAS, UBUNTU from the
     # environment variables to keep the usage similar to docker-compose
-    obj['compose'] = DockerCompose(config_path, params=os.environ)
+    compose = DockerCompose(config_path, params=os.environ)
+    if dry_run:
+        _mock_compose_calls(compose)
+    obj['compose'] = compose
+
+
+@docker_compose.command('build')
+@click.argument('image')
+@click.option('--force-pull/--no-pull', default=True,
+              help="Whether to force pull the image and its ancestor images")
+@click.option('--using-docker-cli', default=False, is_flag=True,
+              envvar='ARCHERY_USE_DOCKER_CLI',
+              help="Use docker CLI directly for building instead of calling "
+                   "docker-compose. This may help to reuse cached layers.")
+@click.option('--use-cache/--no-cache', default=True,
+              help="Whether to use cache when building the image and its "
+                   "ancestor images")
+@click.option('--use-leaf-cache/--no-leaf-cache', default=True,
+              help="Whether to use cache when building only the (leaf) image "
+                   "passed as the argument. To disable caching for both the "
+                   "image and its ancestors use --no-cache option.")
+@click.pass_obj
+def docker_compose_build(obj, image, *, force_pull, using_docker_cli,
+                         use_cache, use_leaf_cache):
+    """
+    Execute docker-compose builds.
+    """
+    from .docker import UndefinedImage
+
+    compose = obj['compose']
+
+    try:
+        if force_pull:
+            compose.pull(image, pull_leaf=use_leaf_cache,
+                         using_docker=using_docker_cli)
+        compose.build(image, use_cache=use_cache,
+                      use_leaf_cache=use_leaf_cache,
+                      using_docker=using_docker_cli)
+    except UndefinedImage as e:
+        raise click.ClickException(
+            "There is no service/image defined in docker-compose.yml with "
+            "name: {}".format(str(e))
+        )
+    except RuntimeError as e:
+        raise click.ClickException(str(e))
 
 
 @docker_compose.command('run')
@@ -741,6 +805,10 @@ def docker_compose(obj, src):
               help="Whether to force build the image and its ancestor images")
 @click.option('--build-only', default=False, is_flag=True,
               help="Pull and/or build the image, but do not run it")
+@click.option('--using-docker-cli', default=False, is_flag=True,
+              envvar='ARCHERY_USE_DOCKER_CLI',
+              help="Use docker CLI directly for building instead of calling "
+                   "docker-compose. This may help to reuse cached layers.")
 @click.option('--use-cache/--no-cache', default=True,
               help="Whether to use cache when building the image and its "
                    "ancestor images")
@@ -748,15 +816,12 @@ def docker_compose(obj, src):
               help="Whether to use cache when building only the (leaf) image "
                    "passed as the argument. To disable caching for both the "
                    "image and its ancestors use --no-cache option.")
-@click.option('--dry-run/--execute', default=False,
-              help="Display the docker-compose commands instead of executing "
-                   "them.")
 @click.option('--volume', '-v', multiple=True,
               help="Set volume within the container")
 @click.pass_obj
 def docker_compose_run(obj, image, command, *, env, user, force_pull,
-                       force_build, build_only, use_cache, use_leaf_cache,
-                       dry_run, volume):
+                       force_build, build_only, using_docker_cli, use_cache,
+                       use_leaf_cache, volume):
     """Execute docker-compose builds.
 
     To see the available builds run `archery docker images`.
@@ -791,28 +856,23 @@ def docker_compose_run(obj, image, command, *, env, user, force_pull,
 
     compose = obj['compose']
 
-    if dry_run:
-        from types import MethodType
-
-        def _print_command(self, *args, **kwargs):
-            params = ['{}={}'.format(k, v) for k, v in self.params.items()]
-            command = ' '.join(params + ['docker-compose'] + list(args))
-            click.echo(command)
-
-        compose._execute_compose = MethodType(_print_command, compose)
-
     env = dict(kv.split('=', 1) for kv in env)
     try:
+        if force_pull:
+            compose.pull(image, pull_leaf=use_leaf_cache,
+                         using_docker=using_docker_cli)
+        if force_build:
+            compose.build(image, use_cache=use_cache,
+                          use_leaf_cache=use_leaf_cache,
+                          using_docker=using_docker_cli)
+        if build_only:
+            return
         compose.run(
             image,
             command=command,
             env=env,
             user=user,
-            force_pull=force_pull,
-            force_build=force_build,
-            build_only=build_only,
-            use_cache=use_cache,
-            use_leaf_cache=use_leaf_cache,
+            using_docker=using_docker_cli,
             volumes=volume
         )
     except UndefinedImage as e:
@@ -826,16 +886,20 @@ def docker_compose_run(obj, image, command, *, env, user, force_pull,
 
 @docker_compose.command('push')
 @click.argument('image')
-@click.option('--user', '-u', required=True, envvar='ARCHERY_DOCKER_USER',
+@click.option('--user', '-u', required=False, envvar='ARCHERY_DOCKER_USER',
               help='Docker repository username')
-@click.option('--password', '-p', required=True,
+@click.option('--password', '-p', required=False,
               envvar='ARCHERY_DOCKER_PASSWORD',
               help='Docker repository password')
+@click.option('--using-docker-cli', default=False, is_flag=True,
+              help="Use docker CLI directly for building instead of calling "
+                   "docker-compose. This may help to reuse cached layers.")
 @click.pass_obj
-def docker_compose_push(obj, image, user, password):
+def docker_compose_push(obj, image, user, password, using_docker_cli):
     """Push the generated docker-compose image."""
     compose = obj['compose']
-    compose.push(image, user=user, password=password)
+    compose.push(image, user=user, password=password,
+                 using_docker=using_docker_cli)
 
 
 @docker_compose.command('images')
diff --git a/dev/archery/archery/docker.py b/dev/archery/archery/docker.py
index 65fac3a..e31ce69 100644
--- a/dev/archery/archery/docker.py
+++ b/dev/archery/archery/docker.py
@@ -47,39 +47,46 @@ class UndefinedImage(Exception):
     pass
 
 
-class Docker(Command):
-
-    def __init__(self, docker_bin=None):
-        self.bin = default_bin(docker_bin, "docker")
+class ComposeConfig:
 
-
-class DockerCompose(Command):
-
-    def __init__(self, config_path, dotenv_path=None, compose_bin=None,
-                 params=None):
-        self.bin = default_bin(compose_bin, 'docker-compose')
-
-        self.config_path = _ensure_path(config_path)
+    def __init__(self, config_path, dotenv_path, compose_bin, params=None):
+        config_path = _ensure_path(config_path)
         if dotenv_path:
-            self.dotenv_path = _ensure_path(dotenv_path)
+            dotenv_path = _ensure_path(dotenv_path)
+        else:
+            dotenv_path = config_path.parent / '.env'
+        self._read_env(dotenv_path, params)
+        self._read_config(config_path, compose_bin)
+
+    def _read_env(self, dotenv_path, params):
+        """
+        Read .env and merge it with explicitly passed parameters.
+        """
+        self.dotenv = dotenv_values(str(dotenv_path))
+        if params is None:
+            self.params = {}
         else:
-            self.dotenv_path = self.config_path.parent / '.env'
+            self.params = {k: v for k, v in params.items() if k in self.dotenv}
 
-        self._read_env(params)
-        self._read_config()
+        # forward the process' environment variables
+        self.env = os.environ.copy()
+        # set the defaults from the dotenv files
+        self.env.update(self.dotenv)
+        # override the defaults passed as parameters
+        self.env.update(self.params)
 
-    def _read_config(self):
+    def _read_config(self, config_path, compose_bin):
         """
         Validate and read the docker-compose.yml
         """
         yaml = YAML()
-        with self.config_path.open() as fp:
+        with config_path.open() as fp:
             config = yaml.load(fp)
 
         services = config['services'].keys()
-        self.nodes = dict(flatten(config.get('x-hierarchy', {})))
+        self.hierarchy = dict(flatten(config.get('x-hierarchy', {})))
         self.with_gpus = config.get('x-with-gpus', [])
-        nodes = self.nodes.keys()
+        nodes = self.hierarchy.keys()
         errors = []
 
         for name in self.with_gpus:
@@ -100,13 +107,14 @@ class DockerCompose(Command):
             )
 
         # trigger docker-compose's own validation
-        result = self._execute_compose('config', check=False,
-                                       stderr=subprocess.PIPE,
-                                       stdout=subprocess.PIPE)
+        compose = Command('docker-compose')
+        args = ['--file', str(config_path), 'config']
+        result = compose.run(*args, env=self.env, check=False,
+                             stderr=subprocess.PIPE, stdout=subprocess.PIPE)
 
         if result.returncode != 0:
             # strip the intro line of docker-compose errors
-            errors += result.stderr.decode().splitlines()[1:]
+            errors += result.stderr.decode().splitlines()
 
         if errors:
             msg = '\n'.join([' - {}'.format(msg) for msg in errors])
@@ -115,34 +123,48 @@ class DockerCompose(Command):
             )
 
         rendered_config = StringIO(result.stdout.decode())
+        self.path = config_path
         self.config = yaml.load(rendered_config)
 
-    def _read_env(self, params):
-        """
-        Read .env and merge it with explicitly passed parameters.
-        """
-        self.dotenv = dotenv_values(str(self.dotenv_path))
-        if params is None:
-            self.params = {}
-        else:
-            self.params = {k: v for k, v in params.items() if k in self.dotenv}
+    def get(self, service_name):
+        try:
+            service = self.config['services'][service_name]
+        except KeyError:
+            raise UndefinedImage(service_name)
+        service['name'] = service_name
+        service['need_gpu'] = service_name in self.with_gpus
+        service['ancestors'] = self.hierarchy[service_name]
+        return service
 
-        # forward the process' environment variables
-        self._compose_env = os.environ.copy()
-        # set the defaults from the dotenv files
-        self._compose_env.update(self.dotenv)
-        # override the defaults passed as parameters
-        self._compose_env.update(self.params)
+    def __getitem__(self, service_name):
+        return self.get(service_name)
+
+
+class Docker(Command):
+
+    def __init__(self, docker_bin=None):
+        self.bin = default_bin(docker_bin, "docker")
 
-    def _validate_image(self, name):
-        if name not in self.nodes:
-            raise UndefinedImage(name)
+
+class DockerCompose(Command):
+
+    def __init__(self, config_path, dotenv_path=None, compose_bin=None,
+                 params=None):
+        compose_bin = default_bin(compose_bin, 'docker-compose')
+        self.config = ComposeConfig(config_path, dotenv_path, compose_bin,
+                                    params)
+        self.bin = compose_bin
+        self.pull_memory = set()
+
+    def clear_pull_memory(self):
+        self.pull_memory = set()
 
     def _execute_compose(self, *args, **kwargs):
         # execute as a docker compose command
         try:
-            return super().run('--file', str(self.config_path), *args,
-                               env=self._compose_env, **kwargs)
+            result = super().run('--file', str(self.config.path), *args,
+                                 env=self.config.env, **kwargs)
+            result.check_returncode()
         except subprocess.CalledProcessError as e:
             def formatdict(d, template):
                 return '\n'.join(
@@ -158,15 +180,18 @@ class DockerCompose(Command):
                 msg.format(
                     cmd=' '.join(e.cmd),
                     code=e.returncode,
-                    dotenv=formatdict(self.dotenv, template='  {}: {}'),
-                    params=formatdict(self.params, template='  export {}={}')
+                    dotenv=formatdict(self.config.dotenv, template='  {}: {}'),
+                    params=formatdict(
+                        self.config.params, template='  export {}={}'
+                    )
                 )
             )
 
     def _execute_docker(self, *args, **kwargs):
         # execute as a plain docker cli command
         try:
-            return Docker().run(*args, **kwargs)
+            result = Docker().run(*args, **kwargs)
+            result.check_returncode()
         except subprocess.CalledProcessError as e:
             raise RuntimeError(
                 "{} exited with non-zero exit code {}".format(
@@ -174,41 +199,75 @@ class DockerCompose(Command):
                 )
             )
 
-    def pull(self, image, pull_leaf=True):
-        self._validate_image(image)
+    def pull(self, service_name, pull_leaf=True, using_docker=False):
+        def _pull(service):
+            args = ['pull']
+            if service['image'] in self.pull_memory:
+                return
+
+            if using_docker:
+                try:
+                    self._execute_docker(*args, service['image'])
+                except Exception as e:
+                    # better --ignore-pull-failures handling
+                    print(e)
+            else:
+                args.append('--ignore-pull-failures')
+                self._execute_compose(*args, service['name'])
 
-        for ancestor in self.nodes[image]:
-            self._execute_compose('pull', '--ignore-pull-failures', ancestor)
+            self.pull_memory.add(service['image'])
 
+        service = self.config.get(service_name)
+        for ancestor in service['ancestors']:
+            _pull(self.config.get(ancestor))
         if pull_leaf:
-            self._execute_compose('pull', '--ignore-pull-failures', image)
+            _pull(service)
 
-    def build(self, image, use_cache=True, use_leaf_cache=True):
-        self._validate_image(image)
-
-        for ancestor in self.nodes[image]:
+    def build(self, service_name, use_cache=True, use_leaf_cache=True,
+              using_docker=False):
+        def _build(service, use_cache):
+            args = ['build']
+            cache_from = list(service.get('build', {}).get('cache_from', []))
             if use_cache:
-                self._execute_compose('build', ancestor)
+                for image in cache_from:
+                    if image not in self.pull_memory:
+                        try:
+                            self._execute_docker('pull', image)
+                        except Exception as e:
+                            print(e)
+                        finally:
+                            self.pull_memory.add(image)
             else:
-                self._execute_compose('build', '--no-cache', ancestor)
-
-        if use_cache and use_leaf_cache:
-            self._execute_compose('build', image)
-        else:
-            self._execute_compose('build', '--no-cache', image)
+                args.append('--no-cache')
+
+            if using_docker:
+                if 'build' not in service:
+                    # nothing to do
+                    return
+                # better for caching
+                for k, v in service['build'].get('args', {}).items():
+                    args.extend(['--build-arg', '{}={}'.format(k, v)])
+                for img in cache_from:
+                    args.append('--cache-from="{}"'.format(img))
+                args.extend([
+                    '-f', service['build']['dockerfile'],
+                    '-t', service['image'],
+                    service['build'].get('context', '.')
+                ])
+                self._execute_docker(*args)
+            else:
+                self._execute_compose(*args, service['name'])
 
-    def run(self, image, command=None, *, env=None, force_pull=False,
-            force_build=False, use_cache=True, use_leaf_cache=True,
-            volumes=None, build_only=False, user=None):
-        self._validate_image(image)
+        service = self.config.get(service_name)
+        # build ancestor services
+        for ancestor in service['ancestors']:
+            _build(self.config.get(ancestor), use_cache=use_cache)
+        # build the leaf/target service
+        _build(service, use_cache=use_cache and use_leaf_cache)
 
-        if force_pull:
-            self.pull(image, pull_leaf=use_leaf_cache)
-        if force_build:
-            self.build(image, use_cache=use_cache,
-                       use_leaf_cache=use_leaf_cache)
-        if build_only:
-            return
+    def run(self, service_name, command=None, *, env=None, volumes=None,
+            user=None, using_docker=False):
+        service = self.config.get(service_name)
 
         args = []
         if user is not None:
@@ -222,58 +281,73 @@ class DockerCompose(Command):
             for volume in volumes:
                 args.extend(['--volume', volume])
 
-        if image in self.with_gpus:
-            # rendered compose configuration for the image
-            cc = self.config['services'][image]
-
+        if using_docker or service['need_gpu']:
             # use gpus, requires docker>=19.03
-            args.extend(['--gpus', 'all'])
+            if service['need_gpu']:
+                args.extend(['--gpus', 'all'])
+
+            if service.get('shm_size'):
+                args.extend(['--shm-size', service['shm_size']])
 
             # append env variables from the compose conf
-            for k, v in cc.get('environment', {}).items():
+            for k, v in service.get('environment', {}).items():
                 args.extend(['-e', '{}={}'.format(k, v)])
 
             # append volumes from the compose conf
-            for v in cc.get('volumes', []):
+            for v in service.get('volumes', []):
+                if not isinstance(v, str):
+                    # if not the compact string volume definition
+                    v = "{}:{}".format(v['source'], v['target'])
                 args.extend(['-v', v])
 
+            # infer whether an interactive shell is desired or not
+            if command in ['cmd.exe', 'bash', 'sh', 'powershell']:
+                args.append('-it')
+
             # get the actual docker image name instead of the compose service
             # name which we refer as image in general
-            args.append(cc['image'])
+            args.append(service['image'])
 
             # add command from compose if it wasn't overridden
             if command is not None:
                 args.append(command)
             else:
                 # replace whitespaces from the preformatted compose command
-                cmd = shlex.split(cc.get('command', ''))
+                cmd = shlex.split(service.get('command', ''))
                 cmd = [re.sub(r"\s+", " ", token) for token in cmd]
                 if cmd:
                     args.extend(cmd)
 
             # execute as a plain docker cli command
-            return self._execute_docker('run', '--rm', '-it', *args)
+            self._execute_docker('run', '--rm', *args)
+        else:
+            # execute as a docker-compose command
+            args.append(service_name)
+            if command is not None:
+                args.append(command)
+            self._execute_compose('run', '--rm', *args)
 
-        # execute as a docker-compose command
-        args.append(image)
-        if command is not None:
-            args.append(command)
-        self._execute_compose('run', '--rm', *args)
+    def push(self, service_name, user=None, password=None, using_docker=False):
+        def _push(service):
+            if using_docker:
+                return self._execute_docker('push', service['image'])
+            else:
+                return self._execute_compose('push', service['name'])
 
-    def push(self, image, user, password):
-        self._validate_image(image)
-        try:
-            # TODO(kszucs): have an option for a prompt
-            Docker().run('login', '-u', user, '-p', password)
-        except subprocess.CalledProcessError:
-            # hide credentials
-            msg = ('Failed to push `{}`, check the passed credentials'
-                   .format(image))
-            raise RuntimeError(msg) from None
-        else:
-            for ancestor in self.nodes[image]:
-                self._execute_compose('push', ancestor)
-            self._execute_compose('push', image)
+        if user is not None:
+            try:
+                # TODO(kszucs): have an option for a prompt
+                self._execute_docker('login', '-u', user, '-p', password)
+            except subprocess.CalledProcessError:
+                # hide credentials
+                msg = ('Failed to push `{}`, check the passed credentials'
+                       .format(service_name))
+                raise RuntimeError(msg) from None
+
+        service = self.config.get(service_name)
+        for ancestor in service['ancestors']:
+            _push(self.config.get(ancestor))
+        _push(service)
 
     def images(self):
-        return sorted(self.nodes.keys())
+        return sorted(self.config.hierarchy.keys())
diff --git a/dev/archery/archery/testing.py b/dev/archery/archery/testing.py
index a773e156..471a54d 100644
--- a/dev/archery/archery/testing.py
+++ b/dev/archery/archery/testing.py
@@ -55,6 +55,12 @@ def _ensure_mock_call_object(obj, **kwargs):
         raise TypeError(obj)
 
 
+class SuccessfulSubprocessResult:
+
+    def check_returncode(self):
+        return
+
+
 @contextmanager
 def assert_subprocess_calls(expected_commands_or_calls, **kwargs):
     calls = [
@@ -62,6 +68,7 @@ def assert_subprocess_calls(expected_commands_or_calls, **kwargs):
         for obj in expected_commands_or_calls
     ]
     with mock.patch('subprocess.run', autospec=True) as run:
+        run.return_value = SuccessfulSubprocessResult()
         yield run
         run.assert_has_calls(calls)
 
diff --git a/dev/archery/archery/tests/test_cli.py b/dev/archery/archery/tests/test_cli.py
index 61ef75a..d2f0355 100644
--- a/dev/archery/archery/tests/test_cli.py
+++ b/dev/archery/archery/tests/test_cli.py
@@ -17,124 +17,130 @@
 
 from unittest.mock import patch
 
-import pytest
 from click.testing import CliRunner
 
 from archery.cli import archery
 from archery.docker import DockerCompose
 
 
-@pytest.mark.parametrize(('command', 'args', 'kwargs'), [
-    (
-        ['ubuntu-cpp', '--build-only'],
-        ['ubuntu-cpp'],
-        dict(
-            command=None,
-            env={},
-            user=None,
-            force_pull=True,
-            force_build=True,
-            build_only=True,
-            use_cache=True,
-            use_leaf_cache=True,
-            volumes=()
-        )
-    ),
-    (
-        ['ubuntu-cpp', 'bash'],
-        ['ubuntu-cpp'],
-        dict(
-            command='bash',
-            env={},
-            user=None,
-            force_pull=True,
-            force_build=True,
-            build_only=False,
-            use_cache=True,
-            use_leaf_cache=True,
-            volumes=()
-        )
-    ),
-    (
-        ['ubuntu-cpp', '--no-pull', '--no-build'],
-        ['ubuntu-cpp'],
-        dict(
-            command=None,
-            env={},
-            user=None,
-            force_pull=False,
-            force_build=False,
-            build_only=False,
-            use_cache=True,
-            use_leaf_cache=True,
-            volumes=()
-        )
-    ),
-    (
-        [
-            'ubuntu-cpp', '--no-pull', '--force-build', '--user', 'me',
-            '--no-cache', '--no-leaf-cache'
-        ],
-        ['ubuntu-cpp'],
-        dict(
-            command=None,
-            env={},
-            user='me',
-            force_pull=False,
-            force_build=True,
-            build_only=False,
-            use_cache=False,
-            use_leaf_cache=False,
-            volumes=()
-        )
-    ),
-    (
-        [
-            '-e', 'ARROW_GANDIVA=OFF', '-e', 'ARROW_FLIGHT=ON', '-u', 'root',
-            'ubuntu-cpp'
-        ],
-        ['ubuntu-cpp'],
-        dict(
-            command=None,
-            env={
-                'ARROW_GANDIVA': 'OFF',
-                'ARROW_FLIGHT': 'ON'
-            },
-            user='root',
-            force_pull=True,
-            force_build=True,
-            build_only=False,
-            use_cache=True,
-            use_leaf_cache=True,
-            volumes=()
-        )
-    ),
-    (
-        [
-            '--volume', './build:/build', '-v', './ccache:/ccache:delegated',
-            'ubuntu-cpp'
-        ],
-        ['ubuntu-cpp'],
-        dict(
-            command=None,
-            env={},
-            user=None,
-            force_pull=True,
-            force_build=True,
-            build_only=False,
-            use_cache=True,
-            use_leaf_cache=True,
-            volumes=(
-                './build:/build',
-                './ccache:/ccache:delegated',
-            )
-        )
+@patch.object(DockerCompose, "pull")
+@patch.object(DockerCompose, "build")
+@patch.object(DockerCompose, "run")
+def test_docker_run_with_custom_command(run, build, pull):
+    # with custom command
+    args = ["docker", "run", "ubuntu-cpp", "bash"]
+    result = CliRunner().invoke(archery, args)
+    assert result.exit_code == 0
+    pull.assert_called_once_with(
+        "ubuntu-cpp", pull_leaf=True, using_docker=False
     )
-])
-def test_docker_run(command, args, kwargs):
-    runner = CliRunner()
+    build.assert_called_once_with(
+        "ubuntu-cpp", use_cache=True, use_leaf_cache=True, using_docker=False
+    )
+    run.assert_called_once_with(
+        "ubuntu-cpp",
+        command="bash",
+        env={},
+        user=None,
+        using_docker=False,
+        volumes=(),
+    )
+
 
-    with patch.object(DockerCompose, 'run') as run:
-        result = runner.invoke(archery, ['docker', 'run'] + command)
-        assert result.exit_code == 0
-        run.assert_called_once_with(*args, **kwargs)
+@patch.object(DockerCompose, "pull")
+@patch.object(DockerCompose, "build")
+@patch.object(DockerCompose, "run")
+def test_docker_run_options(run, build, pull):
+    # environment variables and volumes
+    args = [
+        "docker",
+        "run",
+        "-e",
+        "ARROW_GANDIVA=OFF",
+        "-e",
+        "ARROW_FLIGHT=ON",
+        "--volume",
+        "./build:/build",
+        "-v",
+        "./ccache:/ccache:delegated",
+        "-u",
+        "root",
+        "ubuntu-cpp",
+    ]
+    result = CliRunner().invoke(archery, args)
+    assert result.exit_code == 0
+    pull.assert_called_once_with(
+        "ubuntu-cpp", pull_leaf=True, using_docker=False
+    )
+    build.assert_called_once_with(
+        "ubuntu-cpp", use_cache=True, use_leaf_cache=True, using_docker=False
+    )
+    run.assert_called_once_with(
+        "ubuntu-cpp",
+        command=None,
+        env={"ARROW_GANDIVA": "OFF", "ARROW_FLIGHT": "ON"},
+        user="root",
+        using_docker=False,
+        volumes=(
+            "./build:/build",
+            "./ccache:/ccache:delegated",
+        ),
+    )
+
+
+@patch.object(DockerCompose, "run")
+def test_docker_run_without_pulling_or_building(run):
+    args = ["docker", "run", "--no-pull", "--no-build", "ubuntu-cpp"]
+    result = CliRunner().invoke(archery, args)
+    assert result.exit_code == 0
+    run.assert_called_once_with(
+        "ubuntu-cpp",
+        command=None,
+        env={},
+        user=None,
+        using_docker=False,
+        volumes=(),
+    )
+
+
+@patch.object(DockerCompose, "pull")
+@patch.object(DockerCompose, "build")
+def test_docker_run_only_pulling_and_building(build, pull):
+    args = ["docker", "run", "ubuntu-cpp", "--build-only"]
+    result = CliRunner().invoke(archery, args)
+    assert result.exit_code == 0
+    pull.assert_called_once_with(
+        "ubuntu-cpp", pull_leaf=True, using_docker=False
+    )
+    build.assert_called_once_with(
+        "ubuntu-cpp", use_cache=True, use_leaf_cache=True, using_docker=False
+    )
+
+
+@patch.object(DockerCompose, "build")
+@patch.object(DockerCompose, "run")
+def test_docker_run_without_build_cache(run, build):
+    args = [
+        "docker",
+        "run",
+        "--no-pull",
+        "--force-build",
+        "--user",
+        "me",
+        "--no-cache",
+        "--no-leaf-cache",
+        "ubuntu-cpp",
+    ]
+    result = CliRunner().invoke(archery, args)
+    assert result.exit_code == 0
+    build.assert_called_once_with(
+        "ubuntu-cpp", use_cache=False, use_leaf_cache=False, using_docker=False
+    )
+    run.assert_called_once_with(
+        "ubuntu-cpp",
+        command=None,
+        env={},
+        user="me",
+        using_docker=False,
+        volumes=(),
+    )
diff --git a/dev/archery/archery/tests/test_docker.py b/dev/archery/archery/tests/test_docker.py
index 2ed0c9d..cd8cfcf 100644
--- a/dev/archery/archery/tests/test_docker.py
+++ b/dev/archery/archery/tests/test_docker.py
@@ -41,17 +41,17 @@ x-hierarchy:
 
 services:
   foo:
-    image: dummy
+    image: org/foo
   sub-sub-foo:
-    image: dummy
+    image: org/sub-sub-foo
   another-sub-sub-foo:
-    image: dummy
+    image: org/another-sub-sub-foo
   bar:
-    image: dummy
+    image: org/bar
   sub-bar:
-    image: dummy
+    image: org/sub-bar
   baz:
-    image: dummy
+    image: org/baz
 """
 
 missing_node_compose_yml = """
@@ -67,19 +67,19 @@ x-hierarchy:
 
 services:
   foo:
-    image: dummy
+    image: org/foo
   sub-foo:
-    image: dummy
+    image: org/sub-foo
   sub-sub-foo:
-    image: dummy
+    image: org/sub-foo-foo
   another-sub-sub-foo:
-    image: dummy
+    image: org/another-sub-sub-foo
   bar:
-    image: dummy
+    image: org/bar
   sub-bar:
-    image: dummy
+    image: org/sub-bar
   baz:
-    image: dummy
+    image: org/baz
 """
 
 ok_compose_yml = """
@@ -96,19 +96,19 @@ x-hierarchy:
 
 services:
   foo:
-    image: dummy
+    image: org/foo
   sub-foo:
-    image: dummy
+    image: org/sub-foo
   sub-sub-foo:
-    image: dummy
+    image: org/sub-sub-foo
   another-sub-sub-foo:
-    image: dummy
+    image: org/another-sub-sub-foo
   bar:
-    image: dummy
+    image: org/bar
   sub-bar:
-    image: dummy
+    image: org/sub-bar
   baz:
-    image: dummy
+    image: org/baz
 """
 
 arrow_compose_yml = """
@@ -130,23 +130,23 @@ x-hierarchy:
 
 services:
   conda-cpp:
-    image: dummy
+    image: org/conda-cpp
   conda-python:
-    image: dummy
+    image: org/conda-python
   conda-python-pandas:
-    image: dummy
+    image: org/conda-python-pandas
   conda-python-dask:
-    image: dummy
+    image: org/conda-python-dask
   ubuntu-cpp:
-    image: dummy
+    image: org/ubuntu-cpp
   ubuntu-cpp-cmake32:
-    image: dummy
+    image: org/ubuntu-cpp-cmake32
   ubuntu-c-glib:
-    image: dummy
+    image: org/ubuntu-c-glib
   ubuntu-ruby:
-    image: dummy
+    image: org/ubuntu-ruby
   ubuntu-cuda:
-    image: dummy-cuda
+    image: org/ubuntu-cuda
     environment:
       CUDA_ENV: 1
       OTHER_ENV: 2
@@ -217,7 +217,7 @@ def assert_docker_calls(compose, expected_args):
 
 
 def assert_compose_calls(compose, expected_args, env=mock.ANY):
-    base_command = ['docker-compose', '--file', str(compose.config_path)]
+    base_command = ['docker-compose', '--file', str(compose.config.path)]
     expected_commands = []
     for args in expected_args:
         if isinstance(args, str):
@@ -235,8 +235,8 @@ def test_compose_default_params_and_env(arrow_compose_path):
         UBUNTU='18.04',
         DASK='master'
     ))
-    assert compose.dotenv == arrow_compose_env
-    assert compose.params == {
+    assert compose.config.dotenv == arrow_compose_env
+    assert compose.config.params == {
         'UBUNTU': '18.04',
         'DASK': 'master',
     }
@@ -267,6 +267,7 @@ def test_compose_pull(arrow_compose_path):
         "pull --ignore-pull-failures conda-cpp",
     ]
     with assert_compose_calls(compose, expected_calls):
+        compose.clear_pull_memory()
         compose.pull('conda-cpp')
 
     expected_calls = [
@@ -275,6 +276,7 @@ def test_compose_pull(arrow_compose_path):
         "pull --ignore-pull-failures conda-python-pandas"
     ]
     with assert_compose_calls(compose, expected_calls):
+        compose.clear_pull_memory()
         compose.pull('conda-python-pandas')
 
     expected_calls = [
@@ -282,6 +284,7 @@ def test_compose_pull(arrow_compose_path):
         "pull --ignore-pull-failures conda-python",
     ]
     with assert_compose_calls(compose, expected_calls):
+        compose.clear_pull_memory()
         compose.pull('conda-python-pandas', pull_leaf=False)
 
 
@@ -293,6 +296,7 @@ def test_compose_pull_params(arrow_compose_path):
     compose = DockerCompose(arrow_compose_path, params=dict(UBUNTU='18.04'))
     expected_env = PartialEnv(PYTHON='3.6', PANDAS='latest')
     with assert_compose_calls(compose, expected_calls, env=expected_env):
+        compose.clear_pull_memory()
         compose.pull('conda-python-pandas', pull_leaf=False)
 
 
@@ -419,57 +423,6 @@ def test_compose_run(arrow_compose_path):
         compose.run('conda-python', volumes=volumes)
 
 
-def test_compose_run_force_pull_and_build(arrow_compose_path):
-    compose = DockerCompose(arrow_compose_path)
-
-    expected_calls = [
-        "pull --ignore-pull-failures conda-cpp",
-        format_run("conda-cpp")
-    ]
-    with assert_compose_calls(compose, expected_calls):
-        compose.run('conda-cpp', force_pull=True)
-
-    expected_calls = [
-        "build conda-cpp",
-        format_run("conda-cpp")
-    ]
-    with assert_compose_calls(compose, expected_calls):
-        compose.run('conda-cpp', force_build=True)
-
-    expected_calls = [
-        "pull --ignore-pull-failures conda-cpp",
-        "build conda-cpp",
-        format_run("conda-cpp")
-    ]
-    with assert_compose_calls(compose, expected_calls):
-        compose.run('conda-cpp', force_pull=True, force_build=True)
-
-    expected_calls = [
-        "pull --ignore-pull-failures conda-cpp",
-        "pull --ignore-pull-failures conda-python",
-        "pull --ignore-pull-failures conda-python-pandas",
-        "build conda-cpp",
-        "build conda-python",
-        "build conda-python-pandas",
-        format_run("conda-python-pandas bash")
-    ]
-    with assert_compose_calls(compose, expected_calls):
-        compose.run('conda-python-pandas', command='bash', force_build=True,
-                    force_pull=True)
-
-    expected_calls = [
-        "pull --ignore-pull-failures conda-cpp",
-        "pull --ignore-pull-failures conda-python",
-        "build conda-cpp",
-        "build conda-python",
-        "build --no-cache conda-python-pandas",
-        format_run("conda-python-pandas bash")
-    ]
-    with assert_compose_calls(compose, expected_calls):
-        compose.run('conda-python-pandas', command='bash', force_build=True,
-                    force_pull=True, use_leaf_cache=False)
-
-
 def test_compose_push(arrow_compose_path):
     compose = DockerCompose(arrow_compose_path, params=dict(PYTHON='3.8'))
     expected_env = PartialEnv(PYTHON="3.8")
@@ -478,7 +431,7 @@ def test_compose_push(arrow_compose_path):
     ]
     for image in ["conda-cpp", "conda-python", "conda-python-pandas"]:
         expected_calls.append(
-            mock.call(["docker-compose", "--file", str(compose.config_path),
+            mock.call(["docker-compose", "--file", str(compose.config.path),
                        "push", image], check=True, env=expected_env)
         )
     with assert_subprocess_calls(expected_calls):
@@ -507,16 +460,16 @@ def test_image_with_gpu(arrow_compose_path):
 
     expected_calls = [
         [
-            "run", "--rm", "-it", "--gpus", "all",
+            "run", "--rm", "--gpus", "all",
             "-e", "CUDA_ENV=1",
             "-e", "OTHER_ENV=2",
             "-v", "/host:/container:rw",
-            "dummy-cuda",
+            "org/ubuntu-cuda",
             "/bin/bash", "-c", "echo 1 > /tmp/dummy && cat /tmp/dummy"
         ]
     ]
     with assert_docker_calls(compose, expected_calls):
-        compose.run('ubuntu-cuda', force_pull=False, force_build=False)
+        compose.run('ubuntu-cuda')
 
 
 def test_listing_images(arrow_compose_path):
diff --git a/dev/archery/archery/utils/command.py b/dev/archery/archery/utils/command.py
index 2e27f08..84d2842 100644
--- a/dev/archery/archery/utils/command.py
+++ b/dev/archery/archery/utils/command.py
@@ -56,6 +56,9 @@ class Command:
     property/attribute.
     """
 
+    def __init__(self, bin):
+        self.bin = bin
+
     def run(self, *argv, **kwargs):
         assert hasattr(self, "bin")
         invocation = shlex.split(self.bin)
diff --git a/dev/archery/archery/utils/source.py b/dev/archery/archery/utils/source.py
index 305f830..d30b4f1 100644
--- a/dev/archery/archery/utils/source.py
+++ b/dev/archery/archery/utils/source.py
@@ -182,7 +182,10 @@ class ArrowSources:
         cwd = Path.cwd()
 
         # Implicit via current file
-        this = Path(__file__).parents[4]
+        try:
+            this = Path(__file__).parents[4]
+        except IndexError:
+            this = None
 
         # Implicit via git repository (if archery is installed system wide)
         try:
diff --git a/dev/release/rat_exclude_files.txt b/dev/release/rat_exclude_files.txt
index 85ac2c2..9580e67 100644
--- a/dev/release/rat_exclude_files.txt
+++ b/dev/release/rat_exclude_files.txt
@@ -10,6 +10,7 @@
 .github/ISSUE_TEMPLATE/question.md
 ci/etc/rprofile
 ci/etc/*.patch
+ci/vcpkg/*.patch
 CHANGELOG.md
 cpp/CHANGELOG_PARQUET.md
 cpp/src/arrow/io/mman.h
diff --git a/dev/tasks/crossbow.py b/dev/tasks/crossbow.py
index a68794c..60c0c59 100755
--- a/dev/tasks/crossbow.py
+++ b/dev/tasks/crossbow.py
@@ -1436,12 +1436,14 @@ def check_config(config_path):
 @click.option('--arrow-sha', '-t', default=None,
               help='Set commit SHA or Tag name explicitly, e.g. f67a515, '
                    'apache-arrow-0.11.1.')
+@click.option('--fetch/--no-fetch', default=True,
+              help='Fetch references (branches and tags) from the remote')
 @click.option('--dry-run/--push', default=False,
               help='Just display the rendered CI configurations without '
                    'submitting them')
 @click.pass_obj
 def submit(obj, tasks, groups, params, job_prefix, config_path, arrow_version,
-           arrow_remote, arrow_branch, arrow_sha, dry_run):
+           arrow_remote, arrow_branch, arrow_sha, fetch, dry_run):
     output = obj['output']
     queue, arrow = obj['queue'], obj['arrow']
 
@@ -1469,7 +1471,8 @@ def submit(obj, tasks, groups, params, job_prefix, config_path, arrow_version,
     if dry_run:
         yaml.dump(job, output)
     else:
-        queue.fetch()
+        if fetch:
+            queue.fetch()
         queue.put(job, prefix=job_prefix)
         queue.push()
         yaml.dump(job, output)
diff --git a/dev/tasks/python-wheels/azure.linux.yml b/dev/tasks/python-wheels/azure.linux.yml
deleted file mode 100644
index 279fca9..0000000
--- a/dev/tasks/python-wheels/azure.linux.yml
+++ /dev/null
@@ -1,109 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-jobs:
-- job: linux
-  pool:
-    vmImage: ubuntu-latest
-  timeoutInMinutes: 360
-  steps:
-    - task: CondaEnvironment@1
-      inputs:
-        packageSpecs: 'click github3.py jinja2 jira pygit2 ruamel.yaml setuptools_scm toolz'
-        installOptions: '-c conda-forge'
-        updateConda: false
-
-    - script: |
-        set -ex
-        git clone --no-checkout {{ arrow.remote }} arrow
-        git -C arrow fetch -t {{ arrow.remote }} {{ arrow.branch }}
-        git -C arrow checkout FETCH_HEAD
-        git -C arrow submodule update --init --recursive
-      displayName: Clone arrow
-
-    - script: pip install -e arrow/dev/archery[docker]
-      displayName: Setup Archery
-
-    - script: |
-        set -ex
-        cd arrow
-        BUILD_IMAGE=centos-python-{{ wheel_tag }}
-        archery docker run \
-          -e SETUPTOOLS_SCM_PRETEND_VERSION="{{ arrow.no_rc_version }}" \
-          -e PYTHON_VERSION="{{ python_version }}" \
-          -e UNICODE_WIDTH="{{ unicode_width }}" \
-          ${BUILD_IMAGE}
-        {% if arrow.branch == 'master' %}
-        archery docker push ${BUILD_IMAGE} || :
-        {% endif %}
-      env:
-        ARCHERY_DOCKER_USER: $(ARCHERY_DOCKER_USER)
-        ARCHERY_DOCKER_PASSWORD: $(ARCHERY_DOCKER_PASSWORD)
-        DOCKER_BUILDKIT: 0
-        COMPOSE_DOCKER_CLI_BUILD: 1
-      displayName: Build wheel
-
-    # auditwheel does always exit with 0 so it is mostly for debugging
-    # purposes
-    - script: |
-        set -ex
-        cd arrow
-        docker run \
-          -v $(pwd):/arrow quay.io/pypa/{{ wheel_tag }}_x86_64 \
-          /bin/bash -c \
-            "auditwheel show /arrow/python/{{ wheel_dir }}/dist/*.whl"
-      displayName: Audit wheel
-
-    - script: |
-        set -ex
-        cd arrow
-        test_args=""
-        {%- if test_remove_system_libs %}
-        test_args="${test_args} --remove-system-libs"
-        {%- endif %}
-        {%- for image in test_docker_images %}
-        docker run \
-          --shm-size 2G \
-          -v $(pwd):/arrow \
-          -e WHEEL_DIR="{{ wheel_dir }}" \
-          {{ image }} \
-          /arrow/dev/tasks/python-wheels/manylinux-test.sh ${test_args}
-        {%- endfor %}
-      displayName: Test wheel
-
-    - script: |
-        set -ex
-        python arrow/dev/tasks/crossbow.py \
-          --queue-path . \
-          --queue-remote {{ queue_remote_url }} \
-          upload-artifacts \
-          --pattern "arrow/python/{{ wheel_dir }}/dist/*" \
-          --sha {{ task.branch }} \
-          --tag {{ task.tag }}
-      env:
-        CROSSBOW_GITHUB_TOKEN: $(CROSSBOW_GITHUB_TOKEN)
-      displayName: Upload packages as a GitHub release
-
-    {% if arrow.branch == 'master' %}
-    # upload to gemfury
-    - script: |
-        path=$(ls arrow/python/{{ wheel_dir }}/dist/*.whl)
-        curl -F "package=@${path}" https://${CROSSBOW_GEMFURY_TOKEN}@push.fury.io/${CROSSBOW_GEMFURY_ORG}/
-      env:
-        CROSSBOW_GEMFURY_TOKEN: $(CROSSBOW_GEMFURY_TOKEN)
-        CROSSBOW_GEMFURY_ORG: $(CROSSBOW_GEMFURY_ORG)
-      displayName: Upload packages to Gemfury
-    {% endif %}
diff --git a/dev/tasks/python-wheels/github.win.yml b/dev/tasks/python-wheels/github.linux.yml
similarity index 54%
copy from dev/tasks/python-wheels/github.win.yml
copy to dev/tasks/python-wheels/github.linux.yml
index 4226432..d7f01c7 100644
--- a/dev/tasks/python-wheels/github.win.yml
+++ b/dev/tasks/python-wheels/github.linux.yml
@@ -26,14 +26,24 @@ on:
 
 jobs:
   build:
-    name: "Build wheel for Windows"
-    runs-on: windows-2016
+    name: "Build wheel for Manylinux {{ manylinux_version }}"
+    runs-on: ubuntu-latest
     env:
-      ARCH: "x64"
-      GENERATOR: Visual Studio 15 2017
-      PYTHON_VERSION: "{{ python_version }}"
-      PYARROW_VERSION: {{ arrow.no_rc_version }}
+      # archery uses this environment variable
+      PYTHON: {{ python_version }}
+      # this is a private repository at the moment (mostly because of licensing
+      # consideration of windows images with visual studio), but anyone can
+      # recreate the image by manually building it via:
+      # `archery build python-wheel-manylinux-2010|2014`
+      # note that we don't run docker build since there wouldn't be a cache hit
+      # and rebuilding the dependencies takes a fair amount of time
+      # (in case of linux multi-host caching works much better with buildkit)
+      REPO: ghcr.io/ursacomputing/arrow
+      # prefer the docker cli over docker-compose
+      ARCHERY_USE_DOCKER_CLI: 1
+
     steps:
+      ############################ Checkout ###################################
       - name: Checkout Arrow
         shell: bash
         run: |
@@ -45,27 +55,50 @@ jobs:
       - name: Fetch Submodules and Tags
         shell: bash
         run: cd arrow && ci/scripts/util_checkout.sh
-      - uses: actions/setup-python@v2
+
+      ############################ Docker Registry ############################
+      - name: Login to GitHub Container Registry
+        uses: docker/login-action@v1
         with:
-          python-version: "{{ python_version }}"
-      - name: Set up Miniconda
-        shell: bash
-        run: |
-          echo "c:\\Miniconda\\condabin" >> $GITHUB_PATH
+          registry: ghcr.io
+          username: {{ '${{ github.repository_owner }}' }}
+          password: {{ '${{ secrets.CROSSBOW_GHCR_TOKEN }}' }}
+
+      ############################ Archery Installation #######################
+      - name: Set up Python
+        uses: actions/setup-python@v2
+        with:
+          python-version: 3.8
+      - name: Install Archery and Crossbow dependencies
+        run: pip install -e arrow/dev/archery[all]
+
+      ############################ Build & Test ###############################
       - name: Build wheel
         shell: bash
-        run: |
-          arrow/dev/tasks/python-wheels/win-build.bat
-      - name: Prepare artifacts
-        # the artifacts must be uploaded from a directory relative to the build root
+        run: archery docker run --no-build -e SETUPTOOLS_SCM_PRETEND_VERSION={{ arrow.no_rc_version }} python-wheel-manylinux-{{ manylinux_version }}
+      # TODO(kszucs): auditwheel show
+      - name: Test wheel
+        shell: bash
+        run: archery docker run python-wheel-manylinux-test
+
+      ############################ Artifact Uploading #########################
+      - name: Upload artifacts
         shell: bash
         run: |
-          mv arrow/python/dist/ wheels/
+          python arrow/dev/tasks/crossbow.py \
+            --queue-path . \
+            --queue-remote {{ queue_remote_url }} \
+            upload-artifacts \
+            --pattern "arrow/python/repaired_wheels/*.whl" \
+            --sha {{ task.branch }} \
+            --tag {{ task.tag }}
+        env:
+          CROSSBOW_GITHUB_TOKEN: {{ '${{ secrets.CROSSBOW_GITHUB_TOKEN }}' }}
+
       {% if arrow.branch == 'master' %}
       - name: Upload to gemfury
         shell: bash
         run: |
-          conda.bat install -y curl
           WHEEL_PATH=$(echo wheels/*.whl)
           curl.exe \
             -F "package=@${WHEEL_PATH}" \
@@ -74,19 +107,3 @@ jobs:
           CROSSBOW_GEMFURY_ORG: {{ '${{ secrets.CROSSBOW_GEMFURY_ORG }}' }}
           CROSSBOW_GEMFURY_TOKEN: {{ '${{ secrets.CROSSBOW_GEMFURY_TOKEN }}' }}
       {% endif %}
-      - name: Set up Crossbow
-        shell: bash
-        run: |
-          pip install --requirement arrow/dev/tasks/requirements-crossbow.txt
-      - name: Upload artifacts
-        shell: bash
-        run: |
-          python arrow/dev/tasks/crossbow.py \
-            --queue-path . \
-            --queue-remote {{ queue_remote_url }} \
-            upload-artifacts \
-            --pattern "wheels/*.whl" \
-            --sha {{ task.branch }} \
-            --tag {{ task.tag }}
-        env:
-          CROSSBOW_GITHUB_TOKEN: {{ '${{ secrets.CROSSBOW_GITHUB_TOKEN }}' }}
diff --git a/dev/tasks/python-wheels/github.win.yml b/dev/tasks/python-wheels/github.windows.yml
similarity index 58%
rename from dev/tasks/python-wheels/github.win.yml
rename to dev/tasks/python-wheels/github.windows.yml
index 4226432..f09c357 100644
--- a/dev/tasks/python-wheels/github.win.yml
+++ b/dev/tasks/python-wheels/github.windows.yml
@@ -27,13 +27,22 @@ on:
 jobs:
   build:
     name: "Build wheel for Windows"
-    runs-on: windows-2016
+    runs-on: windows-2019
     env:
-      ARCH: "x64"
-      GENERATOR: Visual Studio 15 2017
-      PYTHON_VERSION: "{{ python_version }}"
-      PYARROW_VERSION: {{ arrow.no_rc_version }}
+      # archery uses this environment variable
+      PYTHON: {{ python_version }}
+      # this is a private repository at the moment (mostly because of licensing
+      # consideration of windows images with visual studio), but anyone can
+      # recreate the image by manually building it via:
+      # `archery build python-wheel-windows-vs2017`
+      # note that we don't run docker build since there wouldn't be a cache hit
+      # and rebuilding the dependencies takes a fair amount of time
+      REPO: ghcr.io/ursacomputing/arrow
+      # prefer the docker cli over docker-compose
+      ARCHERY_USE_DOCKER_CLI: 1
+
     steps:
+      ############################ Checkout ###################################
       - name: Checkout Arrow
         shell: bash
         run: |
@@ -45,27 +54,47 @@ jobs:
       - name: Fetch Submodules and Tags
         shell: bash
         run: cd arrow && ci/scripts/util_checkout.sh
-      - uses: actions/setup-python@v2
+
+      ############################ Docker Registry ############################
+      - name: Login to GitHub Container Registry
+        shell: bash
+        run: docker login ghcr.io -u {{ '${{ github.repository_owner }}' }} -p {{ '${{ secrets.CROSSBOW_GHCR_TOKEN }}' }}
+
+      ############################ Archery Installation #######################
+      - name: Set up Python
+        uses: actions/setup-python@v2
         with:
-          python-version: "{{ python_version }}"
-      - name: Set up Miniconda
+          python-version: 3.8
+      - name: Install Archery and Crossbow dependencies
         shell: bash
-        run: |
-          echo "c:\\Miniconda\\condabin" >> $GITHUB_PATH
+        run: pip install -e arrow/dev/archery[all]
+
+      ############################ Build & Test ###############################
       - name: Build wheel
+        shell: cmd
+        run: archery docker run --no-build -e SETUPTOOLS_SCM_PRETEND_VERSION={{ arrow.no_rc_version }} python-wheel-windows-vs2017
+      - name: Test wheel
+        shell: cmd
+        run: archery docker run python-wheel-windows-test
+
+      ############################ Artifact Uploading #########################
+      - name: Upload artifacts
         shell: bash
         run: |
-          arrow/dev/tasks/python-wheels/win-build.bat
-      - name: Prepare artifacts
-        # the artifacts must be uploaded from a directory relative to the build root
-        shell: bash
-        run: |
-          mv arrow/python/dist/ wheels/
+          python arrow/dev/tasks/crossbow.py \
+            --queue-path . \
+            --queue-remote {{ queue_remote_url }} \
+            upload-artifacts \
+            --pattern "arrow/python/dist/*.whl" \
+            --sha {{ task.branch }} \
+            --tag {{ task.tag }}
+        env:
+          CROSSBOW_GITHUB_TOKEN: {{ '${{ secrets.CROSSBOW_GITHUB_TOKEN }}' }}
+
       {% if arrow.branch == 'master' %}
       - name: Upload to gemfury
         shell: bash
         run: |
-          conda.bat install -y curl
           WHEEL_PATH=$(echo wheels/*.whl)
           curl.exe \
             -F "package=@${WHEEL_PATH}" \
@@ -74,19 +103,3 @@ jobs:
           CROSSBOW_GEMFURY_ORG: {{ '${{ secrets.CROSSBOW_GEMFURY_ORG }}' }}
           CROSSBOW_GEMFURY_TOKEN: {{ '${{ secrets.CROSSBOW_GEMFURY_TOKEN }}' }}
       {% endif %}
-      - name: Set up Crossbow
-        shell: bash
-        run: |
-          pip install --requirement arrow/dev/tasks/requirements-crossbow.txt
-      - name: Upload artifacts
-        shell: bash
-        run: |
-          python arrow/dev/tasks/crossbow.py \
-            --queue-path . \
-            --queue-remote {{ queue_remote_url }} \
-            upload-artifacts \
-            --pattern "wheels/*.whl" \
-            --sha {{ task.branch }} \
-            --tag {{ task.tag }}
-        env:
-          CROSSBOW_GITHUB_TOKEN: {{ '${{ secrets.CROSSBOW_GITHUB_TOKEN }}' }}
diff --git a/dev/tasks/python-wheels/osx-build.sh b/dev/tasks/python-wheels/osx-build.sh
index e6bd615..176b8b3 100755
--- a/dev/tasks/python-wheels/osx-build.sh
+++ b/dev/tasks/python-wheels/osx-build.sh
@@ -71,6 +71,7 @@ function build_wheel {
           -DARROW_GRPC_USE_SHARED=OFF \
           -DARROW_HDFS=ON \
           -DARROW_JEMALLOC=ON \
+          -DARROW_MIMALLOC=ON \
           -DARROW_OPENSSL_USE_SHARED=OFF \
           -DARROW_ORC=OFF \
           -DARROW_PARQUET=ON \
diff --git a/dev/tasks/python-wheels/win-build.bat b/dev/tasks/python-wheels/win-build.bat
deleted file mode 100644
index f2ca57d..0000000
--- a/dev/tasks/python-wheels/win-build.bat
+++ /dev/null
@@ -1,116 +0,0 @@
-@rem Licensed to the Apache Software Foundation (ASF) under one
-@rem or more contributor license agreements.  See the NOTICE file
-@rem distributed with this work for additional information
-@rem regarding copyright ownership.  The ASF licenses this file
-@rem to you under the Apache License, Version 2.0 (the
-@rem "License"); you may not use this file except in compliance
-@rem with the License.  You may obtain a copy of the License at
-@rem
-@rem   http://www.apache.org/licenses/LICENSE-2.0
-@rem
-@rem Unless required by applicable law or agreed to in writing,
-@rem software distributed under the License is distributed on an
-@rem "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-@rem KIND, either express or implied.  See the License for the
-@rem specific language governing permissions and limitations
-@rem under the License.
-
-@echo on
-
-@rem Building Gandiva in the wheels is disabled for now to make the wheels
-@rem smaller.
-
-@rem --file=arrow\ci\conda_env_gandiva.yml ^
-
-@rem create conda environment for compiling
-@rem FIXME: Update to numpy 1.16.6
-call conda.bat create -n wheel-build -q -y -c conda-forge ^
-    cmake ninja pkg-config numpy=1.16.5 ^
-    "vs2015_runtime<14.16" ^
-    python=%PYTHON_VERSION% || exit /B
-
-call conda.bat activate wheel-build
-python -m pip install cython setuptools-scm
-
-set ARROW_HOME=%CONDA_PREFIX%\Library
-set PARQUET_HOME=%CONDA_PREFIX%\Library
-echo %ARROW_HOME%
-
-@rem Build Arrow C++ libraries
-mkdir arrow\cpp\build
-pushd arrow\cpp\build
-
-@rem ARROW-6938(wesm): bz2 is disabled on Windows because the build
-@rem currently selects the shared lib for linking. Using the zstd lib from
-@rem conda-forge also results in a broken build so we use the BUNDLED
-@rem dependency resolution strategy for now
-
-cmake -A "%ARCH%" ^
-      -G "%GENERATOR%" ^
-      -DARROW_BUILD_STATIC=OFF ^
-      -DARROW_BUILD_TESTS=OFF ^
-      -DARROW_CXXFLAGS="/MP" ^
-      -DARROW_DATASET=ON ^
-      -DARROW_DEPENDENCY_SOURCE=BUNDLED ^
-      -DARROW_DEPENDENCY_USE_SHARED=OFF ^
-      -DARROW_FLIGHT=ON ^
-      -DARROW_GANDIVA=OFF ^
-      -DARROW_MIMALLOC=ON ^
-      -DARROW_PARQUET=ON ^
-      -DARROW_PYTHON=ON ^
-      -DARROW_VERBOSE_THIRDPARTY_BUILD=ON ^
-      -DARROW_WITH_BROTLI=ON ^
-      -DARROW_WITH_BZ2=OFF ^
-      -DARROW_WITH_LZ4=ON ^
-      -DARROW_WITH_SNAPPY=ON ^
-      -DARROW_WITH_ZLIB=ON ^
-      -DARROW_WITH_ZSTD=ON ^
-      -DCMAKE_BUILD_TYPE=Release ^
-      -DCMAKE_INSTALL_PREFIX=%ARROW_HOME% ^
-      -DCMAKE_VERBOSE_MAKEFILE=ON ^
-      -DMSVC_LINK_VERBOSE=ON ^
-      .. || exit /B
-cmake ^
-  --build . ^
-  --config Release ^
-  --parallel %NUMBER_OF_PROCESSORS% ^
-  --target install || exit /B
-popd
-
-set PYARROW_BUILD_TYPE=Release
-set PYARROW_BUNDLE_ARROW_CPP=1
-set PYARROW_CMAKE_GENERATOR=%GENERATOR%
-set PYARROW_CMAKE_OPTIONS=-A %ARCH%
-set PYARROW_INSTALL_TESTS=1
-set PYARROW_PARALLEL=%NUMBER_OF_PROCESSORS%
-set PYARROW_WITH_DATASET=1
-set PYARROW_WITH_FLIGHT=1
-set PYARROW_WITH_GANDIVA=0
-set PYARROW_WITH_PARQUET=1
-set PYARROW_WITH_STATIC_BOOST=1
-set SETUPTOOLS_SCM_PRETEND_VERSION=%PYARROW_VERSION%
-
-pushd arrow\python
-python setup.py bdist_wheel || exit /B
-popd
-
-call conda.bat deactivate
-
-set ARROW_TEST_DATA=arrow\testing\data
-
-@rem install the test dependencies
-python -m pip install -r arrow\python\requirements-wheel-test.txt || exit /B
-
-@rem install the produced wheel in a non-conda environment
-python -m pip install --no-index --find-links=arrow\python\dist\ pyarrow || exit /B
-
-@rem test the imports
-python -c "import pyarrow" || exit /B
-python -c "import pyarrow.parquet" || exit /B
-python -c "import pyarrow.flight" || exit /B
-python -c "import pyarrow.dataset" || exit /B
-
-@rem run the python tests, but disable the cython because there is a linking
-@rem issue on python 3.8
-set PYARROW_TEST_CYTHON=OFF
-python -m pytest -rs --pyargs pyarrow || exit /B
diff --git a/dev/tasks/tasks.yml b/dev/tasks/tasks.yml
index f19e0a4..10b79db 100644
--- a/dev/tasks/tasks.yml
+++ b/dev/tasks/tasks.yml
@@ -342,171 +342,75 @@ tasks:
 
   ############################## Wheel Linux ##################################
 
-  wheel-manylinux1-cp36m:
-    ci: azure
-    template: python-wheels/azure.linux.yml
-    params:
-      python_version: 3.6
-      unicode_width: 16
-      wheel_tag: manylinux1
-      wheel_dir: manylinux1
-      test_docker_images:
-        - python:3.6
-      test_remove_system_libs: true
-    artifacts:
-      - pyarrow-{no_rc_version}-cp36-cp36m-manylinux1_x86_64.whl
-
-  wheel-manylinux1-cp37m:
-    ci: azure
-    template: python-wheels/azure.linux.yml
-    params:
-      python_version: 3.7
-      unicode_width: 16
-      wheel_tag: manylinux1
-      wheel_dir: manylinux1
-      test_docker_images:
-        - python:3.7
-      test_remove_system_libs: true
-    artifacts:
-      - pyarrow-{no_rc_version}-cp37-cp37m-manylinux1_x86_64.whl
-
-  wheel-manylinux1-cp38:
-    ci: azure
-    template: python-wheels/azure.linux.yml
-    params:
-      python_version: 3.8
-      unicode_width: 16
-      wheel_tag: manylinux1
-      wheel_dir: manylinux1
-      test_docker_images:
-        - python:3.8
-      test_remove_system_libs: true
-    artifacts:
-      - pyarrow-{no_rc_version}-cp38-cp38-manylinux1_x86_64.whl
-
-  wheel-manylinux1-cp39:
-    ci: azure
-    template: python-wheels/azure.linux.yml
-    params:
-      python_version: 3.9
-      unicode_width: 16
-      wheel_tag: manylinux1
-      wheel_dir: manylinux1
-      test_docker_images:
-        - python:3.9
-      test_remove_system_libs: true
-    artifacts:
-      - pyarrow-{no_rc_version}-cp39-cp39-manylinux1_x86_64.whl
-
   wheel-manylinux2010-cp36m:
-    ci: azure
-    template: python-wheels/azure.linux.yml
+    ci: github
+    template: python-wheels/github.linux.yml
     params:
       python_version: 3.6
-      unicode_width: 16
-      wheel_tag: manylinux2010
-      wheel_dir: manylinux201x
-      test_docker_images:
-        - python:3.6
-      test_remove_system_libs: true
+      manylinux_version: 2010
     artifacts:
       - pyarrow-{no_rc_version}-cp36-cp36m-manylinux2010_x86_64.whl
 
   wheel-manylinux2010-cp37m:
-    ci: azure
-    template: python-wheels/azure.linux.yml
+    ci: github
+    template: python-wheels/github.linux.yml
     params:
       python_version: 3.7
-      unicode_width: 16
-      wheel_tag: manylinux2010
-      wheel_dir: manylinux201x
-      test_docker_images:
-        - python:3.7
-      test_remove_system_libs: true
+      manylinux_version: 2010
     artifacts:
       - pyarrow-{no_rc_version}-cp37-cp37m-manylinux2010_x86_64.whl
 
   wheel-manylinux2010-cp38:
-    ci: azure
-    template: python-wheels/azure.linux.yml
+    ci: github
+    template: python-wheels/github.linux.yml
     params:
       python_version: 3.8
-      unicode_width: 16
-      wheel_tag: manylinux2010
-      wheel_dir: manylinux201x
-      test_docker_images:
-        - python:3.8
-      test_remove_system_libs: true
+      manylinux_version: 2010
     artifacts:
       - pyarrow-{no_rc_version}-cp38-cp38-manylinux2010_x86_64.whl
 
   wheel-manylinux2010-cp39:
-    ci: azure
-    template: python-wheels/azure.linux.yml
+    ci: github
+    template: python-wheels/github.linux.yml
     params:
       python_version: 3.9
-      unicode_width: 16
-      wheel_tag: manylinux2010
-      wheel_dir: manylinux201x
-      test_docker_images:
-        - python:3.9
-      test_remove_system_libs: true
+      manylinux_version: 2010
     artifacts:
       - pyarrow-{no_rc_version}-cp39-cp39-manylinux2010_x86_64.whl
 
   wheel-manylinux2014-cp36m:
-    ci: azure
-    template: python-wheels/azure.linux.yml
+    ci: github
+    template: python-wheels/github.linux.yml
     params:
       python_version: 3.6
-      unicode_width: 16
-      wheel_tag: manylinux2014
-      wheel_dir: manylinux201x
-      test_docker_images:
-        - python:3.6
-      test_remove_system_libs: true
+      manylinux_version: 2014
     artifacts:
       - pyarrow-{no_rc_version}-cp36-cp36m-manylinux2014_x86_64.whl
 
   wheel-manylinux2014-cp37m:
-    ci: azure
-    template: python-wheels/azure.linux.yml
+    ci: github
+    template: python-wheels/github.linux.yml
     params:
       python_version: 3.7
-      unicode_width: 16
-      wheel_tag: manylinux2014
-      wheel_dir: manylinux201x
-      test_docker_images:
-        - python:3.7
-      test_remove_system_libs: true
+      manylinux_version: 2014
     artifacts:
       - pyarrow-{no_rc_version}-cp37-cp37m-manylinux2014_x86_64.whl
 
   wheel-manylinux2014-cp38:
-    ci: azure
-    template: python-wheels/azure.linux.yml
+    ci: github
+    template: python-wheels/github.linux.yml
     params:
       python_version: 3.8
-      unicode_width: 16
-      wheel_tag: manylinux2014
-      wheel_dir: manylinux201x
-      test_docker_images:
-        - python:3.8
-      test_remove_system_libs: true
+      manylinux_version: 2014
     artifacts:
       - pyarrow-{no_rc_version}-cp38-cp38-manylinux2014_x86_64.whl
 
   wheel-manylinux2014-cp39:
-    ci: azure
-    template: python-wheels/azure.linux.yml
+    ci: github
+    template: python-wheels/github.linux.yml
     params:
       python_version: 3.9
-      unicode_width: 16
-      wheel_tag: manylinux2014
-      wheel_dir: manylinux201x
-      test_docker_images:
-        - python:3.9
-      test_remove_system_libs: true
+      manylinux_version: 2014
     artifacts:
       - pyarrow-{no_rc_version}-cp39-cp39-manylinux2014_x86_64.whl
 
@@ -596,30 +500,38 @@ tasks:
 
   ############################## Wheel Windows ################################
 
-  wheel-win-cp36m:
+  wheel-windows-cp36m:
     ci: github
-    template: python-wheels/github.win.yml
+    template: python-wheels/github.windows.yml
     params:
       python_version: 3.6
     artifacts:
       - pyarrow-{no_rc_version}-cp36-cp36m-win_amd64.whl
 
-  wheel-win-cp37m:
+  wheel-windows-cp37m:
     ci: github
-    template: python-wheels/github.win.yml
+    template: python-wheels/github.windows.yml
     params:
       python_version: 3.7
     artifacts:
       - pyarrow-{no_rc_version}-cp37-cp37m-win_amd64.whl
 
-  wheel-win-cp38:
+  wheel-windows-cp38:
     ci: github
-    template: python-wheels/github.win.yml
+    template: python-wheels/github.windows.yml
     params:
       python_version: 3.8
     artifacts:
       - pyarrow-{no_rc_version}-cp38-cp38-win_amd64.whl
 
+  wheel-windows-cp39:
+    ci: github
+    template: python-wheels/github.windows.yml
+    params:
+      python_version: 3.9
+    artifacts:
+      - pyarrow-{no_rc_version}-cp39-cp39-win_amd64.whl
+
   ############################ Python sdist ####################################
 
   python-sdist:
@@ -1770,7 +1682,7 @@ tasks:
     template: r/github.linux.cran.yml
     params:
       MATRIX: "${{ matrix.r_image }}"
-      
+
   test-r-version-compatibility:
     ci: github
     template: r/github.linux.version.compatibility.yml
diff --git a/docker-compose.yml b/docker-compose.yml
index a7c4915..b66619d 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -79,9 +79,6 @@ x-hierarchy:
   # Each node must be either a string scalar of a list containing the
   # descendant images if any. Archery checks that all node has a corresponding
   # service entry, so any new image/service must be listed here.
-  - centos-python-manylinux1
-  - centos-python-manylinux2010
-  - centos-python-manylinux2014
   - conda-cpp:
     - conda-cpp-hiveserver2
     - conda-cpp-valgrind
@@ -122,6 +119,31 @@ x-hierarchy:
   # helper services
   - impala
   - postgres
+  - python-wheel-manylinux-2010
+  - python-wheel-manylinux-2014
+  - python-wheel-manylinux-test
+  - python-wheel-windows-vs2017
+  - python-wheel-windows-test
+
+volumes:
+  conda-ccache:
+    name: ${ARCH}-conda-ccache
+  debian-ccache:
+    name: ${ARCH}-debian-${DEBIAN}-ccache
+  ubuntu-ccache:
+    name: ${ARCH}-ubuntu-${UBUNTU}-ccache
+  fedora-ccache:
+    name: ${ARCH}-fedora-${FEDORA}-ccache
+  debian-rust:
+    name: ${ARCH}-debian-${DEBIAN}-rust
+  maven-cache:
+    name: maven-cache
+  python-wheel-manylinux2010-ccache:
+    name: python-wheel-manylinux2010-ccache
+  python-wheel-manylinux2014-ccache:
+    name: python-wheel-manylinux2014-ccache
+  python-wheel-windows-clcache:
+    name: python-wheel-windows-clcache
 
 services:
 
@@ -169,7 +191,7 @@ services:
       ARROW_USE_PRECOMPILED_HEADERS: "ON"
     volumes: &conda-volumes
       - .:/arrow:delegated
-      - ${DOCKER_VOLUME_DIRECTORY:-.docker}/${ARCH}-conda-ccache:/ccache:delegated
+      - ${DOCKER_VOLUME_PREFIX}conda-ccache:/ccache:delegated
     command: &cpp-conda-command
       ["/arrow/ci/scripts/cpp_build.sh /arrow /build true &&
         /arrow/ci/scripts/cpp_test.sh /arrow /build"]
@@ -229,7 +251,7 @@ services:
       ARROW_MIMALLOC: "ON"
     volumes: &debian-volumes
       - .:/arrow:delegated
-      - ${DOCKER_VOLUME_DIRECTORY:-.docker}/${ARCH}-debian-${DEBIAN}-ccache:/ccache:delegated
+      - ${DOCKER_VOLUME_PREFIX}debian-ccache:/ccache:delegated
     command: &cpp-command >
       /bin/bash -c "
         /arrow/ci/scripts/cpp_build.sh /arrow /build &&
@@ -261,7 +283,7 @@ services:
       ARROW_MIMALLOC: "ON"
     volumes: &ubuntu-volumes
       - .:/arrow:delegated
-      - ${DOCKER_VOLUME_DIRECTORY:-.docker}/${ARCH}-ubuntu-${UBUNTU}-ccache:/ccache:delegated
+      - ${DOCKER_VOLUME_PREFIX}ubuntu-ccache:/ccache:delegated
     command: *cpp-command
 
   ubuntu-cuda-cpp:
@@ -352,7 +374,7 @@ services:
       ARROW_MIMALLOC: "ON"
     volumes: &fedora-volumes
       - .:/arrow:delegated
-      - ${DOCKER_VOLUME_DIRECTORY:-.docker}/${ARCH}-fedora-${FEDORA}-ccache:/ccache:delegated
+      - ${DOCKER_VOLUME_PREFIX}fedora-ccache:/ccache:delegated
     command: *cpp-command
 
   ############################### C GLib ######################################
@@ -637,6 +659,85 @@ services:
         /arrow/ci/scripts/cpp_build.sh /arrow /build &&
         /arrow/ci/scripts/python_sdist_test.sh /arrow"
 
+  ############################ Python wheels ##################################
+
+  # See available versions at:
+  #    https://quay.io/repository/pypa/manylinux2010_x86_64?tab=tags
+  python-wheel-manylinux-2010:
+    image: ${REPO}:python-${PYTHON}-wheel-manylinux-2010
+    build:
+      args:
+        base: quay.io/pypa/manylinux2010_x86_64:2020-12-03-912b0de
+        vcpkg: ${VCPKG}
+        python: ${PYTHON}
+      context: .
+      dockerfile: ci/docker/python-wheel-manylinux-201x.dockerfile
+      cache_from:
+        - ${REPO}:python-${PYTHON}-wheel-manylinux-2010
+    environment:
+      <<: *ccache
+      MANYLINUX_VERSION: 2010
+    volumes:
+      - .:/arrow:delegated
+      - ${DOCKER_VOLUME_PREFIX}python-wheel-manylinux2010-ccache:/ccache:delegated
+    command: /arrow/ci/scripts/python_wheel_manylinux_build.sh
+
+  # See available versions at:
+  #    https://quay.io/repository/pypa/manylinux2014_x86_64?tab=tags
+  python-wheel-manylinux-2014:
+    image: ${REPO}:python-${PYTHON}-wheel-manylinux-2014
+    build:
+      args:
+        base:  quay.io/pypa/manylinux2014_x86_64:2020-11-11-bc8ce45
+        vcpkg: ${VCPKG}
+        python: ${PYTHON}
+      context: .
+      dockerfile: ci/docker/python-wheel-manylinux-201x.dockerfile
+      cache_from:
+        - ${REPO}:python-${PYTHON}-wheel-manylinux-2014
+    environment:
+      <<: *ccache
+      MANYLINUX_VERSION: 2014
+    volumes:
+      - .:/arrow:delegated
+      - ${DOCKER_VOLUME_PREFIX}python-wheel-manylinux2014-ccache:/ccache:delegated
+    command: /arrow/ci/scripts/python_wheel_manylinux_build.sh
+
+  python-wheel-manylinux-test:
+    image: python:${PYTHON}
+    shm_size: 2G
+    volumes:
+      - .:/arrow:delegated
+    command: /arrow/ci/scripts/python_wheel_manylinux_test.sh
+
+  python-wheel-windows-vs2017:
+    image: ${REPO}:python-${PYTHON}-wheel-windows-vs2017
+    build:
+      args:
+        vcpkg: ${VCPKG}
+        python: ${PYTHON}
+      context: .
+      dockerfile: ci/docker/python-wheel-windows-vs2017.dockerfile
+      # This should make the pushed images reusable, but the image gets rebuilt.
+      # Uncomment if no local cache is available.
+      # cache_from:
+      #   - mcr.microsoft.com/windows/servercore:ltsc2019
+      #   - ${REPO}:wheel-windows-vs2017
+    volumes:
+      - "${DOCKER_VOLUME_PREFIX}python-wheel-windows-clcache:C:/clcache"
+      - type: bind
+        source: .
+        target: "C:/arrow"
+    command: arrow\\ci\\scripts\\python_wheel_windows_build.bat
+
+  python-wheel-windows-test:
+    image: python:${PYTHON}-windowsservercore-1809
+    volumes:
+      - type: bind
+        source: .
+        target: "C:/arrow"
+    command: arrow\\ci\\scripts\\python_wheel_windows_test.bat
+
   ##############################  Integration #################################
 
   conda-python-pandas:
@@ -785,68 +886,6 @@ services:
         /arrow/ci/scripts/python_build.sh /arrow /build &&
         /arrow/ci/scripts/integration_kartothek.sh /kartothek /build"]
 
-  ########################## Python Wheels ####################################
-
-  centos-python-manylinux1:
-    image: ${REPO}:amd64-centos-5.11-python-manylinux1
-    build:
-      context: python/manylinux1
-      dockerfile: Dockerfile-x86_64_base
-      cache_from:
-        - ${REPO}:amd64-centos-5.11-python-manylinux1
-      args:
-        llvm: ${LLVM}
-    shm_size: *shm-size
-    environment:
-      <<: *ccache
-      PYTHON_VERSION: ${PYTHON_VERSION:-3.6}
-      UNICODE_WIDTH: ${UNICODE_WIDTH:-16}
-    volumes:
-      - .:/arrow:delegated
-      - ./python/manylinux1:/io:delegated
-      - ${DOCKER_VOLUME_DIRECTORY:-.docker}/centos-python-manylinux1-ccache:/ccache:delegated
-    command: &manylinux-command /io/build_arrow.sh
-
-  centos-python-manylinux2010:
-    image: ${REPO}:amd64-centos-6.10-python-manylinux2010
-    build:
-      context: python/manylinux201x
-      dockerfile: Dockerfile-x86_64_base_2010
-      cache_from:
-        - ${REPO}:amd64-centos-6.10-python-manylinux2010
-      args:
-        llvm: ${LLVM}
-    shm_size: *shm-size
-    environment:
-      <<: *ccache
-      PYTHON_VERSION: ${PYTHON_VERSION:-3.6}
-      UNICODE_WIDTH: ${UNICODE_WIDTH:-16}
-    volumes:
-      - .:/arrow:delegated
-      - ./python/manylinux201x:/io:delegated
-      - ${DOCKER_VOLUME_DIRECTORY:-.docker}/centos-python-manylinux2010-ccache:/ccache:delegated
-    command: *manylinux-command
-
-  centos-python-manylinux2014:
-    image: ${REPO}:amd64-centos-7.7-python-manylinux2014
-    build:
-      context: python/manylinux201x
-      dockerfile: Dockerfile-x86_64_base_2014
-      cache_from:
-        - ${REPO}:amd64-centos-7.7-python-manylinux2014
-      args:
-        llvm: ${LLVM}
-    shm_size: *shm-size
-    environment:
-      <<: *ccache
-      PYTHON_VERSION: ${PYTHON_VERSION:-3.6}
-      UNICODE_WIDTH: ${UNICODE_WIDTH:-16}
-    volumes:
-      - .:/arrow:delegated
-      - ./python/manylinux201x:/io:delegated
-      - ${DOCKER_VOLUME_DIRECTORY:-.docker}/centos-python-manylinux2014-ccache:/ccache:delegated
-    command: *manylinux-command
-
   ################################## R ########################################
 
   ubuntu-r:
@@ -1017,7 +1056,7 @@ services:
     shm_size: *shm-size
     volumes: &java-volumes
       - .:/arrow:delegated
-      - ${DOCKER_VOLUME_DIRECTORY:-.docker}/maven-cache:/root/.m2:delegated
+      - ${DOCKER_VOLUME_PREFIX}maven-cache:/root/.m2:delegated
     command: &java-command >
       /bin/bash -c "
         /arrow/ci/scripts/java_build.sh /arrow /build &&
@@ -1043,8 +1082,8 @@ services:
       <<: *ccache
     volumes:
       - .:/arrow:delegated
-      - ${DOCKER_VOLUME_DIRECTORY:-.docker}/maven-cache:/root/.m2:delegated
-      - ${DOCKER_VOLUME_DIRECTORY:-.docker}/${ARCH}-debian-9-ccache:/ccache:delegated
+      - ${DOCKER_VOLUME_PREFIX}maven-cache:/root/.m2:delegated
+      - ${DOCKER_VOLUME_PREFIX}debian-ccache:/ccache:delegated
     command:
       /bin/bash -c "
         /arrow/ci/scripts/cpp_build.sh /arrow /build &&
@@ -1232,8 +1271,8 @@ services:
     shm_size: *shm-size
     volumes: &conda-maven-volumes
       - .:/arrow:delegated
-      - ${DOCKER_VOLUME_DIRECTORY:-.docker}/maven-cache:/root/.m2:delegated
-      - ${DOCKER_VOLUME_DIRECTORY:-.docker}/${ARCH}-conda-ccache:/ccache:delegated
+      - ${DOCKER_VOLUME_PREFIX}maven-cache:/root/.m2:delegated
+      - ${DOCKER_VOLUME_PREFIX}conda-ccache:/ccache:delegated
     command:
       ["/arrow/ci/scripts/cpp_build.sh /arrow /build &&
         /arrow/ci/scripts/python_build.sh /arrow /build &&
diff --git a/python/CMakeLists.txt b/python/CMakeLists.txt
index 73faaa7..16e9e84 100644
--- a/python/CMakeLists.txt
+++ b/python/CMakeLists.txt
@@ -367,10 +367,9 @@ if(PYARROW_BUNDLE_ARROW_CPP)
   endif()
 
   if(MSVC)
-    find_package(ZLIB)
+    # TODO(kszucs): locate msvcp140.dll in a portable fashion and bundle it
     bundle_arrow_import_lib(ARROW_IMPORT_LIB)
     bundle_arrow_import_lib(ARROW_PYTHON_IMPORT_LIB)
-    bundle_arrow_dependency(zlib)
   endif()
 endif()
 
@@ -499,8 +498,8 @@ if(PYARROW_BUILD_FLIGHT)
       # XXX Hardcoded library names because CMake is too stupid to give us
       # the shared library paths.
       # https://gitlab.kitware.com/cmake/cmake/issues/16210
-      bundle_arrow_dependency(libcrypto-1_1-x64)
-      bundle_arrow_dependency(libssl-1_1-x64)
+      # bundle_arrow_dependency(libcrypto-1_1-x64)
+      # bundle_arrow_dependency(libssl-1_1-x64)
     endif()
   endif()
 
diff --git a/python/manylinux1/.dockerignore b/python/manylinux1/.dockerignore
deleted file mode 100644
index 1521c8b..0000000
--- a/python/manylinux1/.dockerignore
+++ /dev/null
@@ -1 +0,0 @@
-dist
diff --git a/python/manylinux1/Dockerfile-x86_64_base b/python/manylinux1/Dockerfile-x86_64_base
deleted file mode 100644
index d92782f..0000000
--- a/python/manylinux1/Dockerfile-x86_64_base
+++ /dev/null
@@ -1,106 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# See https://quay.io/repository/pypa/manylinux1_x86_64?tab=tags
-# to update base image.
-FROM quay.io/pypa/manylinux1_x86_64:2020-12-11-df302d3
-
-# Install dependencies
-RUN yum install -y xz ccache flex && yum clean all
-
-ADD scripts/build_zlib.sh /
-RUN /build_zlib.sh
-
-ADD scripts/build_openssl.sh /
-RUN /build_openssl.sh
-
-# Install CMake and Ninja manylinux1 packages
-ADD scripts/install_cmake.sh /
-RUN /install_cmake.sh
-
-ADD scripts/build_gflags.sh /
-RUN /build_gflags.sh
-
-ADD scripts/build_protobuf.sh /
-RUN /build_protobuf.sh
-ENV PROTOBUF_HOME /usr/local
-
-ADD scripts/build_cares.sh /
-RUN /build_cares.sh
-
-ADD scripts/build_absl.sh /
-RUN /build_absl.sh
-
-ADD scripts/build_grpc.sh /
-RUN /build_grpc.sh
-
-ADD scripts/build_boost.sh /
-RUN /build_boost.sh
-
-ADD scripts/build_bison.sh /
-RUN /build_bison.sh
-
-ADD scripts/build_thrift.sh /
-RUN /build_thrift.sh
-ENV THRIFT_HOME /usr/local
-
-ADD scripts/build_curl.sh /
-RUN /build_curl.sh
-ENV CURL_HOME /opt/curl
-
-ADD scripts/build_aws_sdk.sh /
-RUN /build_aws_sdk.sh
-
-ADD scripts/build_brotli.sh /
-RUN /build_brotli.sh
-ENV BROTLI_HOME /usr/local
-
-ADD scripts/build_snappy.sh /
-RUN /build_snappy.sh
-ENV SNAPPY_HOME /usr/local
-
-ADD scripts/build_lz4.sh /
-RUN /build_lz4.sh
-ENV LZ4_HOME /usr/local
-
-ADD scripts/build_zstd.sh /
-RUN /build_zstd.sh
-ENV ZSTD_HOME /usr/local
-
-ADD scripts/build_ccache.sh /
-RUN /build_ccache.sh
-
-ADD scripts/build_glog.sh /
-RUN /build_glog.sh
-ENV GLOG_HOME /usr/local
-
-WORKDIR /
-RUN git clone https://github.com/matthew-brett/multibuild.git && cd multibuild && git checkout 8882150df6529658700b66bec124dfb77eefca26
-# Remove unneeded Python versions
-RUN rm -rf /opt/_internal/cpython-2.7.*
-
-ADD scripts/build_rapidjson.sh /
-RUN /build_rapidjson.sh
-
-ADD scripts/build_re2.sh /
-RUN /build_re2.sh
-
-ADD scripts/build_bz2.sh /
-RUN /build_bz2.sh
-
-ADD scripts/build_utf8proc.sh /
-RUN /build_utf8proc.sh
diff --git a/python/manylinux1/Dockerfile-x86_64_ubuntu b/python/manylinux1/Dockerfile-x86_64_ubuntu
deleted file mode 100644
index 290fbc1..0000000
--- a/python/manylinux1/Dockerfile-x86_64_ubuntu
+++ /dev/null
@@ -1,89 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-FROM ubuntu:14.04
-
-# Install dependencies
-RUN apt update
-RUN apt install -y ccache flex wget curl build-essential git libffi-dev autoconf pkg-config
-
-ADD scripts/build_zlib.sh /
-RUN /build_zlib.sh
-
-# Install python
-ADD scripts/requirements.txt /
-ADD scripts/build_python.sh /
-ADD scripts/python-tag-abi-tag.py /
-RUN /build_python.sh
-
-# Install cmake manylinux1 package
-ADD scripts/install_cmake.sh /
-RUN /install_cmake.sh
-
-WORKDIR /
-RUN git clone https://github.com/matthew-brett/multibuild.git && cd multibuild && git checkout 8882150df6529658700b66bec124dfb77eefca26
-
-ADD scripts/build_openssl.sh /
-RUN /build_openssl.sh
-
-ADD scripts/build_boost.sh /
-RUN /build_boost.sh
-
-ADD scripts/build_gtest.sh /
-RUN /build_gtest.sh
-ENV GTEST_HOME /usr
-
-ADD scripts/build_flatbuffers.sh /
-RUN /build_flatbuffers.sh
-ENV FLATBUFFERS_HOME /usr
-
-ADD scripts/build_bison.sh /
-RUN /build_bison.sh
-
-ADD scripts/build_thrift.sh /
-RUN /build_thrift.sh
-ENV THRIFT_HOME /usr
-
-ADD scripts/build_brotli.sh /
-RUN /build_brotli.sh
-ENV BROTLI_HOME /usr
-
-ADD scripts/build_snappy.sh /
-RUN /build_snappy.sh
-ENV SNAPPY_HOME /usr
-
-ADD scripts/build_lz4.sh /
-RUN /build_lz4.sh
-ENV LZ4_HOME /usr
-
-ADD scripts/build_zstd.sh /
-RUN /build_zstd.sh
-ENV ZSTD_HOME /usr
-
-ADD scripts/build_ccache.sh /
-RUN /build_ccache.sh
-
-ADD scripts/build_protobuf.sh /
-RUN /build_protobuf.sh
-ENV PROTOBUF_HOME /usr
-
-ADD scripts/build_glog.sh /
-RUN /build_glog.sh
-ENV GLOG_HOME /usr
-
-ARG llvm
-ADD scripts/build_llvm.sh /
-RUN /build_llvm.sh ${llvm}
diff --git a/python/manylinux1/README.md b/python/manylinux1/README.md
deleted file mode 100644
index 50dce7a..0000000
--- a/python/manylinux1/README.md
+++ /dev/null
@@ -1,130 +0,0 @@
-<!---
-  Licensed to the Apache Software Foundation (ASF) under one
-  or more contributor license agreements.  See the NOTICE file
-  distributed with this work for additional information
-  regarding copyright ownership.  The ASF licenses this file
-  to you under the Apache License, Version 2.0 (the
-  "License"); you may not use this file except in compliance
-  with the License.  You may obtain a copy of the License at
-
-    http://www.apache.org/licenses/LICENSE-2.0
-
-  Unless required by applicable law or agreed to in writing,
-  software distributed under the License is distributed on an
-  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-  KIND, either express or implied.  See the License for the
-  specific language governing permissions and limitations
-  under the License.
--->
-
-## Manylinux1 wheels for Apache Arrow
-
-This folder provides base Docker images and an infrastructure to build
-`manylinux1` compatible Python wheels that should be installable on all
-Linux distributions published in last four years.
-
-The process is split up in two parts: There are base Docker images that build
-the native, Python-independent dependencies. For these you can select if you
-want to also build the dependencies used for the Parquet support. Depending on
-these images, there is also a bash script that will build the pyarrow wheels
-for all supported Python versions and place them in the `dist` folder.
-
-### Build instructions
-
-You can build the wheels with the following command (this is for Python 3.7,
-similarly you can pass another value for `PYTHON_VERSION`):
-
-```bash
-# Build the python packages
-docker-compose run -e PYTHON_VERSION="3.7" centos-python-manylinux1
-# Now the new packages are located in the dist/ folder
-ls -l dist/
-```
-
-### Updating the build environment
-The base docker image is less often updated. In the case we want to update
-a dependency to a new version, we also need to adjust it. You can rebuild
-this image using
-
-```bash
-docker-compose build centos-python-manylinux1
-```
-
-For each dependency, we have a bash script in the directory `scripts/` that
-downloads the sources, builds and installs them. At the end of each dependency
-build the sources are removed again so that only the binary installation of a
-dependency is persisted in the docker image. When you do local adjustments to
-this image, you need to change the name of the docker image in the `docker run`
-command.
-
-### Publishing a new build image
-
-If you have write access to the Docker Hub Ursa Labs account, you can directly
-publish a build image that you built locally.
-
-```bash
-$ docker-compose push centos-python-manylinux1
-```
-
-### Using quay.io to trigger and build the docker image
-
-The used images under the docker-compose setup can be freely changed, currently
-the images are hosted on dockerhub.
-
-1.  Make the change in the build scripts (eg. to modify the boost build, update
-    `scripts/boost.sh`).
-
-2.  Setup an account on quay.io and link to your GitHub account
-
-3.  In quay.io,  Add a new repository using :
-
-    1.  Link to GitHub repository push
-    2.  Trigger build on changes to a specific branch (eg. myquay) of the repo
-        (eg. `pravindra/arrow`)
-    3.  Set Dockerfile location to `/python/manylinux1/Dockerfile-x86_64_base`
-    4.  Set Context location to `/python/manylinux1`
-
-4.  Push change (in step 1) to the branch specified in step 3.ii
-
-    *  This should trigger a build in quay.io, the build takes about 2 hrs to
-       finish.
-
-5.  Add a tag `latest` to the build after step 4 finishes, save the build ID
-    (eg. `quay.io/pravindra/arrow_manylinux1_x86_64_base:latest`)
-
-6.  In your arrow PR,
-
-    *  include the change from 1.
-    *  modify the docker-compose.yml's python-manylinux1 entryo to switch to
-       the location from step 5 for the docker image.
-
-## TensorFlow compatible wheels for Arrow
-
-As TensorFlow is not compatible with the manylinux1 standard, the above
-wheels can cause segfaults if they are used together with the TensorFlow wheels
-from https://www.tensorflow.org/install/pip. We do not recommend using
-TensorFlow wheels with pyarrow manylinux1 wheels until these incompatibilities
-are addressed by the TensorFlow team [1]. For most end-users, the recommended
-way to use Arrow together with TensorFlow is through conda.
-If this is not an option for you, there is also a way to produce TensorFlow
-compatible Arrow wheels that however do not conform to the manylinux1 standard
-and are not officially supported by the Arrow community.
-
-Similar to the manylinux1 wheels, there is a base image that can be built with
-
-```bash
-docker build -t arrow_linux_x86_64_base -f Dockerfile-x86_64_ubuntu .
-```
-
-Once the image has been built, you can then build the wheels with the following
-command:
-
-```bash
-# Build the python packages
-sudo docker run --env UBUNTU_WHEELS=1 --env PYTHON_VERSION="3.7" --rm -t -i -v $PWD:/io -v $PWD/../../:/arrow arrow_linux_x86_64_base:latest /io/build_arrow.sh
-# Now the new packages are located in the dist/ folder
-ls -l dist/
-echo "Please note that these wheels are not manylinux1 compliant"
-```
-
-[1] https://groups.google.com/a/tensorflow.org/d/topic/developers/TMqRaT-H2bI/discussion
diff --git a/python/manylinux1/build_arrow.sh b/python/manylinux1/build_arrow.sh
deleted file mode 100755
index f72dc6c..0000000
--- a/python/manylinux1/build_arrow.sh
+++ /dev/null
@@ -1,176 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-#
-# Build upon the scripts in https://github.com/matthew-brett/manylinux-builds
-# * Copyright (c) 2013-2016, Matt Terry and Matthew Brett (BSD 2-clause)
-#
-# Usage:
-#   either build:
-#     $ docker-compose build python-manylinux1
-#   or pull:
-#     $ docker-compose pull python-manylinux1
-#   and then run:
-#     $ docker-compose run -e PYTHON_VERSION=3.7 python-manylinux1
-
-source /multibuild/manylinux_utils.sh
-
-# Quit on failure
-set -e
-
-# Print commands for debugging
-# set -x
-
-cd /arrow/python
-
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-
-# PyArrow build configuration
-export PYARROW_BUILD_TYPE='release'
-export PYARROW_CMAKE_GENERATOR='Ninja'
-export PYARROW_PARALLEL=${NCORES}
-
-# ARROW-6860: Disabling ORC in wheels until Protobuf static linking issues
-# across projects is resolved
-export PYARROW_WITH_ORC=0
-export PYARROW_WITH_HDFS=1
-export PYARROW_WITH_PARQUET=1
-export PYARROW_WITH_PLASMA=1
-export PYARROW_WITH_S3=1
-export PYARROW_BUNDLE_ARROW_CPP=1
-export PYARROW_BUNDLE_BOOST=1
-export PYARROW_BOOST_NAMESPACE=arrow_boost
-export PKG_CONFIG_PATH=/usr/lib/pkgconfig:/arrow-dist/lib/pkgconfig
-
-export PYARROW_CMAKE_OPTIONS='-DTHRIFT_HOME=/usr -DBoost_NAMESPACE=arrow_boost -DBOOST_ROOT=/arrow_boost_dist'
-# Ensure the target directory exists
-mkdir -p /io/dist
-
-# Must pass PYTHON_VERSION env variable
-# possible values are: 3.6 3.7 3.8 3.9
-
-UNICODE_WIDTH=32  # Dummy value, irrelevant for Python 3
-CPYTHON_PATH="$(cpython_path ${PYTHON_VERSION} ${UNICODE_WIDTH})"
-PYTHON_INTERPRETER="${CPYTHON_PATH}/bin/python"
-PIP="${CPYTHON_PATH}/bin/pip"
-# Put our Python first to avoid picking up an antiquated Python from CMake
-PATH="${CPYTHON_PATH}/bin:${PATH}"
-
-echo "=== (${PYTHON_VERSION}) Install the wheel build dependencies ==="
-$PIP install -r requirements-wheel-build.txt
-
-export PYARROW_INSTALL_TESTS=1
-export PYARROW_WITH_DATASET=1
-export PYARROW_WITH_FLIGHT=1
-export PYARROW_WITH_GANDIVA=0
-export BUILD_ARROW_DATASET=ON
-export BUILD_ARROW_FLIGHT=ON
-export BUILD_ARROW_GANDIVA=OFF
-
-echo "=== (${PYTHON_VERSION}) Building Arrow C++ libraries ==="
-ARROW_BUILD_DIR=/tmp/build-PY${PYTHON_VERSION}
-mkdir -p "${ARROW_BUILD_DIR}"
-pushd "${ARROW_BUILD_DIR}"
-cmake \
-    -DCMAKE_BUILD_TYPE=Release \
-    -DARROW_BOOST_USE_SHARED=ON \
-    -DARROW_BROTLI_USE_SHARED=OFF \
-    -DARROW_BUILD_SHARED=ON \
-    -DARROW_BUILD_STATIC=OFF \
-    -DARROW_BUILD_TESTS=OFF \
-    -DARROW_DATASET=${BUILD_ARROW_DATASET} \
-    -DARROW_DEPENDENCY_SOURCE="SYSTEM" \
-    -DARROW_DEPENDENCY_USE_SHARED=OFF \
-    -DARROW_FLIGHT=${BUILD_ARROW_FLIGHT} \
-    -DARROW_GANDIVA_JAVA=OFF \
-    -DARROW_GANDIVA_PC_CXX_FLAGS="-isystem;/opt/rh/devtoolset-2/root/usr/include/c++/4.8.2;-isystem;/opt/rh/devtoolset-2/root/usr/include/c++/4.8.2/x86_64-CentOS-linux/" \
-    -DARROW_GANDIVA=${BUILD_ARROW_GANDIVA} \
-    -DARROW_HDFS=ON \
-    -DARROW_JEMALLOC=ON \
-    -DARROW_ORC=OFF \
-    -DARROW_PACKAGE_KIND=manylinux1 \
-    -DARROW_PARQUET=ON \
-    -DARROW_PLASMA=ON \
-    -DARROW_PYTHON=ON \
-    -DARROW_RPATH_ORIGIN=ON \
-    -DARROW_S3=ON \
-    -DARROW_TENSORFLOW=ON \
-    -DARROW_UTF8PROC_USE_SHARED=OFF \
-    -DARROW_WITH_BROTLI=ON \
-    -DARROW_WITH_BZ2=ON \
-    -DARROW_WITH_LZ4=ON \
-    -DARROW_WITH_SNAPPY=ON \
-    -DARROW_WITH_ZLIB=ON \
-    -DARROW_WITH_ZSTD=ON \
-    -DBoost_NAMESPACE=arrow_boost \
-    -DBOOST_ROOT=/arrow_boost_dist \
-    -DCMAKE_INSTALL_LIBDIR=lib \
-    -DCMAKE_INSTALL_PREFIX=/arrow-dist \
-    -DCMAKE_UNITY_BUILD=ON \
-    -DOPENSSL_USE_STATIC_LIBS=ON \
-    -DORC_SOURCE=BUNDLED \
-    -GNinja /arrow/cpp
-ninja
-ninja install
-popd
-
-# Check that we don't expose any unwanted symbols
-/io/scripts/check_arrow_visibility.sh
-
-# Clear output directories and leftovers
-rm -rf dist/
-rm -rf build/
-rm -rf repaired_wheels/
-find -name "*.so" -delete
-
-echo "=== (${PYTHON_VERSION}) Building wheel ==="
-PATH="$PATH:${CPYTHON_PATH}/bin" $PYTHON_INTERPRETER setup.py build_ext --inplace
-PATH="$PATH:${CPYTHON_PATH}/bin" $PYTHON_INTERPRETER setup.py bdist_wheel
-# Source distribution is used for debian pyarrow packages.
-PATH="$PATH:${CPYTHON_PATH}/bin" $PYTHON_INTERPRETER setup.py sdist
-
-if [ -n "$UBUNTU_WHEELS" ]; then
-  echo "=== (${PYTHON_VERSION}) Wheels are not compatible with manylinux1 ==="
-  mv dist/pyarrow-*.whl /io/dist
-else
-  echo "=== (${PYTHON_VERSION}) Tag the wheel with manylinux1 ==="
-  mkdir -p repaired_wheels/
-  auditwheel repair -L . dist/pyarrow-*.whl -w repaired_wheels/
-
-  # Install the built wheels
-  $PIP install repaired_wheels/*.whl
-
-  # Test that the modules are importable
-  $PYTHON_INTERPRETER -c "
-import pyarrow
-import pyarrow.csv
-import pyarrow.dataset
-import pyarrow.flight
-import pyarrow.fs
-import pyarrow._hdfs
-import pyarrow.json
-import pyarrow.parquet
-import pyarrow.plasma
-import pyarrow._s3fs
-  "
-
-  # More thorough testing happens outside of the build to prevent
-  # packaging issues like ARROW-4372
-  mv dist/*.tar.gz /io/dist
-  mv repaired_wheels/*.whl /io/dist
-fi
diff --git a/python/manylinux1/scripts/build_absl.sh b/python/manylinux1/scripts/build_absl.sh
deleted file mode 100755
index 89d988c..0000000
--- a/python/manylinux1/scripts/build_absl.sh
+++ /dev/null
@@ -1,35 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export ABSL_VERSION="20200225.2"
-export CFLAGS="-fPIC"
-export CXXFLAGS="-fPIC"
-
-curl -sL "https://github.com/abseil/abseil-cpp/archive/${ABSL_VERSION}.tar.gz" -o ${ABSL_VERSION}.tar.gz
-tar xf ${ABSL_VERSION}.tar.gz
-pushd abseil-cpp-${ABSL_VERSION}
-
-cmake . -GNinja \
-    -DCMAKE_BUILD_TYPE=Release \
-    -DCMAKE_INSTALL_PREFIX=/usr \
-    -DABSL_RUN_TESTS=OFF \
-    -DCMAKE_CXX_STANDARD=11
-
-ninja install
-popd
-rm -rf abseil-cpp-${ABSL_VERSION} ${ABSL_VERSION}.tar.gz
diff --git a/python/manylinux1/scripts/build_aws_sdk.sh b/python/manylinux1/scripts/build_aws_sdk.sh
deleted file mode 100755
index f33f3e4..0000000
--- a/python/manylinux1/scripts/build_aws_sdk.sh
+++ /dev/null
@@ -1,48 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export AWS_SDK_VERSION="1.7.356"
-# Avoid compilation error due to a spurious warning
-export CFLAGS="-Wno-unused-parameter"
-export PREFIX="/usr/local"
-
-curl -sL "https://github.com/aws/aws-sdk-cpp/archive/${AWS_SDK_VERSION}.tar.gz" -o aws-sdk-cpp-${AWS_SDK_VERSION}.tar.gz
-tar xf aws-sdk-cpp-${AWS_SDK_VERSION}.tar.gz
-pushd aws-sdk-cpp-${AWS_SDK_VERSION}
-
-mkdir build
-pushd build
-
-cmake .. -GNinja \
-    -DCMAKE_BUILD_TYPE=Release \
-    -DCMAKE_PREFIX_PATH=${CURL_HOME} \
-    -DCMAKE_C_FLAGS=${CFLAGS} \
-    -DCMAKE_CXX_FLAGS=${CFLAGS} \
-    -DCMAKE_INSTALL_PREFIX=${PREFIX} \
-    -DBUILD_ONLY='s3;core;transfer;config;identity-management;sts' \
-    -DBUILD_SHARED_LIBS=OFF \
-    -DENABLE_CURL_LOGGING=ON \
-    -DENABLE_UNITY_BUILD=ON \
-    -DENABLE_TESTING=OFF
-
-ninja install
-
-popd
-popd
-
-rm -r aws-sdk-cpp-${AWS_SDK_VERSION}.tar.gz aws-sdk-cpp-${AWS_SDK_VERSION}
diff --git a/python/manylinux1/scripts/build_bison.sh b/python/manylinux1/scripts/build_bison.sh
deleted file mode 100755
index b937aaa..0000000
--- a/python/manylinux1/scripts/build_bison.sh
+++ /dev/null
@@ -1,29 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-BISON_VERSION=3.3.2
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-
-curl -sL https://ftp.gnu.org/gnu/bison/bison-${BISON_VERSION}.tar.gz -o bison-${BISON_VERSION}.tar.gz
-tar xf bison-${BISON_VERSION}.tar.gz
-pushd bison-${BISON_VERSION}
-./configure --prefix=/usr
-make -j$NCORES
-make install
-popd
-rm -rf bison-${BISON_VERSION} bison-${BISON_VERSION}.tar.gz
diff --git a/python/manylinux1/scripts/build_boost.sh b/python/manylinux1/scripts/build_boost.sh
deleted file mode 100755
index d54a1ef..0000000
--- a/python/manylinux1/scripts/build_boost.sh
+++ /dev/null
@@ -1,50 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-BOOST_VERSION=1.73.0
-BOOST_VERSION_UNDERSCORE=${BOOST_VERSION//\./_}
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-
-BASE_NAME=boost_${BOOST_VERSION_UNDERSCORE}
-ARCHIVE_NAME=boost_${BOOST_VERSION_UNDERSCORE}.tar.bz2
-
-# Bintray is much faster but can fail because of limitations
-curl -sL https://dl.bintray.com/boostorg/release/${BOOST_VERSION}/source/boost_${BOOST_VERSION_UNDERSCORE}.tar.bz2 -o /${ARCHIVE_NAME} \
-    || curl -sL https://sourceforge.net/projects/boost/files/boost/${BOOST_VERSION}/boost_${BOOST_VERSION_UNDERSCORE}.tar.bz2 -o /${ARCHIVE_NAME}
-
-tar xf ${ARCHIVE_NAME}
-mkdir /arrow_boost
-pushd /${BASE_NAME}
-./bootstrap.sh
-./b2 -j${NCORES} tools/bcp
-./dist/bin/bcp --namespace=arrow_boost --namespace-alias filesystem date_time system regex build predef algorithm locale format variant multiprecision/cpp_int /arrow_boost
-popd
-
-pushd /arrow_boost
-ls -l
-./bootstrap.sh
-./b2 -j${NCORES} dll-path="'\$ORIGIN/'" cxxflags='-std=c++11 -fPIC' cflags=-fPIC linkflags="-std=c++11" variant=release link=shared --prefix=/arrow_boost_dist --with-filesystem --with-date_time --with-system --with-regex install
-popd
-
-rm -rf ${ARCHIVE_NAME} ${BASE_NAME} arrow_boost
-# Boost always install header-only parts but they also take up quite some space.
-# We don't need them in array, so don't persist them in the docker layer.
-# fusion 16.7 MiB
-rm -r /arrow_boost_dist/include/boost/fusion
-# spirit 8.2 MiB
-rm -r /arrow_boost_dist/include/boost/spirit
diff --git a/python/manylinux1/scripts/build_brotli.sh b/python/manylinux1/scripts/build_brotli.sh
deleted file mode 100755
index 612927c..0000000
--- a/python/manylinux1/scripts/build_brotli.sh
+++ /dev/null
@@ -1,34 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export BROTLI_VERSION="1.0.7"
-curl -sL "https://github.com/google/brotli/archive/v${BROTLI_VERSION}.tar.gz" -o brotli-${BROTLI_VERSION}.tar.gz
-tar xf brotli-${BROTLI_VERSION}.tar.gz
-pushd brotli-${BROTLI_VERSION}
-mkdir build
-pushd build
-cmake -DCMAKE_BUILD_TYPE=release \
-    "-DCMAKE_CXX_FLAGS=-fPIC" \
-    "-DCMAKE_C_FLAGS=-fPIC" \
-    -DCMAKE_INSTALL_PREFIX=/usr/local \
-    -GNinja \
-    ..
-ninja install
-popd
-popd
-rm -rf brotli-${BROTLI_VERSION}.tar.gz brotli-${BROTLI_VERSION}
diff --git a/python/manylinux1/scripts/build_bz2.sh b/python/manylinux1/scripts/build_bz2.sh
deleted file mode 100755
index 84fd705..0000000
--- a/python/manylinux1/scripts/build_bz2.sh
+++ /dev/null
@@ -1,30 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-export BZ2_VERSION="1.0.8"
-export CFLAGS="-Wall -Winline -O2 -fPIC -D_FILE_OFFSET_BITS=64"
-
-curl -sL "https://www.sourceware.org/pub/bzip2/bzip2-${BZ2_VERSION}.tar.gz" -o bzip2-${BZ2_VERSION}.tar.gz
-tar xf bzip2-${BZ2_VERSION}.tar.gz
-
-pushd bzip2-${BZ2_VERSION}
-make PREFIX=/usr/local CFLAGS="$CFLAGS" install -j${NCORES}
-popd
-
-rm -rf bzip2-${BZ2_VERSION}.tar.gz bzip2-${BZ2_VERSION}
diff --git a/python/manylinux1/scripts/build_cares.sh b/python/manylinux1/scripts/build_cares.sh
deleted file mode 100755
index 5848eaf..0000000
--- a/python/manylinux1/scripts/build_cares.sh
+++ /dev/null
@@ -1,34 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export CARES_VERSION="1.16.1"
-export CFLAGS="-fPIC"
-export PREFIX="/usr/local"
-curl -sL "https://c-ares.haxx.se/download/c-ares-$CARES_VERSION.tar.gz" -o c-ares-${CARES_VERSION}.tar.gz
-tar xf c-ares-${CARES_VERSION}.tar.gz
-pushd c-ares-${CARES_VERSION}
-
-cmake -DCMAKE_BUILD_TYPE=Release \
-      -DCMAKE_INSTALL_PREFIX=${PREFIX} \
-      -DCARES_STATIC=ON \
-      -DCARES_SHARED=OFF \
-      -DCMAKE_C_FLAGS=${CFLAGS} \
-      -GNinja .
-ninja install
-popd
-rm -rf c-ares-${CARES_VERSION}.tar.gz c-ares-${CARES_VERSION}
diff --git a/python/manylinux1/scripts/build_ccache.sh b/python/manylinux1/scripts/build_ccache.sh
deleted file mode 100755
index 9dc0336..0000000
--- a/python/manylinux1/scripts/build_ccache.sh
+++ /dev/null
@@ -1,32 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-CCACHE_VERSION=3.6
-
-curl -sLO https://www.samba.org/ftp/ccache/ccache-${CCACHE_VERSION}.tar.bz2
-tar xf ccache-${CCACHE_VERSION}.tar.bz2
-pushd ccache-${CCACHE_VERSION}
-./configure --prefix=/usr
-make -j$NCORES
-make install
-popd
-rm -rf ccache-${CCACHE_VERSION}.tar.bz2 ccache-${CCACHE_VERSION}
-
-# Initialize the config directory, otherwise the build sometimes fails.
-mkdir /root/.ccache
diff --git a/python/manylinux1/scripts/build_curl.sh b/python/manylinux1/scripts/build_curl.sh
deleted file mode 100755
index d673e83..0000000
--- a/python/manylinux1/scripts/build_curl.sh
+++ /dev/null
@@ -1,55 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export CURL_VERSION="7.70.0"
-# Install our curl in a separate directory to distinguish from the already
-# existing /usr/local/libcurl.so (used by git tools)
-export PREFIX="/opt/curl"
-
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-
-curl -sL "http://curl.haxx.se/download/curl-${CURL_VERSION}.tar.bz2" -o curl-${CURL_VERSION}.tar.bz2
-tar xf curl-${CURL_VERSION}.tar.bz2
-pushd curl-${CURL_VERSION}
-
-./configure \
-    --prefix=${PREFIX} \
-    --disable-ldap \
-    --disable-ldaps \
-    --disable-rtsp \
-    --disable-telnet \
-    --disable-tftp \
-    --disable-pop3 \
-    --disable-imap \
-    --disable-smb \
-    --disable-smtp \
-    --disable-gopher \
-    --disable-mqtt \
-    --disable-manual \
-    --disable-shared \
-    --without-ca-bundle \
-    --without-ca-path \
-    --with-ssl=/usr/local \
-    --with-zlib=/usr/local
-
-make -j${NCORES}
-make install
-
-popd
-
-rm -r curl-${CURL_VERSION}.tar.bz2 curl-${CURL_VERSION}
diff --git a/python/manylinux1/scripts/build_flatbuffers.sh b/python/manylinux1/scripts/build_flatbuffers.sh
deleted file mode 100755
index 7aaaa60..0000000
--- a/python/manylinux1/scripts/build_flatbuffers.sh
+++ /dev/null
@@ -1,32 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export FLATBUFFERS_VERSION=1.10.0
-curl -sL https://github.com/google/flatbuffers/archive/v${FLATBUFFERS_VERSION}.tar.gz \
-    -o flatbuffers-${FLATBUFFERS_VERSION}.tar.gz
-tar xf flatbuffers-${FLATBUFFERS_VERSION}.tar.gz
-pushd flatbuffers-${FLATBUFFERS_VERSION}
-cmake \
-    "-DCMAKE_CXX_FLAGS=-fPIC" \
-    "-DCMAKE_INSTALL_PREFIX:PATH=/usr" \
-    -DFLATBUFFERS_BUILD_TESTS=OFF \
-    -DCMAKE_BUILD_TYPE=Release \
-    -GNinja .
-ninja install
-popd
-rm -rf flatbuffers-${FLATBUFFERS_VERSION}.tar.gz flatbuffers-${FLATBUFFERS_VERSION}
diff --git a/python/manylinux1/scripts/build_gflags.sh b/python/manylinux1/scripts/build_gflags.sh
deleted file mode 100755
index b26c14b..0000000
--- a/python/manylinux1/scripts/build_gflags.sh
+++ /dev/null
@@ -1,39 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export GFLAGS_VERSION="2.2.2"
-export CFLAGS="-fPIC"
-export CXXFLAGS="-fPIC"
-
-curl -sL "https://github.com/gflags/gflags/archive/v${GFLAGS_VERSION}.tar.gz" -o gflags-${GFLAGS_VERSION}.tar.gz
-tar xf gflags-${GFLAGS_VERSION}.tar.gz
-pushd gflags-${GFLAGS_VERSION}
-
-cmake . -GNinja \
-    -DCMAKE_BUILD_TYPE=Release \
-    -DCMAKE_INSTALL_PREFIX=/usr \
-    -DINSTALL_HEADERS=ON \
-    -DBUILD_SHARED_LIBS=OFF \
-    -DBUILD_STATIC_LIBS=ON \
-    -DBUILD_PACKAGING=OFF \
-    -DBUILD_TESTING=OFF \
-    -DBUILD_CONFIG_TESTS=OFF \
-
-ninja install
-popd
-rm -rf gflags-${GFLAGS_VERSION}.tar.gz gflags-${GFLAGS_VERSION}
diff --git a/python/manylinux1/scripts/build_glog.sh b/python/manylinux1/scripts/build_glog.sh
deleted file mode 100755
index 4d43d36..0000000
--- a/python/manylinux1/scripts/build_glog.sh
+++ /dev/null
@@ -1,35 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export GLOG_VERSION="0.4.0"
-export PREFIX="/usr/local"
-curl -sL "https://github.com/google/glog/archive/v${GLOG_VERSION}.tar.gz" -o glog-${GLOG_VERSION}.tar.gz
-tar xf glog-${GLOG_VERSION}.tar.gz
-pushd glog-${GLOG_VERSION}
-
-cmake -DCMAKE_BUILD_TYPE=Release \
-      -DCMAKE_INSTALL_PREFIX=${PREFIX} \
-      -DCMAKE_POSITION_INDEPENDENT_CODE=1 \
-      -DBUILD_SHARED_LIBS=OFF \
-      -DBUILD_TESTING=OFF \
-      -DWITH_GFLAGS=OFF \
-      -GNinja .
-ninja install
-popd
-rm -rf glog-${GLOG_VERSION}.tar.gz glog-${GLOG_VERSION}
-
diff --git a/python/manylinux1/scripts/build_grpc.sh b/python/manylinux1/scripts/build_grpc.sh
deleted file mode 100755
index ef2e831..0000000
--- a/python/manylinux1/scripts/build_grpc.sh
+++ /dev/null
@@ -1,49 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export GRPC_VERSION="1.29.1"
-export CFLAGS="-fPIC -DGPR_MANYLINUX1=1"
-export PREFIX="/usr/local"
-
-curl -sL "https://github.com/grpc/grpc/archive/v${GRPC_VERSION}.tar.gz" -o grpc-${GRPC_VERSION}.tar.gz
-tar xf grpc-${GRPC_VERSION}.tar.gz
-pushd grpc-${GRPC_VERSION}
-
-cmake -DCMAKE_BUILD_TYPE=Release \
-      -DCMAKE_INSTALL_PREFIX=${PREFIX} \
-      -DBUILD_SHARED_LIBS=OFF \
-      -DCMAKE_C_FLAGS="${CFLAGS}" \
-      -DCMAKE_CXX_FLAGS="${CFLAGS}" \
-      -DgRPC_BUILD_CSHARP_EXT=OFF \
-      -DgRPC_BUILD_GRPC_CSHARP_PLUGIN=OFF \
-      -DgRPC_BUILD_GRPC_NODE_PLUGIN=OFF \
-      -DgRPC_BUILD_GRPC_OBJECTIVE_C_PLUGIN=OFF \
-      -DgRPC_BUILD_GRPC_PHP_PLUGIN=OFF \
-      -DgRPC_BUILD_GRPC_PYTHON_PLUGIN=OFF \
-      -DgRPC_BUILD_GRPC_RUBY_PLUGIN=OFF \
-      -DgRPC_ABSL_PROVIDER=package \
-      -DgRPC_CARES_PROVIDER=package \
-      -DgRPC_GFLAGS_PROVIDER=package \
-      -DgRPC_PROTOBUF_PROVIDER=package \
-      -DgRPC_SSL_PROVIDER=package \
-      -DgRPC_ZLIB_PROVIDER=package \
-      -DOPENSSL_USE_STATIC_LIBS=ON \
-      -GNinja .
-ninja install
-popd
-rm -rf grpc-${GRPC_VERSION}.tar.gz grpc-${GRPC_VERSION}
diff --git a/python/manylinux1/scripts/build_gtest.sh b/python/manylinux1/scripts/build_gtest.sh
deleted file mode 100755
index 6fd1a5b..0000000
--- a/python/manylinux1/scripts/build_gtest.sh
+++ /dev/null
@@ -1,39 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-GTEST_VERSION=1.8.1
-
-curl -sL https://github.com/google/googletest/archive/release-${GTEST_VERSION}.tar.gz -o googletest-release-${GTEST_VERSION}.tar.gz
-tar xf googletest-release-${GTEST_VERSION}.tar.gz
-ls -l
-pushd googletest-release-${GTEST_VERSION}
-
-mkdir build_so
-pushd build_so
-cmake -DCMAKE_CXX_FLAGS='-fPIC' -Dgtest_force_shared_crt=ON -DBUILD_SHARED_LIBS=ON -DBUILD_GMOCK=ON -GNinja -DCMAKE_INSTALL_PREFIX=/usr ..
-ninja install
-popd
-
-mkdir build_a
-pushd build_a
-cmake -DCMAKE_CXX_FLAGS='-fPIC' -Dgtest_force_shared_crt=ON -DBUILD_SHARED_LIBS=OFF -DBUILD_GMOCK=ON -GNinja -DCMAKE_INSTALL_PREFIX=/usr ..
-ninja install
-popd
-
-popd
-rm -rf googletest-release-${GTEST_VERSION}.tar.gz googletest-release-${GTEST_VERSION}
diff --git a/python/manylinux1/scripts/build_llvm.sh b/python/manylinux1/scripts/build_llvm.sh
deleted file mode 100755
index 4e3077c..0000000
--- a/python/manylinux1/scripts/build_llvm.sh
+++ /dev/null
@@ -1,88 +0,0 @@
-#!/bin/bash -ex
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-LLVM_VERSION_MAJOR=$1
-
-source /multibuild/manylinux_utils.sh
-
-detect_llvm_version() {
-  curl -sL https://api.github.com/repos/llvm/llvm-project/releases | \
-    grep tag_name | \
-    grep -o "llvmorg-${LLVM_VERSION_MAJOR}[^\"]*" | \
-    grep -v rc | \
-    sed -e "s/^llvmorg-//g" | \
-    head -n 1
-}
-
-LLVM_VERSION=$(detect_llvm_version)
-curl -sL https://github.com/llvm/llvm-project/releases/download/llvmorg-${LLVM_VERSION}/llvm-${LLVM_VERSION}.src.tar.xz -o llvm-${LLVM_VERSION}.src.tar.xz
-unxz llvm-${LLVM_VERSION}.src.tar.xz
-tar xf llvm-${LLVM_VERSION}.src.tar
-pushd llvm-${LLVM_VERSION}.src
-mkdir build
-pushd build
-cmake \
-    -DCMAKE_BUILD_TYPE=Release \
-    -DCMAKE_INSTALL_PREFIX=/usr \
-    -DLLVM_TARGETS_TO_BUILD=host \
-    -DLLVM_INCLUDE_DOCS=OFF \
-    -DLLVM_INCLUDE_EXAMPLES=OFF \
-    -DLLVM_INCLUDE_TESTS=OFF \
-    -DLLVM_INCLUDE_UTILS=OFF \
-    -DLLVM_ENABLE_TERMINFO=OFF \
-    -DLLVM_ENABLE_ASSERTIONS=ON \
-    -DLLVM_ENABLE_RTTI=ON \
-    -DLLVM_ENABLE_OCAMLDOC=OFF \
-    -DLLVM_USE_INTEL_JITEVENTS=ON \
-    -DLLVM_TEMPORARILY_ALLOW_OLD_TOOLCHAIN=ON \
-    -DPYTHON_EXECUTABLE="$(cpython_path 3.6)/bin/python" \
-    -GNinja \
-    ..
-ninja install
-popd
-popd
-rm -rf llvm-${LLVM_VERSION}.src.tar.xz llvm-${LLVM_VERSION}.src.tar llvm-${LLVM_VERSION}.src
-
-
-# clang is only used to precompile Gandiva bitcode
-if [ ${LLVM_VERSION_MAJOR} -lt 9 ]; then
-  clang_package_name=cfe
-else
-  clang_package_name=clang
-fi
-curl -sL https://github.com/llvm/llvm-project/releases/download/llvmorg-${LLVM_VERSION}/${clang_package_name}-${LLVM_VERSION}.src.tar.xz -o ${clang_package_name}-${LLVM_VERSION}.src.tar.xz
-unxz ${clang_package_name}-${LLVM_VERSION}.src.tar.xz
-tar xf ${clang_package_name}-${LLVM_VERSION}.src.tar
-pushd ${clang_package_name}-${LLVM_VERSION}.src
-mkdir build
-pushd build
-cmake \
-    -DCMAKE_BUILD_TYPE=Release \
-    -DCMAKE_INSTALL_PREFIX=/usr \
-    -DCLANG_INCLUDE_TESTS=OFF \
-    -DCLANG_INCLUDE_DOCS=OFF \
-    -DLLVM_INCLUDE_TESTS=OFF \
-    -DLLVM_INCLUDE_DOCS=OFF \
-    -DLLVM_TEMPORARILY_ALLOW_OLD_TOOLCHAIN=ON \
-    -GNinja \
-    ..
-ninja -w dupbuild=warn install # both clang and llvm builds generate llvm-config file
-popd
-popd
-rm -rf ${clang_package_name}-${LLVM_VERSION}.src.tar.xz ${clang_package_name}-${LLVM_VERSION}.src.tar ${clang_package_name}-${LLVM_VERSION}.src
diff --git a/python/manylinux1/scripts/build_lz4.sh b/python/manylinux1/scripts/build_lz4.sh
deleted file mode 100755
index 8b26155..0000000
--- a/python/manylinux1/scripts/build_lz4.sh
+++ /dev/null
@@ -1,35 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-export LZ4_VERSION="1.9.2"
-export PREFIX="/usr/local"
-export CFLAGS="${CFLAGS} -O3 -fPIC"
-export LDFLAGS="${LDFLAGS} -Wl,-rpath,${PREFIX}/lib -L${PREFIX}/lib"
-
-curl -sL "https://github.com/lz4/lz4/archive/v${LZ4_VERSION}.tar.gz" -o lz4-${LZ4_VERSION}.tar.gz
-tar xf lz4-${LZ4_VERSION}.tar.gz
-pushd lz4-${LZ4_VERSION}
-
-make -j$NCORES PREFIX=${PREFIX} CFLAGS="${CFLAGS}"
-make install PREFIX=${PREFIX}
-popd
-
-rm -rf lz4-${LZ4_VERSION}.tar.gz lz4-${LZ4_VERSION}
-# We don't want to link against shared libs
-rm -rf ${PREFIX}/lib/liblz4.so*
diff --git a/python/manylinux1/scripts/build_openssl.sh b/python/manylinux1/scripts/build_openssl.sh
deleted file mode 100755
index f2ef594..0000000
--- a/python/manylinux1/scripts/build_openssl.sh
+++ /dev/null
@@ -1,34 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# XXX OpenSSL 1.1.1 needs Perl 5.10 to compile, so stick to 1.0.x
-OPENSSL_VERSION="1.0.2u"
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-
-curl -sL https://www.openssl.org/source/openssl-${OPENSSL_VERSION}.tar.gz -o openssl-${OPENSSL_VERSION}.tar.gz
-tar xf openssl-${OPENSSL_VERSION}.tar.gz
-
-pushd openssl-${OPENSSL_VERSION}
-
-./config -fpic no-shared --prefix=/usr/local
-make -j${NCORES}
-make install
-
-popd
-
-rm -rf openssl-${OPENSSL_VERSION}.tar.gz openssl-${OPENSSL_VERSION}
diff --git a/python/manylinux1/scripts/build_protobuf.sh b/python/manylinux1/scripts/build_protobuf.sh
deleted file mode 100755
index f6153d0..0000000
--- a/python/manylinux1/scripts/build_protobuf.sh
+++ /dev/null
@@ -1,29 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-PROTOBUF_VERSION="3.12.1"
-
-curl -sL "https://github.com/google/protobuf/releases/download/v${PROTOBUF_VERSION}/protobuf-all-${PROTOBUF_VERSION}.tar.gz" -o protobuf-${PROTOBUF_VERSION}.tar.gz
-tar xf protobuf-${PROTOBUF_VERSION}.tar.gz
-pushd protobuf-${PROTOBUF_VERSION}
-./configure --disable-shared --prefix=/usr/local "CXXFLAGS=-O2 -fPIC"
-make -j$NCORES
-make install
-popd
-rm -rf protobuf-${PROTOBUF_VERSION}.tar.gz protobuf-${PROTOBUF_VERSION}
diff --git a/python/manylinux1/scripts/build_python.sh b/python/manylinux1/scripts/build_python.sh
deleted file mode 100755
index a8063b9..0000000
--- a/python/manylinux1/scripts/build_python.sh
+++ /dev/null
@@ -1,218 +0,0 @@
-#!/bin/bash -e
-# Copyright (c) 2016 manylinux
-#
-# Permission is hereby granted, free of charge, to any person obtaining a copy
-# of this software and associated documentation files (the "Software"), to deal
-# in the Software without restriction, including without limitation the rights
-# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-# copies of the Software, and to permit persons to whom the Software is
-# furnished to do so, subject to the following conditions:
-#
-# The above copyright notice and this permission notice shall be included in all
-# copies or substantial portions of the Software.
-#
-# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
-# SOFTWARE.
-
-# The following is taken from docker/build_scripts/build_env.sh,
-# docker/build_scripts/build_utils.sh and
-# docker/build_scripts/build.sh from the manylinux1 project
-# (https://github.com/pypa/manylinux/).
-
-PYTHON_DOWNLOAD_URL=https://www.python.org/ftp/python
-CPYTHON_VERSIONS="3.6.8 3.7.2"
-
-# openssl version to build, with expected sha256 hash of .tar.gz
-# archive.
-OPENSSL_ROOT=openssl-1.0.2q
-OPENSSL_HASH=5744cfcbcec2b1b48629f7354203bc1e5e9b5466998bbccc5b5fcde3b18eb684
-OPENSSL_DOWNLOAD_URL=https://www.openssl.org/source
-
-# Update to slightly newer, verified Git commit:
-# https://github.com/NixOS/patchelf/commit/2a9cefd7d637d160d12dc7946393778fa8abbc58
-PATCHELF_VERSION=2a9cefd7d637d160d12dc7946393778fa8abbc58
-PATCHELF_HASH=12da4727f09be42ae0b54878e1b8e86d85cb7a5b595731cdc1a0a170c4873c6d
-
-CURL_ROOT=curl-7.61.1
-CURL_HASH=eaa812e9a871ea10dbe8e1d3f8f12a64a8e3e62aeab18cb23742e2f1727458ae
-CURL_DOWNLOAD_URL=https://curl.haxx.se/download
-
-AUTOCONF_ROOT=autoconf-2.69
-AUTOCONF_HASH=954bd69b391edc12d6a4a51a2dd1476543da5c6bbf05a95b59dc0dd6fd4c2969
-AUTOCONF_DOWNLOAD_URL=https://ftp.gnu.org/gnu/autoconf
-AUTOMAKE_ROOT=automake-1.16.1
-AUTOMAKE_HASH=608a97523f97db32f1f5d5615c98ca69326ced2054c9f82e65bade7fc4c9dea8
-AUTOMAKE_DOWNLOAD_URL=https://ftp.gnu.org/gnu/automake
-LIBTOOL_ROOT=libtool-2.4.6
-LIBTOOL_HASH=e3bd4d5d3d025a36c21dd6af7ea818a2afcd4dfc1ea5a17b39d7854bcd0c06e3
-LIBTOOL_DOWNLOAD_URL=https://ftp.gnu.org/gnu/libtool
-
-SQLITE_AUTOCONF_VERSION=sqlite-autoconf-3230100
-SQLITE_AUTOCONF_HASH=92842b283e5e744eff5da29ed3c69391de7368fccc4d0ee6bf62490ce555ef25
-SQLITE_AUTOCONF_DOWNLOAD_URL=https://www.sqlite.org/2018
-
-GIT_ROOT=2.19.1
-GIT_HASH=ba2fed9d02e424b735e035c4f2b0bdb168ef0df7e35156b5051d900dc7247787
-GIT_DOWNLOAD_URL=https://github.com/git/git/archive
-
-GET_PIP_URL=https://bootstrap.pypa.io/get-pip.py
-EPEL_RPM_HASH=0dcc89f9bf67a2a515bad64569b7a9615edc5e018f676a578d5fd0f17d3c81d4
-DEVTOOLS_HASH=a8ebeb4bed624700f727179e6ef771dafe47651131a00a78b342251415646acc
-
-function check_var {
-    if [ -z "$1" ]; then
-        echo "required variable not defined"
-        exit 1
-    fi
-}
-
-function lex_pyver {
-    # Echoes Python version string padded with zeros
-    # Thus:
-    # 3.2.1 -> 003002001
-    # 3     -> 003000000
-    echo $1 | awk -F "." '{printf "%03d%03d%03d", $1, $2, $3}'
-}
-
-function pyver_dist_dir {
-    # Echoes the dist directory name of given pyver, removing alpha/beta prerelease
-    # Thus:
-    # 3.2.1   -> 3.2.1
-    # 3.7.0b4 -> 3.7.0
-    echo $1 | awk -F "." '{printf "%d.%d.%d", $1, $2, $3}'
-}
-
-function do_cpython_build {
-    local py_ver=$1
-    check_var $py_ver
-    local ucs_setting=$2
-    check_var $ucs_setting
-    tar -xzf Python-$py_ver.tgz
-    pushd Python-$py_ver
-    if [ "$ucs_setting" = "none" ]; then
-        unicode_flags=""
-        dir_suffix=""
-    else
-        local unicode_flags="--enable-unicode=$ucs_setting"
-        local dir_suffix="-$ucs_setting"
-    fi
-    local prefix="/opt/_internal/cpython-${py_ver}${dir_suffix}"
-    mkdir -p ${prefix}/lib
-    ./configure --prefix=${prefix} --disable-shared $unicode_flags > /dev/null
-    make -j2 > /dev/null
-    make install > /dev/null
-    popd
-    rm -rf Python-$py_ver
-    # Some python's install as bin/python3. Make them available as
-    # bin/python.
-    if [ -e ${prefix}/bin/python3 ]; then
-        ln -s python3 ${prefix}/bin/python
-    fi
-    # --force-reinstall is to work around:
-    #   https://github.com/pypa/pip/issues/5220
-    #   https://github.com/pypa/get-pip/issues/19
-    ${prefix}/bin/python get-pip.py --force-reinstall
-    if [ -e ${prefix}/bin/pip3 ] && [ ! -e ${prefix}/bin/pip ]; then
-        ln -s pip3 ${prefix}/bin/pip
-    fi
-    # Since we fall back on a canned copy of get-pip.py, we might not have
-    # the latest pip and friends. Upgrade them to make sure.
-    ${prefix}/bin/pip install -U --require-hashes -r ${MY_DIR}/requirements.txt
-    local abi_tag=$(${prefix}/bin/python ${MY_DIR}/python-tag-abi-tag.py)
-    ln -s ${prefix} /opt/python/${abi_tag}
-}
-
-
-function build_cpython {
-    local py_ver=$1
-    check_var $py_ver
-    check_var $PYTHON_DOWNLOAD_URL
-    local py_dist_dir=$(pyver_dist_dir $py_ver)
-    curl -fsSLO $PYTHON_DOWNLOAD_URL/$py_dist_dir/Python-$py_ver.tgz
-    curl -fsSLO $PYTHON_DOWNLOAD_URL/$py_dist_dir/Python-$py_ver.tgz.asc
-    if [ $(lex_pyver $py_ver) -lt $(lex_pyver 3.3) ]; then
-        do_cpython_build $py_ver ucs2
-        do_cpython_build $py_ver ucs4
-    else
-        do_cpython_build $py_ver none
-    fi
-    rm -f Python-$py_ver.tgz
-    rm -f Python-$py_ver.tgz.asc
-}
-
-
-function build_cpythons {
-    check_var $GET_PIP_URL
-    curl -fsSLO $GET_PIP_URL
-    for py_ver in $@; do
-        build_cpython $py_ver
-    done
-    rm -f get-pip.py
-}
-
-function do_openssl_build {
-    ./config no-ssl2 no-shared -fPIC --prefix=/usr/local/ssl > /dev/null
-    make > /dev/null
-    make install_sw > /dev/null
-}
-
-
-function check_required_source {
-    local file=$1
-    check_var ${file}
-    if [ ! -f $file ]; then
-        echo "Required source archive must be prefetched to docker/sources/ with prefetch.sh: $file"
-        return 1
-    fi
-}
-
-
-function fetch_source {
-    # This is called both inside and outside the build context (e.g. in Travis) to prefetch
-    # source tarballs, where curl exists (and works)
-    local file=$1
-    check_var ${file}
-    local url=$2
-    check_var ${url}
-    if [ -f ${file} ]; then
-        echo "${file} exists, skipping fetch"
-    else
-        curl -fsSL -o ${file} ${url}/${file}
-    fi
-}
-
-
-function check_sha256sum {
-    local fname=$1
-    check_var ${fname}
-    local sha256=$2
-    check_var ${sha256}
-
-    echo "${sha256}  ${fname}" > ${fname}.sha256
-    sha256sum -c ${fname}.sha256
-    rm -f ${fname}.sha256
-}
-
-
-function build_openssl {
-    local openssl_fname=$1
-    check_var ${openssl_fname}
-    local openssl_sha256=$2
-    check_var ${openssl_sha256}
-    # Can't use curl here because we don't have it yet, OpenSSL must be prefetched
-    fetch_source ${openssl_fname}.tar.gz ${OPENSSL_DOWNLOAD_URL}
-    check_sha256sum ${openssl_fname}.tar.gz ${openssl_sha256}
-    tar -xzf ${openssl_fname}.tar.gz
-    (cd ${openssl_fname} && do_openssl_build)
-    rm -rf ${openssl_fname} ${openssl_fname}.tar.gz
-}
-
-build_openssl $OPENSSL_ROOT $OPENSSL_HASH
-
-mkdir -p /opt/python
-build_cpythons $CPYTHON_VERSIONS
diff --git a/python/manylinux1/scripts/build_rapidjson.sh b/python/manylinux1/scripts/build_rapidjson.sh
deleted file mode 100755
index 5e012ae..0000000
--- a/python/manylinux1/scripts/build_rapidjson.sh
+++ /dev/null
@@ -1,37 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export RAPIDJSON_VERSION="1a803826f1197b5e30703afe4b9c0e7dd48074f5"
-
-curl -sL "https://github.com/Tencent/rapidjson/archive/${RAPIDJSON_VERSION}.tar.gz" -o rapidjson-${RAPIDJSON_VERSION}.tar.gz
-tar xf rapidjson-${RAPIDJSON_VERSION}.tar.gz
-pushd rapidjson-${RAPIDJSON_VERSION}
-mkdir build
-pushd build
-cmake -GNinja \
-      -DCMAKE_INSTALL_PREFIX=/usr/local \
-      -DRAPIDJSON_HAS_STDSTRING=ON \
-      -DRAPIDJSON_BUILD_TESTS=OFF \
-      -DRAPIDJSON_BUILD_EXAMPLES=OFF \
-      -DRAPIDJSON_BUILD_DOC=OFF \
-      -DCMAKE_BUILD_TYPE=release \
-      ..
-ninja install
-popd
-popd
-rm -rf rapidjson-${RAPIDJSON_VERSION}.tar.gz rapidjson-${RAPIDJSON_VERSION}
diff --git a/python/manylinux1/scripts/build_re2.sh b/python/manylinux1/scripts/build_re2.sh
deleted file mode 100755
index 4886b20..0000000
--- a/python/manylinux1/scripts/build_re2.sh
+++ /dev/null
@@ -1,35 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export RE2_VERSION="2019-08-01"
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-
-curl -sL "https://github.com/google/re2/archive/${RE2_VERSION}.tar.gz" -o re2-${RE2_VERSION}.tar.gz
-tar xf re2-${RE2_VERSION}.tar.gz
-pushd re2-${RE2_VERSION}
-
-export CXXFLAGS="-fPIC -O2 ${CXXFLAGS}"
-export CFLAGS="-fPIC -O2 ${CFLAGS}"
-
-# Build shared libraries
-make prefix=/usr/local -j${NCORES} install
-
-popd
-
-# Need to remove shared library to make sure the static library is picked up by Arrow
-rm -rf re2-${RE2_VERSION}.tar.gz re2-${RE2_VERSION} /usr/local/lib/libre2.so*
diff --git a/python/manylinux1/scripts/build_snappy.sh b/python/manylinux1/scripts/build_snappy.sh
deleted file mode 100755
index b8eea9e..0000000
--- a/python/manylinux1/scripts/build_snappy.sh
+++ /dev/null
@@ -1,31 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export SNAPPY_VERSION="1.1.8"
-curl -sL "https://github.com/google/snappy/archive/${SNAPPY_VERSION}.tar.gz" -o snappy-${SNAPPY_VERSION}.tar.gz
-tar xf snappy-${SNAPPY_VERSION}.tar.gz
-pushd snappy-${SNAPPY_VERSION}
-CXXFLAGS='-DNDEBUG -O2' cmake -GNinja \
-    -DCMAKE_INSTALL_PREFIX=/usr/local \
-    -DCMAKE_POSITION_INDEPENDENT_CODE=1 \
-    -DBUILD_SHARED_LIBS=OFF \
-    -DSNAPPY_BUILD_TESTS=OFF \
-    .
-ninja install
-popd
-rm -rf snappy-${SNAPPY_VERSION}.tar.gz snappy-${SNAPPY_VERSION}
diff --git a/python/manylinux1/scripts/build_thrift.sh b/python/manylinux1/scripts/build_thrift.sh
deleted file mode 100755
index e53d70d..0000000
--- a/python/manylinux1/scripts/build_thrift.sh
+++ /dev/null
@@ -1,54 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export THRIFT_VERSION=0.13.0
-curl -sL https://archive.apache.org/dist/thrift/${THRIFT_VERSION}/thrift-${THRIFT_VERSION}.tar.gz -o thrift-${THRIFT_VERSION}.tar.gz
-tar xf thrift-${THRIFT_VERSION}.tar.gz
-
-pushd thrift-${THRIFT_VERSION}
-mkdir build-tmp
-pushd build-tmp
-
-cmake -DCMAKE_BUILD_TYPE=release \
-    -DCMAKE_CXX_FLAGS=-fPIC \
-    -DCMAKE_C_FLAGS=-fPIC \
-    -DCMAKE_INSTALL_PREFIX=/usr/local \
-    -DCMAKE_INSTALL_RPATH=/usr/local/lib \
-    -DBUILD_EXAMPLES=OFF \
-    -DBUILD_TESTING=OFF \
-    -DWITH_QT4=OFF \
-    -DWITH_AS3=OFF \
-    -DWITH_C_GLIB=OFF \
-    -DWITH_CPP=ON \
-    -DWITH_HASKELL=OFF \
-    -DWITH_JAVA=OFF \
-    -DWITH_JAVASCRIPT=OFF \
-    -DWITH_NODEJS=OFF \
-    -DWITH_PYTHON=OFF \
-    -DWITH_STATIC_LIB=ON \
-    -DWITH_SHARED_LIB=OFF \
-    -DWITH_OPENSSL=OFF \
-    -DBoost_NAMESPACE=arrow_boost \
-    -DBOOST_ROOT=/arrow_boost_dist \
-    -GNinja ..
-ninja install
-
-popd
-popd
-
-rm -rf thrift-${THRIFT_VERSION}.tar.gz thrift-${THRIFT_VERSION}
diff --git a/python/manylinux1/scripts/build_utf8proc.sh b/python/manylinux1/scripts/build_utf8proc.sh
deleted file mode 100755
index d74c36a..0000000
--- a/python/manylinux1/scripts/build_utf8proc.sh
+++ /dev/null
@@ -1,38 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-export UTF8PROC_VERSION="2.5.0"
-export PREFIX="/usr/local"
-
-curl -sL "https://github.com/JuliaStrings/utf8proc/archive/v${UTF8PROC_VERSION}.tar.gz" -o utf8proc-$UTF8PROC_VERSION}.tar.gz
-tar xf utf8proc-$UTF8PROC_VERSION}.tar.gz
-
-pushd utf8proc-${UTF8PROC_VERSION}
-mkdir build
-pushd build
-cmake .. -GNinja \
-  -DBUILD_SHARED_LIBS=OFF \
-  -DCMAKE_BUILD_TYPE=Release \
-  -DCMAKE_INSTALL_PREFIX=${PREFIX}
-
-ninja install
-popd
-popd
-
-rm -rf utf8proc-${UTF8PROC_VERSION}.tar.gz utf8proc-${UTF8PROC_VERSION}
diff --git a/python/manylinux1/scripts/build_zstd.sh b/python/manylinux1/scripts/build_zstd.sh
deleted file mode 100755
index 9e38c70..0000000
--- a/python/manylinux1/scripts/build_zstd.sh
+++ /dev/null
@@ -1,39 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export ZSTD_VERSION="1.4.5"
-
-curl -sL "https://github.com/facebook/zstd/archive/v${ZSTD_VERSION}.tar.gz" -o zstd-${ZSTD_VERSION}.tar.gz
-tar xf zstd-${ZSTD_VERSION}.tar.gz
-pushd zstd-${ZSTD_VERSION}
-mkdir build_cmake
-pushd build_cmake
-
-cmake -GNinja -DCMAKE_BUILD_TYPE=Release \
-    -DCMAKE_INSTALL_PREFIX=/usr/local \
-    -DZSTD_BUILD_PROGRAMS=off \
-    -DZSTD_BUILD_SHARED=off \
-    -DZSTD_BUILD_STATIC=on \
-    -DZSTD_MULTITHREAD_SUPPORT=off \
-    -DCMAKE_POSITION_INDEPENDENT_CODE=1 \
-    ../build/cmake
-ninja install
-
-popd
-popd
-rm -rf zstd-${ZSTD_VERSION}.tar.gz zstd-${ZSTD_VERSION}
diff --git a/python/manylinux1/scripts/check_arrow_visibility.sh b/python/manylinux1/scripts/check_arrow_visibility.sh
deleted file mode 100755
index ed464e0..0000000
--- a/python/manylinux1/scripts/check_arrow_visibility.sh
+++ /dev/null
@@ -1,35 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-nm --demangle --dynamic /arrow-dist/lib/libarrow.so > nm_arrow.log
-
-# Filter out Arrow symbols and see if anything remains.
-# '_init' and '_fini' symbols may or not be present, we don't care.
-# (note we must ignore the grep exit status when no match is found)
-grep ' T ' nm_arrow.log | grep -v -E '(arrow|\b_init\b|\b_fini\b)' | cat - > visible_symbols.log
-
-if [[ -f visible_symbols.log && `cat visible_symbols.log | wc -l` -eq 0 ]]
-then
-    exit 0
-fi
-
-echo "== Unexpected symbols exported by libarrow.so =="
-cat visible_symbols.log
-echo "================================================"
-
-exit 1
diff --git a/python/manylinux1/scripts/python-tag-abi-tag.py b/python/manylinux1/scripts/python-tag-abi-tag.py
deleted file mode 100644
index 212ab54..0000000
--- a/python/manylinux1/scripts/python-tag-abi-tag.py
+++ /dev/null
@@ -1,30 +0,0 @@
-# Copyright (c) 2016 manylinux
-#
-# Permission is hereby granted, free of charge, to any person obtaining a copy
-# of this software and associated documentation files (the "Software"), to deal
-# in the Software without restriction, including without limitation the rights
-# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-# copies of the Software, and to permit persons to whom the Software is
-# furnished to do so, subject to the following conditions:
-#
-# The above copyright notice and this permission notice shall be included in
-# all copies or substantial portions of the Software.
-#
-# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
-# SOFTWARE.
-
-# The following is taken from docker/build_scripts/python-tag-abi-tag.py
-# from the manylinux1 project (https://github.com/pypa/manylinux/).
-
-# Utility script to print the python tag + the abi tag for a Python
-# See PEP 425 for exactly what these are, but an example would be:
-#   cp27-cp27mu
-
-from wheel.pep425tags import get_abbr_impl, get_impl_ver, get_abi_tag
-
-print("{0}{1}-{2}".format(get_abbr_impl(), get_impl_ver(), get_abi_tag()))
diff --git a/python/manylinux1/scripts/requirements.txt b/python/manylinux1/scripts/requirements.txt
deleted file mode 100644
index 38a32df..0000000
--- a/python/manylinux1/scripts/requirements.txt
+++ /dev/null
@@ -1,34 +0,0 @@
-# Copyright (c) 2016 manylinux
-#
-# Permission is hereby granted, free of charge, to any person obtaining a copy
-# of this software and associated documentation files (the "Software"), to deal
-# in the Software without restriction, including without limitation the rights
-# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-# copies of the Software, and to permit persons to whom the Software is
-# furnished to do so, subject to the following conditions:
-#
-# The above copyright notice and this permission notice shall be included in all
-# copies or substantial portions of the Software.
-#
-# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
-# SOFTWARE.
-
-# The following is taken from docker/build_scripts/requirements.txt
-# from the manylinux1 project (https://github.com/pypa/manylinux/).
-
-# pip requirements for all cpythons
-# NOTE: pip has GPG signatures; could download and verify independently.
-pip==19.0.3 \
-    --hash=sha256:6e6f197a1abfb45118dbb878b5c859a0edbdd33fd250100bc015b67fded4b9f2 \
-    --hash=sha256:bd812612bbd8ba84159d9ddc0266b7fbce712fc9bc98c82dee5750546ec8ec64
-wheel==0.31.1 \
-    --hash=sha256:80044e51ec5bbf6c894ba0bc48d26a8c20a9ba629f4ca19ea26ecfcf87685f5f \
-    --hash=sha256:0a2e54558a0628f2145d2fc822137e322412115173e8a2ddbe1c9024338ae83c
-setuptools==40.7.3 \
-    --hash=sha256:4f4acaf06d617dccfd3fbbc9fbd83cf4749759a1fa2bdf589206a3278e0d537a \
-    --hash=sha256:702fdd31cb10a65a94beba1a7d89219a58d2587a349e0a1b7827b133e99ca430
diff --git a/python/manylinux201x/Dockerfile-x86_64_base_2010 b/python/manylinux201x/Dockerfile-x86_64_base_2010
deleted file mode 100644
index d86ef6d..0000000
--- a/python/manylinux201x/Dockerfile-x86_64_base_2010
+++ /dev/null
@@ -1,101 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# See https://quay.io/repository/pypa/manylinux2010_x86_64?tab=tags
-# to update base image.
-FROM quay.io/pypa/manylinux2010_x86_64:2020-12-10-f10f3bb
-
-# Install build dependencies
-RUN yum install -y xz bison ccache flex wget
-RUN yum clean all
-
-# Install up-to-date CMake and Ninja versions
-# A fresh CMake is required to recognize recent Boost versions...
-ADD scripts/install_cmake.sh /
-RUN /install_cmake.sh
-
-ADD scripts/build_zlib.sh /
-RUN /build_zlib.sh
-
-WORKDIR /
-RUN git clone https://github.com/matthew-brett/multibuild.git && cd multibuild && git checkout 8882150df6529658700b66bec124dfb77eefca26
-# Remove unneeded Python versions
-RUN rm -rf /opt/_internal/cpython-2.7.*
-
-ADD scripts/build_openssl.sh /
-RUN /build_openssl.sh
-
-ADD scripts/build_boost.sh /
-RUN /build_boost.sh
-
-ADD scripts/build_thrift.sh /
-RUN /build_thrift.sh
-ENV THRIFT_HOME /usr/local
-
-ADD scripts/build_gflags.sh /
-RUN /build_gflags.sh
-
-ADD scripts/build_protobuf.sh /
-RUN /build_protobuf.sh
-ENV PROTOBUF_HOME /usr/local
-
-ADD scripts/build_cares.sh /
-RUN /build_cares.sh
-
-ADD scripts/build_absl.sh /
-RUN /build_absl.sh
-
-ADD scripts/build_grpc.sh /
-RUN /build_grpc.sh
-
-ADD scripts/build_curl.sh /
-RUN /build_curl.sh
-
-ADD scripts/build_aws_sdk.sh /
-RUN /build_aws_sdk.sh
-
-ADD scripts/build_brotli.sh /
-RUN /build_brotli.sh
-ENV BROTLI_HOME /usr/local
-
-ADD scripts/build_snappy.sh /
-RUN /build_snappy.sh
-ENV SNAPPY_HOME /usr/local
-
-ADD scripts/build_lz4.sh /
-RUN /build_lz4.sh
-ENV LZ4_HOME /usr/local
-
-ADD scripts/build_zstd.sh /
-RUN /build_zstd.sh
-ENV ZSTD_HOME /usr/local
-
-ADD scripts/build_glog.sh /
-RUN /build_glog.sh
-ENV GLOG_HOME /usr/local
-
-ADD scripts/build_rapidjson.sh /
-RUN /build_rapidjson.sh
-
-ADD scripts/build_re2.sh /
-RUN /build_re2.sh
-
-ADD scripts/build_bz2.sh /
-RUN /build_bz2.sh
-
-ADD scripts/build_utf8proc.sh /
-RUN /build_utf8proc.sh
diff --git a/python/manylinux201x/Dockerfile-x86_64_base_2014 b/python/manylinux201x/Dockerfile-x86_64_base_2014
deleted file mode 100644
index f53b439..0000000
--- a/python/manylinux201x/Dockerfile-x86_64_base_2014
+++ /dev/null
@@ -1,101 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# See https://quay.io/repository/pypa/manylinux2014_x86_64?tab=tags
-# to update base image.
-FROM quay.io/pypa/manylinux2014_x86_64:2020-12-10-c7182d6
-
-# Install build dependencies
-RUN yum install -y xz bison ccache flex wget
-RUN yum clean all
-
-# Install up-to-date CMake and Ninja versions
-# A fresh CMake is required to recognize recent Boost versions...
-ADD scripts/install_cmake.sh /
-RUN /install_cmake.sh
-
-ADD scripts/build_zlib.sh /
-RUN /build_zlib.sh
-
-WORKDIR /
-RUN git clone https://github.com/matthew-brett/multibuild.git && cd multibuild && git checkout 8882150df6529658700b66bec124dfb77eefca26
-# Remove unneeded Python versions
-RUN rm -rf /opt/_internal/cpython-2.7.*
-
-ADD scripts/build_openssl.sh /
-RUN /build_openssl.sh
-
-ADD scripts/build_boost.sh /
-RUN /build_boost.sh
-
-ADD scripts/build_thrift.sh /
-RUN /build_thrift.sh
-ENV THRIFT_HOME /usr/local
-
-ADD scripts/build_gflags.sh /
-RUN /build_gflags.sh
-
-ADD scripts/build_protobuf.sh /
-RUN /build_protobuf.sh
-ENV PROTOBUF_HOME /usr/local
-
-ADD scripts/build_cares.sh /
-RUN /build_cares.sh
-
-ADD scripts/build_absl.sh /
-RUN /build_absl.sh
-
-ADD scripts/build_grpc.sh /
-RUN /build_grpc.sh
-
-ADD scripts/build_curl.sh /
-RUN /build_curl.sh
-
-ADD scripts/build_aws_sdk.sh /
-RUN /build_aws_sdk.sh
-
-ADD scripts/build_brotli.sh /
-RUN /build_brotli.sh
-ENV BROTLI_HOME /usr/local
-
-ADD scripts/build_snappy.sh /
-RUN /build_snappy.sh
-ENV SNAPPY_HOME /usr/local
-
-ADD scripts/build_lz4.sh /
-RUN /build_lz4.sh
-ENV LZ4_HOME /usr/local
-
-ADD scripts/build_zstd.sh /
-RUN /build_zstd.sh
-ENV ZSTD_HOME /usr/local
-
-ADD scripts/build_glog.sh /
-RUN /build_glog.sh
-ENV GLOG_HOME /usr/local
-
-ADD scripts/build_rapidjson.sh /
-RUN /build_rapidjson.sh
-
-ADD scripts/build_re2.sh /
-RUN /build_re2.sh
-
-ADD scripts/build_bz2.sh /
-RUN /build_bz2.sh
-
-ADD scripts/build_utf8proc.sh /
-RUN /build_utf8proc.sh
diff --git a/python/manylinux201x/README.md b/python/manylinux201x/README.md
deleted file mode 100644
index d630f69..0000000
--- a/python/manylinux201x/README.md
+++ /dev/null
@@ -1,77 +0,0 @@
-<!---
-  Licensed to the Apache Software Foundation (ASF) under one
-  or more contributor license agreements.  See the NOTICE file
-  distributed with this work for additional information
-  regarding copyright ownership.  The ASF licenses this file
-  to you under the Apache License, Version 2.0 (the
-  "License"); you may not use this file except in compliance
-  with the License.  You may obtain a copy of the License at
-
-    http://www.apache.org/licenses/LICENSE-2.0
-
-  Unless required by applicable law or agreed to in writing,
-  software distributed under the License is distributed on an
-  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-  KIND, either express or implied.  See the License for the
-  specific language governing permissions and limitations
-  under the License.
--->
-
-## Manylinux201x wheels for Apache Arrow
-
-This folder provides base Docker images and an infrastructure to build
-`manylinux2010` and `manylinux2014` compatible Python wheels that should be installable on all
-Linux distributions published in last several years.
-
-The process is split up in two parts:
-
-1. There are base Docker images that contain the build dependencies for
-   Arrow.  Those images do not need to be rebuilt frequently, and are hosted
-   on the public Docker Hub service.
-
-2. Based on on these images, there is a bash script (`build_arrow.sh`) that
-   the PyArrow wheels for all supported Python versions, and place them
-   in the `dist` folder.
-
-### Building PyArrow
-
-You can build the PyArrow wheels by running the following command in this
-directory (this is for Python 3.7, similarly you can pass another value
-for `PYTHON_VERSION`):
-
-```bash
-# Build the python packages
-docker-compose run -e PYTHON_VERSION="3.7" centos-python-manylinux2010
-# Now the new packages are located in the dist/ folder
-ls -l dist/
-```
-
-You can do the same for `manylinux2014` by substituting `centos-python-manylinux2014`.
-
-### Re-building the build image
-
-In case you want to make changes to the base Docker image (for example because
-you want to update a dependency to a new version), you must re-build it.
-The Docker configuration is in `Dockerfile-x86_64_base_2010` and `Dockerfile-x86_64_base_2014`,
-and it calls into scripts stored under the `scripts` directory.
-
-```bash
-docker-compose build centos-python-manylinux2010
-```
-
-For each dependency, a bash script in the `scripts/` directory downloads the
-source code, builds it and installs the build results.  At the end of each
-dependency build the sources are removed again so that only the binary
-installation of a dependency is persisted in the Docker image.
-
-### Publishing a new build image
-
-If you have write access to the Docker Hub Ursa Labs account, you can directly
-publish a build image that you built locally.
-
-```bash
-$ docker push centos-python-manylinux2010
-The push refers to repository [arrowdev/arrow_manylinux2010_x86_64_base]
-a1ab88d27acc: Pushing [==============>                                    ]  492.5MB/1.645GB
-[... etc. ...]
-```
diff --git a/python/manylinux201x/build_arrow.sh b/python/manylinux201x/build_arrow.sh
deleted file mode 100755
index d87d256..0000000
--- a/python/manylinux201x/build_arrow.sh
+++ /dev/null
@@ -1,173 +0,0 @@
-#!/bin/bash
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-#
-# Build upon the scripts in https://github.com/matthew-brett/manylinux-builds
-# * Copyright (c) 2013-2019, Matt Terry and Matthew Brett (BSD 2-clause)
-#
-# Usage:
-#   either build:
-#     $ docker-compose build centos-python-manylinux2010
-#   or pull:
-#     $ docker-compose pull centos-python-manylinux2010
-#   and then run:
-#     $ docker-compose run -e PYTHON_VERSION=3.7 centos-python-manylinux2010
-# Can use either manylinux2010 or manylinux2014
-
-source /multibuild/manylinux_utils.sh
-
-# Quit on failure
-set -e
-
-# Print commands for debugging
-# set -x
-
-cd /arrow/python
-
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-
-# PyArrow build configuration
-export PYARROW_BUILD_TYPE='release'
-export PYARROW_CMAKE_GENERATOR='Ninja'
-export PYARROW_PARALLEL=${NCORES}
-
-# ARROW-6860: Disabling ORC in wheels until Protobuf static linking issues
-# across projects is resolved
-export PYARROW_WITH_ORC=0
-export PYARROW_WITH_HDFS=1
-export PYARROW_WITH_PARQUET=1
-export PYARROW_WITH_PLASMA=1
-export PYARROW_WITH_S3=1
-export PYARROW_BUNDLE_ARROW_CPP=1
-# Boost is only a compile-time dependency for wheels => no need to bundle .so's
-export PYARROW_BUNDLE_BOOST=0
-export PKG_CONFIG_PATH=/usr/lib/pkgconfig:/arrow-dist/lib/pkgconfig
-
-# Ensure the target directory exists
-mkdir -p /io/dist
-
-# Must pass PYTHON_VERSION env variable
-# possible values are: 3.6 3.7 3.8 3.9
-
-UNICODE_WIDTH=32  # Dummy value, irrelevant for Python 3
-CPYTHON_PATH="$(cpython_path ${PYTHON_VERSION} ${UNICODE_WIDTH})"
-PYTHON_INTERPRETER="${CPYTHON_PATH}/bin/python"
-PIP="${CPYTHON_PATH}/bin/pip"
-PATH="${PATH}:${CPYTHON_PATH}"
-
-# Will be "manylinux2010" or "manylinux2014"
-manylinux_kind=$(${PYTHON_INTERPRETER} -c "import os; print(os.environ['AUDITWHEEL_PLAT'].split('_')[0], end='')")
-
-echo "=== (${PYTHON_VERSION}) Install the wheel build dependencies ==="
-$PIP install -r requirements-wheel-build.txt
-
-export PYARROW_INSTALL_TESTS=1
-export PYARROW_WITH_DATASET=1
-export PYARROW_WITH_FLIGHT=1
-export PYARROW_WITH_GANDIVA=0
-export BUILD_ARROW_DATASET=ON
-export BUILD_ARROW_FLIGHT=ON
-export BUILD_ARROW_GANDIVA=OFF
-
-echo "=== (${PYTHON_VERSION}) Building Arrow C++ libraries ==="
-ARROW_BUILD_DIR=/tmp/build-PY${PYTHON_VERSION}
-mkdir -p "${ARROW_BUILD_DIR}"
-pushd "${ARROW_BUILD_DIR}"
-PATH="${CPYTHON_PATH}/bin:${PATH}" cmake \
-    -DARROW_BOOST_USE_SHARED=ON \
-    -DARROW_BROTLI_USE_SHARED=OFF \
-    -DARROW_BUILD_SHARED=ON \
-    -DARROW_BUILD_STATIC=OFF \
-    -DARROW_BUILD_TESTS=OFF \
-    -DARROW_DATASET=${BUILD_ARROW_DATASET} \
-    -DARROW_DEPENDENCY_SOURCE="SYSTEM" \
-    -DARROW_DEPENDENCY_USE_SHARED=OFF \
-    -DARROW_FLIGHT=${BUILD_ARROW_FLIGHT} \
-    -DARROW_GANDIVA_JAVA=OFF \
-    -DARROW_GANDIVA_PC_CXX_FLAGS="-isystem;/opt/rh/devtoolset-8/root/usr/include/c++/8/;-isystem;/opt/rh/devtoolset-8/root/usr/include/c++/8/x86_64-redhat-linux/" \
-    -DARROW_GANDIVA=${BUILD_ARROW_GANDIVA} \
-    -DARROW_HDFS=ON \
-    -DARROW_JEMALLOC=ON \
-    -DARROW_ORC=OFF \
-    -DARROW_PACKAGE_KIND=${manylinux_kind} \
-    -DARROW_PARQUET=ON \
-    -DARROW_PLASMA=ON \
-    -DARROW_PYTHON=ON \
-    -DARROW_RPATH_ORIGIN=ON \
-    -DARROW_S3=ON \
-    -DARROW_TENSORFLOW=ON \
-    -DARROW_UTF8PROC_USE_SHARED=OFF \
-    -DARROW_WITH_BROTLI=ON \
-    -DARROW_WITH_BZ2=ON \
-    -DARROW_WITH_LZ4=ON \
-    -DARROW_WITH_SNAPPY=ON \
-    -DARROW_WITH_ZLIB=ON \
-    -DARROW_WITH_ZSTD=ON \
-    -DBoost_NAMESPACE=arrow_boost \
-    -DBOOST_ROOT=/arrow_boost_dist \
-    -DCMAKE_BUILD_TYPE=Release \
-    -DCMAKE_INSTALL_LIBDIR=lib \
-    -DCMAKE_INSTALL_PREFIX=/arrow-dist \
-    -DCMAKE_UNITY_BUILD=ON \
-    -DOPENSSL_USE_STATIC_LIBS=ON \
-    -DORC_SOURCE=BUNDLED \
-    -DZLIB_ROOT=/usr/local \
-    -GNinja /arrow/cpp
-ninja install
-popd
-
-# Check that we don't expose any unwanted symbols
-/io/scripts/check_arrow_visibility.sh
-
-# Clear output directories and leftovers
-rm -rf dist/
-rm -rf build/
-rm -rf repaired_wheels/
-find -name "*.so" -delete
-
-echo "=== (${PYTHON_VERSION}) Building wheel ==="
-PATH="${CPYTHON_PATH}/bin:$PATH" $PYTHON_INTERPRETER setup.py build_ext --inplace
-PATH="${CPYTHON_PATH}/bin:$PATH" $PYTHON_INTERPRETER setup.py bdist_wheel
-# Source distribution is used for debian pyarrow packages.
-PATH="${CPYTHON_PATH}/bin:$PATH" $PYTHON_INTERPRETER setup.py sdist
-
-echo "=== (${PYTHON_VERSION}) Tag the wheel with manylinux201x ==="
-mkdir -p repaired_wheels/
-auditwheel repair --plat ${AUDITWHEEL_PLAT} -L . dist/pyarrow-*.whl -w repaired_wheels/
-
-# Install the built wheels
-$PIP install repaired_wheels/*.whl
-
-# Test that the modules are importable
-$PYTHON_INTERPRETER -c "
-import pyarrow
-import pyarrow.csv
-import pyarrow.dataset
-import pyarrow.flight
-import pyarrow.fs
-import pyarrow._hdfs
-import pyarrow.json
-import pyarrow.parquet
-import pyarrow.plasma
-import pyarrow._s3fs
-"
-
-# More thorough testing happens outside of the build to prevent
-# packaging issues like ARROW-4372
-mv dist/*.tar.gz /io/dist
-mv repaired_wheels/*.whl /io/dist
diff --git a/python/manylinux201x/scripts/build_absl.sh b/python/manylinux201x/scripts/build_absl.sh
deleted file mode 100755
index 9bb7484..0000000
--- a/python/manylinux201x/scripts/build_absl.sh
+++ /dev/null
@@ -1,35 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export ABSL_VERSION="2eba343b51e0923cd3fb919a6abd6120590fc059"
-export CFLAGS="-fPIC"
-export CXXFLAGS="-fPIC"
-
-curl -sL "https://github.com/abseil/abseil-cpp/archive/${ABSL_VERSION}.tar.gz" -o ${ABSL_VERSION}.tar.gz
-tar xf ${ABSL_VERSION}.tar.gz
-pushd abseil-cpp-${ABSL_VERSION}
-
-cmake . -GNinja \
-    -DCMAKE_BUILD_TYPE=Release \
-    -DCMAKE_INSTALL_PREFIX=/usr \
-    -DABSL_RUN_TESTS=OFF \
-    -DCMAKE_CXX_STANDARD=11
-
-ninja install
-popd
-rm -rf abseil-cpp-${ABSL_VERSION} ${ABSL_VERSION}.tar.gz
diff --git a/python/manylinux201x/scripts/build_aws_sdk.sh b/python/manylinux201x/scripts/build_aws_sdk.sh
deleted file mode 100755
index 8271b74..0000000
--- a/python/manylinux201x/scripts/build_aws_sdk.sh
+++ /dev/null
@@ -1,44 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export AWS_SDK_VERSION="1.7.356"
-export CFLAGS="-fPIC"
-export PREFIX="/usr/local"
-
-curl -sL "https://github.com/aws/aws-sdk-cpp/archive/${AWS_SDK_VERSION}.tar.gz" -o aws-sdk-cpp-${AWS_SDK_VERSION}.tar.gz
-tar xf aws-sdk-cpp-${AWS_SDK_VERSION}.tar.gz
-pushd aws-sdk-cpp-${AWS_SDK_VERSION}
-
-mkdir build
-pushd build
-
-cmake .. -GNinja \
-    -DCMAKE_BUILD_TYPE=Release \
-    -DCMAKE_INSTALL_PREFIX=${PREFIX} \
-    -DBUILD_ONLY='s3;core;transfer;config;identity-management;sts' \
-    -DBUILD_SHARED_LIBS=OFF \
-    -DENABLE_CURL_LOGGING=ON \
-    -DENABLE_UNITY_BUILD=ON \
-    -DENABLE_TESTING=OFF
-
-ninja install
-
-popd
-popd
-
-rm -r aws-sdk-cpp-${AWS_SDK_VERSION}.tar.gz aws-sdk-cpp-${AWS_SDK_VERSION}
diff --git a/python/manylinux201x/scripts/build_boost.sh b/python/manylinux201x/scripts/build_boost.sh
deleted file mode 100755
index d54a1ef..0000000
--- a/python/manylinux201x/scripts/build_boost.sh
+++ /dev/null
@@ -1,50 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-BOOST_VERSION=1.73.0
-BOOST_VERSION_UNDERSCORE=${BOOST_VERSION//\./_}
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-
-BASE_NAME=boost_${BOOST_VERSION_UNDERSCORE}
-ARCHIVE_NAME=boost_${BOOST_VERSION_UNDERSCORE}.tar.bz2
-
-# Bintray is much faster but can fail because of limitations
-curl -sL https://dl.bintray.com/boostorg/release/${BOOST_VERSION}/source/boost_${BOOST_VERSION_UNDERSCORE}.tar.bz2 -o /${ARCHIVE_NAME} \
-    || curl -sL https://sourceforge.net/projects/boost/files/boost/${BOOST_VERSION}/boost_${BOOST_VERSION_UNDERSCORE}.tar.bz2 -o /${ARCHIVE_NAME}
-
-tar xf ${ARCHIVE_NAME}
-mkdir /arrow_boost
-pushd /${BASE_NAME}
-./bootstrap.sh
-./b2 -j${NCORES} tools/bcp
-./dist/bin/bcp --namespace=arrow_boost --namespace-alias filesystem date_time system regex build predef algorithm locale format variant multiprecision/cpp_int /arrow_boost
-popd
-
-pushd /arrow_boost
-ls -l
-./bootstrap.sh
-./b2 -j${NCORES} dll-path="'\$ORIGIN/'" cxxflags='-std=c++11 -fPIC' cflags=-fPIC linkflags="-std=c++11" variant=release link=shared --prefix=/arrow_boost_dist --with-filesystem --with-date_time --with-system --with-regex install
-popd
-
-rm -rf ${ARCHIVE_NAME} ${BASE_NAME} arrow_boost
-# Boost always install header-only parts but they also take up quite some space.
-# We don't need them in array, so don't persist them in the docker layer.
-# fusion 16.7 MiB
-rm -r /arrow_boost_dist/include/boost/fusion
-# spirit 8.2 MiB
-rm -r /arrow_boost_dist/include/boost/spirit
diff --git a/python/manylinux201x/scripts/build_brotli.sh b/python/manylinux201x/scripts/build_brotli.sh
deleted file mode 100755
index 4be14b1..0000000
--- a/python/manylinux201x/scripts/build_brotli.sh
+++ /dev/null
@@ -1,33 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export BROTLI_VERSION="1.0.7"
-curl -sL "https://github.com/google/brotli/archive/v${BROTLI_VERSION}.tar.gz" -o brotli-${BROTLI_VERSION}.tar.gz
-tar xf brotli-${BROTLI_VERSION}.tar.gz
-pushd brotli-${BROTLI_VERSION}
-mkdir build
-pushd build
-cmake -DCMAKE_BUILD_TYPE=release \
-    -DCMAKE_POSITION_INDEPENDENT_CODE=ON \
-    -DCMAKE_INSTALL_PREFIX=/usr/local \
-    -GNinja \
-    ..
-ninja install
-popd
-popd
-rm -rf brotli-${BROTLI_VERSION}.tar.gz brotli-${BROTLI_VERSION}
diff --git a/python/manylinux201x/scripts/build_bz2.sh b/python/manylinux201x/scripts/build_bz2.sh
deleted file mode 100755
index 84fd705..0000000
--- a/python/manylinux201x/scripts/build_bz2.sh
+++ /dev/null
@@ -1,30 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-export BZ2_VERSION="1.0.8"
-export CFLAGS="-Wall -Winline -O2 -fPIC -D_FILE_OFFSET_BITS=64"
-
-curl -sL "https://www.sourceware.org/pub/bzip2/bzip2-${BZ2_VERSION}.tar.gz" -o bzip2-${BZ2_VERSION}.tar.gz
-tar xf bzip2-${BZ2_VERSION}.tar.gz
-
-pushd bzip2-${BZ2_VERSION}
-make PREFIX=/usr/local CFLAGS="$CFLAGS" install -j${NCORES}
-popd
-
-rm -rf bzip2-${BZ2_VERSION}.tar.gz bzip2-${BZ2_VERSION}
diff --git a/python/manylinux201x/scripts/build_cares.sh b/python/manylinux201x/scripts/build_cares.sh
deleted file mode 100755
index 5848eaf..0000000
--- a/python/manylinux201x/scripts/build_cares.sh
+++ /dev/null
@@ -1,34 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export CARES_VERSION="1.16.1"
-export CFLAGS="-fPIC"
-export PREFIX="/usr/local"
-curl -sL "https://c-ares.haxx.se/download/c-ares-$CARES_VERSION.tar.gz" -o c-ares-${CARES_VERSION}.tar.gz
-tar xf c-ares-${CARES_VERSION}.tar.gz
-pushd c-ares-${CARES_VERSION}
-
-cmake -DCMAKE_BUILD_TYPE=Release \
-      -DCMAKE_INSTALL_PREFIX=${PREFIX} \
-      -DCARES_STATIC=ON \
-      -DCARES_SHARED=OFF \
-      -DCMAKE_C_FLAGS=${CFLAGS} \
-      -GNinja .
-ninja install
-popd
-rm -rf c-ares-${CARES_VERSION}.tar.gz c-ares-${CARES_VERSION}
diff --git a/python/manylinux201x/scripts/build_curl.sh b/python/manylinux201x/scripts/build_curl.sh
deleted file mode 100755
index 7659449..0000000
--- a/python/manylinux201x/scripts/build_curl.sh
+++ /dev/null
@@ -1,53 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export CURL_VERSION="7.70.0"
-export PREFIX="/usr/local"
-
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-
-curl -sL "http://curl.haxx.se/download/curl-${CURL_VERSION}.tar.bz2" -o curl-${CURL_VERSION}.tar.bz2
-tar xf curl-${CURL_VERSION}.tar.bz2
-pushd curl-${CURL_VERSION}
-
-./configure \
-    --prefix=${PREFIX} \
-    --disable-ldap \
-    --disable-ldaps \
-    --disable-rtsp \
-    --disable-telnet \
-    --disable-tftp \
-    --disable-pop3 \
-    --disable-imap \
-    --disable-smb \
-    --disable-smtp \
-    --disable-gopher \
-    --disable-mqtt \
-    --disable-manual \
-    --disable-shared \
-    --without-ca-bundle \
-    --without-ca-path \
-    --with-ssl=${PREFIX} \
-    --with-zlib=${PREFIX}
-
-make -j${NCORES}
-make install
-
-popd
-
-rm -r curl-${CURL_VERSION}.tar.bz2 curl-${CURL_VERSION}
diff --git a/python/manylinux201x/scripts/build_flatbuffers.sh b/python/manylinux201x/scripts/build_flatbuffers.sh
deleted file mode 100755
index f5af7cc..0000000
--- a/python/manylinux201x/scripts/build_flatbuffers.sh
+++ /dev/null
@@ -1,32 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export FLATBUFFERS_VERSION=1.10.0
-curl -sL https://github.com/google/flatbuffers/archive/v${FLATBUFFERS_VERSION}.tar.gz \
-    -o flatbuffers-${FLATBUFFERS_VERSION}.tar.gz
-tar xf flatbuffers-${FLATBUFFERS_VERSION}.tar.gz
-pushd flatbuffers-${FLATBUFFERS_VERSION}
-cmake \
-    -DCMAKE_CXX_FLAGS=-fPIC \
-    -DCMAKE_INSTALL_PREFIX:PATH=/usr/local \
-    -DFLATBUFFERS_BUILD_TESTS=OFF \
-    -DCMAKE_BUILD_TYPE=Release \
-    -GNinja .
-ninja install
-popd
-rm -rf flatbuffers-${FLATBUFFERS_VERSION}.tar.gz flatbuffers-${FLATBUFFERS_VERSION}
diff --git a/python/manylinux201x/scripts/build_gflags.sh b/python/manylinux201x/scripts/build_gflags.sh
deleted file mode 100755
index b26c14b..0000000
--- a/python/manylinux201x/scripts/build_gflags.sh
+++ /dev/null
@@ -1,39 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export GFLAGS_VERSION="2.2.2"
-export CFLAGS="-fPIC"
-export CXXFLAGS="-fPIC"
-
-curl -sL "https://github.com/gflags/gflags/archive/v${GFLAGS_VERSION}.tar.gz" -o gflags-${GFLAGS_VERSION}.tar.gz
-tar xf gflags-${GFLAGS_VERSION}.tar.gz
-pushd gflags-${GFLAGS_VERSION}
-
-cmake . -GNinja \
-    -DCMAKE_BUILD_TYPE=Release \
-    -DCMAKE_INSTALL_PREFIX=/usr \
-    -DINSTALL_HEADERS=ON \
-    -DBUILD_SHARED_LIBS=OFF \
-    -DBUILD_STATIC_LIBS=ON \
-    -DBUILD_PACKAGING=OFF \
-    -DBUILD_TESTING=OFF \
-    -DBUILD_CONFIG_TESTS=OFF \
-
-ninja install
-popd
-rm -rf gflags-${GFLAGS_VERSION}.tar.gz gflags-${GFLAGS_VERSION}
diff --git a/python/manylinux201x/scripts/build_glog.sh b/python/manylinux201x/scripts/build_glog.sh
deleted file mode 100755
index 4d43d36..0000000
--- a/python/manylinux201x/scripts/build_glog.sh
+++ /dev/null
@@ -1,35 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export GLOG_VERSION="0.4.0"
-export PREFIX="/usr/local"
-curl -sL "https://github.com/google/glog/archive/v${GLOG_VERSION}.tar.gz" -o glog-${GLOG_VERSION}.tar.gz
-tar xf glog-${GLOG_VERSION}.tar.gz
-pushd glog-${GLOG_VERSION}
-
-cmake -DCMAKE_BUILD_TYPE=Release \
-      -DCMAKE_INSTALL_PREFIX=${PREFIX} \
-      -DCMAKE_POSITION_INDEPENDENT_CODE=1 \
-      -DBUILD_SHARED_LIBS=OFF \
-      -DBUILD_TESTING=OFF \
-      -DWITH_GFLAGS=OFF \
-      -GNinja .
-ninja install
-popd
-rm -rf glog-${GLOG_VERSION}.tar.gz glog-${GLOG_VERSION}
-
diff --git a/python/manylinux201x/scripts/build_grpc.sh b/python/manylinux201x/scripts/build_grpc.sh
deleted file mode 100755
index ef2e831..0000000
--- a/python/manylinux201x/scripts/build_grpc.sh
+++ /dev/null
@@ -1,49 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export GRPC_VERSION="1.29.1"
-export CFLAGS="-fPIC -DGPR_MANYLINUX1=1"
-export PREFIX="/usr/local"
-
-curl -sL "https://github.com/grpc/grpc/archive/v${GRPC_VERSION}.tar.gz" -o grpc-${GRPC_VERSION}.tar.gz
-tar xf grpc-${GRPC_VERSION}.tar.gz
-pushd grpc-${GRPC_VERSION}
-
-cmake -DCMAKE_BUILD_TYPE=Release \
-      -DCMAKE_INSTALL_PREFIX=${PREFIX} \
-      -DBUILD_SHARED_LIBS=OFF \
-      -DCMAKE_C_FLAGS="${CFLAGS}" \
-      -DCMAKE_CXX_FLAGS="${CFLAGS}" \
-      -DgRPC_BUILD_CSHARP_EXT=OFF \
-      -DgRPC_BUILD_GRPC_CSHARP_PLUGIN=OFF \
-      -DgRPC_BUILD_GRPC_NODE_PLUGIN=OFF \
-      -DgRPC_BUILD_GRPC_OBJECTIVE_C_PLUGIN=OFF \
-      -DgRPC_BUILD_GRPC_PHP_PLUGIN=OFF \
-      -DgRPC_BUILD_GRPC_PYTHON_PLUGIN=OFF \
-      -DgRPC_BUILD_GRPC_RUBY_PLUGIN=OFF \
-      -DgRPC_ABSL_PROVIDER=package \
-      -DgRPC_CARES_PROVIDER=package \
-      -DgRPC_GFLAGS_PROVIDER=package \
-      -DgRPC_PROTOBUF_PROVIDER=package \
-      -DgRPC_SSL_PROVIDER=package \
-      -DgRPC_ZLIB_PROVIDER=package \
-      -DOPENSSL_USE_STATIC_LIBS=ON \
-      -GNinja .
-ninja install
-popd
-rm -rf grpc-${GRPC_VERSION}.tar.gz grpc-${GRPC_VERSION}
diff --git a/python/manylinux201x/scripts/build_llvm.sh b/python/manylinux201x/scripts/build_llvm.sh
deleted file mode 100755
index 8468dcf..0000000
--- a/python/manylinux201x/scripts/build_llvm.sh
+++ /dev/null
@@ -1,87 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-LLVM_VERSION_MAJOR=$1
-
-source /multibuild/manylinux_utils.sh
-
-detect_llvm_version() {
-  curl -sL https://api.github.com/repos/llvm/llvm-project/releases | \
-    grep tag_name | \
-    grep -o "llvmorg-${LLVM_VERSION_MAJOR}[^\"]*" | \
-    grep -v rc | \
-    sed -e "s/^llvmorg-//g" | \
-    head -n 1
-}
-
-LLVM_VERSION=$(detect_llvm_version)
-
-curl -sL https://github.com/llvm/llvm-project/releases/download/llvmorg-${LLVM_VERSION}/llvm-${LLVM_VERSION}.src.tar.xz -o llvm-${LLVM_VERSION}.src.tar.xz
-unxz llvm-${LLVM_VERSION}.src.tar.xz
-tar xf llvm-${LLVM_VERSION}.src.tar
-pushd llvm-${LLVM_VERSION}.src
-mkdir build
-pushd build
-cmake \
-    -DCMAKE_BUILD_TYPE=Release \
-    -DCMAKE_INSTALL_PREFIX=/usr \
-    -DLLVM_TARGETS_TO_BUILD=host \
-    -DLLVM_INCLUDE_DOCS=OFF \
-    -DLLVM_INCLUDE_EXAMPLES=OFF \
-    -DLLVM_INCLUDE_TESTS=OFF \
-    -DLLVM_INCLUDE_UTILS=OFF \
-    -DLLVM_ENABLE_TERMINFO=OFF \
-    -DLLVM_ENABLE_ASSERTIONS=ON \
-    -DLLVM_ENABLE_RTTI=ON \
-    -DLLVM_ENABLE_OCAMLDOC=OFF \
-    -DLLVM_USE_INTEL_JITEVENTS=ON \
-    -DLLVM_TEMPORARILY_ALLOW_OLD_TOOLCHAIN=ON \
-    -DPYTHON_EXECUTABLE="$(cpython_path 3.6 16)/bin/python" \
-    -GNinja \
-    ..
-ninja install
-popd
-popd
-rm -rf llvm-${LLVM_VERSION}.src.tar.xz llvm-${LLVM_VERSION}.src.tar llvm-${LLVM_VERSION}.src
-
-
-# clang is only used to precompile Gandiva bitcode
-curl -sL https://github.com/llvm/llvm-project/releases/download/llvmorg-${LLVM_VERSION}/clang-${LLVM_VERSION}.src.tar.xz -o clang-${LLVM_VERSION}.src.tar.xz
-unxz clang-${LLVM_VERSION}.src.tar.xz
-tar xf clang-${LLVM_VERSION}.src.tar
-pushd clang-${LLVM_VERSION}.src
-mkdir build
-pushd build
-cmake \
-    -DCMAKE_BUILD_TYPE=Release \
-    -DCMAKE_INSTALL_PREFIX=/usr \
-    -DCLANG_INCLUDE_TESTS=OFF \
-    -DCLANG_INCLUDE_DOCS=OFF \
-    -DCLANG_ENABLE_ARCMT=OFF \
-    -DCLANG_ENABLE_STATIC_ANALYZER=OFF \
-    -DLLVM_ENABLE_RTTI=ON \
-    -DLLVM_INSTALL_TOOLCHAIN_ONLY=ON \
-    -DLLVM_INCLUDE_TESTS=OFF \
-    -DLLVM_INCLUDE_DOCS=OFF \
-    -DLLVM_TEMPORARILY_ALLOW_OLD_TOOLCHAIN=ON \
-    -GNinja \
-    ..
-ninja -w dupbuild=warn install # both clang and llvm builds generate llvm-config file
-popd
-popd
-rm -rf clang-${LLVM_VERSION}.src.tar.xz clang-${LLVM_VERSION}.src.tar clang-${LLVM_VERSION}.src
diff --git a/python/manylinux201x/scripts/build_lz4.sh b/python/manylinux201x/scripts/build_lz4.sh
deleted file mode 100755
index 8b26155..0000000
--- a/python/manylinux201x/scripts/build_lz4.sh
+++ /dev/null
@@ -1,35 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-export LZ4_VERSION="1.9.2"
-export PREFIX="/usr/local"
-export CFLAGS="${CFLAGS} -O3 -fPIC"
-export LDFLAGS="${LDFLAGS} -Wl,-rpath,${PREFIX}/lib -L${PREFIX}/lib"
-
-curl -sL "https://github.com/lz4/lz4/archive/v${LZ4_VERSION}.tar.gz" -o lz4-${LZ4_VERSION}.tar.gz
-tar xf lz4-${LZ4_VERSION}.tar.gz
-pushd lz4-${LZ4_VERSION}
-
-make -j$NCORES PREFIX=${PREFIX} CFLAGS="${CFLAGS}"
-make install PREFIX=${PREFIX}
-popd
-
-rm -rf lz4-${LZ4_VERSION}.tar.gz lz4-${LZ4_VERSION}
-# We don't want to link against shared libs
-rm -rf ${PREFIX}/lib/liblz4.so*
diff --git a/python/manylinux201x/scripts/build_openssl.sh b/python/manylinux201x/scripts/build_openssl.sh
deleted file mode 100755
index 78093a4..0000000
--- a/python/manylinux201x/scripts/build_openssl.sh
+++ /dev/null
@@ -1,31 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-OPENSSL_VERSION="1.1.1g"
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-
-wget --no-check-certificate https://www.openssl.org/source/openssl-${OPENSSL_VERSION}.tar.gz -O openssl-${OPENSSL_VERSION}.tar.gz
-tar xf openssl-${OPENSSL_VERSION}.tar.gz
-
-pushd openssl-${OPENSSL_VERSION}
-./config -fpic no-shared no-tests --prefix=/usr/local
-make -j${NCORES}
-make install_sw
-popd
-
-rm -rf openssl-${OPENSSL_VERSION}.tar.gz openssl-${OPENSSL_VERSION}
diff --git a/python/manylinux201x/scripts/build_protobuf.sh b/python/manylinux201x/scripts/build_protobuf.sh
deleted file mode 100755
index f6153d0..0000000
--- a/python/manylinux201x/scripts/build_protobuf.sh
+++ /dev/null
@@ -1,29 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-PROTOBUF_VERSION="3.12.1"
-
-curl -sL "https://github.com/google/protobuf/releases/download/v${PROTOBUF_VERSION}/protobuf-all-${PROTOBUF_VERSION}.tar.gz" -o protobuf-${PROTOBUF_VERSION}.tar.gz
-tar xf protobuf-${PROTOBUF_VERSION}.tar.gz
-pushd protobuf-${PROTOBUF_VERSION}
-./configure --disable-shared --prefix=/usr/local "CXXFLAGS=-O2 -fPIC"
-make -j$NCORES
-make install
-popd
-rm -rf protobuf-${PROTOBUF_VERSION}.tar.gz protobuf-${PROTOBUF_VERSION}
diff --git a/python/manylinux201x/scripts/build_rapidjson.sh b/python/manylinux201x/scripts/build_rapidjson.sh
deleted file mode 100755
index 5e012ae..0000000
--- a/python/manylinux201x/scripts/build_rapidjson.sh
+++ /dev/null
@@ -1,37 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export RAPIDJSON_VERSION="1a803826f1197b5e30703afe4b9c0e7dd48074f5"
-
-curl -sL "https://github.com/Tencent/rapidjson/archive/${RAPIDJSON_VERSION}.tar.gz" -o rapidjson-${RAPIDJSON_VERSION}.tar.gz
-tar xf rapidjson-${RAPIDJSON_VERSION}.tar.gz
-pushd rapidjson-${RAPIDJSON_VERSION}
-mkdir build
-pushd build
-cmake -GNinja \
-      -DCMAKE_INSTALL_PREFIX=/usr/local \
-      -DRAPIDJSON_HAS_STDSTRING=ON \
-      -DRAPIDJSON_BUILD_TESTS=OFF \
-      -DRAPIDJSON_BUILD_EXAMPLES=OFF \
-      -DRAPIDJSON_BUILD_DOC=OFF \
-      -DCMAKE_BUILD_TYPE=release \
-      ..
-ninja install
-popd
-popd
-rm -rf rapidjson-${RAPIDJSON_VERSION}.tar.gz rapidjson-${RAPIDJSON_VERSION}
diff --git a/python/manylinux201x/scripts/build_re2.sh b/python/manylinux201x/scripts/build_re2.sh
deleted file mode 100755
index 4886b20..0000000
--- a/python/manylinux201x/scripts/build_re2.sh
+++ /dev/null
@@ -1,35 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export RE2_VERSION="2019-08-01"
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-
-curl -sL "https://github.com/google/re2/archive/${RE2_VERSION}.tar.gz" -o re2-${RE2_VERSION}.tar.gz
-tar xf re2-${RE2_VERSION}.tar.gz
-pushd re2-${RE2_VERSION}
-
-export CXXFLAGS="-fPIC -O2 ${CXXFLAGS}"
-export CFLAGS="-fPIC -O2 ${CFLAGS}"
-
-# Build shared libraries
-make prefix=/usr/local -j${NCORES} install
-
-popd
-
-# Need to remove shared library to make sure the static library is picked up by Arrow
-rm -rf re2-${RE2_VERSION}.tar.gz re2-${RE2_VERSION} /usr/local/lib/libre2.so*
diff --git a/python/manylinux201x/scripts/build_snappy.sh b/python/manylinux201x/scripts/build_snappy.sh
deleted file mode 100755
index b8eea9e..0000000
--- a/python/manylinux201x/scripts/build_snappy.sh
+++ /dev/null
@@ -1,31 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export SNAPPY_VERSION="1.1.8"
-curl -sL "https://github.com/google/snappy/archive/${SNAPPY_VERSION}.tar.gz" -o snappy-${SNAPPY_VERSION}.tar.gz
-tar xf snappy-${SNAPPY_VERSION}.tar.gz
-pushd snappy-${SNAPPY_VERSION}
-CXXFLAGS='-DNDEBUG -O2' cmake -GNinja \
-    -DCMAKE_INSTALL_PREFIX=/usr/local \
-    -DCMAKE_POSITION_INDEPENDENT_CODE=1 \
-    -DBUILD_SHARED_LIBS=OFF \
-    -DSNAPPY_BUILD_TESTS=OFF \
-    .
-ninja install
-popd
-rm -rf snappy-${SNAPPY_VERSION}.tar.gz snappy-${SNAPPY_VERSION}
diff --git a/python/manylinux201x/scripts/build_thrift.sh b/python/manylinux201x/scripts/build_thrift.sh
deleted file mode 100755
index 35a40c5..0000000
--- a/python/manylinux201x/scripts/build_thrift.sh
+++ /dev/null
@@ -1,50 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export THRIFT_VERSION=0.13.0
-wget https://archive.apache.org/dist/thrift/${THRIFT_VERSION}/thrift-${THRIFT_VERSION}.tar.gz
-tar xf thrift-${THRIFT_VERSION}.tar.gz
-pushd thrift-${THRIFT_VERSION}
-mkdir build-tmp
-pushd build-tmp
-cmake -DCMAKE_BUILD_TYPE=release \
-    -DCMAKE_CXX_FLAGS=-fPIC \
-    -DCMAKE_C_FLAGS=-fPIC \
-    -DCMAKE_INSTALL_PREFIX=/usr/local \
-    -DCMAKE_INSTALL_RPATH=/usr/local/lib \
-    -DBUILD_EXAMPLES=OFF \
-    -DBUILD_TESTING=OFF \
-    -DWITH_QT4=OFF \
-    -DWITH_AS3=OFF \
-    -DWITH_C_GLIB=OFF \
-    -DWITH_CPP=ON \
-    -DWITH_HASKELL=OFF \
-    -DWITH_JAVA=OFF \
-    -DWITH_JAVASCRIPT=OFF \
-    -DWITH_NODEJS=OFF \
-    -DWITH_PYTHON=OFF \
-    -DWITH_STATIC_LIB=ON \
-    -DWITH_SHARED_LIB=OFF \
-    -DWITH_OPENSSL=OFF \
-    -DBoost_NAMESPACE=arrow_boost \
-    -DBOOST_ROOT=/arrow_boost_dist \
-    -GNinja ..
-ninja install
-popd
-popd
-rm -rf thrift-${THRIFT_VERSION}.tar.gz thrift-${THRIFT_VERSION}
diff --git a/python/manylinux201x/scripts/build_utf8proc.sh b/python/manylinux201x/scripts/build_utf8proc.sh
deleted file mode 100755
index d74c36a..0000000
--- a/python/manylinux201x/scripts/build_utf8proc.sh
+++ /dev/null
@@ -1,38 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-NCORES=$(($(grep -c ^processor /proc/cpuinfo) + 1))
-export UTF8PROC_VERSION="2.5.0"
-export PREFIX="/usr/local"
-
-curl -sL "https://github.com/JuliaStrings/utf8proc/archive/v${UTF8PROC_VERSION}.tar.gz" -o utf8proc-$UTF8PROC_VERSION}.tar.gz
-tar xf utf8proc-$UTF8PROC_VERSION}.tar.gz
-
-pushd utf8proc-${UTF8PROC_VERSION}
-mkdir build
-pushd build
-cmake .. -GNinja \
-  -DBUILD_SHARED_LIBS=OFF \
-  -DCMAKE_BUILD_TYPE=Release \
-  -DCMAKE_INSTALL_PREFIX=${PREFIX}
-
-ninja install
-popd
-popd
-
-rm -rf utf8proc-${UTF8PROC_VERSION}.tar.gz utf8proc-${UTF8PROC_VERSION}
diff --git a/python/manylinux201x/scripts/build_zstd.sh b/python/manylinux201x/scripts/build_zstd.sh
deleted file mode 100755
index 9e38c70..0000000
--- a/python/manylinux201x/scripts/build_zstd.sh
+++ /dev/null
@@ -1,39 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-export ZSTD_VERSION="1.4.5"
-
-curl -sL "https://github.com/facebook/zstd/archive/v${ZSTD_VERSION}.tar.gz" -o zstd-${ZSTD_VERSION}.tar.gz
-tar xf zstd-${ZSTD_VERSION}.tar.gz
-pushd zstd-${ZSTD_VERSION}
-mkdir build_cmake
-pushd build_cmake
-
-cmake -GNinja -DCMAKE_BUILD_TYPE=Release \
-    -DCMAKE_INSTALL_PREFIX=/usr/local \
-    -DZSTD_BUILD_PROGRAMS=off \
-    -DZSTD_BUILD_SHARED=off \
-    -DZSTD_BUILD_STATIC=on \
-    -DZSTD_MULTITHREAD_SUPPORT=off \
-    -DCMAKE_POSITION_INDEPENDENT_CODE=1 \
-    ../build/cmake
-ninja install
-
-popd
-popd
-rm -rf zstd-${ZSTD_VERSION}.tar.gz zstd-${ZSTD_VERSION}
diff --git a/python/manylinux201x/scripts/check_arrow_visibility.sh b/python/manylinux201x/scripts/check_arrow_visibility.sh
deleted file mode 100755
index ed464e0..0000000
--- a/python/manylinux201x/scripts/check_arrow_visibility.sh
+++ /dev/null
@@ -1,35 +0,0 @@
-#!/bin/bash -ex
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-nm --demangle --dynamic /arrow-dist/lib/libarrow.so > nm_arrow.log
-
-# Filter out Arrow symbols and see if anything remains.
-# '_init' and '_fini' symbols may or not be present, we don't care.
-# (note we must ignore the grep exit status when no match is found)
-grep ' T ' nm_arrow.log | grep -v -E '(arrow|\b_init\b|\b_fini\b)' | cat - > visible_symbols.log
-
-if [[ -f visible_symbols.log && `cat visible_symbols.log | wc -l` -eq 0 ]]
-then
-    exit 0
-fi
-
-echo "== Unexpected symbols exported by libarrow.so =="
-cat visible_symbols.log
-echo "================================================"
-
-exit 1
diff --git a/python/requirements-wheel-test.txt b/python/requirements-wheel-test.txt
index a532a27..01aca24 100644
--- a/python/requirements-wheel-test.txt
+++ b/python/requirements-wheel-test.txt
@@ -4,7 +4,7 @@ hypothesis
 numpy==1.19.4
 pandas<1.1.0; python_version < "3.8"
 pandas; python_version >= "3.8"
-pickle5; python_version == "3.6" or python_version == "3.7"
+pickle5; (python_version == "3.6" or python_version == "3.7") and sys_platform != 'win32'
 pytest
 pytest-lazy-fixture
 pytz
diff --git a/python/setup.py b/python/setup.py
index 853ef18..01060f1 100755
--- a/python/setup.py
+++ b/python/setup.py
@@ -385,17 +385,6 @@ class build_ext(_build_ext):
                 build_prefix, build_lib,
                 "{}_regex".format(self.boost_namespace),
                 implib_required=False)
-        if sys.platform == 'win32':
-            # zlib uses zlib.dll for Windows
-            zlib_lib_name = 'zlib'
-            move_shared_libs(build_prefix, build_lib, zlib_lib_name,
-                             implib_required=False)
-            if self.with_flight:
-                # DLL dependencies for gRPC / Flight
-                for lib_name in ['libcrypto-1_1-x64',
-                                 'libssl-1_1-x64']:
-                    move_shared_libs(build_prefix, build_lib, lib_name,
-                                     implib_required=False)
 
     def _bundle_cython_cpp(self, name, lib_path):
         cpp_generated_path = self.get_ext_generated_cpp_source(name)