You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@arrow.apache.org by ks...@apache.org on 2020/05/05 17:02:57 UTC
[arrow] branch master updated: ARROW-8628: [Dev] Wrap
docker-compose commands with archery
This is an automated email from the ASF dual-hosted git repository.
kszucs pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/arrow.git
The following commit(s) were added to refs/heads/master by this push:
new 4116516 ARROW-8628: [Dev] Wrap docker-compose commands with archery
4116516 is described below
commit 4116516f22b680b70ab32ea28c029cc47e8d8988
Author: Krisztián Szűcs <sz...@gmail.com>
AuthorDate: Tue May 5 19:02:22 2020 +0200
ARROW-8628: [Dev] Wrap docker-compose commands with archery
- extend docker-compose.yml with a custom field to define image hierarchy
- added docker and docker-compose commands to archery
- validate docker-compose configuration's custom fields as well exercising docker-compose's builtin validation
- covered the added features with unittests
- archery command to execute the pull/build/run docker-compose sequences
- archery support for python 3.5, also test it on github actions
- updated all of the github workflows to use the new archery tooling
- updated all of the crossbow tasks to use archery as well
- removed the old Makefile.docker
- fixed archery's package and module definitions, because a non-editable installation caused import errors
- added documentation
Closes #7021 from kszucs/archery-docker
Authored-by: Krisztián Szűcs <sz...@gmail.com>
Signed-off-by: Krisztián Szűcs <sz...@gmail.com>
---
.github/workflows/archery.yml | 10 +-
.github/workflows/comment_bot.yml | 2 +-
.github/workflows/cpp.yml | 81 +---
.github/workflows/cpp_cron.yml | 185 ++------
.github/workflows/dev.yml | 92 ++--
.github/workflows/go.yml | 26 +-
.github/workflows/integration.yml | 32 +-
.github/workflows/java.yml | 22 +-
.github/workflows/java_jni.yml | 33 +-
.github/workflows/js.yml | 37 +-
.github/workflows/python.yml | 183 +++-----
.github/workflows/python_cron.yml | 238 +++-------
.github/workflows/r.yml | 105 ++---
.github/workflows/ruby.yml | 32 +-
.github/workflows/rust.yml | 30 +-
Makefile.docker | 65 ---
.../{setup.py => archery/benchmark/__init__.py} | 25 -
dev/archery/archery/benchmark/core.py | 8 +-
dev/archery/archery/benchmark/google.py | 14 +-
dev/archery/archery/benchmark/runner.py | 11 +-
dev/archery/archery/bot.py | 9 +-
dev/archery/archery/cli.py | 153 ++++++-
dev/archery/archery/compat.py | 51 +++
dev/archery/archery/docker.py | 186 ++++++++
dev/archery/{setup.py => archery/lang/__init__.py} | 25 -
dev/archery/archery/lang/cpp.py | 7 +-
dev/archery/archery/testing.py | 63 +++
dev/archery/archery/tests/test_docker.py | 410 +++++++++++++++++
dev/archery/archery/tests/test_testing.py | 62 +++
.../{setup.py => archery/utils/__init__.py} | 25 -
dev/archery/archery/utils/cache.py | 2 +-
dev/archery/archery/utils/cmake.py | 14 +-
dev/archery/archery/utils/command.py | 4 +-
dev/archery/archery/utils/git.py | 10 +-
dev/archery/archery/utils/lint.py | 2 +-
dev/archery/archery/utils/rat.py | 8 +-
dev/archery/archery/utils/source.py | 77 ++--
dev/archery/requirements.txt | 3 +-
dev/archery/setup.py | 23 +-
dev/tasks/crossbow.py | 6 +-
dev/tasks/docker-tests/azure.linux.yml | 38 +-
dev/tasks/docker-tests/circle.linux.yml | 55 +--
dev/tasks/docker-tests/github.linux.yml | 40 +-
dev/tasks/tasks.yml | 506 +++++----------------
docker-compose.yml | 71 ++-
docs/source/developers/cpp/development.rst | 3 +-
docs/source/developers/crossbow.rst | 35 +-
docs/source/developers/docker.rst | 232 ++++++++++
docs/source/developers/documentation.rst | 8 +-
docs/source/developers/integration.rst | 26 +-
docs/source/example.gz | Bin 0 -> 41 bytes
docs/source/index.rst | 1 +
52 files changed, 1881 insertions(+), 1505 deletions(-)
diff --git a/.github/workflows/archery.yml b/.github/workflows/archery.yml
index 34997e3..4eda469 100644
--- a/.github/workflows/archery.yml
+++ b/.github/workflows/archery.yml
@@ -23,11 +23,13 @@ on:
- '.github/workflows/archery.yml'
- 'dev/archery/**'
- 'dev/tasks/**'
+ - 'docker-compose.yml'
pull_request:
paths:
- '.github/workflows/archery.yml'
- 'dev/archery/**'
- 'dev/tasks/**'
+ - 'docker-compose.yml'
jobs:
@@ -48,13 +50,15 @@ jobs:
- name: Setup Python
uses: actions/setup-python@v1
with:
- python-version: '3.7'
- - name: Install
+ python-version: '3.5'
+ - name: Install Archery, Crossbow- and Test Dependencies
working-directory: dev/archery
- run: pip install pytest responses ruamel.yaml toolz jinja2 -e .
+ run: pip install pytest responses toolz jinja2 -e .[all]
- name: Archery Unittests
working-directory: dev/archery
run: pytest -v archery
+ - name: Archery Docker Validation
+ run: archery docker
- name: Crossbow Check Config
working-directory: dev/tasks
run: python crossbow.py check-config
diff --git a/.github/workflows/comment_bot.yml b/.github/workflows/comment_bot.yml
index 838c3c9..340d6bc 100644
--- a/.github/workflows/comment_bot.yml
+++ b/.github/workflows/comment_bot.yml
@@ -44,7 +44,7 @@ jobs:
- name: Instal Archery and Crossbow dependencies
run: |
conda install -y --file arrow/ci/conda_env_crossbow.txt pygithub
- pip install -e arrow/dev/archery
+ pip install -e arrow/dev/archery[bot]
- name: Handle Github comment event
env:
ARROW_GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
diff --git a/.github/workflows/cpp.yml b/.github/workflows/cpp.yml
index 34ac5c5..2c39aac 100644
--- a/.github/workflows/cpp.yml
+++ b/.github/workflows/cpp.yml
@@ -41,83 +41,50 @@ env:
DOCKER_BUILDKIT: 0
COMPOSE_DOCKER_CLI_BUILD: 1
ARROW_ENABLE_TIMING_TESTS: OFF
+ ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_TOKEN }}
+ ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_USER }}
jobs:
- conda:
- name: AMD64 Conda C++
- runs-on: ubuntu-latest
- if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
- steps:
- - name: Checkout Arrow
- uses: actions/checkout@v2
- with:
- fetch-depth: 0
- - name: Fetch Submodules and Tags
- shell: bash
- run: ci/scripts/util_checkout.sh
- - name: Free Up Disk Space
- shell: bash
- run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: docker-compose pull --ignore-pull-failures conda-cpp
- - name: Docker Build
- shell: bash
- run: docker-compose build conda-cpp
- - name: Docker Run
- shell: bash
- run: |
- sudo sysctl -w kernel.core_pattern="core.%e.%p"
- ulimit -c unlimited
- docker-compose run conda-cpp
- - name: Docker Push
- if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
- continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push conda-cpp
-
- ubuntu-sanitizer:
- name: AMD64 Ubuntu ${{ matrix.ubuntu }} C++ ASAN UBSAN
+ docker:
+ name: ${{ matrix.title }}
runs-on: ubuntu-latest
if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
strategy:
fail-fast: false
matrix:
- ubuntu: [18.04]
- env:
- UBUNTU: ${{ matrix.ubuntu }}
+ image:
+ - conda-cpp
+ - ubuntu-cpp-sanitizer
+ include:
+ - image: conda-cpp
+ title: AMD64 Conda C++
+ - image: ubuntu-cpp-sanitizer
+ title: AMD64 Ubuntu 18.04 C++ ASAN UBSAN
steps:
- name: Checkout Arrow
uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Fetch Submodules and Tags
- shell: bash
run: ci/scripts/util_checkout.sh
- name: Free Up Disk Space
- shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: docker-compose pull --ignore-pull-failures ubuntu-cpp-sanitizer
- - name: Docker Build
- shell: bash
- run: docker-compose build ubuntu-cpp-sanitizer
- - name: Docker Run
- shell: bash
- run: docker-compose run ubuntu-cpp-sanitizer
+ - name: Setup Python
+ uses: actions/setup-python@v1
+ with:
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
+ run: |
+ sudo sysctl -w kernel.core_pattern="core.%e.%p"
+ ulimit -c unlimited
+ archery docker run ${{ matrix.image }}
- name: Docker Push
if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push ubuntu-cpp-sanitizer
+ run: archery docker push ${{ matrix.image }}
macos:
name: AMD64 MacOS 10.15 C++
diff --git a/.github/workflows/cpp_cron.yml b/.github/workflows/cpp_cron.yml
index bdfed05..f4a7a86 100644
--- a/.github/workflows/cpp_cron.yml
+++ b/.github/workflows/cpp_cron.yml
@@ -36,170 +36,69 @@ env:
jobs:
- debian:
- name: AMD64 Debian ${{ matrix.debian }} C++
+ docker:
+ name: ${{ matrix.title }}
runs-on: ubuntu-latest
if: ${{ !contains(github.event.pull_request.title, 'WIP') && github.repository == 'apache/arrow' }}
strategy:
fail-fast: false
matrix:
- debian: [10]
+ name:
+ - debian-10-cpp
+ - fedora-30-cpp
+ - ubuntu-16.04-cpp
+ - ubuntu-18.04-cpp
+ - ubuntu-18.04-cpp-cmake32
+ include:
+ - name: debian-10-cpp
+ image: debian-cpp
+ title: AMD64 Debian 10 C++
+ debian: 10
+ - name: fedora-30-cpp
+ image: fedora-cpp
+ title: AMD64 Fedora 30 C++
+ fedora: 30
+ - name: ubuntu-16.04-cpp
+ image: ubuntu-cpp
+ title: AMD64 Ubuntu 16.04 C++
+ ubuntu: 16.04
+ - name: ubuntu-18.04-cpp
+ image: ubuntu-cpp
+ title: AMD64 Ubuntu 18.04 C++
+ ubuntu: 18.04
+ - name: ubuntu-18.04-cpp-cmake32
+ image: ubuntu-cpp-cmake32
+ title: AMD64 Ubuntu 18.04 C++ CMake 3.2
+ ubuntu: 18.04
env:
- DEBIAN: ${{ matrix.debian }}
+ # the defaults here should correspond to the values in .env
+ DEBIAN: ${{ matrix.debian || 10 }}
+ FEDORA: ${{ matrix.fedora || 30 }}
+ UBUNTU: ${{ matrix.ubuntu || 18.04 }}
steps:
- name: Checkout Arrow
uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Fetch Submodules and Tags
- shell: bash
run: ci/scripts/util_checkout.sh
- name: Free Up Disk Space
- shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: docker-compose pull --ignore-pull-failures debian-cpp
- - name: Docker Build
- shell: bash
- run: docker-compose build debian-cpp
- - name: Docker Run
- shell: bash
- run: |
- sudo sysctl -w kernel.core_pattern="core.%e.%p"
- ulimit -c unlimited
- docker-compose run debian-cpp
- - name: Docker Push
- if: success() && github.repository == 'apache/arrow'
- continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push debian-cpp
-
- fedora:
- name: AMD64 Fedora ${{ matrix.fedora }} C++
- runs-on: ubuntu-latest
- if: ${{ !contains(github.event.pull_request.title, 'WIP') && github.repository == 'apache/arrow' }}
- strategy:
- fail-fast: false
- matrix:
- fedora: [30]
- env:
- FEDORA: ${{ matrix.fedora }}
- steps:
- - name: Checkout Arrow
- uses: actions/checkout@v2
+ - name: Setup Python
+ uses: actions/setup-python@v1
with:
- fetch-depth: 0
- - name: Fetch Submodules and Tags
- shell: bash
- run: ci/scripts/util_checkout.sh
- - name: Free Up Disk Space
- shell: bash
- run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: docker-compose pull --ignore-pull-failures fedora-cpp
- - name: Docker Build
- shell: bash
- run: docker-compose build fedora-cpp
- - name: Docker Run
- shell: bash
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
run: |
sudo sysctl -w kernel.core_pattern="core.%e.%p"
ulimit -c unlimited
- docker-compose run fedora-cpp
+ archery docker run ${{ matrix.image }}
- name: Docker Push
- if: success() && github.repository == 'apache/arrow'
+ if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push fedora-cpp
-
- ubuntu:
- name: AMD64 Ubuntu ${{ matrix.ubuntu }} C++
- runs-on: ubuntu-latest
- if: ${{ !contains(github.event.pull_request.title, 'WIP') && github.repository == 'apache/arrow' }}
- strategy:
- fail-fast: false
- matrix:
- ubuntu: [16.04, 18.04]
- env:
- UBUNTU: ${{ matrix.ubuntu }}
- steps:
- - name: Checkout Arrow
- uses: actions/checkout@v2
- with:
- fetch-depth: 0
- - name: Fetch Submodules and Tags
- shell: bash
- run: ci/scripts/util_checkout.sh
- - name: Free Up Disk Space
- shell: bash
- run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: docker-compose pull --ignore-pull-failures ubuntu-cpp
- - name: Docker Build
- shell: bash
- run: docker-compose build ubuntu-cpp
- - name: Docker Run
- shell: bash
- run: |
- sudo sysctl -w kernel.core_pattern="core.%e.%p"
- ulimit -c unlimited
- docker-compose run ubuntu-cpp
- # No push step, it's the same image as ubuntu-cpp-sanitizer
-
- ubuntu-cmake32:
- name: AMD64 Ubuntu 18.04 C++ CMake 3.2
- runs-on: ubuntu-latest
- if: ${{ !contains(github.event.pull_request.title, 'WIP') && github.repository == 'apache/arrow' }}
- strategy:
- fail-fast: false
- matrix:
- ubuntu: [18.04]
- env:
- UBUNTU: ${{ matrix.ubuntu }}
- steps:
- - name: Checkout Arrow
- uses: actions/checkout@v2
- with:
- fetch-depth: 0
- - name: Fetch Submodules and Tags
- shell: bash
- run: ci/scripts/util_checkout.sh
- - name: Free Up Disk Space
- shell: bash
- run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: |
- docker-compose pull --ignore-pull-failures ubuntu-cpp
- docker-compose pull --ignore-pull-failures ubuntu-cpp-cmake32
- - name: Docker Build
- shell: bash
- run: |
- docker-compose build ubuntu-cpp
- docker-compose build ubuntu-cpp-cmake32
- - name: Docker Run
- shell: bash
- run: |
- sudo sysctl -w kernel.core_pattern="core.%e.%p"
- ulimit -c unlimited
- docker-compose run ubuntu-cpp-cmake32
- - name: Docker Push
- if: success() && github.repository == 'apache/arrow'
- continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push ubuntu-cpp-cmake32
+ run: archery docker push ${{ matrix.image }}
oss-fuzz:
name: OSS-Fuzz build check
diff --git a/.github/workflows/dev.yml b/.github/workflows/dev.yml
index 6eb92b8..1100dda 100644
--- a/.github/workflows/dev.yml
+++ b/.github/workflows/dev.yml
@@ -25,6 +25,8 @@ on:
env:
DOCKER_BUILDKIT: 0
COMPOSE_DOCKER_CLI_BUILD: 1
+ ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_TOKEN }}
+ ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_USER }}
jobs:
@@ -38,29 +40,55 @@ jobs:
with:
fetch-depth: 0
- name: Fetch Submodules and Tags
- shell: bash
run: ci/scripts/util_checkout.sh
- name: Free Up Disk Space
- shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- run: |
- docker-compose pull --ignore-pull-failures ubuntu-cpp
- docker-compose pull --ignore-pull-failures ubuntu-lint
- - name: Docker Build
+ - name: Setup Python
+ uses: actions/setup-python@v1
+ with:
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
run: |
- docker-compose build ubuntu-cpp
- docker-compose build ubuntu-lint
- - name: Docker Run
- run: docker-compose run ubuntu-lint
+ sudo sysctl -w kernel.core_pattern="core.%e.%p"
+ ulimit -c unlimited
+ archery docker run ubuntu-lint
- name: Docker Push
if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
continue-on-error: true
+ run: archery docker push ubuntu-lint
+
+ docs:
+ name: Sphinx and API documentations
+ runs-on: ubuntu-latest
+ if: github.event_name == 'push'
+ steps:
+ - name: Checkout Arrow
+ uses: actions/checkout@v2
+ with:
+ fetch-depth: 0
+ - name: Fetch Submodules and Tags
+ shell: bash
+ run: ci/scripts/util_checkout.sh
+ - name: Free Up Disk Space
shell: bash
+ run: ci/scripts/util_cleanup.sh
+ - name: Setup Python
+ uses: actions/setup-python@v1
+ with:
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push ubuntu-lint
+ sudo sysctl -w kernel.core_pattern="core.%e.%p"
+ ulimit -c unlimited
+ archery docker run ubuntu-docs
+ - name: Docker Push
+ if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
+ continue-on-error: true
+ run: archery docker push ubuntu-docs
release:
name: Source Release and Merge Script
@@ -104,39 +132,3 @@ jobs:
shell: bash
run: |
pytest -v dev/test_merge_arrow_pr.py
-
- docs:
- name: Sphinx and API documentations
- runs-on: ubuntu-latest
- if: github.event_name == 'push'
- steps:
- - name: Checkout Arrow
- uses: actions/checkout@v2
- with:
- fetch-depth: 0
- - name: Fetch Submodules and Tags
- shell: bash
- run: ci/scripts/util_checkout.sh
- - name: Free Up Disk Space
- shell: bash
- run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- run: |
- docker-compose pull --ignore-pull-failures ubuntu-cpp
- docker-compose pull --ignore-pull-failures ubuntu-python
- docker-compose pull --ignore-pull-failures ubuntu-docs
- - name: Docker Build
- run: |
- docker-compose build ubuntu-cpp
- docker-compose build ubuntu-python
- docker-compose build ubuntu-docs
- - name: Docker Run
- run: docker-compose run ubuntu-docs
- - name: Docker Push
- if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
- continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push ubuntu-docs
diff --git a/.github/workflows/go.yml b/.github/workflows/go.yml
index 5f9d6df..94bf519 100644
--- a/.github/workflows/go.yml
+++ b/.github/workflows/go.yml
@@ -35,10 +35,12 @@ on:
env:
DOCKER_BUILDKIT: 0
COMPOSE_DOCKER_CLI_BUILD: 1
+ ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_TOKEN }}
+ ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_USER }}
jobs:
- debian:
+ docker:
name: AMD64 Debian 10 Go ${{ matrix.go }}
runs-on: ubuntu-latest
if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
@@ -54,25 +56,21 @@ jobs:
with:
fetch-depth: 0
- name: Fetch Submodules and Tags
- shell: bash
run: ci/scripts/util_checkout.sh
- name: Free Up Disk Space
- shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- run: docker-compose pull --ignore-pull-failures debian-go
- - name: Docker Build
- run: docker-compose build debian-go
- - name: Docker Run
- run: docker-compose run debian-go
+ - name: Setup Python
+ uses: actions/setup-python@v1
+ with:
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
+ run: archery docker run debian-go
- name: Docker Push
if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push debian-go
+ run: archery docker push debian-go
windows:
name: AMD64 Windows 2019 Go ${{ matrix.go }}
diff --git a/.github/workflows/integration.yml b/.github/workflows/integration.yml
index b0d01d7..e82b06f 100644
--- a/.github/workflows/integration.yml
+++ b/.github/workflows/integration.yml
@@ -44,41 +44,33 @@ on:
env:
DOCKER_BUILDKIT: 0
COMPOSE_DOCKER_CLI_BUILD: 1
+ ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_TOKEN }}
+ ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_USER }}
jobs:
- conda-integration:
+ docker:
name: AMD64 Conda Integration Test
runs-on: ubuntu-latest
if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
- env:
- MAVEN: 3.5
steps:
- name: Checkout Arrow
uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Fetch Submodules and Tags
- shell: bash
run: ci/scripts/util_checkout.sh
- name: Free Up Disk Space
- shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- run: |
- docker-compose pull --ignore-pull-failures conda-cpp
- docker-compose pull --ignore-pull-failures conda-integration
- - name: Docker Build
- run: |
- docker-compose build conda-cpp
- docker-compose build conda-integration
- - name: Docker Run
- run: docker-compose run conda-integration
+ - name: Setup Python
+ uses: actions/setup-python@v1
+ with:
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
+ run: archery docker run conda-integration
- name: Docker Push
if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push conda-integration
+ run: archery docker push conda-integration
diff --git a/.github/workflows/java.yml b/.github/workflows/java.yml
index 6869283..aa038d9 100644
--- a/.github/workflows/java.yml
+++ b/.github/workflows/java.yml
@@ -38,6 +38,8 @@ on:
env:
DOCKER_BUILDKIT: 0
COMPOSE_DOCKER_CLI_BUILD: 1
+ ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_TOKEN }}
+ ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_USER }}
jobs:
@@ -64,20 +66,18 @@ jobs:
- name: Free Up Disk Space
shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- run: docker-compose pull --ignore-pull-failures debian-java
- - name: Docker Build
- run: docker-compose build debian-java
- - name: Docker Run
- run: docker-compose run debian-java
+ - name: Setup Python
+ uses: actions/setup-python@v1
+ with:
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
+ run: archery docker run debian-java
- name: Docker Push
if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push debian-java
+ run: archery docker push debian-go
macos:
name: AMD64 MacOS 10.15 Java JDK ${{ matrix.jdk }}
diff --git a/.github/workflows/java_jni.yml b/.github/workflows/java_jni.yml
index bdb2bf0..90b5172 100644
--- a/.github/workflows/java_jni.yml
+++ b/.github/workflows/java_jni.yml
@@ -38,11 +38,13 @@ on:
env:
DOCKER_BUILDKIT: 0
COMPOSE_DOCKER_CLI_BUILD: 1
+ ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_TOKEN }}
+ ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_USER }}
jobs:
- debian:
- name: AMD64 Debian 9 Java JNI and Plasma
+ docker:
+ name: AMD64 Debian 9 Java JNI (Gandiva, Plasma, ORC)
runs-on: ubuntu-latest
if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
strategy:
@@ -59,27 +61,18 @@ jobs:
with:
fetch-depth: 0
- name: Fetch Submodules and Tags
- shell: bash
run: ci/scripts/util_checkout.sh
- name: Free Up Disk Space
- shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- run: |
- docker-compose pull --ignore-pull-failures debian-java
- docker-compose pull --ignore-pull-failures debian-java-jni
- - name: Docker Build
- run: |
- docker-compose build debian-java
- docker-compose build debian-java-jni
- - name: Docker Run
- run: |
- docker-compose run debian-java-jni
+ - name: Setup Python
+ uses: actions/setup-python@v1
+ with:
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
+ run: archery docker run debian-java-jni
- name: Docker Push
if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push debian-java-jni
+ run: archery docker push debian-java-jni
diff --git a/.github/workflows/js.yml b/.github/workflows/js.yml
index fc749fa..383326e 100644
--- a/.github/workflows/js.yml
+++ b/.github/workflows/js.yml
@@ -34,44 +34,39 @@ on:
env:
DOCKER_BUILDKIT: 0
COMPOSE_DOCKER_CLI_BUILD: 1
+ ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_TOKEN }}
+ ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_USER }}
jobs:
- debian:
- name: AMD64 Debian 10 NodeJS ${{ matrix.node }}
+ docker:
+ name: AMD64 Debian 10 NodeJS 11
runs-on: ubuntu-latest
if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
- strategy:
- fail-fast: false
- matrix:
- node: [11]
- env:
- NODE: ${{ matrix.node }}
steps:
- name: Checkout Arrow
uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Fetch Submodules and Tags
- shell: bash
run: ci/scripts/util_checkout.sh
- name: Free Up Disk Space
- shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- run: docker-compose pull --ignore-pull-failures debian-js
- - name: Docker Build
- run: docker-compose build debian-js
- - name: Docker Run
- run: docker-compose run debian-js
+ - name: Setup Python
+ uses: actions/setup-python@v1
+ with:
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
+ run: |
+ sudo sysctl -w kernel.core_pattern="core.%e.%p"
+ ulimit -c unlimited
+ archery docker run debian-js
- name: Docker Push
if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push debian-js
+ run: archery docker push debian-js
macos:
name: AMD64 MacOS 10.15 NodeJS ${{ matrix.node }}
diff --git a/.github/workflows/python.yml b/.github/workflows/python.yml
index f1a4ff3..9817770 100644
--- a/.github/workflows/python.yml
+++ b/.github/workflows/python.yml
@@ -34,103 +34,76 @@ on:
env:
DOCKER_BUILDKIT: 0
COMPOSE_DOCKER_CLI_BUILD: 1
+ ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_TOKEN }}
+ ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_USER }}
jobs:
- ubuntu:
- name: AMD64 Ubuntu 16.04 Python 3.5
- runs-on: ubuntu-latest
- if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
+ docker:
+ name: ${{ matrix.title }}
+ runs-on: ubuntu-latest
+ if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
+ strategy:
+ fail-fast: false
+ matrix:
+ name:
+ - ubuntu-16.04-python-3
+ - conda-python-3.8-nopandas
+ - conda-python-3.6-pandas-0.23
+ - conda-python-3.6-pandas-latest
+ include:
+ - name: ubuntu-16.04-python-3
+ image: ubuntu-python
+ # this image always builds with python 3.5
+ title: AMD64 Ubuntu 16.04 Python 3.5
+ ubuntu: 16.04
+ - name: conda-python-3.8-nopandas
+ image: conda-python
+ title: AMD64 Conda Python 3.8 Without Pandas
+ python: 3.8
+ - name: conda-python-3.6-pandas-0.23
+ image: conda-python-pandas
+ title: AMD64 Conda Python 3.6 Pandas 0.23
+ python: 3.6
+ pandas: 0.23
+ - name: conda-python-3.6-pandas-latest
+ image: conda-python-pandas
+ title: AMD64 Conda Python 3.6 Pandas latest
+ python: 3.6
+ pandas: latest
env:
- UBUNTU: 16.04
+ PYTHON: ${{ matrix.python || 3.7 }}
+ UBUNTU: ${{ matrix.ubuntu || 18.04 }}
+ PANDAS: ${{ matrix.pandas || 'latest' }}
steps:
- name: Checkout Arrow
uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Fetch Submodules and Tags
- shell: bash
run: ci/scripts/util_checkout.sh
- name: Free Up Disk Space
- shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: |
- docker-compose pull --ignore-pull-failures ubuntu-cpp
- docker-compose pull --ignore-pull-failures ubuntu-python
- - name: Docker Build
- shell: bash
- run: |
- docker-compose build ubuntu-cpp
- docker-compose build ubuntu-python
- - name: Docker Run
- shell: bash
- run: docker-compose run ubuntu-python
- - name: Docker Push
- if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
- continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push ubuntu-python
-
- conda-nopandas:
- name: AMD64 Conda Python ${{ matrix.python }} Without Pandas
- runs-on: ubuntu-latest
- if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
- strategy:
- fail-fast: false
- matrix:
- python: [3.8]
- env:
- PYTHON: ${{ matrix.python }}
- steps:
- - name: Checkout Arrow
- uses: actions/checkout@v2
+ - name: Setup Python
+ uses: actions/setup-python@v1
with:
- fetch-depth: 0
- - name: Fetch Submodules and Tags
- shell: bash
- run: ci/scripts/util_checkout.sh
- - name: Free Up Disk Space
- shell: bash
- run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: |
- docker-compose pull --ignore-pull-failures conda-cpp
- docker-compose pull --ignore-pull-failures conda-python
- - name: Docker Build
- shell: bash
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
run: |
- docker-compose build conda-cpp
- docker-compose build conda-python
- - name: Docker Run
- shell: bash
- run: docker-compose run conda-python
+ sudo sysctl -w kernel.core_pattern="core.%e.%p"
+ ulimit -c unlimited
+ archery docker run ${{ matrix.image }}
- name: Docker Push
if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push conda-python
+ run: archery docker push ${{ matrix.image }}
- conda-pandas:
- name: AMD64 Conda Python ${{ matrix.python }} Pandas ${{ matrix.pandas }}
+ manylinux1:
+ name: AMD64 CentOS 5.11 Python 3.6 manylinux1
runs-on: ubuntu-latest
if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
- strategy:
- fail-fast: false
- matrix:
- python: [3.6]
- pandas: ["latest", "0.23"]
- env:
- PYTHON: ${{ matrix.python }}
- PANDAS: ${{ matrix.pandas }}
steps:
- name: Checkout Arrow
uses: actions/checkout@v2
@@ -142,30 +115,17 @@ jobs:
- name: Free Up Disk Space
shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: |
- docker-compose pull --ignore-pull-failures conda-cpp
- docker-compose pull --ignore-pull-failures conda-python
- docker-compose pull --ignore-pull-failures conda-python-pandas
- - name: Docker Build
- shell: bash
- run: |
- docker-compose build conda-cpp
- docker-compose build conda-python
- docker-compose build conda-python-pandas
- - name: Docker Run
- shell: bash
- run: |
- docker-compose run conda-python-pandas
- - name: Docker Push
- if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
- continue-on-error: true
- shell: bash
+ - name: Setup Python
+ uses: actions/setup-python@v1
+ with:
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push conda-python-pandas
+ sudo sysctl -w kernel.core_pattern="core.%e.%p"
+ ulimit -c unlimited
+ archery docker run --no-build ${{ matrix.image }}
macos:
name: AMD64 MacOS 10.15 Python 3.7
@@ -212,28 +172,3 @@ jobs:
- name: Test
shell: bash
run: ci/scripts/python_test.sh $(pwd) $(pwd)/build
-
- manylinux1:
- name: AMD64 CentOS 5.11 Python 3.6 manylinux1
- runs-on: ubuntu-latest
- if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
- steps:
- - name: Checkout Arrow
- uses: actions/checkout@v2
- with:
- fetch-depth: 0
- - name: Fetch Submodules and Tags
- shell: bash
- run: ci/scripts/util_checkout.sh
- - name: Free Up Disk Space
- shell: bash
- run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: docker-compose pull centos-python-manylinux1
- - name: Docker Run
- shell: bash
- run: |
- sudo sysctl -w kernel.core_pattern="core.%e.%p"
- ulimit -c unlimited
- docker-compose run centos-python-manylinux1
diff --git a/.github/workflows/python_cron.yml b/.github/workflows/python_cron.yml
index 10f709f..db6c828 100644
--- a/.github/workflows/python_cron.yml
+++ b/.github/workflows/python_cron.yml
@@ -31,180 +31,71 @@ on:
env:
DOCKER_BUILDKIT: 0
COMPOSE_DOCKER_CLI_BUILD: 1
+ ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_TOKEN }}
+ ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_USER }}
jobs:
- debian:
- name: AMD64 Debian ${{ matrix.debian }} Python 3
- runs-on: ubuntu-latest
- if: ${{ !contains(github.event.pull_request.title, 'WIP') && github.repository == 'apache/arrow' }}
- strategy:
- fail-fast: false
- matrix:
- debian: [10]
- env:
- DEBIAN: ${{ matrix.debian }}
- steps:
- - name: Checkout Arrow
- uses: actions/checkout@v2
- with:
- fetch-depth: 0
- - name: Fetch Submodules and Tags
- shell: bash
- run: ci/scripts/util_checkout.sh
- - name: Free Up Disk Space
- shell: bash
- run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: |
- docker-compose pull --ignore-pull-failures debian-cpp
- docker-compose pull --ignore-pull-failures debian-python
- - name: Docker Build
- shell: bash
- run: |
- docker-compose build debian-cpp
- docker-compose build debian-python
- - name: Docker Run
- shell: bash
- run: docker-compose run debian-python
- - name: Docker Push
- if: success() && github.repository == 'apache/arrow'
- continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push debian-python
-
- ubuntu:
- name: AMD64 Ubuntu ${{ matrix.ubuntu }} Python 3
- runs-on: ubuntu-latest
- if: ${{ !contains(github.event.pull_request.title, 'WIP') && github.repository == 'apache/arrow' }}
- strategy:
- fail-fast: false
- matrix:
- ubuntu: [18.04]
- env:
- UBUNTU: ${{ matrix.ubuntu }}
- steps:
- - name: Checkout Arrow
- uses: actions/checkout@v2
- with:
- fetch-depth: 0
- - name: Fetch Submodules and Tags
- shell: bash
- run: ci/scripts/util_checkout.sh
- - name: Free Up Disk Space
- shell: bash
- run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: |
- docker-compose pull --ignore-pull-failures ubuntu-cpp
- docker-compose pull --ignore-pull-failures ubuntu-python
- - name: Docker Build
- shell: bash
- run: |
- docker-compose build ubuntu-cpp
- docker-compose build ubuntu-python
- - name: Docker Run
- shell: bash
- run: docker-compose run ubuntu-python
- - name: Docker Push
- if: success() && github.repository == 'apache/arrow'
- continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push ubuntu-python
-
- fedora:
- name: AMD64 Fedora ${{ matrix.fedora }} Python 3
- runs-on: ubuntu-latest
- if: ${{ !contains(github.event.pull_request.title, 'WIP') && github.repository == 'apache/arrow' }}
- strategy:
- fail-fast: false
- matrix:
- fedora: [30]
- env:
- FEDORA: ${{ matrix.fedora }}
- steps:
- - name: Checkout Arrow
- uses: actions/checkout@v2
- with:
- fetch-depth: 0
- - name: Fetch Submodules and Tags
- shell: bash
- run: ci/scripts/util_checkout.sh
- - name: Free Up Disk Space
- shell: bash
- run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: |
- docker-compose pull --ignore-pull-failures fedora-cpp
- docker-compose pull --ignore-pull-failures fedora-python
- - name: Docker Build
- shell: bash
- run: |
- docker-compose build fedora-cpp
- docker-compose build fedora-python
- - name: Docker Run
- shell: bash
- run: docker-compose run fedora-python
- - name: Docker Push
- if: success() && github.repository == 'apache/arrow'
- continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push fedora-python
-
- downstream:
- name: AMD64 Conda Python 3.7 ${{ matrix.title }}
+ docker:
+ name: ${{ matrix.title }}
runs-on: ubuntu-latest
if: ${{ !contains(github.event.pull_request.title, 'WIP') && github.repository == 'apache/arrow' }}
strategy:
fail-fast: false
matrix:
name:
- - dask-latest
- - hdfs-2.9.2
- - turbodbc-latest
- - kartothek-latest
- - pandas-master
- - pandas-0.24
+ - debian-10-python-3
+ - fedora-30-python-3
+ - ubuntu-18.04-python-3
+ - conda-python-3.7-dask-latest
+ - conda-python-3.7-turbodbc-latest
+ - conda-python-3.7-kartothek-latest
+ - conda-python-3.7-pandas-0.24
+ - conda-python-3.7-pandas-master
+ - conda-python-3.7-hdfs-2.9.2
include:
- - name: dask-latest
- library: dask
+ - name: debian-10-python-3
+ image: debian-python
+ title: AMD64 Debian 10 Python 3
+ debian: 10
+ - name: fedora-30-python-3
+ image: fedora-python
+ title: AMD64 Fedora 30 Python 3
+ fedora: 30
+ - name: ubuntu-18.04-python-3
+ image: ubuntu-python
+ title: AMD64 Ubuntu 18.04 Python 3
+ ubuntu: 18.04
+ - name: conda-python-3.7-dask-latest
+ image: conda-python-dask
+ title: AMD64 Conda Python 3.7 Dask latest
dask: latest
- title: Dask latest
- - name: turbodbc-latest
- library: turbodbc
+ - name: conda-python-3.7-turbodbc-latest
+ image: conda-python-turbodbc
+ title: AMD64 Conda Python 3.7 Turbodbc latest
turbodbc: latest
- title: Turbodbc latest
- - name: kartothek-latest
- library: kartothek
+ - name: conda-python-3.7-kartothek-latest
+ image: conda-python-kartothek
+ title: AMD64 Conda Python 3.7 Kartothek latest
kartothek: latest
- title: Kartothek latest
- - name: pandas-0.24
- library: pandas
- title: Pandas 0.24
+ - name: conda-python-3.7-pandas-0.24
+ image: conda-python-pandas
+ title: AMD64 Conda Python 3.7 Pandas 0.24
pandas: 0.24
- - name: pandas-master
- library: pandas
- title: Pandas master
+ - name: conda-python-3.7-pandas-master
+ image: --no-cache-leaf conda-python-pandas
+ title: AMD64 Conda Python 3.7 Pandas master
pandas: master
- - name: hdfs-2.9.2
- library: hdfs
+ - name: conda-python-3.7-hdfs-2.9.2
+ image: conda-python-hdfs
+ title: AMD64 Conda Python 3.7 HDFS 2.9.2
hdfs: 2.9.2
- title: HDFS 2.9.2
env:
# the defaults here should correspond to the values in .env
- PYTHON: 3.7
+ DEBIAN: ${{ matrix.debian || 10 }}
+ FEDORA: ${{ matrix.fedora || 30 }}
+ UBUNTU: ${{ matrix.ubuntu || 18.04 }}
+ PYTHON: ${{ matrix.python || 3.7 }}
HDFS: ${{ matrix.hdfs || '2.9.2' }}
DASK: ${{ matrix.dask || 'latest' }}
TURBODBC: ${{ matrix.turbodbc || 'latest' }}
@@ -216,32 +107,21 @@ jobs:
with:
fetch-depth: 0
- name: Fetch Submodules and Tags
- shell: bash
run: ci/scripts/util_checkout.sh
- name: Free Up Disk Space
- shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: |
- docker-compose pull --ignore-pull-failures conda-cpp
- docker-compose pull --ignore-pull-failures conda-python
- docker-compose pull --ignore-pull-failures conda-python-${{ matrix.library }}
- - name: Docker Build
- shell: bash
- run: |
- docker-compose build conda-cpp
- docker-compose build conda-python
- docker-compose build conda-python-${{ matrix.library }}
- - name: Docker Run
- shell: bash
+ - name: Setup Python
+ uses: actions/setup-python@v1
+ with:
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
run: |
- docker-compose run conda-python-${{ matrix.library }}
+ sudo sysctl -w kernel.core_pattern="core.%e.%p"
+ ulimit -c unlimited
+ archery docker run ${{ matrix.image }}
- name: Docker Push
- if: success() && github.repository == 'apache/arrow'
+ if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push conda-python-${{ matrix.library }}
+ run: archery docker push ${{ matrix.image }}
diff --git a/.github/workflows/r.yml b/.github/workflows/r.yml
index f1389ff..c5fcba8 100644
--- a/.github/workflows/r.yml
+++ b/.github/workflows/r.yml
@@ -42,8 +42,11 @@ on:
env:
DOCKER_BUILDKIT: 0
COMPOSE_DOCKER_CLI_BUILD: 1
+ ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_TOKEN }}
+ ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_USER }}
jobs:
+
ubuntu:
name: AMD64 Ubuntu ${{ matrix.ubuntu }} R ${{ matrix.r }}
runs-on: ubuntu-latest
@@ -51,93 +54,55 @@ jobs:
strategy:
fail-fast: false
matrix:
- r: [3.6]
- ubuntu: [18.04]
+ name:
+ - amd64-ubuntu-18.04-r-3.6
+ - amd64-centos-7-rstudio-r-3.6
+ include:
+ - name: amd64-ubuntu-18.04-r-3.6
+ image: ubuntu-r
+ title: AMD64 Ubuntu 18.04 R 3.6
+ ubuntu: 18.04
+ r: 3.6
+ - name: amd64-centos-7-rstudio-r-3.6
+ image: r
+ title: AMD64 CentOS 7 RStudio R 3.6
+ r: 3.6
+ r_org: rstudio
+ r_image: r-base
+ r_tag: 3.6-centos7
env:
- R: ${{ matrix.r }}
- UBUNTU: ${{ matrix.ubuntu }}
+ R: ${{ matrix.r || 3.6 }}
+ UBUNTU: ${{ matrix.ubuntu || 18.04 }}
+ R_ORG: ${{ matrix.r_org || "rhub" }}
+ R_IMAGE: ${{ matrix.r_image || ubuntu-gcc-release }}
+ R_TAG: ${{ matrix.r_tag || "latest" }}
steps:
- name: Checkout Arrow
uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Fetch Submodules and Tags
- shell: bash
run: ci/scripts/util_checkout.sh
- name: Free Up Disk Space
- shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: |
- docker-compose pull --ignore-pull-failures ubuntu-cpp
- docker-compose pull --ignore-pull-failures ubuntu-r
- - name: Docker Build
- shell: bash
- run: |
- docker-compose build ubuntu-cpp
- docker-compose build ubuntu-r
- - name: Docker Run
- shell: bash
- run: docker-compose run ubuntu-r
- - name: Dump install logs on failure
- if: failure()
- run: cat r/check/arrow.Rcheck/00install.out
- - name: Docker Push
- if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
- continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push ubuntu-r
-
- rstudio:
- name: "rstudio/r-base:${{ matrix.r_version }}-${{ matrix.r_image }}"
- runs-on: ubuntu-latest
- if: ${{ !contains(github.event.pull_request.title, 'WIP') }}
- strategy:
- fail-fast: false
- matrix:
- # See https://hub.docker.com/r/rstudio/r-base
- r_version: ["3.6"]
- r_image:
- - centos7
- env:
- R_ORG: "rstudio"
- R_IMAGE: "r-base"
- R_TAG: ${{ matrix.r_version }}-${{ matrix.r_image }}
- steps:
- - name: Checkout Arrow
- uses: actions/checkout@v2
+ - name: Setup Python
+ uses: actions/setup-python@v1
with:
- fetch-depth: 0
- - name: Fetch Submodules and Tags
- shell: bash
- run: ci/scripts/util_checkout.sh
- - name: Free Up Disk Space
- shell: bash
- run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: docker-compose pull --ignore-pull-failures r
- - name: Docker Build
- shell: bash
- run: docker-compose build r
- - name: Docker Run
- shell: bash
- run: docker-compose run r
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
+ run: |
+ sudo sysctl -w kernel.core_pattern="core.%e.%p"
+ ulimit -c unlimited
+ archery docker run {{ matrix.image }}
- name: Dump install logs on failure
if: failure()
run: cat r/check/arrow.Rcheck/00install.out
- name: Docker Push
if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
continue-on-error: true
- shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push r
+ run: archery docker push ubuntu-r
windows:
name: AMD64 Windows RTools ${{ matrix.rtools }}
diff --git a/.github/workflows/ruby.yml b/.github/workflows/ruby.yml
index c7ceccb..563394b 100644
--- a/.github/workflows/ruby.yml
+++ b/.github/workflows/ruby.yml
@@ -36,6 +36,8 @@ on:
env:
DOCKER_BUILDKIT: 0
COMPOSE_DOCKER_CLI_BUILD: 1
+ ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_TOKEN }}
+ ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_USER }}
jobs:
@@ -63,30 +65,22 @@ jobs:
- name: Free Up Disk Space
shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- shell: bash
- run: |
- docker-compose pull --ignore-pull-failures ubuntu-cpp
- docker-compose pull --ignore-pull-failures ubuntu-c-glib
- docker-compose pull --ignore-pull-failures ubuntu-ruby
- - name: Docker Build
- shell: bash
+ - name: Setup Python
+ uses: actions/setup-python@v1
+ with:
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
run: |
- docker-compose build ubuntu-cpp
- docker-compose build ubuntu-c-glib
- docker-compose build ubuntu-ruby
- - name: Docker Run
- shell: bash
- run: docker-compose run ubuntu-ruby
+ sudo sysctl -w kernel.core_pattern="core.%e.%p"
+ ulimit -c unlimited
+ archery docker run ubuntu-ruby
- name: Docker Push
if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
continue-on-error: true
shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push ubuntu-c-glib
- docker-compose push ubuntu-ruby
+ run: archery docker push ubuntu-ruby
macos:
name: AMD64 MacOS 10.15 GLib & Ruby
diff --git a/.github/workflows/rust.yml b/.github/workflows/rust.yml
index 361ca43..01a13cc 100644
--- a/.github/workflows/rust.yml
+++ b/.github/workflows/rust.yml
@@ -38,6 +38,8 @@ on:
env:
DOCKER_BUILDKIT: 0
COMPOSE_DOCKER_CLI_BUILD: 1
+ ARCHERY_DOCKER_USER: ${{ secrets.DOCKERHUB_TOKEN }}
+ ARCHERY_DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_USER }}
jobs:
@@ -62,20 +64,22 @@ jobs:
- name: Free Up Disk Space
shell: bash
run: ci/scripts/util_cleanup.sh
- - name: Docker Pull
- run: docker-compose pull --ignore-pull-failures debian-rust
- - name: Docker Build
- run: docker-compose build debian-rust
- - name: Docker Run
- run: docker-compose run debian-rust
+ - name: Setup Python
+ uses: actions/setup-python@v1
+ with:
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e dev/archery[docker]
+ - name: Execute Docker Build
+ run: |
+ sudo sysctl -w kernel.core_pattern="core.%e.%p"
+ ulimit -c unlimited
+ archery docker run debian-rust
- name: Docker Push
if: success() && github.event_name == 'push' && github.repository == 'apache/arrow'
continue-on-error: true
shell: bash
- run: |
- docker login -u ${{ secrets.DOCKERHUB_USER }} \
- -p ${{ secrets.DOCKERHUB_TOKEN }}
- docker-compose push debian-rust
+ run: archery docker push debian-rust
windows:
name: AMD64 Windows 2019 Rust ${{ matrix.rust }}
@@ -121,9 +125,9 @@ jobs:
- name: Install Rust
uses: actions-rs/toolchain@v1
with:
- toolchain: ${{ matrix.rust }}
- override: true
- components: rustfmt
+ toolchain: ${{ matrix.rust }}
+ override: true
+ components: rustfmt
- name: Install Flatbuffers
shell: bash
run: brew install flatbuffers
diff --git a/Makefile.docker b/Makefile.docker
deleted file mode 100644
index a044333..0000000
--- a/Makefile.docker
+++ /dev/null
@@ -1,65 +0,0 @@
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements. See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership. The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied. See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# build docker compose images:
-# $ make -f Makefile.docker build-cpp
-# To run the test suite
-# $ make -f Makefile.docker cpp
-
-LANGUAGES = cpp cpp-alpine cpp-cmake32 c_glib go java js python python-alpine rust r r-sanitizer
-MISC = lint iwyu clang-format docs pandas-master
-TESTS = dask hdfs-integration spark-integration python-nopandas
-
-# declare images dependencies
-DEPENDS_ON_CPP = build-c_glib build-python build-r
-DEPENDS_ON_CPP_ALPINE = build-python-alpine
-DEPENDS_ON_PYTHON = build-lint build-docs build-dask build-hdfs-integration \
- build-spark-integration build-python-nopandas \
- build-turbodbc-integration build-kartothek-integration
-DEPENDS_ON_LINT = build-iwyu build-clang-format
-
-SERVICES = $(LANGUAGES) $(MISC) $(TESTS)
-.PHONY: clean build-% run-% $(SERVICES)
-
-DC := docker-compose
-
-clean:
- $(DC) down -v
-
-# Default build target if no dependencies
-build-%:
- $(DC) build $*
-
-# The following targets create the dependencies of the form `build-X: build-Y`
-$(DEPENDS_ON_CPP): build-%: build-cpp
- $(DC) build $*
-$(DEPENDS_ON_CPP_ALPINE): build-%: build-cpp-alpine
- $(DC) build $*
-$(DEPENDS_ON_PYTHON): build-%: build-python
- $(DC) build $*
-# The dependents of lint image don't build anything
-$(DEPENDS_ON_LINT): build-%: build-lint
-
-# panda master is a special case due to --no-cache
-build-pandas-master: build-python
- $(DC) build --no-cache pandas-master
-
-run-%: build-%
- $(DC) run --rm $*
-
-# Trick to get `service` expand to `run-service`
-$(SERVICES): % : run-%
diff --git a/dev/archery/setup.py b/dev/archery/archery/benchmark/__init__.py
similarity index 57%
copy from dev/archery/setup.py
copy to dev/archery/archery/benchmark/__init__.py
index c7be38e..13a8339 100644
--- a/dev/archery/setup.py
+++ b/dev/archery/archery/benchmark/__init__.py
@@ -1,4 +1,3 @@
-#!/usr/bin/env python
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
@@ -15,27 +14,3 @@
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
-
-import sys
-from setuptools import setup
-
-
-if sys.version_info < (3, 5):
- sys.exit('Python < 3.5 is not supported')
-
-
-setup(
- name='archery',
- version="0.1.0",
- description='Apache Arrow Developers Tools',
- url='http://github.com/apache/arrow',
- maintainer='Arrow Developers',
- maintainer_email='dev@arrow.apache.org',
- packages=['archery'],
- install_requires=['click', 'pygithub'],
- tests_require=['pytest', 'ruamel.yaml'],
- entry_points='''
- [console_scripts]
- archery=archery.cli:archery
- ''',
-)
diff --git a/dev/archery/archery/benchmark/core.py b/dev/archery/archery/benchmark/core.py
index ddf53d5..b21c7ee 100644
--- a/dev/archery/archery/benchmark/core.py
+++ b/dev/archery/archery/benchmark/core.py
@@ -39,7 +39,7 @@ class Benchmark:
return self.median
def __repr__(self):
- return f"Benchmark[name={self.name},value={self.value}]"
+ return "Benchmark[name={},value={}]".format(self.name, self.value)
class BenchmarkSuite:
@@ -48,6 +48,6 @@ class BenchmarkSuite:
self.benchmarks = benchmarks
def __repr__(self):
- name = self.name
- benchmarks = self.benchmarks
- return f"BenchmarkSuite[name={name}, benchmarks={benchmarks}]"
+ return "BenchmarkSuite[name={}, benchmarks={}]".format(
+ self.name, self.benchmarks
+ )
diff --git a/dev/archery/archery/benchmark/google.py b/dev/archery/archery/benchmark/google.py
index 49e6ad1..5b2176c 100644
--- a/dev/archery/archery/benchmark/google.py
+++ b/dev/archery/archery/benchmark/google.py
@@ -47,19 +47,21 @@ class GoogleBenchmarkCommand(Command):
def list_benchmarks(self):
argv = ["--benchmark_list_tests"]
if self.benchmark_filter:
- argv.append(f"--benchmark_filter={self.benchmark_filter}")
+ argv.append("--benchmark_filter={}".format(self.benchmark_filter))
result = self.run(*argv, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
return str.splitlines(result.stdout.decode("utf-8"))
def results(self, repetitions=DEFAULT_REPETITIONS):
with NamedTemporaryFile() as out:
- argv = [f"--benchmark_repetitions={repetitions}",
- f"--benchmark_out={out.name}",
+ argv = ["--benchmark_repetitions={}".format(repetitions),
+ "--benchmark_out={}".format(out.name),
"--benchmark_out_format=json"]
if self.benchmark_filter:
- argv.append(f"--benchmark_filter={self.benchmark_filter}")
+ argv.append(
+ "--benchmark_filter={}".format(self.benchmark_filter)
+ )
self.run(*argv, check=True)
return json.load(out)
@@ -134,7 +136,7 @@ class GoogleBenchmarkObservation:
return self.time_unit
def __repr__(self):
- return f"{self.value}"
+ return self.value
class GoogleBenchmark(Benchmark):
@@ -161,7 +163,7 @@ class GoogleBenchmark(Benchmark):
super().__init__(name, unit, less_is_better, values)
def __repr__(self):
- return f"GoogleBenchmark[name={self.name},runs={self.runs}]"
+ return "GoogleBenchmark[name={},runs={}]".format(self.names, self.runs)
@classmethod
def from_json(cls, payload):
diff --git a/dev/archery/archery/benchmark/runner.py b/dev/archery/archery/benchmark/runner.py
index d33de42..e36696b 100644
--- a/dev/archery/archery/benchmark/runner.py
+++ b/dev/archery/archery/benchmark/runner.py
@@ -92,7 +92,7 @@ class StaticBenchmarkRunner(BenchmarkRunner):
def list_benchmarks(self):
for suite in self._suites:
for benchmark in suite.benchmarks:
- yield f"{suite.name}.{benchmark.name}"
+ yield "{}.{}".format(suite.name, benchmark.name)
@property
def suites(self):
@@ -122,7 +122,7 @@ class StaticBenchmarkRunner(BenchmarkRunner):
return BenchmarkRunnerCodec.decode(json_load(path_or_str), **kwargs)
def __repr__(self):
- return f"BenchmarkRunner[suites={list(self.suites)}]"
+ return "BenchmarkRunner[suites={}]".format(list(self.suites))
class CppBenchmarkRunner(BenchmarkRunner):
@@ -167,7 +167,7 @@ class CppBenchmarkRunner(BenchmarkRunner):
for suite_name, suite_bin in self.suites_binaries.items():
suite_cmd = GoogleBenchmarkCommand(suite_bin)
for benchmark_name in suite_cmd.list_benchmarks():
- yield f"{suite_name}.{benchmark_name}"
+ yield "{}.{}".format(suite_name, benchmark_name)
@property
def suites(self):
@@ -177,7 +177,7 @@ class CppBenchmarkRunner(BenchmarkRunner):
suite_and_binaries = self.suites_binaries
for suite_name in suite_and_binaries:
if not suite_matcher(suite_name):
- logger.debug(f"Ignoring suite {suite_name}")
+ logger.debug("Ignoring suite {}".format(suite_name))
continue
suite_bin = suite_and_binaries[suite_name]
@@ -185,7 +185,8 @@ class CppBenchmarkRunner(BenchmarkRunner):
# Filter may exclude all benchmarks
if not suite:
- logger.debug(f"Suite {suite_name} executed but no results")
+ logger.debug("Suite {} executed but no results"
+ .format(suite_name))
continue
yield suite
diff --git a/dev/archery/archery/bot.py b/dev/archery/archery/bot.py
index 1c8a485..b4ea24c 100644
--- a/dev/archery/archery/bot.py
+++ b/dev/archery/archery/bot.py
@@ -132,8 +132,9 @@ class CrossbowCommentFormatter:
url = 'https://github.com/{repo}/branches/all?query={branch}'
sha = job['target']['head']
- msg = f'Revision: {sha}\n\n'
- msg += f'Submitted crossbow builds: [{{repo}} @ {{branch}}]({url})\n'
+ msg = 'Revision: {}\n\n'.format(sha)
+ msg += 'Submitted crossbow builds: [{repo} @ {branch}]'
+ msg += '({})\n'.format(url)
msg += '\n|Task|Status|\n|----|------|'
tasks = sorted(job['tasks'].items(), key=operator.itemgetter(0))
@@ -150,7 +151,7 @@ class CrossbowCommentFormatter:
except KeyError:
badge = 'unsupported CI service `{}`'.format(task['ci'])
- msg += f'\n|{key}|{badge}|'
+ msg += '\n|{}|{}|'.format(key, badge)
return msg.format(repo=self.crossbow_repo, branch=job['branch'])
@@ -169,7 +170,7 @@ class CommentBot:
# only allow users of apache org to submit commands, for more see
# https://developer.github.com/v4/enum/commentauthorassociation/
allowed_roles = {'OWNER', 'MEMBER', 'CONTRIBUTOR'}
- mention = f'@{self.name}'
+ mention = '@{}'.format(self.name)
comment = payload['comment']
if payload['sender']['login'] == self.name:
diff --git a/dev/archery/archery/cli.py b/dev/archery/archery/cli.py
index 1cfd11a..4021c0a 100644
--- a/dev/archery/archery/cli.py
+++ b/dev/archery/archery/cli.py
@@ -30,9 +30,8 @@ from .lang.cpp import CppCMakeDefinition, CppConfiguration
from .utils.codec import JsonEncoder
from .utils.lint import linter, python_numpydoc, LintValidationException
from .utils.logger import logger, ctx as log_ctx
-from .utils.source import ArrowSources
+from .utils.source import ArrowSources, InvalidArrowSource
from .utils.tmpdir import tmpdir
-from .bot import CommentBot, actions
# Set default logging to INFO in command line.
logging.basicConfig(level=logging.INFO)
@@ -90,11 +89,10 @@ def archery(ctx, debug, pdb, quiet):
def validate_arrow_sources(ctx, param, src):
""" Ensure a directory contains Arrow cpp sources. """
- if isinstance(src, str):
- if not ArrowSources.valid(src):
- raise click.BadParameter(f"No Arrow C++ sources found in {src}.")
- src = ArrowSources(src)
- return src
+ try:
+ return ArrowSources.find(src)
+ except InvalidArrowSource as e:
+ raise click.BadParameter(str(e))
build_dir_type = click.Path(dir_okay=True, file_okay=False, resolve_path=True)
@@ -125,7 +123,7 @@ def _apply_options(cmd, options):
@archery.command(short_help="Initialize an Arrow C++ build")
-@click.option("--src", metavar="<arrow_src>", default=ArrowSources.find(),
+@click.option("--src", metavar="<arrow_src>", default=None,
callback=validate_arrow_sources,
help="Specify Arrow source directory")
# toolchain
@@ -284,7 +282,7 @@ def decorate_lint_command(cmd):
@archery.command(short_help="Check Arrow source tree for errors")
-@click.option("--src", metavar="<arrow_src>", default=ArrowSources.find(),
+@click.option("--src", metavar="<arrow_src>", default=None,
callback=validate_arrow_sources,
help="Specify Arrow source directory")
@click.option("--fix", is_flag=True, type=BOOL, default=False,
@@ -312,7 +310,7 @@ def lint(ctx, src, fix, iwyu_all, **checks):
@archery.command(short_help="Lint python docstring with NumpyDoc")
@click.argument('symbols', nargs=-1)
-@click.option("--src", metavar="<arrow_src>", default=ArrowSources.find(),
+@click.option("--src", metavar="<arrow_src>", default=None,
callback=validate_arrow_sources,
help="Specify Arrow source directory")
@click.option("--whitelist", "-w", help="Allow only these rules")
@@ -352,8 +350,7 @@ def benchmark(ctx):
def benchmark_common_options(cmd):
options = [
click.option("--src", metavar="<arrow_src>", show_default=True,
- default=ArrowSources.find(),
- callback=validate_arrow_sources,
+ default=None, callback=validate_arrow_sources,
help="Specify Arrow source directory"),
click.option("--preserve", type=BOOL, default=False, show_default=True,
is_flag=True,
@@ -392,7 +389,7 @@ def benchmark_list(ctx, rev_or_path, src, preserve, output, cmake_extras,
""" List benchmark suite.
"""
with tmpdir(preserve=preserve) as root:
- logger.debug(f"Running benchmark {rev_or_path}")
+ logger.debug("Running benchmark {}".format(rev_or_path))
conf = CppBenchmarkRunner.default_configuration(
cmake_extras=cmake_extras, **kwargs)
@@ -445,7 +442,7 @@ def benchmark_run(ctx, rev_or_path, src, preserve, output, cmake_extras,
archery benchmark run --output=run.json
"""
with tmpdir(preserve=preserve) as root:
- logger.debug(f"Running benchmark {rev_or_path}")
+ logger.debug("Running benchmark {}".format(rev_or_path))
conf = CppBenchmarkRunner.default_configuration(
cmake_extras=cmake_extras, **kwargs)
@@ -471,7 +468,7 @@ def benchmark_run(ctx, rev_or_path, src, preserve, output, cmake_extras,
def benchmark_diff(ctx, src, preserve, output, cmake_extras,
suite_filter, benchmark_filter,
threshold, contender, baseline, **kwargs):
- """ Compare (diff) benchmark runs.
+ """Compare (diff) benchmark runs.
This command acts like git-diff but for benchmark results.
@@ -542,8 +539,8 @@ def benchmark_diff(ctx, src, preserve, output, cmake_extras,
archery --quiet benchmark diff WORKSPACE run.json > result.json
"""
with tmpdir(preserve=preserve) as root:
- logger.debug(f"Comparing {contender} (contender) with "
- f"{baseline} (baseline)")
+ logger.debug("Comparing {} (contender) with {} (baseline)"
+ .format(contender, baseline))
conf = CppBenchmarkRunner.default_configuration(
cmake_extras=cmake_extras, **kwargs)
@@ -650,11 +647,133 @@ def integration(with_all=False, random_seed=12345, **args):
@click.option('--crossbow-token', '-ct', envvar='CROSSBOW_GITHUB_TOKEN',
help='OAuth token for pushing to the crossow repository')
def trigger_bot(event_name, event_payload, arrow_token, crossbow_token):
+ from .bot import CommentBot, actions
+
event_payload = json.loads(event_payload.read())
bot = CommentBot(name='github-actions', handler=actions, token=arrow_token)
bot.handle(event_name, event_payload)
+@archery.group('docker')
+@click.option("--src", metavar="<arrow_src>", default=None,
+ callback=validate_arrow_sources,
+ help="Specify Arrow source directory.")
+@click.pass_obj
+def docker_compose(obj, src):
+ """Interact with docker-compose based builds."""
+ from .docker import DockerCompose
+
+ config_path = src.path / 'docker-compose.yml'
+ if not config_path.exists():
+ raise click.ClickException(
+ "Docker compose configuration cannot be found in directory {}, "
+ "try to pass the arrow source directory explicitly.".format(src)
+ )
+
+ # take the docker-compose parameters like PYTHON, PANDAS, UBUNTU from the
+ # environment variables to keep the usage similar to docker-compose
+ obj['compose'] = DockerCompose(config_path, params=os.environ)
+ obj['compose'].validate()
+
+
+@docker_compose.command('run')
+@click.argument('image')
+@click.argument('command', required=False, default=None)
+@click.option('--env', '-e', multiple=True,
+ help="Set environment variable within the container")
+@click.option('--build/--no-build', default=True,
+ help="Whether to force build the image and its ancestor images")
+@click.option('--cache/--no-cache', default=True,
+ help="Whether to use cache when building the image and its "
+ "ancestor images")
+@click.option('--cache-leaf/--no-cache-leaf', default=True,
+ help="Whether to use cache when building only the (leaf) image "
+ "passed as the argument. To disable caching for both the "
+ "image and its ancestors use --no-cache option.")
+@click.option('--dry-run/--execute', default=False,
+ help="Display the docker-compose commands instead of executing "
+ "them.")
+@click.pass_obj
+def docker_compose_run(obj, image, command, env, build, cache, cache_leaf,
+ dry_run):
+ """Execute docker-compose builds.
+
+ To see the available builds run `archery docker list`.
+
+ Examples:
+
+ # execute a single build
+ archery docker run conda-python
+
+ # execute the builds but disable the image pulling
+ archery docker run --no-cache conda-python
+
+ # pass a docker-compose parameter, like the python version
+ PYTHON=3.8 archery docker run conda-python
+
+ # disable the cache only for the leaf image
+ PANDAS=master archery docker run --no-cache-leaf conda-python-pandas
+
+ # entirely skip building the image
+ archery docker run --no-build conda-python
+
+ # pass runtime parameters via docker environment variables
+ archery docker run -e CMAKE_BUILD_TYPE=release ubuntu-cpp
+
+ # starting an interactive bash session for debugging
+ archery docker run ubuntu-cpp bash
+ """
+ from .docker import UndefinedImage
+
+ compose = obj['compose']
+
+ if dry_run:
+ from types import MethodType
+
+ def _print_command(self, *args, **kwargs):
+ params = ['{}={}'.format(k, v) for k, v in self.params.items()]
+ command = ' '.join(params + ['docker-compose'] + list(args))
+ click.echo(command)
+
+ compose._execute = MethodType(_print_command, compose)
+
+ try:
+ if build:
+ compose.build(image, cache=cache, cache_leaf=cache_leaf)
+ compose.run(image, command=command)
+ except UndefinedImage as e:
+ raise click.ClickException(
+ "There is no service/image defined in docker-compose.yml with "
+ "name: {}".format(str(e))
+ )
+ except RuntimeError as e:
+ raise click.ClickException(str(e))
+
+
+@docker_compose.command('push')
+@click.argument('image')
+@click.option('--user', '-u', required=True, envvar='ARCHERY_DOCKER_USER',
+ help='Docker repository username')
+@click.option('--password', '-p', required=True,
+ envvar='ARCHERY_DOCKER_PASSWORD',
+ help='Docker repository password')
+@click.pass_obj
+def docker_compose_push(obj, image, user, password):
+ """Push the generated docker-compose image."""
+ compose = obj['compose']
+ compose.push(image, user=user, password=password)
+
+
+@docker_compose.command('images')
+@click.pass_obj
+def docker_compose_images(obj):
+ """List the available docker-compose images."""
+ compose = obj['compose']
+ click.echo('Available images:')
+ for image in compose.images():
+ click.echo(' - {}'.format(image))
+
+
if __name__ == "__main__":
archery(obj={})
diff --git a/dev/archery/archery/compat.py b/dev/archery/archery/compat.py
new file mode 100644
index 0000000..22cb9fc
--- /dev/null
+++ b/dev/archery/archery/compat.py
@@ -0,0 +1,51 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied. See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import pathlib
+
+
+def _is_path_like(path):
+ # PEP519 filesystem path protocol is available from python 3.6, so pathlib
+ # doesn't implement __fspath__ for earlier versions
+ return (isinstance(path, str) or
+ hasattr(path, '__fspath__') or
+ isinstance(path, pathlib.Path))
+
+
+def _ensure_path(path):
+ if isinstance(path, pathlib.Path):
+ return path
+ else:
+ return pathlib.Path(_stringify_path(path))
+
+
+def _stringify_path(path):
+ """
+ Convert *path* to a string or unicode path if possible.
+ """
+ if isinstance(path, str):
+ return path
+
+ # checking whether path implements the filesystem protocol
+ try:
+ return path.__fspath__() # new in python 3.6
+ except AttributeError:
+ # fallback pathlib ckeck for earlier python versions than 3.6
+ if isinstance(path, pathlib.Path):
+ return str(path)
+
+ raise TypeError("not a path-like object")
diff --git a/dev/archery/archery/docker.py b/dev/archery/archery/docker.py
new file mode 100644
index 0000000..42660cd
--- /dev/null
+++ b/dev/archery/archery/docker.py
@@ -0,0 +1,186 @@
+#!/usr/bin/env python3
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied. See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import os
+import subprocess
+
+from dotenv import dotenv_values
+from ruamel.yaml import YAML
+
+from .utils.command import Command, default_bin
+from .compat import _ensure_path
+
+
+def flatten(node, parents=None):
+ parents = list(parents or [])
+ if isinstance(node, str):
+ yield (node, parents)
+ elif isinstance(node, list):
+ for value in node:
+ yield from flatten(value, parents=parents)
+ elif isinstance(node, dict):
+ for key, value in node.items():
+ yield (key, parents)
+ yield from flatten(value, parents=parents + [key])
+ else:
+ raise TypeError(node)
+
+
+class UndefinedImage(Exception):
+ pass
+
+
+class Docker(Command):
+
+ def __init__(self, docker_bin=None):
+ self.bin = default_bin(docker_bin, "docker")
+
+
+class DockerCompose(Command):
+
+ def __init__(self, config_path, dotenv_path=None, compose_bin=None,
+ params=None):
+ self.config_path = _ensure_path(config_path)
+ if dotenv_path:
+ self.dotenv_path = _ensure_path(dotenv_path)
+ else:
+ self.dotenv_path = self.config_path.parent / '.env'
+
+ yaml = YAML()
+ with self.config_path.open() as fp:
+ self.config = yaml.load(fp)
+
+ self.bin = default_bin(compose_bin, 'docker-compose')
+ self.nodes = dict(flatten(self.config['x-hierarchy']))
+ self.dotenv = dotenv_values(str(self.dotenv_path))
+ if params is None:
+ self.params = {}
+ else:
+ self.params = {k: v for k, v in params.items() if k in self.dotenv}
+
+ # forward the process' environment variables
+ self._compose_env = os.environ.copy()
+ # set the defaults from the dotenv files
+ self._compose_env.update(self.dotenv)
+ # override the defaults passed as parameters
+ self._compose_env.update(self.params)
+
+ def validate(self):
+ services = self.config['services'].keys()
+ nodes = self.nodes.keys()
+ errors = []
+
+ for name in nodes - services:
+ errors.append(
+ 'Service `{}` is defined in `x-hierarchy` bot not in '
+ '`services`'.format(name)
+ )
+ for name in services - nodes:
+ errors.append(
+ 'Service `{}` is defined in `services` but not in '
+ '`x-hierarchy`'.format(name)
+ )
+
+ # trigger docker-compose's own validation
+ result = self._execute('config', check=False, stderr=subprocess.PIPE,
+ stdout=subprocess.PIPE)
+
+ if result.returncode != 0:
+ # strip the intro line of docker-compose errors
+ errors += result.stderr.decode().splitlines()[1:]
+
+ if errors:
+ msg = '\n'.join([' - {}'.format(msg) for msg in errors])
+ raise ValueError(
+ 'Found errors with docker-compose:\n{}'.format(msg)
+ )
+
+ def _validate_image(self, name):
+ if name not in self.nodes:
+ raise UndefinedImage(name)
+
+ def _execute(self, *args, **kwargs):
+ try:
+ return super().run('--file', str(self.config_path), *args,
+ env=self._compose_env, **kwargs)
+ except subprocess.CalledProcessError as e:
+ def formatdict(d, template):
+ return '\n'.join(
+ template.format(k, v) for k, v in sorted(d.items())
+ )
+ msg = (
+ "`{cmd}` exited with a non-zero exit code {code}, see the "
+ "process log above.\n\nThe docker-compose command was "
+ "invoked with the following parameters:\n\nDefaults defined "
+ "in .env:\n{dotenv}\n\nArchery was called with:\n{params}"
+ )
+ raise RuntimeError(
+ msg.format(
+ cmd=' '.join(e.cmd),
+ code=e.returncode,
+ dotenv=formatdict(self.dotenv, template=' {}: {}'),
+ params=formatdict(self.params, template=' export {}={}')
+ )
+ )
+
+ def _pull_andor_build(self, image, pull_if):
+ if pull_if:
+ self._execute('pull', '--ignore-pull-failures', image)
+ self._execute('build', image)
+ else:
+ self._execute('build', '--no-cache', image)
+
+ def build(self, image, cache=True, cache_leaf=True, params=None):
+ self._validate_image(image)
+
+ # build each ancestors
+ for ancestor in self.nodes[image]:
+ self._pull_andor_build(ancestor, pull_if=cache)
+
+ # build the image at last
+ self._pull_andor_build(image, pull_if=(cache and cache_leaf))
+
+ def run(self, image, command=None, env=None, params=None):
+ self._validate_image(image)
+
+ args = []
+ if env is not None:
+ for k, v in env.items():
+ args.extend(['-e', '{}={}'.format(k, v)])
+
+ args.append(image)
+ if command is not None:
+ args.append(command)
+
+ self._execute('run', '--rm', *args)
+
+ def push(self, image, user, password):
+ self._validate_image(image)
+ try:
+ # TODO(kszucs): have an option for a prompt
+ Docker().run('login', '-u', user, '-p', password)
+ except subprocess.CalledProcessError:
+ # hide credentials
+ msg = ('Failed to push `{}`, check the passed credentials'
+ .format(image))
+ raise RuntimeError(msg) from None
+ else:
+ self._execute('push', image)
+
+ def images(self):
+ return sorted(self.nodes.keys())
diff --git a/dev/archery/setup.py b/dev/archery/archery/lang/__init__.py
similarity index 57%
copy from dev/archery/setup.py
copy to dev/archery/archery/lang/__init__.py
index c7be38e..13a8339 100644
--- a/dev/archery/setup.py
+++ b/dev/archery/archery/lang/__init__.py
@@ -1,4 +1,3 @@
-#!/usr/bin/env python
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
@@ -15,27 +14,3 @@
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
-
-import sys
-from setuptools import setup
-
-
-if sys.version_info < (3, 5):
- sys.exit('Python < 3.5 is not supported')
-
-
-setup(
- name='archery',
- version="0.1.0",
- description='Apache Arrow Developers Tools',
- url='http://github.com/apache/arrow',
- maintainer='Arrow Developers',
- maintainer_email='dev@arrow.apache.org',
- packages=['archery'],
- install_requires=['click', 'pygithub'],
- tests_require=['pytest', 'ruamel.yaml'],
- entry_points='''
- [console_scripts]
- archery=archery.cli:archery
- ''',
-)
diff --git a/dev/archery/archery/lang/cpp.py b/dev/archery/archery/lang/cpp.py
index a9997ad..2769781 100644
--- a/dev/archery/archery/lang/cpp.py
+++ b/dev/archery/archery/lang/cpp.py
@@ -151,7 +151,7 @@ class CppConfiguration:
return self._cc
if self.with_fuzzing:
- return f"clang-{LLVM_VERSION}"
+ return "clang-{}".format(LLVM_VERSION)
return None
@@ -161,7 +161,7 @@ class CppConfiguration:
return self._cxx
if self.with_fuzzing:
- return f"clang++-{LLVM_VERSION}"
+ return "clang++-{}".format(LLVM_VERSION)
return None
@@ -261,7 +261,8 @@ class CppConfiguration:
@property
def definitions(self):
extras = list(self.cmake_extras) if self.cmake_extras else []
- return [f"-D{d[0]}={d[1]}" for d in self._gen_defs()] + extras
+ definitions = ["-D{}={}".format(d[0], d[1]) for d in self._gen_defs()]
+ return definitions + extras
@property
def environment(self):
diff --git a/dev/archery/archery/testing.py b/dev/archery/archery/testing.py
new file mode 100644
index 0000000..9b9c619
--- /dev/null
+++ b/dev/archery/archery/testing.py
@@ -0,0 +1,63 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied. See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from contextlib import contextmanager
+import os
+from unittest import mock
+import re
+
+
+class PartialEnv(dict):
+
+ def __eq__(self, other):
+ return self.items() <= other.items()
+
+
+_mock_call_type = type(mock.call())
+
+
+def _ensure_mock_call_object(obj, **kwargs):
+ if isinstance(obj, _mock_call_type):
+ return obj
+ elif isinstance(obj, str):
+ cmd = re.split(r"\s+", obj)
+ return mock.call(cmd, **kwargs)
+ elif isinstance(obj, list):
+ return mock.call(obj, **kwargs)
+ else:
+ raise TypeError(obj)
+
+
+@contextmanager
+def assert_subprocess_calls(expected_commands_or_calls, **kwargs):
+ calls = [
+ _ensure_mock_call_object(obj, **kwargs)
+ for obj in expected_commands_or_calls
+ ]
+ with mock.patch('subprocess.run', autospec=True) as run:
+ yield run
+ run.assert_has_calls(calls)
+
+
+@contextmanager
+def override_env(mapping):
+ original = os.environ
+ try:
+ os.environ = dict(os.environ, **mapping)
+ yield os.environ
+ finally:
+ os.environ = original
diff --git a/dev/archery/archery/tests/test_docker.py b/dev/archery/archery/tests/test_docker.py
new file mode 100644
index 0000000..902234f
--- /dev/null
+++ b/dev/archery/archery/tests/test_docker.py
@@ -0,0 +1,410 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied. See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import collections
+import os
+import re
+import subprocess
+from unittest import mock
+
+import pytest
+
+from archery.docker import DockerCompose
+from archery.testing import assert_subprocess_calls, override_env, PartialEnv
+
+
+missing_service_compose_yml = """
+version: '3.5'
+
+x-hierarchy:
+ - foo:
+ - sub-foo:
+ - sub-sub-foo
+ - another-sub-sub-foo
+ - bar:
+ - sub-bar
+ - baz
+
+services:
+ foo:
+ image: dummy
+ sub-sub-foo:
+ image: dummy
+ another-sub-sub-foo:
+ image: dummy
+ bar:
+ image: dummy
+ sub-bar:
+ image: dummy
+ baz:
+ image: dummy
+"""
+
+missing_node_compose_yml = """
+version: '3.5'
+
+x-hierarchy:
+ - foo:
+ - sub-foo:
+ - sub-sub-foo
+ - another-sub-sub-foo
+ - bar
+ - baz
+
+services:
+ foo:
+ image: dummy
+ sub-foo:
+ image: dummy
+ sub-sub-foo:
+ image: dummy
+ another-sub-sub-foo:
+ image: dummy
+ bar:
+ image: dummy
+ sub-bar:
+ image: dummy
+ baz:
+ image: dummy
+"""
+
+ok_compose_yml = """
+version: '3.5'
+
+x-hierarchy:
+ - foo:
+ - sub-foo:
+ - sub-sub-foo
+ - another-sub-sub-foo
+ - bar:
+ - sub-bar
+ - baz
+
+services:
+ foo:
+ image: dummy
+ sub-foo:
+ image: dummy
+ sub-sub-foo:
+ image: dummy
+ another-sub-sub-foo:
+ image: dummy
+ bar:
+ image: dummy
+ sub-bar:
+ image: dummy
+ baz:
+ image: dummy
+"""
+
+arrow_compose_yml = """
+version: '3.5'
+
+x-hierarchy:
+ - conda-cpp:
+ - conda-python:
+ - conda-python-pandas
+ - conda-python-dask
+ - conda-r
+ - ubuntu-cpp:
+ - ubuntu-cpp-cmake32
+ - ubuntu-c-glib:
+ - ubuntu-ruby
+
+services:
+ conda-cpp:
+ image: dummy
+ conda-python:
+ image: dummy
+ conda-python-pandas:
+ image: dummy
+ conda-python-dask:
+ image: dummy
+ conda-r:
+ image: dummy
+ ubuntu-cpp:
+ image: dummy
+ ubuntu-cpp-cmake32:
+ image: dummy
+ ubuntu-c-glib:
+ image: dummy
+ ubuntu-ruby:
+ image: dummy
+"""
+
+arrow_compose_env = {
+ 'UBUNTU': '20.04', # overridden below
+ 'PYTHON': '3.6',
+ 'PANDAS': 'latest',
+ 'DASK': 'latest', # overridden below
+}
+
+
+def create_config(directory, yml_content, env_content=None):
+ env_path = directory / '.env'
+ config_path = directory / 'docker-compose.yml'
+
+ with config_path.open('w') as fp:
+ fp.write(yml_content)
+
+ if env_content is not None:
+ with env_path.open('w') as fp:
+ for k, v in env_content.items():
+ fp.write("{}={}\n".format(k, v))
+
+ return config_path
+
+
+@pytest.fixture
+def arrow_compose_path(tmpdir):
+ return create_config(tmpdir, arrow_compose_yml, arrow_compose_env)
+
+
+def test_config_validation(tmpdir):
+ config_path = create_config(tmpdir, missing_service_compose_yml)
+ compose = DockerCompose(config_path)
+ msg = "`sub-foo` is defined in `x-hierarchy` bot not in `services`"
+ with pytest.raises(ValueError, match=msg):
+ compose.validate()
+
+ config_path = create_config(tmpdir, missing_node_compose_yml)
+ compose = DockerCompose(config_path)
+ msg = "`sub-bar` is defined in `services` but not in `x-hierarchy`"
+ with pytest.raises(ValueError, match=msg):
+ compose.validate()
+
+ config_path = create_config(tmpdir, ok_compose_yml)
+ compose = DockerCompose(config_path)
+ compose.validate()
+
+
+def assert_compose_calls(compose, expected_args, env=mock.ANY):
+ base_command = ['docker-compose', '--file', str(compose.config_path)]
+ expected_commands = []
+ for args in expected_args:
+ if isinstance(args, str):
+ cmd = base_command + re.split(r"\s", args)
+ expected_commands.append(cmd)
+ return assert_subprocess_calls(expected_commands, check=True, env=env)
+
+
+def test_arrow_example_validation_passes(arrow_compose_path):
+ compose = DockerCompose(arrow_compose_path)
+ compose.validate()
+
+
+def test_compose_default_params_and_env(arrow_compose_path):
+ compose = DockerCompose(arrow_compose_path, params=dict(
+ UBUNTU='18.04',
+ DASK='master'
+ ))
+ assert compose.dotenv == arrow_compose_env
+ assert compose.params == {
+ 'UBUNTU': '18.04',
+ 'DASK': 'master',
+ }
+
+
+def test_forwarding_env_variables(arrow_compose_path):
+ expected_calls = [
+ "pull --ignore-pull-failures conda-cpp",
+ "build conda-cpp",
+ ]
+ expected_env = PartialEnv(
+ MY_CUSTOM_VAR_A='a',
+ MY_CUSTOM_VAR_B='b'
+ )
+ with override_env({'MY_CUSTOM_VAR_A': 'a', 'MY_CUSTOM_VAR_B': 'b'}):
+ compose = DockerCompose(arrow_compose_path)
+ with assert_compose_calls(compose, expected_calls, env=expected_env):
+ assert os.environ['MY_CUSTOM_VAR_A'] == 'a'
+ assert os.environ['MY_CUSTOM_VAR_B'] == 'b'
+ compose.build('conda-cpp')
+
+
+def test_compose_build(arrow_compose_path):
+ compose = DockerCompose(arrow_compose_path)
+
+ expected_calls = [
+ "pull --ignore-pull-failures conda-cpp",
+ "build conda-cpp",
+ ]
+ with assert_compose_calls(compose, expected_calls):
+ compose.build('conda-cpp')
+
+ expected_calls = [
+ "build --no-cache conda-cpp"
+ ]
+ with assert_compose_calls(compose, expected_calls):
+ compose.build('conda-cpp', cache=False)
+
+ expected_calls = [
+ "pull --ignore-pull-failures conda-cpp",
+ "build conda-cpp",
+ "pull --ignore-pull-failures conda-python",
+ "build conda-python",
+ "pull --ignore-pull-failures conda-python-pandas",
+ "build conda-python-pandas"
+ ]
+ with assert_compose_calls(compose, expected_calls):
+ compose.build('conda-python-pandas')
+
+ expected_calls = [
+ "build --no-cache conda-cpp",
+ "build --no-cache conda-python",
+ "build --no-cache conda-python-pandas",
+ ]
+ with assert_compose_calls(compose, expected_calls):
+ compose.build('conda-python-pandas', cache=False)
+
+ expected_calls = [
+ "pull --ignore-pull-failures conda-cpp",
+ "build conda-cpp",
+ "pull --ignore-pull-failures conda-python",
+ "build conda-python",
+ "build --no-cache conda-python-pandas",
+ ]
+ with assert_compose_calls(compose, expected_calls):
+ compose.build('conda-python-pandas', cache=True, cache_leaf=False)
+
+
+def test_compose_build_params(arrow_compose_path):
+ expected_calls = [
+ "pull --ignore-pull-failures ubuntu-cpp",
+ "build ubuntu-cpp",
+ ]
+
+ compose = DockerCompose(arrow_compose_path, params=dict(UBUNTU='18.04'))
+ expected_env = PartialEnv(UBUNTU="18.04")
+ with assert_compose_calls(compose, expected_calls, env=expected_env):
+ compose.build('ubuntu-cpp')
+
+ compose = DockerCompose(arrow_compose_path, params=dict(UBUNTU='16.04'))
+ expected_env = PartialEnv(UBUNTU="16.04")
+ with assert_compose_calls(compose, expected_calls, env=expected_env):
+ compose.build('ubuntu-cpp')
+
+ expected_calls = [
+ "build --no-cache conda-cpp",
+ "build --no-cache conda-python",
+ "build --no-cache conda-python-pandas",
+ ]
+ compose = DockerCompose(arrow_compose_path, params=dict(UBUNTU='18.04'))
+ expected_env = PartialEnv(PYTHON='3.6', PANDAS='latest')
+ with assert_compose_calls(compose, expected_calls, env=expected_env):
+ compose.build('conda-python-pandas', cache=False)
+
+ compose = DockerCompose(arrow_compose_path, params=dict(PANDAS='0.25.3'))
+ expected_env = PartialEnv(PYTHON='3.6', PANDAS='0.25.3')
+ with assert_compose_calls(compose, expected_calls, env=expected_env):
+ compose.build('conda-python-pandas', cache=False)
+
+ compose = DockerCompose(arrow_compose_path,
+ params=dict(PYTHON='3.8', PANDAS='master'))
+ expected_env = PartialEnv(PYTHON='3.8', PANDAS='master')
+ with assert_compose_calls(compose, expected_calls, env=expected_env):
+ compose.build('conda-python-pandas', cache=False)
+
+
+def test_compose_run(arrow_compose_path):
+ expected_calls = [
+ "run --rm conda-cpp",
+ ]
+ compose = DockerCompose(arrow_compose_path)
+ with assert_compose_calls(compose, expected_calls):
+ compose.run('conda-cpp')
+
+ expected_calls = [
+ "run --rm conda-python"
+ ]
+ expected_env = PartialEnv(PYTHON='3.6')
+ with assert_compose_calls(compose, expected_calls, env=expected_env):
+ compose.run('conda-python')
+
+ compose = DockerCompose(arrow_compose_path, params=dict(PYTHON='3.8'))
+ expected_env = PartialEnv(PYTHON='3.8')
+ with assert_compose_calls(compose, expected_calls, env=expected_env):
+ compose.run('conda-python')
+
+ compose = DockerCompose(arrow_compose_path, params=dict(PYTHON='3.8'))
+ for command in ["bash", "echo 1"]:
+ expected_calls = [
+ ["run", "--rm", "conda-python", command]
+ ]
+ expected_env = PartialEnv(PYTHON='3.8')
+ with assert_compose_calls(compose, expected_calls, env=expected_env):
+ compose.run('conda-python', command)
+
+ expected_calls = [
+ (
+ "run --rm -e CONTAINER_ENV_VAR_A=a -e CONTAINER_ENV_VAR_B=b "
+ "conda-python"
+ )
+ ]
+ compose = DockerCompose(arrow_compose_path)
+ expected_env = PartialEnv(PYTHON='3.6')
+ with assert_compose_calls(compose, expected_calls, env=expected_env):
+ env = collections.OrderedDict([
+ ("CONTAINER_ENV_VAR_A", "a"),
+ ("CONTAINER_ENV_VAR_B", "b")
+ ])
+ compose.run('conda-python', env=env)
+
+
+def test_compose_push(arrow_compose_path):
+ compose = DockerCompose(arrow_compose_path, params=dict(PYTHON='3.8'))
+ expected_env = PartialEnv(PYTHON="3.8")
+ expected_calls = [
+ mock.call(["docker", "login", "-u", "user", "-p", "pass"], check=True),
+ mock.call(["docker-compose", "--file", str(compose.config_path),
+ "push", "conda-python"], check=True, env=expected_env)
+ ]
+ with assert_subprocess_calls(expected_calls):
+ compose.push('conda-python', user='user', password='pass')
+
+
+def test_compose_error(arrow_compose_path):
+ compose = DockerCompose(arrow_compose_path, params=dict(
+ PYTHON='3.8',
+ PANDAS='master'
+ ))
+ compose.validate()
+
+ error = subprocess.CalledProcessError(99, [])
+ with mock.patch('subprocess.run', side_effect=error):
+ with pytest.raises(RuntimeError) as exc:
+ compose.run('conda-cpp')
+
+ exception_message = str(exc.value)
+ assert "exited with a non-zero exit code 99" in exception_message
+ assert "PANDAS: latest" in exception_message
+ assert "export PANDAS=master" in exception_message
+
+
+def test_listing_images(arrow_compose_path):
+ compose = DockerCompose(arrow_compose_path)
+ assert compose.images() == [
+ 'conda-cpp',
+ 'conda-python',
+ 'conda-python-dask',
+ 'conda-python-pandas',
+ 'conda-r',
+ 'ubuntu-c-glib',
+ 'ubuntu-cpp',
+ 'ubuntu-cpp-cmake32',
+ 'ubuntu-ruby',
+ ]
diff --git a/dev/archery/archery/tests/test_testing.py b/dev/archery/archery/tests/test_testing.py
new file mode 100644
index 0000000..117b928
--- /dev/null
+++ b/dev/archery/archery/tests/test_testing.py
@@ -0,0 +1,62 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership. The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied. See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import subprocess
+
+import pytest
+
+from archery.testing import PartialEnv, assert_subprocess_calls
+
+
+def test_partial_env():
+ assert PartialEnv(a=1, b=2) == {'a': 1, 'b': 2, 'c': 3}
+ assert PartialEnv(a=1) == {'a': 1, 'b': 2, 'c': 3}
+ assert PartialEnv(a=1, b=2) == {'a': 1, 'b': 2}
+ assert PartialEnv(a=1, b=2) != {'b': 2, 'c': 3}
+ assert PartialEnv(a=1, b=2) != {'a': 1, 'c': 3}
+
+
+def test_assert_subprocess_calls():
+ expected_calls = [
+ "echo Hello",
+ ["echo", "World"]
+ ]
+ with assert_subprocess_calls(expected_calls):
+ subprocess.run(['echo', 'Hello'])
+ subprocess.run(['echo', 'World'])
+
+ expected_env = PartialEnv(
+ CUSTOM_ENV_A='a',
+ CUSTOM_ENV_C='c'
+ )
+ with assert_subprocess_calls(expected_calls, env=expected_env):
+ env = {
+ 'CUSTOM_ENV_A': 'a',
+ 'CUSTOM_ENV_B': 'b',
+ 'CUSTOM_ENV_C': 'c'
+ }
+ subprocess.run(['echo', 'Hello'], env=env)
+ subprocess.run(['echo', 'World'], env=env)
+
+ with pytest.raises(AssertionError):
+ with assert_subprocess_calls(expected_calls, env=expected_env):
+ env = {
+ 'CUSTOM_ENV_B': 'b',
+ 'CUSTOM_ENV_C': 'c'
+ }
+ subprocess.run(['echo', 'Hello'], env=env)
+ subprocess.run(['echo', 'World'], env=env)
diff --git a/dev/archery/setup.py b/dev/archery/archery/utils/__init__.py
similarity index 57%
copy from dev/archery/setup.py
copy to dev/archery/archery/utils/__init__.py
index c7be38e..13a8339 100644
--- a/dev/archery/setup.py
+++ b/dev/archery/archery/utils/__init__.py
@@ -1,4 +1,3 @@
-#!/usr/bin/env python
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
@@ -15,27 +14,3 @@
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
-
-import sys
-from setuptools import setup
-
-
-if sys.version_info < (3, 5):
- sys.exit('Python < 3.5 is not supported')
-
-
-setup(
- name='archery',
- version="0.1.0",
- description='Apache Arrow Developers Tools',
- url='http://github.com/apache/arrow',
- maintainer='Arrow Developers',
- maintainer_email='dev@arrow.apache.org',
- packages=['archery'],
- install_requires=['click', 'pygithub'],
- tests_require=['pytest', 'ruamel.yaml'],
- entry_points='''
- [console_scripts]
- archery=archery.cli:archery
- ''',
-)
diff --git a/dev/archery/archery/utils/cache.py b/dev/archery/archery/utils/cache.py
index 2e476bd..d92c5f3 100644
--- a/dev/archery/archery/utils/cache.py
+++ b/dev/archery/archery/utils/cache.py
@@ -70,7 +70,7 @@ class Cache:
"""
def download(path):
""" Tiny wrapper that download a file and save as key. """
- logger.debug(f"Downloading {url} as {path}")
+ logger.debug("Downloading {} as {}".format(url, path))
conn = urlopen(url)
# Ensure the download is completed before writing to disks.
content = conn.read()
diff --git a/dev/archery/archery/utils/cmake.py b/dev/archery/archery/utils/cmake.py
index 4be3dc9..f93895b 100644
--- a/dev/archery/archery/utils/cmake.py
+++ b/dev/archery/archery/utils/cmake.py
@@ -79,7 +79,7 @@ class CMakeDefinition:
def arguments(self):
"""" Return the arguments to cmake invocation. """
arguments = [
- f"-G{self.generator}",
+ "-G{}".format(self.generator),
] + self.definitions + [
self.source
]
@@ -99,9 +99,13 @@ class CMakeDefinition:
if os.path.exists(build_dir):
# Extra safety to ensure we're deleting a build folder.
if not CMakeBuild.is_build_dir(build_dir):
- raise FileExistsError(f"{build_dir} is not a cmake build")
+ raise FileExistsError(
+ "{} is not a cmake build".format(build_dir)
+ )
if not force:
- raise FileExistsError(f"{build_dir} exists use force=True")
+ raise FileExistsError(
+ "{} exists use force=True".format(build_dir)
+ )
rmtree(build_dir)
os.mkdir(build_dir)
@@ -112,7 +116,7 @@ class CMakeDefinition:
**kwargs)
def __repr__(self):
- return f"CMakeDefinition[source={self.source}]"
+ return "CMakeDefinition[source={}]".format(self.source)
CMAKE_BUILD_TYPE_RE = re.compile("CMAKE_BUILD_TYPE:STRING=([a-zA-Z]+)")
@@ -190,7 +194,7 @@ class CMakeBuild(CMake):
be lost. Only build_type is recovered.
"""
if not CMakeBuild.is_build_dir(path):
- raise ValueError(f"Not a valid CMakeBuild path: {path}")
+ raise ValueError("Not a valid CMakeBuild path: {}".format(path))
build_type = None
# Infer build_type by looking at CMakeCache.txt and looking for a magic
diff --git a/dev/archery/archery/utils/command.py b/dev/archery/archery/utils/command.py
index 77f3e9e..3ef6abe 100644
--- a/dev/archery/archery/utils/command.py
+++ b/dev/archery/archery/utils/command.py
@@ -57,7 +57,7 @@ class Command:
"""
def run(self, *argv, **kwargs):
- assert(hasattr(self, "bin"))
+ assert hasattr(self, "bin")
invocation = shlex.split(self.bin)
invocation.extend(argv)
@@ -70,7 +70,7 @@ class Command:
if "check" not in kwargs:
kwargs["check"] = True
- logger.debug(f"Executing `{invocation}`")
+ logger.debug("Executing `{}`".format(invocation))
return subprocess.run(invocation, **kwargs)
@property
diff --git a/dev/archery/archery/utils/git.py b/dev/archery/archery/utils/git.py
index 0d83f79..798bc5d 100644
--- a/dev/archery/archery/utils/git.py
+++ b/dev/archery/archery/utils/git.py
@@ -16,6 +16,7 @@
# under the License.
from .command import Command, capture_stdout, default_bin
+from ..compat import _stringify_path
# Decorator prepending argv with the git sub-command found with the method
@@ -36,8 +37,8 @@ class Git(Command):
def run_cmd(self, cmd, *argv, git_dir=None, **kwargs):
""" Inject flags before sub-command in argv. """
opts = []
- if git_dir and isinstance(git_dir, str):
- opts.extend(("-C", git_dir))
+ if git_dir is not None:
+ opts.extend(["-C", _stringify_path(git_dir)])
return self.run(*opts, cmd, *argv, **kwargs)
@@ -90,5 +91,10 @@ class Git(Command):
def current_branch(self, **kwargs):
return self.rev_parse("--abbrev-ref", "HEAD", **kwargs)
+ def repository_root(self, git_dir=None, **kwargs):
+ """ Locates the repository's root path from a subdirectory. """
+ stdout = self.rev_parse("--show-toplevel", git_dir=git_dir, **kwargs)
+ return stdout.decode('utf-8')
+
git = Git()
diff --git a/dev/archery/archery/utils/lint.py b/dev/archery/archery/utils/lint.py
index 8b0052a..d24f55b 100644
--- a/dev/archery/archery/utils/lint.py
+++ b/dev/archery/archery/utils/lint.py
@@ -231,7 +231,7 @@ def rat_linter(src, root):
violations = list(report.validate(exclusion=exclusion))
for violation in violations:
- print(f"apache-rat license violation: {violation}")
+ print("apache-rat license violation: {}".format(violation))
yield LintResult(len(violations) == 0)
diff --git a/dev/archery/archery/utils/rat.py b/dev/archery/archery/utils/rat.py
index 53f9da9..ce78f9f 100644
--- a/dev/archery/archery/utils/rat.py
+++ b/dev/archery/archery/utils/rat.py
@@ -24,9 +24,9 @@ from .cache import Cache
from .command import capture_stdout
RAT_VERSION = 0.13
-RAT_JAR_FILENAME = f"apache-rat-{RAT_VERSION}.jar"
-RAT_URL_ = f"https://repo1.maven.org/maven2/org/apache/rat/apache-rat"
-RAT_URL = f"{RAT_URL_}/{RAT_VERSION}/{RAT_JAR_FILENAME}"
+RAT_JAR_FILENAME = "apache-rat-{}.jar".format(RAT_VERSION)
+RAT_URL_ = "https://repo1.maven.org/maven2/org/apache/rat/apache-rat"
+RAT_URL = "/".join([RAT_URL_, str(RAT_VERSION), RAT_JAR_FILENAME])
class Rat(Jar):
@@ -54,7 +54,7 @@ class RatReport:
self.tree = ElementTree.fromstring(xml)
def __repr__(self):
- return f"RatReport({self.xml})"
+ return "RatReport({})".format(self.xml)
def validate(self, exclusion=None):
for r in self.tree.findall('resource'):
diff --git a/dev/archery/archery/utils/source.py b/dev/archery/archery/utils/source.py
index 913e8d7..4f542e8 100644
--- a/dev/archery/archery/utils/source.py
+++ b/dev/archery/archery/utils/source.py
@@ -16,10 +16,16 @@
# under the License.
import os
+from pathlib import Path
+import subprocess
from .git import git
+class InvalidArrowSource(Exception):
+ pass
+
+
class ArrowSources:
""" ArrowSources is a companion class representing a directory containing
Apache Arrow's sources.
@@ -39,49 +45,53 @@ class ArrowSources:
----------
path : src
"""
- assert isinstance(path, str) and ArrowSources.valid(path)
+ path = Path(path)
+ # validate by checking a specific path in the arrow source tree
+ if not (path / 'cpp' / 'CMakeLists.txt').exists():
+ raise InvalidArrowSource(
+ "No Arrow C++ sources found in {}.".format(path)
+ )
self.path = path
@property
def archery(self):
""" Returns the archery directory of an Arrow sources. """
- return os.path.join(self.dev, "archery")
+ return self.dev / "archery"
@property
def cpp(self):
""" Returns the cpp directory of an Arrow sources. """
- return os.path.join(self.path, "cpp")
+ return self.path / "cpp"
@property
def dev(self):
""" Returns the dev directory of an Arrow sources. """
- return os.path.join(self.path, "dev")
+ return self.path / "dev"
@property
def python(self):
""" Returns the python directory of an Arrow sources. """
- return os.path.join(self.path, "python")
+ return self.path / "python"
@property
def pyarrow(self):
""" Returns the python/pyarrow directory of an Arrow sources. """
- return os.path.join(self.python, "pyarrow")
+ return self.python / "pyarrow"
@property
def r(self):
""" Returns the r directory of an Arrow sources. """
- return os.path.join(self.path, "r")
+ return self.path / "r"
@property
def rust(self):
""" Returns the rust directory of an Arrow sources. """
- return os.path.join(self.path, "rust")
+ return self.path / "rust"
@property
def git_backed(self):
""" Indicate if the sources are backed by git. """
- git_path = os.path.join(self.path, ".git")
- return os.path.exists(git_path)
+ return (self.path / ".git").exists()
@property
def git_dirty(self):
@@ -91,10 +101,11 @@ class ArrowSources:
def archive(self, path, dereference=False, compressor=None, revision=None):
""" Saves a git archive at path. """
if not self.git_backed:
- raise ValueError(f"{self} is not backed by git")
+ raise ValueError("{} is not backed by git".format(self))
rev = revision if revision else "HEAD"
- archive = git.archive("--prefix=apache-arrow/", rev, git_dir=self.path)
+ archive = git.archive("--prefix=apache-arrow/", rev,
+ git_dir=self.path)
# TODO(fsaintjacques): fix dereference for
@@ -124,7 +135,7 @@ class ArrowSources:
Path to checkout the local clone.
"""
if not self.git_backed:
- raise ValueError(f"{self} is not backed by git")
+ raise ValueError("{} is not backed by git".format(self))
if revision == ArrowSources.WORKSPACE:
return self, False
@@ -144,17 +155,6 @@ class ArrowSources:
return ArrowSources(clone_dir), True
@staticmethod
- def valid(src):
- """ Indicate if current sources are valid. """
- if isinstance(src, ArrowSources):
- return True
- if isinstance(src, str):
- cpp_path = os.path.join(src, "cpp")
- cmake_path = os.path.join(cpp_path, "CMakeLists.txt")
- return os.path.exists(cmake_path)
- return False
-
- @staticmethod
def find(path=None):
""" Infer Arrow sources directory from various method.
@@ -177,17 +177,30 @@ class ArrowSources:
env = os.environ.get("ARROW_SRC")
# Implicit via cwd
- cwd = os.getcwd()
+ cwd = Path.cwd()
# Implicit via current file
- this_dir = os.path.dirname(os.path.realpath(__file__))
- this = os.path.join(this_dir, "..", "..", "..", "..")
-
- for p in [path, env, cwd, this]:
- if ArrowSources.valid(p):
+ this = Path(__file__).parents[4]
+
+ # Implicit via git repository (if archery is installed system wide)
+ try:
+ repo = git.repository_root(git_dir=cwd)
+ except subprocess.CalledProcessError:
+ # We're not inside a git repository.
+ repo = None
+
+ paths = list(filter(None, [path, env, cwd, this, repo]))
+ for p in paths:
+ try:
return ArrowSources(p)
+ except InvalidArrowSource:
+ pass
- return None
+ searched_paths = "\n".join([" - {}".format(p) for p in paths])
+ raise InvalidArrowSource(
+ "Unable to locate Arrow's source directory. "
+ "Searched paths are:\n{}".format(searched_paths)
+ )
def __repr__(self):
- return f"{self.path}"
+ return self.path
diff --git a/dev/archery/requirements.txt b/dev/archery/requirements.txt
index 7105030..0e1258a 100644
--- a/dev/archery/requirements.txt
+++ b/dev/archery/requirements.txt
@@ -1,3 +1,4 @@
click
pygithub
-ruamel.yaml
\ No newline at end of file
+python-dotenv
+ruamel.yaml
diff --git a/dev/archery/setup.py b/dev/archery/setup.py
index c7be38e..6952fed 100644
--- a/dev/archery/setup.py
+++ b/dev/archery/setup.py
@@ -16,13 +16,19 @@
# specific language governing permissions and limitations
# under the License.
+import functools
+import operator
import sys
from setuptools import setup
-
if sys.version_info < (3, 5):
sys.exit('Python < 3.5 is not supported')
+extras = {
+ 'bot': ['ruamel.yaml', 'pygithub'],
+ 'docker': ['ruamel.yaml', 'python-dotenv']
+}
+extras['all'] = list(set(functools.reduce(operator.add, extras.values())))
setup(
name='archery',
@@ -31,11 +37,18 @@ setup(
url='http://github.com/apache/arrow',
maintainer='Arrow Developers',
maintainer_email='dev@arrow.apache.org',
- packages=['archery'],
- install_requires=['click', 'pygithub'],
- tests_require=['pytest', 'ruamel.yaml'],
+ packages=[
+ 'archery',
+ 'archery.benchmark',
+ 'archery.integration',
+ 'archery.lang',
+ 'archery.utils'
+ ],
+ install_requires=['click'],
+ tests_require=['pytest', 'responses'],
+ extras_require=extras,
entry_points='''
[console_scripts]
archery=archery.cli:archery
- ''',
+ '''
)
diff --git a/dev/tasks/crossbow.py b/dev/tasks/crossbow.py
index 41f1fce..621aa4b 100755
--- a/dev/tasks/crossbow.py
+++ b/dev/tasks/crossbow.py
@@ -769,11 +769,9 @@ class Task(Serializable):
submitting the job to a queue.
"""
- def __init__(self, platform, ci, template, artifacts=None, params=None):
- assert platform in {'win', 'osx', 'linux'}
+ def __init__(self, ci, template, artifacts=None, params=None):
assert ci in {'circle', 'travis', 'appveyor', 'azure', 'github'}
self.ci = ci
- self.platform = platform
self.template = template
self.artifacts = artifacts or []
self.params = params or {}
@@ -1435,7 +1433,7 @@ yaml.register_class(Target)
# define default paths
-DEFAULT_CONFIG_PATH = CWD / 'tasks.yml'
+DEFAULT_CONFIG_PATH = str(CWD / 'tasks.yml')
DEFAULT_ARROW_PATH = CWD.parents[1]
DEFAULT_QUEUE_PATH = CWD.parents[2] / 'crossbow'
diff --git a/dev/tasks/docker-tests/azure.linux.yml b/dev/tasks/docker-tests/azure.linux.yml
index 6df166d..f136947 100644
--- a/dev/tasks/docker-tests/azure.linux.yml
+++ b/dev/tasks/docker-tests/azure.linux.yml
@@ -20,6 +20,12 @@ jobs:
pool:
vmImage: ubuntu-16.04
timeoutInMinutes: 360
+ {%- if env is defined %}
+ variables:
+ {%- for key, value in env.items() %}
+ {{ key }}: {{ value }}
+ {%- endfor %}
+ {%- endif %}
steps:
- task: DockerInstaller@0
@@ -28,6 +34,10 @@ jobs:
dockerVersion: 17.09.0-ce
releaseType: stable
+ - task: UsePythonVersion@0
+ inputs:
+ versionSpec: '3.6'
+
- script: |
git clone --no-checkout {{ arrow.remote }} arrow
git -C arrow fetch -t {{ arrow.remote }} {{ arrow.branch }}
@@ -35,28 +45,8 @@ jobs:
git -C arrow submodule update --init --recursive
displayName: Clone arrow
- - script: |
- {% if env is defined %}
- {%- for key, value in env.items() %}
- export {{ key }}={{ value }}
- {%- endfor %}
- {% endif %}
-
- {% if build is defined %}
- {%- for image in build %}
- docker-compose pull --ignore-pull-failures {{ image }}
- docker-compose build {{ image }}
- {%- endfor %}
- {% endif %}
-
- {% if nocache is defined %}
- {%- for image in nocache %}
- docker-compose build --no-cache {{ image }}
- {%- endfor %}
- {% endif %}
+ - script: pip install -e arrow/dev/archery[docker]
+ displayName: Setup Archery
- {%- for image in run %}
- docker-compose run --rm -e SETUPTOOLS_SCM_PRETEND_VERSION="{{ arrow.no_rc_version }}" {{ image }}
- {%- endfor %}
- workingDirectory: arrow
- displayName: Run docker test
+ - script: archery docker run -e SETUPTOOLS_SCM_PRETEND_VERSION="{{ arrow.no_rc_version }}" {{ run }}
+ displayName: Execute Docker Build
diff --git a/dev/tasks/docker-tests/circle.linux.yml b/dev/tasks/docker-tests/circle.linux.yml
index 9135810..3ddb93d 100644
--- a/dev/tasks/docker-tests/circle.linux.yml
+++ b/dev/tasks/docker-tests/circle.linux.yml
@@ -19,46 +19,29 @@ version: 2
jobs:
build:
machine:
- image: ubuntu-1604:201903-01
+ image: ubuntu-1604:202004-01
+ {%- if env is defined %}
+ environment:
+ {%- for key, value in env.items() %}
+ {{ key }}: {{ value }}
+ {%- endfor %}
+ {%- endif %}
steps:
- - run: docker -v
- - run: docker-compose -v
- {% if arrow.branch == "master" %}
- run: |
- if [ -n $DOCKER_USER ] && [ -n $DOCKER_PASS ]; then
- docker login -u $DOCKER_USER -p $DOCKER_PASS
- fi
- {% endif %}
- - run: git clone --no-checkout {{ arrow.remote }} arrow
- - run: git -C arrow fetch -t {{ arrow.remote }} {{ arrow.branch }}
- - run: git -C arrow checkout FETCH_HEAD
- - run: git -C arrow submodule update --init --recursive
+ docker -v
+ docker-compose -v
+ - run: |
+ git clone --no-checkout {{ arrow.remote }} arrow
+ git -C arrow fetch -t {{ arrow.remote }} {{ arrow.branch }}
+ git -C arrow checkout FETCH_HEAD
+ git -C arrow submodule update --init --recursive
- run:
+ name: Execute Docker Build
command: |
- pushd arrow
- {% if env is defined %}
- {%- for key, value in env.items() %}
- export {{ key }}={{ value }}
- {%- endfor %}
- {% endif %}
-
- {% if build is defined %}
- {%- for image in build %}
- docker-compose pull --ignore-pull-failures {{ image }}
- docker-compose build {{ image }}
- {%- endfor %}
- {% endif %}
-
- {% if nocache is defined %}
- {%- for image in nocache %}
- docker-compose build --no-cache {{ image }}
- {%- endfor %}
- {% endif %}
-
- {%- for image in run %}
- docker-compose run --rm -e SETUPTOOLS_SCM_PRETEND_VERSION="{{ arrow.no_rc_version }}" {{ image }}
- {%- endfor %}
- popd
+ pyenv versions
+ pyenv global 3.6.10
+ pip install -e arrow/dev/archery[docker]
+ archery docker run -e SETUPTOOLS_SCM_PRETEND_VERSION="{{ arrow.no_rc_version }}" {{ run }}
no_output_timeout: "1h"
workflows:
diff --git a/dev/tasks/docker-tests/github.linux.yml b/dev/tasks/docker-tests/github.linux.yml
index e7496bb..1bc9e07 100644
--- a/dev/tasks/docker-tests/github.linux.yml
+++ b/dev/tasks/docker-tests/github.linux.yml
@@ -39,30 +39,18 @@ jobs:
- name: Free Up Disk Space
shell: bash
run: arrow/ci/scripts/util_cleanup.sh
- - name: Run Docker Build
+ - name: Setup Python
+ uses: actions/setup-python@v1
+ with:
+ python-version: 3.8
+ - name: Setup Archery
+ run: pip install -e arrow/dev/archery[docker]
+ - name: Execute Docker Build
shell: bash
- run: |
- pushd arrow
- {% if env is defined %}
- {%- for key, value in env.items() %}
- export {{ key }}={{ value }}
- {%- endfor %}
- {% endif %}
-
- {% if build is defined %}
- {%- for image in build %}
- docker-compose pull --ignore-pull-failures {{ image }}
- docker-compose build {{ image }}
- {%- endfor %}
- {% endif %}
-
- {% if nocache is defined %}
- {%- for image in nocache %}
- docker-compose build --no-cache {{ image }}
- {%- endfor %}
- {% endif %}
-
- {%- for image in run %}
- docker-compose run --rm -e SETUPTOOLS_SCM_PRETEND_VERSION="{{ arrow.no_rc_version }}" {{ image }}
- {%- endfor %}
- popd
+ {%- if env is defined %}
+ env:
+ {%- for key, value in env.items() %}
+ {{ key }}: {{ value }}
+ {%- endfor %}
+ {%- endif %}
+ run: archery docker run -e SETUPTOOLS_SCM_PRETEND_VERSION="{{ arrow.no_rc_version }}" {{ run }}
diff --git a/dev/tasks/tasks.yml b/dev/tasks/tasks.yml
index 25f64dc..51f830d 100644
--- a/dev/tasks/tasks.yml
+++ b/dev/tasks/tasks.yml
@@ -112,7 +112,6 @@ groups:
tasks:
# arbitrary_task_name:
- # platform: osx|linux|win
# template: path of jinja2 templated yml
# params: optional extra parameters
# artifacts: list of regex patterns, each needs to match a single github
@@ -124,7 +123,6 @@ tasks:
conda-linux-gcc-py36:
ci: azure
- platform: linux
template: conda-recipes/azure.linux.yml
params:
config: linux_python3.6
@@ -134,7 +132,6 @@ tasks:
conda-linux-gcc-py37:
ci: azure
- platform: linux
template: conda-recipes/azure.linux.yml
params:
config: linux_python3.7
@@ -144,7 +141,6 @@ tasks:
conda-linux-gcc-py38:
ci: azure
- platform: linux
template: conda-recipes/azure.linux.yml
params:
config: linux_python3.8
@@ -156,7 +152,6 @@ tasks:
conda-osx-clang-py36:
ci: azure
- platform: osx
template: conda-recipes/azure.osx.yml
params:
config: osx_python3.6
@@ -166,7 +161,6 @@ tasks:
conda-osx-clang-py37:
ci: azure
- platform: osx
template: conda-recipes/azure.osx.yml
params:
config: osx_python3.7
@@ -176,7 +170,6 @@ tasks:
conda-osx-clang-py38:
ci: azure
- platform: osx
template: conda-recipes/azure.osx.yml
params:
config: osx_python3.8
@@ -188,7 +181,6 @@ tasks:
conda-win-vs2015-py36:
ci: azure
- platform: win
template: conda-recipes/azure.win.yml
params:
config: win_c_compilervs2015cxx_compilervs2015python3.6vc14
@@ -198,7 +190,6 @@ tasks:
conda-win-vs2015-py37:
ci: azure
- platform: win
template: conda-recipes/azure.win.yml
params:
config: win_c_compilervs2015cxx_compilervs2015python3.7vc14
@@ -208,7 +199,6 @@ tasks:
conda-win-vs2015-py38:
ci: azure
- platform: win
template: conda-recipes/azure.win.yml
params:
config: win_c_compilervs2015cxx_compilervs2015python3.8vc14
@@ -220,7 +210,6 @@ tasks:
wheel-manylinux1-cp35m:
ci: azure
- platform: linux
template: python-wheels/azure.linux.yml
params:
python_version: 3.5
@@ -235,7 +224,6 @@ tasks:
wheel-manylinux1-cp36m:
ci: azure
- platform: linux
template: python-wheels/azure.linux.yml
params:
python_version: 3.6
@@ -250,7 +238,6 @@ tasks:
wheel-manylinux1-cp37m:
ci: azure
- platform: linux
template: python-wheels/azure.linux.yml
params:
python_version: 3.7
@@ -265,7 +252,6 @@ tasks:
wheel-manylinux1-cp38:
ci: azure
- platform: linux
template: python-wheels/azure.linux.yml
params:
python_version: 3.8
@@ -280,7 +266,6 @@ tasks:
wheel-manylinux2010-cp35m:
ci: azure
- platform: linux
template: python-wheels/azure.linux.yml
params:
python_version: 3.5
@@ -295,7 +280,6 @@ tasks:
wheel-manylinux2010-cp36m:
ci: azure
- platform: linux
template: python-wheels/azure.linux.yml
params:
python_version: 3.6
@@ -310,7 +294,6 @@ tasks:
wheel-manylinux2010-cp37m:
ci: azure
- platform: linux
template: python-wheels/azure.linux.yml
params:
python_version: 3.7
@@ -325,7 +308,6 @@ tasks:
wheel-manylinux2010-cp38:
ci: azure
- platform: linux
template: python-wheels/azure.linux.yml
params:
python_version: 3.8
@@ -340,7 +322,6 @@ tasks:
wheel-manylinux2014-cp35m:
ci: azure
- platform: linux
template: python-wheels/azure.linux.yml
params:
python_version: 3.5
@@ -355,7 +336,6 @@ tasks:
wheel-manylinux2014-cp36m:
ci: azure
- platform: linux
template: python-wheels/azure.linux.yml
params:
python_version: 3.6
@@ -370,7 +350,6 @@ tasks:
wheel-manylinux2014-cp37m:
ci: azure
- platform: linux
template: python-wheels/azure.linux.yml
params:
python_version: 3.7
@@ -385,7 +364,6 @@ tasks:
wheel-manylinux2014-cp38:
ci: azure
- platform: linux
template: python-wheels/azure.linux.yml
params:
python_version: 3.8
@@ -402,7 +380,6 @@ tasks:
wheel-osx-cp35m:
ci: travis
- platform: osx
template: python-wheels/travis.osx.yml
params:
python_version: 3.5
@@ -411,7 +388,6 @@ tasks:
wheel-osx-cp36m:
ci: travis
- platform: osx
template: python-wheels/travis.osx.yml
params:
python_version: 3.6
@@ -420,7 +396,6 @@ tasks:
wheel-osx-cp37m:
ci: travis
- platform: osx
template: python-wheels/travis.osx.yml
params:
python_version: 3.7
@@ -429,7 +404,6 @@ tasks:
wheel-osx-cp38:
ci: travis
- platform: osx
template: python-wheels/travis.osx.yml
params:
python_version: 3.8
@@ -440,7 +414,6 @@ tasks:
wheel-win-cp35m:
ci: appveyor
- platform: win
template: python-wheels/appveyor.yml
params:
script: win-build-3.5.bat
@@ -450,7 +423,6 @@ tasks:
wheel-win-cp36m:
ci: appveyor
- platform: win
template: python-wheels/appveyor.yml
params:
script: win-build.bat
@@ -460,7 +432,6 @@ tasks:
wheel-win-cp37m:
ci: appveyor
- platform: win
template: python-wheels/appveyor.yml
params:
script: win-build.bat
@@ -470,7 +441,6 @@ tasks:
wheel-win-cp38:
ci: appveyor
- platform: win
template: python-wheels/appveyor.yml
params:
script: win-build.bat
@@ -482,7 +452,6 @@ tasks:
debian-stretch-amd64:
ci: github
- platform: linux
template: linux-packages/github.linux.amd64.yml
params:
build_task: "apt:build"
@@ -554,7 +523,6 @@ tasks:
debian-stretch-arm64:
ci: travis
- platform: linux
template: linux-packages/travis.linux.arm64.yml
params:
build_task: "apt:build"
@@ -597,7 +565,6 @@ tasks:
debian-buster-amd64:
ci: github
- platform: linux
template: linux-packages/github.linux.amd64.yml
params:
build_task: "apt:build"
@@ -669,7 +636,6 @@ tasks:
debian-buster-arm64:
ci: travis
- platform: linux
template: linux-packages/travis.linux.arm64.yml
params:
build_task: "apt:build"
@@ -712,7 +678,6 @@ tasks:
ubuntu-xenial-amd64:
ci: github
- platform: linux
template: linux-packages/github.linux.amd64.yml
params:
build_task: "apt:build"
@@ -768,7 +733,6 @@ tasks:
ubuntu-xenial-arm64:
ci: travis
- platform: linux
template: linux-packages/travis.linux.arm64.yml
params:
build_task: "apt:build"
@@ -802,7 +766,6 @@ tasks:
ubuntu-bionic-amd64:
ci: github
- platform: linux
template: linux-packages/github.linux.amd64.yml
params:
build_task: "apt:build"
@@ -863,7 +826,6 @@ tasks:
ubuntu-bionic-arm64:
ci: travis
- platform: linux
template: linux-packages/travis.linux.arm64.yml
params:
build_task: "apt:build"
@@ -901,7 +863,6 @@ tasks:
ubuntu-eoan-amd64:
ci: github
- platform: linux
template: linux-packages/github.linux.amd64.yml
params:
build_task: "apt:build"
@@ -962,7 +923,6 @@ tasks:
ubuntu-eoan-arm64:
ci: travis
- platform: linux
template: linux-packages/travis.linux.arm64.yml
params:
build_task: "apt:build"
@@ -1000,7 +960,6 @@ tasks:
ubuntu-focal-amd64:
ci: github
- platform: linux
template: linux-packages/github.linux.amd64.yml
params:
build_task: "apt:build"
@@ -1061,7 +1020,6 @@ tasks:
ubuntu-focal-arm64:
ci: travis
- platform: linux
template: linux-packages/travis.linux.arm64.yml
params:
build_task: "apt:build"
@@ -1099,7 +1057,6 @@ tasks:
centos-6-amd64:
ci: github
- platform: linux
template: linux-packages/github.linux.amd64.yml
params:
build_task: "yum:build"
@@ -1122,7 +1079,6 @@ tasks:
centos-7-amd64:
ci: github
- platform: linux
template: linux-packages/github.linux.amd64.yml
params:
build_task: "yum:build"
@@ -1157,7 +1113,6 @@ tasks:
centos-7-aarch64:
ci: travis
- platform: linux
template: linux-packages/travis.linux.arm64.yml
params:
build_task: "yum:build"
@@ -1190,7 +1145,6 @@ tasks:
centos-8-amd64:
ci: github
- platform: linux
template: linux-packages/github.linux.amd64.yml
params:
build_task: "yum:build"
@@ -1248,7 +1202,6 @@ tasks:
centos-8-aarch64:
ci: travis
- platform: linux
template: linux-packages/travis.linux.arm64.yml
params:
build_task: "yum:build"
@@ -1299,14 +1252,12 @@ tasks:
homebrew-cpp:
ci: travis
- platform: osx
template: homebrew-formulae/travis.osx.yml
params:
formula: apache-arrow.rb
homebrew-cpp-autobrew:
ci: travis
- platform: osx
template: homebrew-formulae/travis.osx.yml
params:
formula: autobrew/apache-arrow.rb
@@ -1314,7 +1265,6 @@ tasks:
homebrew-r-autobrew:
# This tests that the autobrew formula + script work in practice
ci: travis
- platform: osx
template: homebrew-formulae/travis.osx.r.yml
params:
r_version: release
@@ -1323,14 +1273,12 @@ tasks:
gandiva-jar-xenial:
ci: travis
- platform: linux
template: gandiva-jars/travis.linux.yml
artifacts:
- arrow-gandiva-{no_rc_version}-SNAPSHOT.jar
gandiva-jar-osx:
ci: travis
- platform: osx
template: gandiva-jars/travis.osx.yml
artifacts:
- arrow-gandiva-{no_rc_version}-SNAPSHOT.jar
@@ -1339,7 +1287,6 @@ tasks:
verify-rc-binaries-binary:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "ubuntu"
@@ -1348,7 +1295,6 @@ tasks:
verify-rc-binaries-apt:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "ubuntu"
@@ -1357,7 +1303,6 @@ tasks:
verify-rc-binaries-yum:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "ubuntu"
@@ -1366,7 +1311,6 @@ tasks:
verify-rc-wheels-linux:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "ubuntu"
@@ -1375,7 +1319,6 @@ tasks:
verify-rc-wheels-macos:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "macOS"
@@ -1384,7 +1327,6 @@ tasks:
verify-rc-source-macos-java:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "macOS"
@@ -1393,7 +1335,6 @@ tasks:
verify-rc-source-macos-csharp:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "macOS"
@@ -1402,7 +1343,6 @@ tasks:
verify-rc-source-macos-ruby:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "macOS"
@@ -1411,7 +1351,6 @@ tasks:
verify-rc-source-macos-python:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "macOS"
@@ -1420,7 +1359,6 @@ tasks:
verify-rc-source-macos-js:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "macOS"
@@ -1429,7 +1367,6 @@ tasks:
verify-rc-source-macos-go:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "macOS"
@@ -1438,7 +1375,6 @@ tasks:
verify-rc-source-macos-rust:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "macOS"
@@ -1447,7 +1383,6 @@ tasks:
verify-rc-source-macos-integration:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "macOS"
@@ -1456,7 +1391,6 @@ tasks:
verify-rc-source-linux-java:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "ubuntu"
@@ -1465,7 +1399,6 @@ tasks:
verify-rc-source-linux-csharp:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "ubuntu"
@@ -1474,7 +1407,6 @@ tasks:
verify-rc-source-linux-ruby:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "ubuntu"
@@ -1483,7 +1415,6 @@ tasks:
verify-rc-source-linux-python:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "ubuntu"
@@ -1492,7 +1423,6 @@ tasks:
verify-rc-source-linux-js:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "ubuntu"
@@ -1501,7 +1431,6 @@ tasks:
verify-rc-source-linux-go:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "ubuntu"
@@ -1510,7 +1439,6 @@ tasks:
verify-rc-source-linux-rust:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "ubuntu"
@@ -1519,7 +1447,6 @@ tasks:
verify-rc-source-linux-integration:
ci: github
- platform: linux
template: verify-rc/github.nix.yml
params:
os: "ubuntu"
@@ -1528,248 +1455,156 @@ tasks:
verify-rc-source-windows:
ci: github
- platform: linux
template: verify-rc/github.windows.source.yml
verify-rc-wheels-windows:
ci: github
- platform: linux
template: verify-rc/github.windows.wheels.yml
############################## Docker tests #################################
test-conda-cpp:
ci: github
- platform: linux
template: docker-tests/github.linux.yml
params:
- build:
- - conda-cpp
- run:
- - conda-cpp
+ run: conda-cpp
test-conda-cpp-valgrind:
ci: github
- platform: linux
template: docker-tests/github.linux.yml
params:
- build:
- - conda-cpp
- run:
- - conda-cpp-valgrind
+ run: conda-cpp-valgrind
test-debian-10-cpp:
- ci: github
- platform: linux
- template: docker-tests/github.linux.yml
+ ci: circle
+ template: docker-tests/circle.linux.yml
params:
env:
DEBIAN: 10
- build:
- - debian-cpp
- run:
- - debian-cpp
+ run: debian-cpp
test-ubuntu-16.04-cpp:
- ci: github
- platform: linux
- template: docker-tests/github.linux.yml
+ ci: circle
+ template: docker-tests/circle.linux.yml
params:
env:
UBUNTU: 16.04
- build:
- - ubuntu-cpp
- run:
- - ubuntu-cpp
+ run: ubuntu-cpp
test-ubuntu-18.04-cpp:
- ci: github
- platform: linux
- template: docker-tests/github.linux.yml
+ ci: circle
+ template: docker-tests/circle.linux.yml
params:
env:
UBUNTU: 18.04
- build:
- - ubuntu-cpp
- run:
- - ubuntu-cpp
+ run: ubuntu-cpp
test-fedora-30-cpp:
- ci: github
- platform: linux
- template: docker-tests/github.linux.yml
+ ci: circle
+ template: docker-tests/circle.linux.yml
params:
env:
FEDORA: 30
- build:
- - fedora-cpp
- run:
- - fedora-cpp
+ run: fedora-cpp
test-ubuntu-18.04-cpp-release:
- ci: github
- platform: linux
- template: docker-tests/github.linux.yml
+ ci: circle
+ template: docker-tests/circle.linux.yml
params:
- build:
- - ubuntu-cpp
- run:
- - -e ARROW_BUILD_TYPE=release ubuntu-cpp
+ run: "-e ARROW_BUILD_TYPE=release ubuntu-cpp"
test-ubuntu-18.04-cpp-static:
- ci: github
- platform: linux
- template: docker-tests/github.linux.yml
+ ci: circle
+ template: docker-tests/circle.linux.yml
params:
- build:
- - ubuntu-cpp
- run:
- - -e ARROW_BUILD_SHARED=OFF -e ARROW_BUILD_STATIC=ON -e ARROW_TEST_LINKAGE=static ubuntu-cpp
+ run: "-e ARROW_BUILD_SHARED=OFF -e ARROW_BUILD_STATIC=ON -e ARROW_TEST_LINKAGE=static ubuntu-cpp"
test-ubuntu-18.04-cpp-cmake32:
- ci: github
- platform: linux
- template: docker-tests/github.linux.yml
+ ci: circle
+ template: docker-tests/circle.linux.yml
params:
- build:
- - ubuntu-cpp-cmake32
- run:
- - ubuntu-cpp-cmake32
+ run: ubuntu-cpp-cmake32
test-debian-c-glib:
- ci: github
- platform: linux
- template: docker-tests/github.linux.yml
+ ci: circle
+ template: docker-tests/circle.linux.yml
params:
- build:
- - debian-cpp
- - debian-c-glib
- run:
- - debian-c-glib
+ run: debian-c-glib
test-ubuntu-c-glib:
- ci: github
- platform: linux
- template: docker-tests/github.linux.yml
+ ci: circle
+ template: docker-tests/circle.linux.yml
params:
- build:
- - ubuntu-cpp
- - ubuntu-c-glib
- run:
- - ubuntu-c-glib
+ run: ubuntu-c-glib
test-debian-ruby:
- ci: azure
- platform: linux
- template: docker-tests/azure.linux.yml
+ ci: circle
+ template: docker-tests/circle.linux.yml
params:
- build:
- - debian-cpp
- - debian-c-glib
- - debian-ruby
- run:
- - debian-ruby
+ run: debian-ruby
test-ubuntu-ruby:
ci: azure
- platform: linux
template: docker-tests/azure.linux.yml
params:
- build:
- - ubuntu-cpp
- - ubuntu-c-glib
- - ubuntu-ruby
- run:
- - ubuntu-ruby
+ run: ubuntu-ruby
test-conda-python-3.6:
- ci: azure
- platform: linux
- template: docker-tests/azure.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.6
- build:
- - conda-cpp
- - conda-python
- run:
- - conda-python
+ run: conda-python
test-conda-python-3.7:
- ci: azure
- platform: linux
- template: docker-tests/azure.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.7
- build:
- - conda-cpp
- - conda-python
- run:
- - conda-python
+ run: conda-python
test-conda-python-3.8:
- ci: azure
- platform: linux
- template: docker-tests/azure.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.8
- build:
- - conda-cpp
- - conda-python
- run:
- - conda-python
+ run: conda-python
test-debian-10-python-3:
ci: azure
- platform: linux
template: docker-tests/azure.linux.yml
params:
env:
DEBIAN: 10
- build:
- - debian-cpp
- - debian-python
- run:
- - debian-python
+ run: debian-python
test-ubuntu-18.04-python-3:
ci: azure
- platform: linux
template: docker-tests/azure.linux.yml
params:
env:
UBUNTU: 18.04
- build:
- - ubuntu-cpp
- - ubuntu-python
- run:
- - ubuntu-python
+ run: ubuntu-python
test-fedora-30-python-3:
ci: azure
- platform: linux
template: docker-tests/azure.linux.yml
params:
env:
FEDORA: 30
- build:
- - fedora-cpp
- - fedora-python
- run:
- - fedora-python
+ run: fedora-python
test-r-linux-as-cran:
ci: github
- platform: linux
template: r/github.linux.cran.yml
params:
MATRIX: "${{ matrix.r_image }}"
test-r-rhub-ubuntu-gcc-release:
ci: azure
- platform: linux
template: r/azure.linux.yml
params:
r_org: rhub
@@ -1779,7 +1614,6 @@ tasks:
test-r-rocker-r-base-latest:
ci: azure
- platform: linux
template: r/azure.linux.yml
params:
r_org: rocker
@@ -1789,7 +1623,6 @@ tasks:
test-r-rstudio-r-base-3.6-bionic:
ci: azure
- platform: linux
template: r/azure.linux.yml
params:
r_org: rstudio
@@ -1799,7 +1632,6 @@ tasks:
test-r-rstudio-r-base-3.6-centos6:
ci: azure
- platform: linux
template: r/azure.linux.yml
params:
r_org: rstudio
@@ -1809,7 +1641,6 @@ tasks:
test-r-rstudio-r-base-3.6-opensuse15:
ci: azure
- platform: linux
template: r/azure.linux.yml
params:
r_org: rstudio
@@ -1819,7 +1650,6 @@ tasks:
test-r-rstudio-r-base-3.6-opensuse42:
ci: azure
- platform: linux
template: r/azure.linux.yml
params:
r_org: rstudio
@@ -1828,286 +1658,174 @@ tasks:
not_cran: "TRUE"
test-ubuntu-18.04-r-3.6:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: azure
+ template: docker-tests/azure.linux.yml
params:
env:
UBUNTU: 18.04
R: 3.6
- build:
- - ubuntu-cpp
- - ubuntu-r
- run:
- - ubuntu-r
+ run: ubuntu-r
test-conda-r-3.6:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
R: 3.6
- build:
- - conda-cpp
- - conda-r
- run:
- - conda-r
+ run: conda-r
test-ubuntu-18.04-r-sanitizer:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: azure
+ template: docker-tests/azure.linux.yml
params:
- build:
- - ubuntu-r-sanitizer
- run:
- - ubuntu-r-sanitizer
+ run: ubuntu-r-sanitizer
test-debian-10-go-1.12:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: azure
+ template: docker-tests/azure.linux.yml
params:
env:
GO: 1.12
- build:
- - debian-go
- run:
- - debian-go
+ run: debian-go
test-ubuntu-18.04-docs:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: azure
+ template: docker-tests/azure.linux.yml
params:
env:
UBUNTU: 18.04
- build:
- - ubuntu-cpp
- - ubuntu-python
- - ubuntu-docs
- run:
- - ubuntu-docs
+ run: ubuntu-docs
############################## Integration tests ############################
test-conda-python-3.7-pandas-latest:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.7
PANDAS: latest
- build:
- - conda-cpp
- - conda-python
- nocache:
- - conda-python-pandas
- run:
- - conda-python-pandas
+ # use the latest pandas release, so prevent reusing any cached layers
+ run: --no-cache-leaf conda-python-pandas
test-conda-python-3.8-pandas-latest:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.8
PANDAS: latest
- build:
- - conda-cpp
- - conda-python
- nocache:
- - conda-python-pandas
- run:
- - conda-python-pandas
+ # use the latest pandas release, so prevent reusing any cached layers
+ run: --no-cache-leaf conda-python-pandas
test-conda-python-3.7-pandas-master:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.7
PANDAS: master
- build:
- - conda-cpp
- - conda-python
- nocache:
- - conda-python-pandas
- run:
- - conda-python-pandas
+ # use the master branch of pandas, so prevent reusing any cached layers
+ run: --no-cache-leaf conda-python-pandas
test-conda-python-3.6-pandas-0.23:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.6
PANDAS: 0.23
- build:
- - conda-cpp
- - conda-python
- nocache:
- - conda-python-pandas
- run:
- - conda-python-pandas
+ run: conda-python-pandas
test-conda-python-3.7-dask-latest:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.7
DASK: latest
- build:
- - conda-cpp
- - conda-python
- nocache:
- - conda-python-dask
- run:
- - conda-python-dask
+ # use the latest dask release, so prevent reusing any cached layers
+ run: --no-cache-leaf conda-python-dask
test-conda-python-3.8-dask-master:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.8
- DASK: latest
- build:
- - conda-cpp
- - conda-python
- nocache:
- - conda-python-dask
- run:
- - conda-python-dask
+ DASK: master
+ # use the master branch of dask, so prevent reusing any cached layers
+ run: --no-cache-leaf conda-python-dask
test-conda-python-3.8-jpype:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.8
- build:
- - conda-cpp
- - conda-python
- nocache:
- - conda-python-jpype
- run:
- - conda-python-jpype
+ run: conda-python-jpype
test-conda-python-3.7-turbodbc-latest:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.7
TURBODBC: latest
- build:
- - conda-cpp
- - conda-python
- nocache:
- - conda-python-turbodbc
- run:
- - conda-python-turbodbc
+ # use the latest turbodbc release, so prevent reusing any cached layers
+ run: --no-cache-leaf conda-python-turbodbc
test-conda-python-3.7-turbodbc-master:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.7
TURBODBC: master
- build:
- - conda-cpp
- - conda-python
- nocache:
- - conda-python-turbodbc
- run:
- - conda-python-turbodbc
+ # use the master branch of dask, so prevent reusing any cached layers
+ run: --no-cache-leaf conda-python-turbodbc
test-conda-python-3.7-kartothek-latest:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.7
KARTOTHEK: latest
- build:
- - conda-cpp
- - conda-python
- nocache:
- - conda-python-kartothek
- run:
- - conda-python-kartothek
+ run: --no-cache-leaf conda-python-kartothek
test-conda-python-3.7-kartothek-master:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.7
KARTOTHEK: master
- build:
- - conda-cpp
- - conda-python
- nocache:
- - conda-python-kartothek
- run:
- - conda-python-kartothek
+ # use the master branch of kartothek, so prevent reusing any layers
+ run: --no-cache-leaf conda-python-kartothek
test-conda-python-3.7-hdfs-2.9.2:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.7
HDFS: 2.9.2
- build:
- - conda-cpp
- - conda-python
- nocache:
- - conda-python-hdfs
- run:
- - conda-python-hdfs
+ run: conda-python-hdfs
test-conda-python-3.7-spark-master:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
env:
PYTHON: 3.7
SPARK: master
- build:
- - conda-cpp
- - conda-python
- nocache:
- - conda-python-spark
- run:
- - conda-python-spark
+ # use the master branch of spark, so prevent reusing any layers
+ run: --no-cache-leaf conda-python-spark
# Remove the "skipped-" prefix in ARROW-8475
skipped-test-conda-cpp-hiveserver2:
- ci: circle
- platform: linux
- template: docker-tests/circle.linux.yml
+ ci: github
+ template: docker-tests/github.linux.yml
params:
- build:
- - conda-cpp
- nocache:
- - conda-cpp-hiveserver2
- run:
- - conda-cpp-hiveserver2
+ run: conda-cpp-hiveserver2
diff --git a/docker-compose.yml b/docker-compose.yml
index ec8d483..1f4bcfc 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -79,6 +79,64 @@ x-ccache: &ccache
CCACHE_MAXSIZE: 500M
CCACHE_DIR: /build/ccache
+x-hierarchy:
+ # This section is used by the archery tool to enable building nested images,
+ # so it is enough to call:
+ # archery run debian-ruby
+ # instead of a seguence of docker-compose commands:
+ # docker-compose build debian-cpp
+ # docker-compose build debian-c-glib
+ # docker-compose build debian-ruby
+ # docker-compose run --rm debian-ruby
+ #
+ # Each node must be either a string scalar of a list containing the
+ # descendant images if any. Archery checks that all node has a corresponding
+ # service entry, so any new image/service must be listed here.
+ - centos-python-manylinux1
+ - centos-python-manylinux2010
+ - centos-python-manylinux2014
+ - conda-cpp:
+ - conda-cpp-hiveserver2
+ - conda-cpp-valgrind
+ - conda-integration
+ - conda-python:
+ - conda-python-pandas
+ - conda-python-dask
+ - conda-python-hdfs
+ - conda-python-jpype
+ - conda-python-turbodbc
+ - conda-python-kartothek
+ - conda-python-spark
+ - conda-r
+ - cuda-cpp:
+ - cuda-python
+ - debian-cpp:
+ - debian-c-glib:
+ - debian-ruby
+ - debian-python
+ - debian-go
+ - debian-java:
+ - debian-java-jni
+ - debian-js
+ - debian-rust
+ - fedora-cpp:
+ - fedora-python
+ - ubuntu-cpp:
+ - ubuntu-cpp-cmake32
+ - ubuntu-c-glib:
+ - ubuntu-ruby
+ - ubuntu-lint
+ - ubuntu-python:
+ - ubuntu-docs
+ - ubuntu-r
+ - ubuntu-csharp
+ - ubuntu-cpp-sanitizer
+ - ubuntu-r-sanitizer
+ - r
+ # helper services
+ - impala
+ - postgres
+
services:
################################# C++ #######################################
@@ -1148,14 +1206,13 @@ services:
links:
- impala:impala
environment:
- - ARROW_FLIGHT=OFF
- - ARROW_GANDIVA=OFF
- - ARROW_PLASMA=OFF
- - ARROW_HIVESERVER2=ON
- - ARROW_HIVESERVER2_TEST_HOST=impala
- shm_size: *shm-size
- environment:
<<: *ccache
+ ARROW_FLIGHT: "OFF"
+ ARROW_GANDIVA: "OFF"
+ ARROW_PLASMA: "OFF"
+ ARROW_HIVESERVER2: "ON"
+ ARROW_HIVESERVER2_TEST_HOST: impala
+ shm_size: *shm-size
volumes: *conda-volumes
command:
["/arrow/ci/scripts/cpp_build.sh /arrow /build &&
diff --git a/docs/source/developers/cpp/development.rst b/docs/source/developers/cpp/development.rst
index 680f000..f77abac 100644
--- a/docs/source/developers/cpp/development.rst
+++ b/docs/source/developers/cpp/development.rst
@@ -128,8 +128,7 @@ codebase:
.. code-block:: shell
- make -f Makefile.docker build-iwyu
- docker-compose run lint
+ archery docker run ubuntu-lint
Checking for ABI and API stability
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
diff --git a/docs/source/developers/crossbow.rst b/docs/source/developers/crossbow.rst
index 85ccad1..d34aecf 100644
--- a/docs/source/developers/crossbow.rst
+++ b/docs/source/developers/crossbow.rst
@@ -1,22 +1,19 @@
-.. raw:: html
-
- <!--
- Licensed to the Apache Software Foundation (ASF) under one
- or more contributor license agreements. See the NOTICE file
- distributed with this work for additional information
- regarding copyright ownership. The ASF licenses this file
- to you under the Apache License, Version 2.0 (the
- "License"); you may not use this file except in compliance
- with the License. You may obtain a copy of the License at
-
- http://www.apache.org/licenses/LICENSE-2.0
-
- Unless required by applicable law or agreed to in writing, software
- distributed under the License is distributed on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- See the License for the specific language governing permissions and
- limitations under the License.
- -->
+.. Licensed to the Apache Software Foundation (ASF) under one
+.. or more contributor license agreements. See the NOTICE file
+.. distributed with this work for additional information
+.. regarding copyright ownership. The ASF licenses this file
+.. to you under the Apache License, Version 2.0 (the
+.. "License"); you may not use this file except in compliance
+.. with the License. You may obtain a copy of the License at
+
+.. http://www.apache.org/licenses/LICENSE-2.0
+
+.. Unless required by applicable law or agreed to in writing,
+.. software distributed under the License is distributed on an
+.. "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+.. KIND, either express or implied. See the License for the
+.. specific language governing permissions and limitations
+.. under the License.
Packaging and Testing with Crossbow
===================================
diff --git a/docs/source/developers/docker.rst b/docs/source/developers/docker.rst
new file mode 100644
index 0000000..e3f125d
--- /dev/null
+++ b/docs/source/developers/docker.rst
@@ -0,0 +1,232 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+.. or more contributor license agreements. See the NOTICE file
+.. distributed with this work for additional information
+.. regarding copyright ownership. The ASF licenses this file
+.. to you under the Apache License, Version 2.0 (the
+.. "License"); you may not use this file except in compliance
+.. with the License. You may obtain a copy of the License at
+
+.. http://www.apache.org/licenses/LICENSE-2.0
+
+.. Unless required by applicable law or agreed to in writing,
+.. software distributed under the License is distributed on an
+.. "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+.. KIND, either express or implied. See the License for the
+.. specific language governing permissions and limitations
+.. under the License.
+
+Running Docker Builds
+=====================
+
+Most of our Linux based continuous integration tasks are decoupled from public
+CI services using docker and docker-compose. Keeping the CI configuration
+minimal makes local reproducibility possible.
+
+Usage
+-----
+
+There are multiple ways to execute the docker based builds. The recommended is
+to use the archery tool:
+
+Installation
+~~~~~~~~~~~~
+
+``archery`` requires ``python>=3.5``. It is recommended to install archery in
+``editable`` mode with the ``-e`` flag to automatically update the intallation
+by pulling the arrow repository.
+
+.. code:: bash
+
+ pip install -e dev/archery[docker]
+
+For the available commands and options invoke the installed archery commands
+with the ``--help`` flag:
+
+.. code:: bash
+
+ archery docker --help
+ archery docker run --help
+
+
+Examples
+~~~~~~~~
+
+**List the available images:**
+
+.. code:: bash
+
+ archery docker images
+
+**Execute a build:**
+
+.. code:: bash
+
+ archery docker run conda-python
+
+Archery calls the following docker-compose commands:
+
+.. code:: bash
+
+ docker-compose pull --ignore-pull-failures conda-cpp
+ docker-compose build conda-cpp
+ docker-compose pull --ignore-pull-failures conda-python
+ docker-compose build conda-python
+ docker-compose run --rm conda-python
+
+**Show the docker-compose commands instead of executing them:**
+
+.. code:: bash
+
+ archery docker run --dry-run conda-python
+
+**To disable the image pulling:**
+
+.. code:: bash
+
+ archery docker run --no-cache conda-python
+
+Which translates to:
+
+.. code:: bash
+
+ docker-compose build --no-cache conda-cpp
+ docker-compose build --no-cache conda-python
+ docker-compose run --rm conda-python
+
+**To disable the cache only for the leaf image:**
+
+Useful to force building the development version of a dependency.
+In case of the example below the command builds the
+``conda-cpp > conda-python > conda-python-pandas`` branch of the image tree
+where the leaf image is ``conda-python-pandas``.
+
+.. code:: bash
+
+ PANDAS=master archery docker run --no-cache-leaf conda-python-pandas
+
+Which translates to:
+
+.. code:: bash
+
+ export PANDAS=master
+ docker-compose pull --ignore-pull-failures conda-cpp
+ docker-compose build conda-cpp
+ docker-compose pull --ignore-pull-failures conda-python
+ docker-compose build conda-python
+ docker-compose build --no-cache conda-python-pandas
+ docker-compose run --rm conda-python-pandas
+
+Note that it doesn't pull the conda-python-pandas image and disable the cache
+when building it.
+
+``PANDAS`` is a `build parameter <Docker Build Parameters>`_, see the
+defaults in the .env file.
+
+**To entirely skip building the image:**
+
+The layer-caching mechanism of docker-compose can be less reliable than
+docker's, depending on the version, the ``cache_from`` build entry, and the
+backend used (docker-py, docker-cli, docker-cli and buildkit). This can lead to
+different layer hashes - even when executing the same build command
+repeatedly - eventually causing cache misses full image rebuilds.
+
+*If the image has been already built but the cache doesn't work properly*, it
+can be useful to skip the build phases:
+
+.. code:: bash
+
+ # first run ensures that the image is built
+ archery docker run conda-python
+
+ # if the second run tries the build the image again and none of the files
+ # referenced in the relevant dockerfile have changed, then it indicates a
+ # cache miss caused by the issue desribed above
+ archery docker run conda-python
+
+ # since the image is properly built with the first command, there is no
+ # need to rebuild it, so manually disable the build phase to spare the
+ # build time
+ archery docker run --no-build conda-python
+
+**Pass environment variables to the container:**
+
+Most of the build scripts used within the containers can be configured through
+environment variables. Pass them using ``--env`` or ``-e`` CLI options -
+similar to the ``docker run`` and ``docker-compose run`` interface.
+
+.. code:: bash
+
+ archery docker run --env CMAKE_BUILD_TYPE=release ubuntu-cpp
+
+For the available environment variables in the C++ builds see the
+``ci/scripts/cpp_build.sh`` script.
+
+**Run the image with custom command:**
+
+Custom docker commands may be passed as the second argument to
+``archery docker run``.
+
+The following example starts an interactive ``bash`` session in the container
+- useful for debugging the build interactively:
+
+.. code:: bash
+
+ archery docker run ubuntu-cpp bash
+
+
+Development
+-----------
+
+The docker-compose configuration is tuned towards reusable development
+containers using hierarchical images. For example multiple language bindings
+are dependent on the C++ implementation, so instead of redefining the
+C++ environment multiple Dockerfiles, we can reuse the exact same base C++
+image when building Glib, Ruby, R and Python bindings.
+This reduces duplication and streamlines maintenance, but makes the
+docker-compose configuration more complicated.
+
+Docker Build Parameters
+~~~~~~~~~~~~~~~~~~~~~~~
+
+The build time parameters are pushed down to the dockerfiles to make the
+image building more flexible. These parameters are usually called as docker
+build args, but we pass these values as environment variables to
+docker-compose.yml. The build parameters are extensively used for:
+
+- defining the docker registry used for caching
+- platform architectures
+- operation systems and versions
+- defining various versions if dependencies
+
+The default parameter values are stored in the top level .env file.
+For detailed examples see the docker-compose.yml.
+
+Build Scripts
+~~~~~~~~~~~~~
+
+The scripts maintainted under ci/scripts directory should be kept
+parametrizable but reasonably minimal to clearly encapsulate the tasks it is
+responsible for. Like:
+
+- ``cpp_build.sh``: build the C++ implementation without running the tests.
+- ``cpp_test.sh``: execute the C++ tests.
+- ``python_build.sh``: build the Python bindings without running the tests.
+- ``python_test.sh``: execute the python tests.
+- ``docs_build.sh``: build the Sphinx documentation.
+- ``integration_dask.sh``: execute the dask integration tests.
+- ``integration_pandas.sh``: execute the pandas integration tests.
+- ``install_minio.sh``: install minio server for multiple platforms.
+- ``install_conda.sh``: install miniconda for multiple platforms.
+
+The parametrization (like the C++ CMake options) is achieved via environment
+variables with useful defaults to keep the build configurations declarative.
+
+A good example is ``cpp_build.sh`` build script which forwards environment
+variables as CMake options - so the same scripts can be invoked in various
+configurations without the necessity of changing it. For examples see how the
+environment variables are passed in the docker-compose.yml's C++ images.
+
+Adding New Images
+~~~~~~~~~~~~~~~~~
+
+See the inline comments available in the docker-compose.yml file.
diff --git a/docs/source/developers/documentation.rst b/docs/source/developers/documentation.rst
index e2e9e88..5878aa5 100644
--- a/docs/source/developers/documentation.rst
+++ b/docs/source/developers/documentation.rst
@@ -86,16 +86,14 @@ format in ``docs/_build/html``. In particular, you can point your browser
at ``docs/_build/html/index.html`` to read the docs and review any changes
you made.
-
-.. _building-docker:
-
Building with Docker
--------------------
-You can use Docker to build the documentation:
+You can use Archery to build the documentation within a docker container.
+For installation and usage see `Running Docker Builds`_ section.
.. code-block:: shell
- make -f Makefile.docker docs
+ archery docker run ubuntu-docs
The final output is located under ``docs/_build/html``.
diff --git a/docs/source/developers/integration.rst b/docs/source/developers/integration.rst
index 7b87733..e6ce3be 100644
--- a/docs/source/developers/integration.rst
+++ b/docs/source/developers/integration.rst
@@ -31,11 +31,11 @@ Docker images (services)
------------------------
The docker-compose services are defined in the ``docker-compose.yml`` file.
-Each service usually correspond to a language binding or an important service to
-test with Arrow.
+Each service usually correspond to a language binding or an important service
+to test with Arrow.
-Services are configured with 2 local mounts, ``/arrow`` for the top-level source
-directory and ``/build`` for caching build artifacts. The source level
+Services are configured with 2 local mounts, ``/arrow`` for the top-level
+source directory and ``/build`` for caching build artifacts. The source level
directory mount can be paired with git checkout to test a specific commit. The
build mount is used for caching and sharing state between staged images.
@@ -56,15 +56,19 @@ build mount is used for caching and sharing state between staged images.
You can build and run a service by using the `build` and `run` docker-compose
sub-command, e.g. `docker-compose build python && docker-compose run python`.
We do not publish the build images, you need to build them manually. This
-method requires the user to build the images in reverse dependency order. To
-simplify this, we provide a Makefile.
+method requires the user to build the images in reverse dependency order.
.. code-block:: shell
# Build and run manually
- docker-compose build cpp
- docker-compose build python
- docker-compose run python
+ docker-compose build conda-cpp
+ docker-compose build conda-python
+ docker-compose run conda-python
- # Using the makefile with proper image dependency resolution
- make -f Makefile.docker python
+To simplify this, Archery provides a command for it:
+
+.. code-block:: shell
+
+ archery docker run conda-python
+
+See `Running Docker Builds`_ for more details.
diff --git a/docs/source/example.gz b/docs/source/example.gz
new file mode 100644
index 0000000..4fc6040
Binary files /dev/null and b/docs/source/example.gz differ
diff --git a/docs/source/index.rst b/docs/source/index.rst
index a3f3302..021e2d5 100644
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -69,5 +69,6 @@ such topics as:
developers/python
developers/integration
developers/crossbow
+ developers/docker
developers/benchmarks
developers/documentation