You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "ASF GitHub Bot (JIRA)" <ji...@apache.org> on 2018/08/22 08:27:00 UTC

[jira] [Commented] (AIRFLOW-2499) Dockerised CI pipeline

    [ https://issues.apache.org/jira/browse/AIRFLOW-2499?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16588544#comment-16588544 ] 

ASF GitHub Bot commented on AIRFLOW-2499:
-----------------------------------------

Fokko closed pull request #3393: [AIRFLOW-2499] Dockerised CI pipeline
URL: https://github.com/apache/incubator-airflow/pull/3393
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/.gitignore b/.gitignore
index 9749b1aed3..d005437890 100644
--- a/.gitignore
+++ b/.gitignore
@@ -144,3 +144,5 @@ scripts/ci/kubernetes/kube/.generated/airflow.yaml
 node_modules
 npm-debug.log*
 static/dist
+derby.log
+metastore_db
diff --git a/.travis.yml b/.travis.yml
index ecc4e62ac9..9c7cfd0208 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -19,94 +19,40 @@
 sudo: true
 dist: trusty
 language: python
-jdk:
-  - oraclejdk8
-services:
-  - cassandra
-  - mongodb
-  - mysql
-  - postgresql
-  - rabbitmq
-addons:
-  apt:
-    packages:
-      - slapd
-      - ldap-utils
-      - openssh-server
-      - mysql-server-5.6
-      - mysql-client-core-5.6
-      - mysql-client-5.6
-      - krb5-user
-      - krb5-kdc
-      - krb5-admin-server
-      - oracle-java8-installer
-  postgresql: "9.2"
-python:
-  - "2.7"
-  - "3.5"
 env:
   global:
+    - DOCKER_COMPOSE_VERSION=1.20.0
     - SLUGIFY_USES_TEXT_UNIDECODE=yes
     - TRAVIS_CACHE=$HOME/.travis_cache/
-    - KRB5_CONFIG=/etc/krb5.conf
-    - KRB5_KTNAME=/etc/airflow.keytab
-    # Travis on google cloud engine has a global /etc/boto.cfg that
-    # does not work with python 3
-    - BOTO_CONFIG=/tmp/bogusvalue
   matrix:
+    - TOX_ENV=flake8
     - TOX_ENV=py27-backend_mysql
     - TOX_ENV=py27-backend_sqlite
     - TOX_ENV=py27-backend_postgres
-    - TOX_ENV=py35-backend_mysql
-    - TOX_ENV=py35-backend_sqlite
-    - TOX_ENV=py35-backend_postgres
-    - TOX_ENV=flake8
+    - TOX_ENV=py35-backend_mysql PYTHON_VERSION=3
+    - TOX_ENV=py35-backend_sqlite PYTHON_VERSION=3
+    - TOX_ENV=py35-backend_postgres PYTHON_VERSION=3
     - TOX_ENV=py27-backend_postgres KUBERNETES_VERSION=v1.9.0
-    - TOX_ENV=py35-backend_postgres KUBERNETES_VERSION=v1.10.0
-matrix:
-  exclude:
-    - python: "3.5"
-      env: TOX_ENV=py27-backend_mysql
-    - python: "3.5"
-      env: TOX_ENV=py27-backend_sqlite
-    - python: "3.5"
-      env: TOX_ENV=py27-backend_postgres
-    - python: "2.7"
-      env: TOX_ENV=py35-backend_mysql
-    - python: "2.7"
-      env: TOX_ENV=py35-backend_sqlite
-    - python: "2.7"
-      env: TOX_ENV=py35-backend_postgres
-    - python: "2.7"
-      env: TOX_ENV=flake8
-    - python: "3.5"
-      env: TOX_ENV=py27-backend_postgres KUBERNETES_VERSION=v1.9.0
-    - python: "2.7"
-      env: TOX_ENV=py35-backend_postgres KUBERNETES_VERSION=v1.10.0
+    - TOX_ENV=py35-backend_postgres KUBERNETES_VERSION=v1.10.0 PYTHON_VERSION=3
 cache:
   directories:
     - $HOME/.wheelhouse/
+    - $HOME/.cache/pip
     - $HOME/.travis_cache/
 before_install:
-  - yes | ssh-keygen -t rsa -C your_email@youremail.com -P '' -f ~/.ssh/id_rsa
-  - cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
-  - ln -s ~/.ssh/authorized_keys ~/.ssh/authorized_keys2
-  - chmod 600 ~/.ssh/*
-  - jdk_switcher use oraclejdk8
+  - sudo ls -lh $HOME/.cache/pip/
+  - sudo rm -rf $HOME/.cache/pip/* $HOME/.wheelhouse/*
+  - sudo chown -R travis.travis $HOME/.cache/pip
 install:
+  # Use recent docker-compose version
+  - sudo rm /usr/local/bin/docker-compose
+  - curl -L https://github.com/docker/compose/releases/download/${DOCKER_COMPOSE_VERSION}/docker-compose-`uname -s`-`uname -m` > docker-compose
+  - chmod +x docker-compose
+  - sudo mv docker-compose /usr/local/bin
   - pip install --upgrade pip
-  - pip install tox
   - pip install codecov
-before_script:
-  - cat "$TRAVIS_BUILD_DIR/scripts/ci/my.cnf" | sudo tee -a /etc/mysql/my.cnf
-  - mysql -e 'drop database if exists airflow; create database airflow' -u root
-  - sudo service mysql restart
-  - psql -c 'create database airflow;' -U postgres
-  - export PATH=${PATH}:/tmp/hive/bin
-  # Required for K8s v1.10.x. See
-  # https://github.com/kubernetes/kubernetes/issues/61058#issuecomment-372764783
-  - sudo mount --make-shared / && sudo service docker restart
 script:
-  - ./scripts/ci/travis_script.sh
+  - docker-compose --log-level ERROR -f scripts/ci/docker-compose.yml run airflow-testing /app/scripts/ci/run-ci.sh
 after_success:
+  - sudo chown -R travis.travis .
   - codecov
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index e6df3d4751..beaf609b5a 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -3,22 +3,20 @@
 Contributions are welcome and are greatly appreciated! Every
 little bit helps, and credit will always be given.
 
-## Table of Contents
-
-- [TOC](#table-of-contents)
-- [Types of Contributions](#types-of-contributions)
-  - [Report Bugs](#report-bugs)
-  - [Fix Bugs](#fix-bugs)
-  - [Implement Features](#implement-features)
-  - [Improve Documentation](#improve-documentation)
-  - [Submit Feedback](#submit-feedback)
-- [Documentation](#documentation)
-- [Development and Testing](#development-and-testing)
-  - [Setting up a development environment](#setting-up-a-development-environment)
-  - [Pull requests guidelines](#pull-request-guidelines)
-  - [Testing on Travis CI](#testing-on-travis-ci)
-  - [Testing Locally](#testing-locally)
-- [Changing the Metadata Database](#changing-the-metadata-database)
+# Table of Contents
+  * [TOC](#table-of-contents)
+  * [Types of Contributions](#types-of-contributions)
+      - [Report Bugs](#report-bugs)
+      - [Fix Bugs](#fix-bugs)
+      - [Implement Features](#implement-features)
+      - [Improve Documentation](#improve-documentation)
+      - [Submit Feedback](#submit-feedback)
+  * [Documentation](#documentation)
+  * [Development and Testing](#development-and-testing)
+      - [Setting up a development environment](#setting-up-a-development-environment)
+      - [Running unit tests](#running-unit-tests)
+  * [Pull requests guidelines](#pull-request-guidelines)
+  * [Changing the Metadata Database](#changing-the-metadata-database)
 
 ## Types of Contributions
 
@@ -83,57 +81,110 @@ extras to build the full API reference.
 
 ## Development and Testing
 
-### Set up a development env using Docker
+### Set up a development environment
 
-Go to your Airflow directory and start a new docker container. You can choose between Python 2 or 3, whatever you prefer.
+There are three ways to setup an Apache Airflow development environment.
 
-```
-# Start docker in your Airflow directory
-docker run -t -i -v `pwd`:/airflow/ -w /airflow/ -e SLUGIFY_USES_TEXT_UNIDECODE=yes python:2 bash
+1. Using tools and libraries installed directly on your system.
+
+  Install Python (2.7.x or 3.4.x), MySQL, and libxml by using system-level package
+  managers like yum, apt-get for Linux, or Homebrew for Mac OS at first. Refer to the [base CI Dockerfile](https://github.com/apache/incubator-airflow-ci/blob/master/Dockerfile.base) for
+  a comprehensive list of required packages.
+
+  Then install python development requirements. It is usually best to work in a virtualenv:
+
+  ```bash
+  cd $AIRFLOW_HOME
+  virtualenv env
+  source env/bin/activate
+  pip install -e .[devel]
+  ```
+
+2. Using a Docker container
+
+  Go to your Airflow directory and start a new docker container. You can choose between Python 2 or 3, whatever you prefer.
+
+  ```
+  # Start docker in your Airflow directory
+  docker run -t -i -v `pwd`:/airflow/ -w /airflow/ -e SLUGIFY_USES_TEXT_UNIDECODE=yes python:2 bash
+
+  # Go to the Airflow directory
+  cd /airflow/
+
+  # Install Airflow with all the required dependencies,
+  # including the devel which will provide the development tools
+  pip install -e ".[hdfs,hive,druid,devel]"
+
+  # Init the database
+  airflow initdb
+
+  nosetests -v tests/hooks/test_druid_hook.py
+
+    test_get_first_record (tests.hooks.test_druid_hook.TestDruidDbApiHook) ... ok
+    test_get_records (tests.hooks.test_druid_hook.TestDruidDbApiHook) ... ok
+    test_get_uri (tests.hooks.test_druid_hook.TestDruidDbApiHook) ... ok
+    test_get_conn_url (tests.hooks.test_druid_hook.TestDruidHook) ... ok
+    test_submit_gone_wrong (tests.hooks.test_druid_hook.TestDruidHook) ... ok
+    test_submit_ok (tests.hooks.test_druid_hook.TestDruidHook) ... ok
+    test_submit_timeout (tests.hooks.test_druid_hook.TestDruidHook) ... ok
+    test_submit_unknown_response (tests.hooks.test_druid_hook.TestDruidHook) ... ok
+
+    ----------------------------------------------------------------------
+    Ran 8 tests in 3.036s
+
+    OK
+  ```
+
+  The Airflow code is mounted inside of the Docker container, so if you change something using your favorite IDE, you can directly test is in the container.
+
+3. Using [Docker Compose](https://docs.docker.com/compose/) and Airflow's CI scripts.
 
-# Install Airflow with all the required dependencies,
-# including the devel which will provide the development tools
-pip install -e .[devel,druid,hdfs,hive]
+  Start a docker container through Compose for development to avoid installing the packages directly on your system. The following will give you a shell inside a container, run all required service containers (MySQL, PostgresSQL, krb5 and so on) and install all the dependencies:
 
-# Init the database
-airflow initdb
+  ```bash
+  docker-compose -f scripts/ci/docker-compose.yml run airflow-testing bash
+  # From the container
+  pip install -e .[devel]
+  # Run all the tests with python and mysql through tox
+  tox -e py35-backend_mysql
+  ```
 
-nosetests -v tests/hooks/test_druid_hook.py
+### Running unit tests
 
-  test_get_first_record (tests.hooks.test_druid_hook.TestDruidDbApiHook) ... ok
-  test_get_records (tests.hooks.test_druid_hook.TestDruidDbApiHook) ... ok
-  test_get_uri (tests.hooks.test_druid_hook.TestDruidDbApiHook) ... ok
-  test_get_conn_url (tests.hooks.test_druid_hook.TestDruidHook) ... ok
-  test_submit_gone_wrong (tests.hooks.test_druid_hook.TestDruidHook) ... ok
-  test_submit_ok (tests.hooks.test_druid_hook.TestDruidHook) ... ok
-  test_submit_timeout (tests.hooks.test_druid_hook.TestDruidHook) ... ok
-  test_submit_unknown_response (tests.hooks.test_druid_hook.TestDruidHook) ... ok
+To run tests locally, once your unit test environment is setup (directly on your
+system or through our Docker setup) you should be able to simply run
+``./run_unit_tests.sh`` at will.
 
-  ----------------------------------------------------------------------
-  Ran 8 tests in 3.036s
+For example, in order to just execute the "core" unit tests, run the following:
 
-  OK
+```
+./run_unit_tests.sh tests.core:CoreTest -s --logging-level=DEBUG
 ```
 
-The Airflow code is mounted inside of the Docker container, so if you change something using your favorite IDE, you can directly test is in the container.
+or a single test method:
 
-### Set up a development env using Virtualenv
+```
+./run_unit_tests.sh tests.core:CoreTest.test_check_operators -s --logging-level=DEBUG
+```
 
-Please install python(2.7.x or 3.4.x), mysql, and libxml by using system-level package
-managers like yum, apt-get for Linux, or homebrew for Mac OS at first.
-It is usually best to work in a virtualenv and tox. Install development requirements:
+To run the whole test suite with Docker Compose, do:
 
 ```
-cd $AIRFLOW_HOME
-virtualenv env
-source env/bin/activate
-pip install -e .[devel]
-tox
+# Install Docker Compose first, then this will run the tests
+docker-compose -f scripts/ci/docker-compose.yml run airflow-testing /app/scripts/ci/run-ci.sh
 ```
 
+Alternatively can also set up [Travis CI](https://travis-ci.org/) on your repo to automate this.
+It is free for open source projects.
+
+For more information on how to run a subset of the tests, take a look at the
+nosetests docs.
+
+See also the list of test classes and methods in `tests/core.py`.
+
 Feel free to customize based on the extras available in [setup.py](./setup.py)
 
-### Pull Request Guidelines
+## Pull Request Guidelines
 
 Before you submit a pull request from your forked repo, check that it
 meets these guidelines:
@@ -213,64 +264,6 @@ More information:
 [travis-ci-open-source]: https://docs.travis-ci.com/user/open-source-on-travis-ci-com/
 [travis-ci-org-vs-com]: https://devops.stackexchange.com/a/4305/8830
 
-### Testing locally
-
-#### TL;DR
-
-Tests can then be run with (see also the [Running unit tests](#running-unit-tests) section below):
-
-```
-./run_unit_tests.sh
-```
-
-Individual test files can be run with:
-
-```
-nosetests [path to file]
-```
-
-#### Running unit tests
-
-We *highly* recommend setting up [Travis CI](https://travis-ci.org/) on
-your repo to automate this. It is free for open source projects. If for
-some reason you cannot, you can use the steps below to run tests.
-
-Here are loose guidelines on how to get your environment to run the unit tests.
-We do understand that no one out there can run the full test suite since
-Airflow is meant to connect to virtually any external system and that you most
-likely have only a subset of these in your environment. You should run the
-CoreTests and tests related to things you touched in your PR.
-
-To set up a unit test environment, first take a look at `run_unit_tests.sh` and
-understand that your ``AIRFLOW_CONFIG`` points to an alternate config file
-while running the tests. You shouldn't have to alter this config file but
-you may if need be.
-
-From that point, you can actually export these same environment variables in
-your shell, start an Airflow webserver ``airflow webserver -d`` and go and
-configure your connection. Default connections that are used in the tests
-should already have been created, you just need to point them to the systems
-where you want your tests to run.
-
-Once your unit test environment is setup, you should be able to simply run
-``./run_unit_tests.sh`` at will.
-
-For example, in order to just execute the "core" unit tests, run the following:
-
-```
-./run_unit_tests.sh tests.core:CoreTest -s --logging-level=DEBUG
-```
-
-or a single test method:
-
-```
-./run_unit_tests.sh tests.core:CoreTest.test_check_operators -s --logging-level=DEBUG
-```
-
-For more information on how to run a subset of the tests, take a look at the
-nosetests docs.
-
-See also the list of test classes and methods in `tests/core.py`.
 
 ### Changing the Metadata Database
 
diff --git a/README.md b/README.md
index f33c55cb2f..cad776ba06 100644
--- a/README.md
+++ b/README.md
@@ -78,6 +78,12 @@ unit of work and continuity.
 
   ![](/docs/img/code.png)
 
+
+## Contributing
+
+Want to help build Apache Airflow? Check out our [contributing documentation](https://github.com/apache/incubator-airflow/blob/master/CONTRIBUTING.md).
+
+
 ## Who uses Airflow?
 
 As the Airflow community grows, we'd like to keep track of who is using
diff --git a/airflow/operators/python_operator.py b/airflow/operators/python_operator.py
index 2817f663a8..a01c93ca0a 100644
--- a/airflow/operators/python_operator.py
+++ b/airflow/operators/python_operator.py
@@ -188,10 +188,8 @@ class PythonVirtualenvOperator(PythonOperator):
     variable named virtualenv_string_args will be available (populated by
     string_args). In addition, one can pass stuff through op_args and op_kwargs, and one
     can use a return value.
-
     Note that if your virtualenv runs in a different Python major version than Airflow,
     you cannot use return values, op_args, or op_kwargs. You can use string_args though.
-
     :param python_callable: A python function with no references to outside variables,
         defined with def, which will be run in a virtualenv
     :type python_callable: function
diff --git a/airflow/utils/db.py b/airflow/utils/db.py
index 93458c6f61..b57b8cf92b 100644
--- a/airflow/utils/db.py
+++ b/airflow/utils/db.py
@@ -94,12 +94,12 @@ def initdb(rbac=False):
     merge_conn(
         models.Connection(
             conn_id='airflow_db', conn_type='mysql',
-            host='localhost', login='root', password='',
+            host='mysql', login='root', password='',
             schema='airflow'))
     merge_conn(
         models.Connection(
             conn_id='airflow_ci', conn_type='mysql',
-            host='localhost', login='root', extra="{\"local_infile\": true}",
+            host='mysql', login='root', extra="{\"local_infile\": true}",
             schema='airflow_ci'))
     merge_conn(
         models.Connection(
@@ -141,18 +141,19 @@ def initdb(rbac=False):
     merge_conn(
         models.Connection(
             conn_id='mongo_default', conn_type='mongo',
-            host='localhost', port=27017))
+            host='mongo', port=27017))
     merge_conn(
         models.Connection(
             conn_id='mysql_default', conn_type='mysql',
             login='root',
-            host='localhost'))
+            host='mysql'))
     merge_conn(
         models.Connection(
             conn_id='postgres_default', conn_type='postgres',
             login='postgres',
+            password='airflow',
             schema='airflow',
-            host='localhost'))
+            host='postgres'))
     merge_conn(
         models.Connection(
             conn_id='sqlite_default', conn_type='sqlite',
@@ -184,7 +185,7 @@ def initdb(rbac=False):
     merge_conn(
         models.Connection(
             conn_id='sftp_default', conn_type='sftp',
-            host='localhost', port=22, login='travis',
+            host='localhost', port=22, login='airflow',
             extra='''
                 {"private_key": "~/.ssh/id_rsa", "ignore_hostkey_verification": true}
             '''))
@@ -283,7 +284,7 @@ def initdb(rbac=False):
     merge_conn(
         models.Connection(
             conn_id='cassandra_default', conn_type='cassandra',
-            host='localhost', port=9042))
+            host='cassandra', port=9042))
 
     # Known event types
     KET = models.KnownEventType
diff --git a/run_tox.sh b/run_tox.sh
deleted file mode 100755
index b4f204d649..0000000000
--- a/run_tox.sh
+++ /dev/null
@@ -1,21 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-# 
-#   http://www.apache.org/licenses/LICENSE-2.0
-# 
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-set -o verbose
-
-python setup.py test --tox-args="-v -e $TOX_ENV"
diff --git a/run_unit_tests.sh b/run_unit_tests.sh
index 27e4d08af1..f923de9e5c 100755
--- a/run_unit_tests.sh
+++ b/run_unit_tests.sh
@@ -21,7 +21,7 @@
 set -x
 
 # environment
-export AIRFLOW_HOME=${AIRFLOW_HOME:=~/airflow}
+export AIRFLOW_HOME=${AIRFLOW_HOME:=~}
 export AIRFLOW__CORE__UNIT_TEST_MODE=True
 
 # configuration test
@@ -35,57 +35,41 @@ DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
 export PYTHONPATH=$PYTHONPATH:${DIR}/tests/test_utils
 
 # any argument received is overriding the default nose execution arguments:
-
 nose_args=$@
 
-#--with-doctest
-
 # Generate the `airflow` executable if needed
 which airflow > /dev/null || python setup.py develop
 
 echo "Initializing the DB"
+yes | airflow initdb
 yes | airflow resetdb
 
-if [ "${TRAVIS}" ]; then
-    if [ -z "$nose_args" ]; then
-      nose_args="--with-coverage \
-    --cover-erase \
-    --cover-html \
-    --cover-package=airflow \
-    --cover-html-dir=airflow/www/static/coverage \
-    --with-ignore-docstrings \
-    --rednose \
-    --with-timer \
-    -v \
-    --logging-level=DEBUG "
-    fi
-
-  # For impersonation tests running on SQLite on Travis, make the database world readable so other
-  # users can update it
-  AIRFLOW_DB="/home/travis/airflow/airflow.db"
-  if [ -f "${AIRFLOW_DB}" ]; then
-    sudo chmod a+rw "${AIRFLOW_DB}"
-  fi
-
-  # For impersonation tests on Travis, make airflow accessible to other users via the global PATH
-  # (which contains /usr/local/bin)
-  sudo ln -s "${VIRTUAL_ENV}/bin/airflow" /usr/local/bin/
-else
-    if [ -z "$nose_args" ]; then
-      nose_args="--with-coverage \
-    --cover-erase \
-    --cover-html \
-    --cover-package=airflow \
-    --cover-html-dir=airflow/www/static/coverage \
-    --with-ignore-docstrings \
-    --rednose \
-    --with-timer \
-    -s \
-    -v \
-    --logging-level=DEBUG "
-    fi
+if [ -z "$nose_args" ]; then
+  nose_args="--with-coverage \
+  --cover-erase \
+  --cover-html \
+  --cover-package=airflow \
+  --cover-html-dir=airflow/www/static/coverage \
+  --with-ignore-docstrings \
+  --rednose \
+  --with-timer \
+  -v \
+  --logging-level=DEBUG "
+fi
+
+# For impersonation tests running on SQLite on Travis, make the database world readable so other
+# users can update it
+AIRFLOW_DB="$HOME/airflow.db"
+
+if [ -f "${AIRFLOW_DB}" ]; then
+  chmod a+rw "${AIRFLOW_DB}"
+  chmod g+rwx "${AIRFLOW_HOME}"
 fi
 
+# For impersonation tests on Travis, make airflow accessible to other users via the global PATH
+# (which contains /usr/local/bin)
+sudo ln -sf "${VIRTUAL_ENV}/bin/airflow" /usr/local/bin/
+
 echo "Starting the unit tests with the following nose arguments: "$nose_args
 nosetests $nose_args
 
diff --git a/scripts/ci/ldap.sh b/scripts/ci/1-setup-env.sh
similarity index 67%
rename from scripts/ci/ldap.sh
rename to scripts/ci/1-setup-env.sh
index d0e3043a9b..0a976b35f0 100755
--- a/scripts/ci/ldap.sh
+++ b/scripts/ci/1-setup-env.sh
@@ -8,9 +8,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@@ -18,19 +18,16 @@
 # specific language governing permissions and limitations
 # under the License.
 
-set -o verbose
-
-DIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
-LDAP_DB=/tmp/ldap_db
-
-echo "Creating database directory"
-
-rm -rf ${LDAP_DB} && mkdir ${LDAP_DB} && cp  /usr/share/doc/slapd/examples/DB_CONFIG ${LDAP_DB}
+set -exuo pipefail
 
-echo "Launching OpenLDAP ..."
+# Start MiniCluster
+java -cp "/tmp/minicluster-1.1-SNAPSHOT/*" com.ing.minicluster.MiniCluster > /dev/null &
 
-# Start slapd with non root privileges
-slapd -h "ldap://127.0.0.1:3890/" -f ${DIR}/slapd.conf
+# Set up ssh keys
+echo 'yes' | ssh-keygen -t rsa -C your_email@youremail.com -P '' -f ~/.ssh/id_rsa
+cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
+ln -s ~/.ssh/authorized_keys ~/.ssh/authorized_keys2
+chmod 600 ~/.ssh/*
 
-# Wait for LDAP to start
-sleep 1
+# SSH Service
+sudo service ssh restart
diff --git a/scripts/ci/load_fixtures.sh b/scripts/ci/2-setup-kdc.sh
similarity index 59%
rename from scripts/ci/load_fixtures.sh
rename to scripts/ci/2-setup-kdc.sh
index 55beb919df..56824f5a5e 100755
--- a/scripts/ci/load_fixtures.sh
+++ b/scripts/ci/2-setup-kdc.sh
@@ -8,9 +8,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@@ -18,17 +18,21 @@
 # specific language governing permissions and limitations
 # under the License.
 
-set -o verbose
+set -exuo pipefail
+
+DIRNAME=$(cd "$(dirname "$0")"; pwd)
+
+FQDN=`hostname`
+ADMIN="admin"
+PASS="airflow"
+KRB5_KTNAME=/etc/airflow.keytab
 
-DIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
-FIXTURES_DIR="$DIR/ldif"
-LOAD_ORDER=("example.com.ldif" "manager.example.com.ldif" "users.example.com.ldif" "groups.example.com.ldif")
+cat /etc/hosts
+echo "hostname: ${FQDN}"
 
-load_fixture () {
-  ldapadd -x -H ldap://127.0.0.1:3890/ -D "cn=Manager,dc=example,dc=com" -w insecure -f $1
-}
+sudo cp $DIRNAME/krb5/krb-conf/client/krb5.conf /etc/krb5.conf
 
-for FILE in "${LOAD_ORDER[@]}"
-do
-  load_fixture "${FIXTURES_DIR}/${FILE}"
-done;
+echo -e "${PASS}\n${PASS}" | sudo kadmin -p ${ADMIN}/admin -w ${PASS} -q "addprinc -randkey airflow/${FQDN}"
+sudo kadmin -p ${ADMIN}/admin -w ${PASS} -q "ktadd -k ${KRB5_KTNAME} airflow"
+sudo kadmin -p ${ADMIN}/admin -w ${PASS} -q "ktadd -k ${KRB5_KTNAME} airflow/${FQDN}"
+sudo chmod 0644 ${KRB5_KTNAME}
diff --git a/scripts/ci/travis_script.sh b/scripts/ci/3-setup-databases.sh
similarity index 58%
rename from scripts/ci/travis_script.sh
rename to scripts/ci/3-setup-databases.sh
index 52571cce14..2a5cb682e0 100755
--- a/scripts/ci/travis_script.sh
+++ b/scripts/ci/3-setup-databases.sh
@@ -1,5 +1,4 @@
 #!/usr/bin/env bash
-
 #  Licensed to the Apache Software Foundation (ASF) under one   *
 #  or more contributor license agreements.  See the NOTICE file *
 #  distributed with this work for additional information        *
@@ -17,24 +16,8 @@
 #  specific language governing permissions and limitations      *
 #  under the License.                                           *
 
-DIRNAME=$(cd "$(dirname "$0")"; pwd)
-AIRFLOW_ROOT="$DIRNAME/../.."
-cd $AIRFLOW_ROOT && pip --version && ls -l $HOME/.wheelhouse && tox --version
+set -exuo pipefail
+
+MYSQL_HOST=mysql
 
-if [ -z "$KUBERNETES_VERSION" ];
-then
-  tox -e $TOX_ENV
-else
-  KUBERNETES_VERSION=${KUBERNETES_VERSION} $DIRNAME/kubernetes/setup_kubernetes.sh && \
-  tox -e $TOX_ENV -- tests.contrib.minikube \
-                     --with-coverage \
-                     --cover-erase \
-                     --cover-html \
-                     --cover-package=airflow \
-                     --cover-html-dir=airflow/www/static/coverage \
-                     --with-ignore-docstrings \
-                     --rednose \
-                     --with-timer \
-                     -v \
-                     --logging-level=DEBUG
-fi
+mysql -h ${MYSQL_HOST} -u root -e 'drop database if exists airflow; create database airflow'
diff --git a/scripts/ci/load_data.sh b/scripts/ci/4-load-data.sh
similarity index 81%
rename from scripts/ci/load_data.sh
rename to scripts/ci/4-load-data.sh
index 3422b07c20..7935482be0 100755
--- a/scripts/ci/load_data.sh
+++ b/scripts/ci/4-load-data.sh
@@ -15,14 +15,15 @@
 #  KIND, either express or implied.  See the License for the    *
 #  specific language governing permissions and limitations      *
 #  under the License.                                           *
-set -o verbose
+
+set -exuo pipefail
 
 DIR=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
 DATA_DIR="${DIR}/data"
 DATA_FILE="${DATA_DIR}/baby_names.csv"
 DATABASE=airflow_ci
+HOST=mysql
 
-mysqladmin -u root create ${DATABASE}
-mysql -u root < ${DATA_DIR}/mysql_schema.sql
-mysqlimport --local -u root --fields-optionally-enclosed-by="\"" --fields-terminated-by=, --ignore-lines=1 ${DATABASE} ${DATA_FILE}
-
+mysqladmin -h ${HOST} -u root create ${DATABASE}
+mysql -h ${HOST} -u root < ${DATA_DIR}/mysql_schema.sql
+mysqlimport --local -h ${HOST} -u root --fields-optionally-enclosed-by="\"" --fields-terminated-by=, --ignore-lines=1 ${DATABASE} ${DATA_FILE}
diff --git a/scripts/ci/5-run-tests.sh b/scripts/ci/5-run-tests.sh
new file mode 100755
index 0000000000..9a4e06f5e2
--- /dev/null
+++ b/scripts/ci/5-run-tests.sh
@@ -0,0 +1,97 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -o verbose
+
+if [ -z "$HADOOP_HOME" ]; then
+    echo "HADOOP_HOME not set - abort" >&2
+    exit 1
+fi
+
+echo "Using ${HADOOP_DISTRO} distribution of Hadoop from ${HADOOP_HOME}"
+
+pwd
+
+echo "Using travis airflow.cfg"
+DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+cp -f ${DIR}/airflow_travis.cfg ~/unittests.cfg
+
+ROOTDIR="$(dirname $(dirname $DIR))"
+export AIRFLOW__CORE__DAGS_FOLDER="$ROOTDIR/tests/dags"
+
+# add test/contrib to PYTHONPATH
+export PYTHONPATH=${PYTHONPATH:-$ROOTDIR/tests/test_utils}
+
+echo Backend: $AIRFLOW__CORE__SQL_ALCHEMY_CONN
+
+# environment
+export AIRFLOW_HOME=${AIRFLOW_HOME:=~}
+export AIRFLOW__CORE__UNIT_TEST_MODE=True
+
+# configuration test
+export AIRFLOW__TESTSECTION__TESTKEY=testvalue
+
+# use Airflow 2.0-style imports
+export AIRFLOW_USE_NEW_IMPORTS=1
+
+# any argument received is overriding the default nose execution arguments:
+nose_args=$@
+
+# Generate the `airflow` executable if needed
+which airflow > /dev/null || python setup.py develop
+
+# For impersonation tests on Travis, make airflow accessible to other users via the global PATH
+# (which contains /usr/local/bin)
+sudo ln -sf "${VIRTUAL_ENV}/bin/airflow" /usr/local/bin/
+
+echo "Initializing the DB"
+yes | airflow initdb
+yes | airflow resetdb
+
+if [ -z "$nose_args" ]; then
+  nose_args="--with-coverage \
+  --cover-erase \
+  --cover-html \
+  --cover-package=airflow \
+  --cover-html-dir=airflow/www/static/coverage \
+  --with-ignore-docstrings \
+  --rednose \
+  --with-timer \
+  -v \
+  --logging-level=DEBUG"
+fi
+
+# kdc init happens in setup_kdc.sh
+kinit -kt ${KRB5_KTNAME} airflow
+
+# For impersonation tests running on SQLite on Travis, make the database world readable so other
+# users can update it
+AIRFLOW_DB="$HOME/airflow.db"
+
+if [ -f "${AIRFLOW_DB}" ]; then
+  chmod a+rw "${AIRFLOW_DB}"
+  chmod g+rwx "${AIRFLOW_HOME}"
+fi
+
+echo "Starting the unit tests with the following nose arguments: "$nose_args
+nosetests $nose_args
+
+# To run individual tests:
+# nosetests tests.core:CoreTest.test_scheduler_job
diff --git a/scripts/ci/check-license.sh b/scripts/ci/6-check-license.sh
similarity index 87%
rename from scripts/ci/check-license.sh
rename to scripts/ci/6-check-license.sh
index 83c942a0a4..52d49896fe 100755
--- a/scripts/ci/check-license.sh
+++ b/scripts/ci/6-check-license.sh
@@ -53,9 +53,7 @@ acquire_rat_jar () {
 FWDIR="$(cd "`dirname "$0"`"/../..; pwd)"
 cd "$FWDIR"
 
-if [ -z "${TRAVIS_CACHE}" ]; then
-    TRAVIS_CACHE=/tmp
-fi
+TMP_DIR=/tmp
 
 if test -x "$JAVA_HOME/bin/java"; then
     declare java_cmd="$JAVA_HOME/bin/java"
@@ -64,8 +62,8 @@ else
 fi
 
 export RAT_VERSION=0.12
-export rat_jar="${TRAVIS_CACHE}"/lib/apache-rat-${RAT_VERSION}.jar
-mkdir -p ${TRAVIS_CACHE}/lib
+export rat_jar="${TMP_DIR}"/lib/apache-rat-${RAT_VERSION}.jar
+mkdir -p ${TMP_DIR}/lib
 
 
 [[ -f "$rat_jar" ]] || acquire_rat_jar || {
@@ -88,18 +86,18 @@ if test ! -z "$ERRORS"; then
     echo "$ERRORS"
     COUNT=`echo "${ERRORS}" | wc -l`
     # due to old builds can be removed later
-    rm -rf ${TRAVIS_CACHE}/rat-error-count
-    if [ ! -f ${TRAVIS_CACHE}/rat-error-count-builds ]; then
-        [ "${TRAVIS_PULL_REQUEST}" = "false" ] && echo ${COUNT} > ${TRAVIS_CACHE}/rat-error-count-builds
+    rm -rf ${TMP_DIR}/rat-error-count
+    if [ ! -f ${TMP_DIR}/rat-error-count-builds ]; then
+        [ "${TRAVIS_PULL_REQUEST}" = "false" ] && echo ${COUNT} > ${TMP_DIR}/rat-error-count-builds
         OLD_COUNT=${COUNT}
     else
-        typeset -i OLD_COUNT=$(cat ${TRAVIS_CACHE}/rat-error-count-builds)
+        typeset -i OLD_COUNT=$(cat ${TMP_DIR}/rat-error-count-builds)
     fi
     if [ ${COUNT} -gt ${OLD_COUNT} ]; then
         echo "New missing licenses (${COUNT} vs ${OLD_COUNT}) detected. Please correct them by adding them to to header of your files"
         exit 1
     else
-        [ "${TRAVIS_PULL_REQUEST}" = "false" ] && echo ${COUNT} > ${TRAVIS_CACHE}/rat-error-count-builds
+        [ "${TRAVIS_PULL_REQUEST}" = "false" ] && echo ${COUNT} > ${TMP_DIR}/rat-error-count-builds
     fi
     exit 0
 else
diff --git a/scripts/ci/airflow_travis.cfg b/scripts/ci/airflow_travis.cfg
index 552e836018..2d412e182c 100644
--- a/scripts/ci/airflow_travis.cfg
+++ b/scripts/ci/airflow_travis.cfg
@@ -21,7 +21,7 @@ airflow_home = ~/airflow
 dags_folder = ~/airflow/dags
 base_log_folder = ~/airflow/logs
 executor = LocalExecutor
-sql_alchemy_conn = mysql://root@localhost/airflow
+sql_alchemy_conn = mysql://root@mysql/airflow
 unit_test_mode = True
 load_examples = True
 donot_pickle = False
@@ -55,8 +55,8 @@ smtp_mail_from = airflow@example.com
 celery_app_name = airflow.executors.celery_executor
 worker_concurrency = 16
 worker_log_server_port = 8793
-broker_url = amqp://guest:guest@localhost:5672/
-result_backend = db+mysql://root@localhost/airflow
+broker_url = amqp://guest:guest@rabbitmq:5672/
+result_backend = db+mysql://root@mysql/airflow
 flower_port = 5555
 default_queue = default
 
diff --git a/scripts/ci/docker-compose.yml b/scripts/ci/docker-compose.yml
new file mode 100644
index 0000000000..861cf9e8b8
--- /dev/null
+++ b/scripts/ci/docker-compose.yml
@@ -0,0 +1,90 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+version: "2.2"
+services:
+  mysql:
+    image: mysql:5.6
+    restart: always
+    environment:
+      - MYSQL_ALLOW_EMPTY_PASSWORD=true
+      - MYSQL_ROOT_HOST=%
+    volumes:
+      - ./mysql/conf.d:/etc/mysql/conf.d
+
+  postgres:
+    image: postgres:9.6
+    restart: always
+    environment:
+      - POSTGRES_USER=postgres
+      - POSTGRES_PASSWORD=airflow
+      - POSTGRES_DB=airflow
+
+  mongo:
+    image: mongo:3
+    restart: always
+
+  cassandra:
+    image: cassandra:3.0
+    restart: always
+
+  rabbitmq:
+    image: rabbitmq:3.7
+    restart: always
+
+  openldap:
+    image: osixia/openldap:1.2.0
+    restart: always
+    command: --copy-service
+    environment:
+      - LDAP_DOMAIN=example.com
+      - LDAP_ADMIN_PASSWORD=insecure
+      - LDAP_CONFIG_PASSWORD=insecure
+    volumes:
+      - ./openldap/ldif:/container/service/slapd/assets/config/bootstrap/ldif/custom
+
+  krb5-kdc-server:
+    build: ./krb5
+    image: krb5-kdc-server
+    hostname: krb5-kdc-server
+    domainname: example.com
+
+  airflow-testing:
+    image: airflowci/incubator-airflow-ci:latest
+    init: true
+    environment:
+      - USER=airflow
+      - SLUGIFY_USES_TEXT_UNIDECODE=yes
+      - TOX_ENV
+      - PYTHON_VERSION
+      - TRAVIS
+      - TRAVIS_BRANCH
+      - TRAVIS_BUILD_DIR
+      - TRAVIS_JOB_ID
+      - TRAVIS_PULL_REQUEST
+    depends_on:
+      - postgres
+      - mysql
+      - mongo
+      - cassandra
+      - rabbitmq
+      - openldap
+      - krb5-kdc-server
+    volumes:
+      - ../../:/app
+      - ~/.cache/pip:/home/airflow/.cache/pip
+      - ~/.wheelhouse/:/home/airflow/.wheelhouse/
+    working_dir: /app
diff --git a/scripts/ci/flake8_diff.sh b/scripts/ci/flake8-diff.sh
similarity index 100%
rename from scripts/ci/flake8_diff.sh
rename to scripts/ci/flake8-diff.sh
diff --git a/scripts/ci/run_tests.sh b/scripts/ci/krb5/Dockerfile
old mode 100755
new mode 100644
similarity index 51%
rename from scripts/ci/run_tests.sh
rename to scripts/ci/krb5/Dockerfile
index d5e7655803..cdb4cf979e
--- a/scripts/ci/run_tests.sh
+++ b/scripts/ci/krb5/Dockerfile
@@ -1,5 +1,3 @@
-#!/usr/bin/env bash
-
 #
 # Licensed to the Apache Software Foundation (ASF) under one
 # or more contributor license agreements.  See the NOTICE file
@@ -8,9 +6,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@@ -18,30 +16,37 @@
 # specific language governing permissions and limitations
 # under the License.
 
-set -o verbose
+FROM ubuntu:xenial
 
-if [ -z "$HADOOP_HOME" ]; then
-    echo "HADOOP_HOME not set - abort" >&2
-    exit 1
-fi
+# environment variables
+ENV DEBIAN_FRONTEND noninteractive
 
-echo "Using ${HADOOP_DISTRO} distribution of Hadoop from ${HADOOP_HOME}"
+# Kerberos server
+RUN apt-get update && apt-get install --no-install-recommends -y \
+    ntp \
+    python-dev \
+    python-pip \
+    python-wheel \
+    python-setuptools \
+    python-pkg-resources \
+    krb5-admin-server \
+    krb5-kdc
 
-pwd
+RUN mkdir /app/
 
-mkdir ~/airflow/
+# Supervisord
+RUN pip install supervisor==3.3.4
+RUN mkdir -p /var/log/supervisord/
 
-if [ "${TRAVIS}" ]; then
-    echo "Using travis airflow.cfg"
-    DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
-    cp -f ${DIR}/airflow_travis.cfg ~/airflow/unittests.cfg
+COPY ./krb-conf/server/kdc.conf /etc/krb5kdc/kdc.conf
+COPY ./krb-conf/server/kadm5.acl /etc/krb5kdc/kadm5.acl
+COPY ./krb-conf/client/krb5.conf /etc/krb5.conf
+COPY ./start_kdc.sh /app/start_kdc.sh
 
-    ROOTDIR="$(dirname $(dirname $DIR))"
-    export AIRFLOW__CORE__DAGS_FOLDER="$ROOTDIR/tests/dags"
+# supervisord
+COPY ./supervisord.conf /etc/supervisord.conf
 
-    # kdc init happens in setup_kdc.sh
-    kinit -kt ${KRB5_KTNAME} airflow
-fi
+WORKDIR /app
 
-echo Backend: $AIRFLOW__CORE__SQL_ALCHEMY_CONN
-./run_unit_tests.sh $@
+# when container is starting
+CMD ["/bin/bash", "/app/start_kdc.sh"]
diff --git a/scripts/ci/krb5.conf b/scripts/ci/krb5/krb-conf/client/krb5.conf
similarity index 94%
rename from scripts/ci/krb5.conf
rename to scripts/ci/krb5/krb-conf/client/krb5.conf
index bbc802a0eb..471737a964 100644
--- a/scripts/ci/krb5.conf
+++ b/scripts/ci/krb5/krb-conf/client/krb5.conf
@@ -6,9 +6,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@@ -31,5 +31,6 @@ forwardable = true
 
 [realms]
  TEST.LOCAL = {
-   kdc = localhost:88
+   kdc = krb5-kdc-server:88
+   admin_server = krb5-kdc-server
  }
diff --git a/scripts/ci/kadm5.acl b/scripts/ci/krb5/krb-conf/server/kadm5.acl
similarity index 99%
rename from scripts/ci/kadm5.acl
rename to scripts/ci/krb5/krb-conf/server/kadm5.acl
index 691dce6c2b..41d17385ff 100644
--- a/scripts/ci/kadm5.acl
+++ b/scripts/ci/krb5/krb-conf/server/kadm5.acl
@@ -6,9 +6,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
diff --git a/scripts/ci/kdc.conf b/scripts/ci/krb5/krb-conf/server/kdc.conf
similarity index 99%
rename from scripts/ci/kdc.conf
rename to scripts/ci/krb5/krb-conf/server/kdc.conf
index 5eef3053b3..c21095f418 100644
--- a/scripts/ci/kdc.conf
+++ b/scripts/ci/krb5/krb-conf/server/kdc.conf
@@ -6,9 +6,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
diff --git a/scripts/ci/setup_kdc.sh b/scripts/ci/krb5/start_kdc.sh
similarity index 75%
rename from scripts/ci/setup_kdc.sh
rename to scripts/ci/krb5/start_kdc.sh
index 3e525c5e82..6e02f006fa 100755
--- a/scripts/ci/setup_kdc.sh
+++ b/scripts/ci/krb5/start_kdc.sh
@@ -8,9 +8,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@@ -18,23 +18,15 @@
 # specific language governing permissions and limitations
 # under the License.
 
-cat /etc/hosts
+set -exuo pipefail
 
 FQDN=`hostname`
-
-echo "hostname: ${FQDN}"
-
 ADMIN="admin"
 PASS="airflow"
+KRB5_KTNAME=/etc/airflow.keytab
 
-DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
-
-ln -sf /dev/urandom /dev/random
-
-cp ${DIR}/kdc.conf /etc/krb5kdc/kdc.conf
-cp ${DIR}/kadm5.acl /etc/krb5kdc/kadm5.acl
-cp ${DIR}/krb5.conf /etc/krb5.conf
-
+cat /etc/hosts
+echo "hostname: ${FQDN}"
 # create kerberos database
 echo -e "${PASS}\n${PASS}" | kdb5_util create -s
 # create admin
@@ -45,12 +37,5 @@ echo -e "${PASS}\n${PASS}" | kadmin.local -q "addprinc -randkey airflow/${FQDN}"
 kadmin.local -q "ktadd -k ${KRB5_KTNAME} airflow"
 kadmin.local -q "ktadd -k ${KRB5_KTNAME} airflow/${FQDN}"
 
-service krb5-kdc restart
-
-# make sure the keytab is readable to anyone
-chmod 664 ${KRB5_KTNAME}
-
-# don't do a kinit here as this happens under super user privileges
-# on travis
-# kinit -kt ${KRB5_KTNAME} airflow
-
+# Start services
+/usr/local/bin/supervisord -n -c /etc/supervisord.conf
diff --git a/scripts/ci/krb5/supervisord.conf b/scripts/ci/krb5/supervisord.conf
new file mode 100644
index 0000000000..165e5cde84
--- /dev/null
+++ b/scripts/ci/krb5/supervisord.conf
@@ -0,0 +1,43 @@
+;
+; Licensed to the Apache Software Foundation (ASF) under one
+; or more contributor license agreements.  See the NOTICE file
+; distributed with this work for additional information
+; regarding copyright ownership.  The ASF licenses this file
+; to you under the Apache License, Version 2.0 (the
+; "License"); you may not use this file except in compliance
+; with the License.  You may obtain a copy of the License at
+;
+;   http://www.apache.org/licenses/LICENSE-2.0
+;
+; Unless required by applicable law or agreed to in writing,
+; software distributed under the License is distributed on an
+; "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+; KIND, either express or implied.  See the License for the
+; specific language governing permissions and limitations
+; under the License.
+
+; supervisord.conf - kdc-server
+
+[supervisord]
+logfile=/var/log/supervisord/supervisord.log    ; supervisord log file
+logfile_maxbytes=50MB                           ; maximum size of logfile before rotation
+logfile_backups=10                              ; number of backed up logfiles
+loglevel=error                                  ; info, debug, warn, trace
+pidfile=/var/run/supervisord.pid                ; pidfile location
+nodaemon=false                                  ; run supervisord as a daemon
+minfds=1024                                     ; number of startup file descriptors
+minprocs=200                                    ; number of process descriptors
+user=root                                       ; default user
+childlogdir=/var/log/supervisord/               ; where child log files will live
+
+[program:krb5-kdc]
+command=service krb5-kdc start
+autostart=true
+autorestart=true
+
+[program:krb5-admin-server]
+command=service krb5-admin-server start
+autostart=true
+autorestart=true
+
+[supervisorctl]
diff --git a/scripts/ci/ldif/example.com.ldif b/scripts/ci/ldif/example.com.ldif
deleted file mode 100644
index cda5d00561..0000000000
--- a/scripts/ci/ldif/example.com.ldif
+++ /dev/null
@@ -1,24 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-# 
-#   http://www.apache.org/licenses/LICENSE-2.0
-# 
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-dn: dc=example,dc=com
-dc: example
-description: LDAP Example
-objectClass: dcObject
-objectClass: organization
-o: example
diff --git a/scripts/ci/my.cnf b/scripts/ci/mysql/conf.d/airflow.cnf
similarity index 99%
rename from scripts/ci/my.cnf
rename to scripts/ci/mysql/conf.d/airflow.cnf
index 5ff549ca4e..4802d195a2 100644
--- a/scripts/ci/my.cnf
+++ b/scripts/ci/mysql/conf.d/airflow.cnf
@@ -6,9 +6,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
diff --git a/scripts/ci/ldif/users.example.com.ldif b/scripts/ci/openldap/ldif/01-users.example.com.ldif
similarity index 99%
rename from scripts/ci/ldif/users.example.com.ldif
rename to scripts/ci/openldap/ldif/01-users.example.com.ldif
index ed4a7ceb76..bf5baf5ad8 100644
--- a/scripts/ci/ldif/users.example.com.ldif
+++ b/scripts/ci/openldap/ldif/01-users.example.com.ldif
@@ -6,9 +6,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
diff --git a/scripts/ci/ldif/groups.example.com.ldif b/scripts/ci/openldap/ldif/02-groups.example.com.ldif
similarity index 96%
rename from scripts/ci/ldif/groups.example.com.ldif
rename to scripts/ci/openldap/ldif/02-groups.example.com.ldif
index 804dcb008b..9d1deb349f 100644
--- a/scripts/ci/ldif/groups.example.com.ldif
+++ b/scripts/ci/openldap/ldif/02-groups.example.com.ldif
@@ -6,9 +6,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@@ -27,7 +27,7 @@ dn: cn=group1,ou=groups,dc=example,dc=com
 objectclass: groupofnames
 cn: group1
 description: Group 1 of users
-# add the group members all of which are 
+# add the group members all of which are
 # assumed to exist under example
 member: cn=user1,dc=example,dc=com
 
diff --git a/scripts/ci/ldif/manager.example.com.ldif b/scripts/ci/openldap/ldif/03-manager.example.com.ldif
similarity index 99%
rename from scripts/ci/ldif/manager.example.com.ldif
rename to scripts/ci/openldap/ldif/03-manager.example.com.ldif
index 1d06a4f2eb..d4b90b73b8 100644
--- a/scripts/ci/ldif/manager.example.com.ldif
+++ b/scripts/ci/openldap/ldif/03-manager.example.com.ldif
@@ -6,9 +6,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
diff --git a/init.sh b/scripts/ci/openldap/ldif/04-rootdn.ldif
similarity index 84%
rename from init.sh
rename to scripts/ci/openldap/ldif/04-rootdn.ldif
index 6f4adcad06..59bd3e554d 100644
--- a/init.sh
+++ b/scripts/ci/openldap/ldif/04-rootdn.ldif
@@ -1,5 +1,3 @@
-#!/usr/bin/env bash
-
 #
 # Licensed to the Apache Software Foundation (ASF) under one
 # or more contributor license agreements.  See the NOTICE file
@@ -8,9 +6,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@@ -18,4 +16,9 @@
 # specific language governing permissions and limitations
 # under the License.
 
-source $AIRFLOW_HOME/env/bin/activate
+dn: cn=config
+changetype: modify
+
+dn: olcDatabase={1}{{ LDAP_BACKEND }},cn=config
+replace: olcRootDN
+olcRootDN: cn=Manager,{{ LDAP_BASE_DN }}
diff --git a/scripts/ci/slapd.conf b/scripts/ci/openldap/slapd.conf
similarity index 99%
rename from scripts/ci/slapd.conf
rename to scripts/ci/openldap/slapd.conf
index 34df186e4a..a404530b8d 100644
--- a/scripts/ci/slapd.conf
+++ b/scripts/ci/openldap/slapd.conf
@@ -6,9 +6,9 @@
 # to you under the Apache License, Version 2.0 (the
 # "License"); you may not use this file except in compliance
 # with the License.  You may obtain a copy of the License at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in writing,
 # software distributed under the License is distributed on an
 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@@ -38,7 +38,6 @@ disallow bind_anon
 database hdb
 
 suffix "dc=example,dc=com"
-
 rootdn "cn=Manager,dc=example,dc=com"
 rootpw insecure
 
diff --git a/scripts/ci/requirements.txt b/scripts/ci/requirements.txt
deleted file mode 100644
index 837c43f2f7..0000000000
--- a/scripts/ci/requirements.txt
+++ /dev/null
@@ -1,100 +0,0 @@
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-alembic
-atlasclient
-azure-storage>=0.34.0
-bcrypt
-bleach
-boto
-boto3
-celery
-cgroupspy
-chartkick
-cloudant
-coverage
-coveralls
-croniter>=0.3.17
-cryptography
-datadog
-dill
-distributed
-docker-py
-filechunkio
-flake8
-flask
-flask-admin
-flask-bcrypt
-flask-cache
-flask-login==0.2.11
-Flask-WTF
-flower
-freezegun
-future
-google-api-python-client>=1.6.0,<2.0.0dev
-google-auth>=1.0.0,<2.0.0dev
-google-auth-httplib2
-gunicorn
-hdfs
-hive-thrift-py
-ipython
-jaydebeapi
-jinja2<2.9.0
-jira
-ldap3
-lxml
-markdown
-mock
-moto==1.1.19
-mysqlclient
-nose
-nose-exclude
-nose-ignore-docstring==0.2
-nose-timer
-pandas
-pandas-gbq
-parameterized
-paramiko>=2.1.1
-pendulum>=1.3.2
-psutil>=4.2.0, <5.0.0
-psycopg2
-pygments
-pyhive
-pykerberos
-pymongo
-PyOpenSSL
-PySmbClient
-python-daemon
-python-dateutil
-qds-sdk>=1.9.6
-redis
-rednose
-requests
-requests-kerberos
-requests_mock
-sendgrid
-setproctitle
-slackclient
-sphinx
-sphinx-argparse
-Sphinx-PyPI-upload
-sphinx_rtd_theme
-sqlalchemy>=1.1.15, <1.2.0
-statsd
-tenacity==4.8.0
-thrift
-thrift_sasl
-unicodecsv
-vertica_python
-zdesk
-kubernetes
diff --git a/scripts/ci/run-ci.sh b/scripts/ci/run-ci.sh
new file mode 100755
index 0000000000..f2815bbd95
--- /dev/null
+++ b/scripts/ci/run-ci.sh
@@ -0,0 +1,56 @@
+#!/usr/bin/env bash
+
+#
+#  Licensed to the Apache Software Foundation (ASF) under one
+#  or more contributor license agreements.  See the NOTICE file
+#  distributed with this work for additional information
+#  regarding copyright ownership.  The ASF licenses this file
+#  to you under the Apache License, Version 2.0 (the
+#  "License"); you may not use this file except in compliance
+#  with the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing,
+#  software distributed under the License is distributed on an
+#  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+#  KIND, either express or implied.  See the License for the
+#  specific language governing permissions and limitations
+#  under the License.
+
+set -x
+
+DIRNAME=$(cd "$(dirname "$0")"; pwd)
+AIRFLOW_ROOT="$DIRNAME/../.."
+
+# Fix file permissions
+sudo chown -R airflow.airflow . $HOME/.wheelhouse/ $HOME/.cache/pip
+
+if [[ $PYTHON_VERSION == '3' ]]; then
+  PIP=pip3
+else
+  PIP=pip
+fi
+
+sudo $PIP install --upgrade pip
+sudo $PIP install tox
+
+cd $AIRFLOW_ROOT && $PIP --version && tox --version
+
+if [ -z "$KUBERNETES_VERSION" ];
+then
+  tox -e $TOX_ENV
+else
+  KUBERNETES_VERSION=${KUBERNETES_VERSION} $DIRNAME/kubernetes/setup_kubernetes.sh && \
+  tox -e $TOX_ENV -- tests.contrib.minikube \
+                     --with-coverage \
+                     --cover-erase \
+                     --cover-html \
+                     --cover-package=airflow \
+                     --cover-html-dir=airflow/www/static/coverage \
+                     --with-ignore-docstrings \
+                     --rednose \
+                     --with-timer \
+                     -v \
+                     --logging-level=DEBUG
+fi
diff --git a/scripts/ci/setup_env.sh b/scripts/ci/setup_env.sh
deleted file mode 100755
index a1d9a4c244..0000000000
--- a/scripts/ci/setup_env.sh
+++ /dev/null
@@ -1,166 +0,0 @@
-#!/usr/bin/env bash
-
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-# 
-#   http://www.apache.org/licenses/LICENSE-2.0
-# 
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-set -o verbose
-
-MINIKDC_VERSION=2.7.1
-
-HADOOP_DISTRO=${HADOOP_DISTRO:-"hdp"}
-
-ONLY_DOWNLOAD=${ONLY_DOWNLOAD:-false}
-ONLY_EXTRACT=${ONLY_EXTRACT:-false}
-
-MINICLUSTER_URL=https://github.com/bolkedebruin/minicluster/releases/download/1.1/minicluster-1.1-SNAPSHOT-bin.zip
-
-HIVE_HOME=/tmp/hive
-
-while test $# -gt 0; do
-    case "$1" in
-        -h|--help)
-            echo "Setup environment for airflow tests"
-            echo " "
-            echo "options:"
-            echo -e "\t-h, --help            show brief help"
-            echo -e "\t-o, --only-download   just download hadoop tar(s)"
-            echo -e "\t-e, --only-extract    just extract hadoop tar(s)"
-            echo -e "\t-d, --distro          select distro (hdp|cdh)"
-            exit 0
-            ;;
-        -o|--only-download)
-            shift
-            ONLY_DOWNLOAD=true
-            ;;
-        -e|--only-extract)
-            shift
-            ONLY_EXTRACT=true
-            ;;
-        -d|--distro)
-            shift
-            if test $# -gt 0; then
-                HADOOP_DISTRO=$1
-            else
-                echo "No Hadoop distro specified - abort" >&2
-                exit 1
-            fi
-            shift
-            ;;
-        *)
-            echo "Unknown options: $1" >&2
-            exit 1
-            ;;
-    esac
-done
-
-HADOOP_HOME=/tmp/hadoop-${HADOOP_DISTRO}
-MINICLUSTER_HOME=/tmp/minicluster
-
-if $ONLY_DOWNLOAD && $ONLY_EXTRACT; then
-    echo "Both only-download and only-extract specified - abort" >&2
-    exit 1
-fi
-
-mkdir -p ${HADOOP_HOME}
-mkdir -p ${TRAVIS_CACHE}/${HADOOP_DISTRO}
-mkdir -p ${TRAVIS_CACHE}/minicluster
-mkdir -p ${TRAVIS_CACHE}/hive
-mkdir -p ${HIVE_HOME}
-chmod -R 777 ${HIVE_HOME}
-sudo mkdir -p /user/hive/warehouse
-sudo chown -R ${USER} /user/
-sudo chmod -R 777 /user/
-ls -l /
-
-if [ $HADOOP_DISTRO = "cdh" ]; then
-    # URL="http://archive.cloudera.com/cdh5/cdh/5/hadoop-latest.tar.gz"
-    URL="https://archive.cloudera.com/cdh5/cdh/5/hadoop-2.6.0-cdh5.11.0.tar.gz"
-    # HIVE_URL="http://archive.cloudera.com/cdh5/cdh/5/hive-latest.tar.gz"
-    HIVE_URL="https://archive.cloudera.com/cdh5/cdh/5/hive-1.1.0-cdh5.11.0.tar.gz"
-elif [ $HADOOP_DISTRO = "hdp" ]; then
-    URL="http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.3.2.0/tars/hadoop-2.7.1.2.3.2.0-2950.tar.gz"
-    HIVE_URL="http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.3.2.0/tars/apache-hive-1.2.1.2.3.2.0-2950-bin.tar.gz"
-else
-    echo "No/bad HADOOP_DISTRO='${HADOOP_DISTRO}' specified" >&2
-    exit 1
-fi
-
-if ! $ONLY_EXTRACT; then
-    echo "Downloading Hadoop from $URL to ${TRAVIS_CACHE}/${HADOOP_DISTRO}/hadoop.tar.gz"
-    curl -z ${TRAVIS_CACHE}/${HADOOP_DISTRO}/hadoop.tar.gz -o ${TRAVIS_CACHE}/${HADOOP_DISTRO}/hadoop.tar.gz -L $URL
-
-    if [ $? != 0 ]; then
-        echo "Failed to download Hadoop from $URL - abort" >&2
-        exit 1
-    fi
-fi
-
-if $ONLY_DOWNLOAD; then
-    exit 0
-fi
-
-echo "Extracting ${HADOOP_HOME}/hadoop.tar.gz into $HADOOP_HOME"
-tar zxf ${TRAVIS_CACHE}/${HADOOP_DISTRO}/hadoop.tar.gz --strip-components 1 -C $HADOOP_HOME
-
-if [ $? != 0 ]; then
-    echo "Failed to extract Hadoop from ${HADOOP_HOME}/hadoop.tar.gz to ${HADOOP_HOME} - abort" >&2
-    echo "Trying again..." >&2
-    # dont use cache
-    curl -o ${TRAVIS_CACHE}/${HADOOP_DISTRO}/hadoop.tar.gz -L $URL
-    tar zxf ${TRAVIS_CACHE}/${HADOOP_DISTRO}/hadoop.tar.gz --strip-components 1 -C $HADOOP_HOME
-    if [ $? != 0 ]; then
-        echo "Failed twice in downloading and unpacking hadoop!" >&2
-        exit 1
-    fi
-fi
-
-echo "Downloading and unpacking hive"
-curl -z ${TRAVIS_CACHE}/hive/hive.tar.gz -o ${TRAVIS_CACHE}/hive/hive.tar.gz -L ${HIVE_URL}
-tar zxf ${TRAVIS_CACHE}/hive/hive.tar.gz --strip-components 1 -C ${HIVE_HOME}
-
-if [ $? != 0 ]; then
-    echo "Failed to extract hive from ${TRAVIS_CACHE}/hive/hive.tar.gz" >&2
-    echo "Trying again..." >&2
-    # dont use cache
-    curl -o ${TRAVIS_CACHE}/hive/hive.tar.gz -L ${HIVE_URL}
-    tar zxf ${TRAVIS_CACHE}/hive/hive.tar.gz --strip-components 1 -C ${HIVE_HOME}
-    if [ $? != 0 ]; then
-        echo "Failed twice in downloading and unpacking hive!" >&2
-        exit 1
-    fi
-fi
-
-echo "Downloading and unpacking minicluster"
-curl -z ${TRAVIS_CACHE}/minicluster/minicluster.zip -o ${TRAVIS_CACHE}/minicluster/minicluster.zip -L ${MINICLUSTER_URL}
-ls -l ${TRAVIS_CACHE}/minicluster/minicluster.zip
-unzip ${TRAVIS_CACHE}/minicluster/minicluster.zip -d /tmp
-if [ $? != 0 ] ; then
-    # Try downloading w/o cache if there's a failure
-    curl -o ${TRAVIS_CACHE}/minicluster/minicluster.zip -L ${MINICLUSTER_URL}
-    ls -l ${TRAVIS_CACHE}/minicluster/minicluster.zip
-    unzip ${TRAVIS_CACHE}/minicluster/minicluster.zip -d /tmp
-    if [ $? != 0 ] ; then
-        echo "Failed twice in downloading and unpacking minicluster!" >&2
-        exit 1
-    fi
-    exit 1
-fi
-
-echo "Path = ${PATH}"
-
-java -cp "/tmp/minicluster-1.1-SNAPSHOT/*" com.ing.minicluster.MiniCluster > /dev/null &
diff --git a/scripts/ci/test-environment.sh b/scripts/ci/test-environment.sh
new file mode 100644
index 0000000000..5c402d46df
--- /dev/null
+++ b/scripts/ci/test-environment.sh
@@ -0,0 +1,19 @@
+#!/bin/bash -e
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+docker-compose -f scripts/ci/docker-compose.yml run --rm airflow-testing "${@-bash}"
diff --git a/tests/contrib/hooks/test_cassandra_hook.py b/tests/contrib/hooks/test_cassandra_hook.py
index 9cb0739993..73dac4f3b4 100644
--- a/tests/contrib/hooks/test_cassandra_hook.py
+++ b/tests/contrib/hooks/test_cassandra_hook.py
@@ -42,7 +42,7 @@ def setUp(self):
         db.merge_conn(
             models.Connection(
                 conn_id='cassandra_default_with_schema', conn_type='cassandra',
-                host='localhost', port='9042', schema='s'))
+                host='cassandra', port='9042', schema='s'))
 
         hook = CassandraHook("cassandra_default")
         session = hook.get_conn()
diff --git a/tests/contrib/sensors/test_mongo_sensor.py b/tests/contrib/sensors/test_mongo_sensor.py
index 876cb99ba4..6a78b7d146 100644
--- a/tests/contrib/sensors/test_mongo_sensor.py
+++ b/tests/contrib/sensors/test_mongo_sensor.py
@@ -38,7 +38,7 @@ def setUp(self):
         db.merge_conn(
             Connection(
                 conn_id='mongo_test', conn_type='mongo',
-                host='localhost', port='27017', schema='test'))
+                host='mongo', port='27017', schema='test'))
 
         args = {
             'owner': 'airflow',
diff --git a/tests/core.py b/tests/core.py
index bef47b6e1d..8df6312eeb 100644
--- a/tests/core.py
+++ b/tests/core.py
@@ -2078,7 +2078,7 @@ def setUp(self):
             configuration.conf.add_section("ldap")
         except:
             pass
-        configuration.conf.set("ldap", "uri", "ldap://localhost:3890")
+        configuration.conf.set("ldap", "uri", "ldap://openldap:389")
         configuration.conf.set("ldap", "user_filter", "objectClass=*")
         configuration.conf.set("ldap", "user_name_attr", "uid")
         configuration.conf.set("ldap", "bind_user", "cn=Manager,dc=example,dc=com")
@@ -2165,7 +2165,7 @@ def setUp(self):
             configuration.conf.add_section("ldap")
         except:
             pass
-        configuration.conf.set("ldap", "uri", "ldap://localhost:3890")
+        configuration.conf.set("ldap", "uri", "ldap://openldap:389")
         configuration.conf.set("ldap", "user_filter", "objectClass=*")
         configuration.conf.set("ldap", "user_name_attr", "uid")
         configuration.conf.set("ldap", "bind_user", "cn=Manager,dc=example,dc=com")
diff --git a/tox.ini b/tox.ini
index b9c386afb1..73e3170ec8 100644
--- a/tox.ini
+++ b/tox.ini
@@ -40,42 +40,40 @@ basepython =
     py35: python3.5
 
 setenv =
-    HADOOP_DISTRO = cdh
-    HADOOP_HOME = /tmp/hadoop-cdh
-    HADOOP_OPTS = -D/tmp/krb5.conf
-    MINICLUSTER_HOME = /tmp/minicluster-1.1-SNAPSHOT
-    HIVE_HOME = /tmp/hive
-    backend_mysql: AIRFLOW__CORE__SQL_ALCHEMY_CONN = mysql://root@localhost/airflow
-    backend_postgres: AIRFLOW__CORE__SQL_ALCHEMY_CONN = postgresql+psycopg2://postgres@localhost/airflow
-    backend_sqlite: AIRFLOW__CORE__SQL_ALCHEMY_CONN = sqlite:///{homedir}/airflow/airflow.db
-    backend_sqlite: AIRFLOW__CORE__EXECUTOR = SequentialExecutor
+  HADOOP_DISTRO=cdh
+  HADOOP_HOME=/tmp/hadoop-cdh
+  HADOOP_OPTS=-D/tmp/krb5.conf
+  HIVE_HOME=/tmp/hive
+  MINICLUSTER_HOME=/tmp/minicluster-1.1-SNAPSHOT
+  KRB5_CONFIG=/etc/krb5.conf
+  KRB5_KTNAME=/etc/airflow.keytab
+  backend_mysql: AIRFLOW__CORE__SQL_ALCHEMY_CONN=mysql://root@mysql/airflow
+  backend_postgres: AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://postgres:airflow@postgres/airflow
+  backend_sqlite: AIRFLOW__CORE__SQL_ALCHEMY_CONN=sqlite:///{homedir}/airflow.db
+  backend_sqlite: AIRFLOW__CORE__EXECUTOR=SequentialExecutor
 
 passenv =
     HOME
     JAVA_HOME
+    USER
+    PATH
+    BOTO_CONFIG
     TRAVIS
     TRAVIS_BRANCH
     TRAVIS_BUILD_DIR
     TRAVIS_JOB_ID
-    USER
-    TRAVIS_CACHE
     TRAVIS_PULL_REQUEST
-    PATH
-    BOTO_CONFIG
-    KRB5_CONFIG
-    KRB5_KTNAME
     SLUGIFY_USES_TEXT_UNIDECODE
 
 commands =
-    pip wheel -w {homedir}/.wheelhouse -f {homedir}/.wheelhouse -e .[devel_ci]
-    pip install --find-links={homedir}/.wheelhouse --no-index -e .[devel_ci]
-    sudo {toxinidir}/scripts/ci/setup_kdc.sh
-    {toxinidir}/scripts/ci/setup_env.sh
-    {toxinidir}/scripts/ci/ldap.sh
-    {toxinidir}/scripts/ci/load_fixtures.sh
-    {toxinidir}/scripts/ci/load_data.sh
-    {toxinidir}/scripts/ci/run_tests.sh []
-    {toxinidir}/scripts/ci/check-license.sh
+  pip wheel --progress-bar off -w {homedir}/.wheelhouse -f {homedir}/.wheelhouse -e .[devel_ci]
+  pip install --progress-bar off --find-links={homedir}/.wheelhouse --no-index -e .[devel_ci]
+  {toxinidir}/scripts/ci/1-setup-env.sh
+  {toxinidir}/scripts/ci/2-setup-kdc.sh
+  {toxinidir}/scripts/ci/3-setup-databases.sh
+  {toxinidir}/scripts/ci/4-load-data.sh
+  {toxinidir}/scripts/ci/5-run-tests.sh []
+  {toxinidir}/scripts/ci/6-check-license.sh
 
 [testenv:flake8]
 basepython = python3
@@ -84,4 +82,4 @@ deps =
     flake8
 
 commands =
-    {toxinidir}/scripts/ci/flake8_diff.sh
+    {toxinidir}/scripts/ci/flake8-diff.sh


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


> Dockerised CI pipeline
> ----------------------
>
>                 Key: AIRFLOW-2499
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-2499
>             Project: Apache Airflow
>          Issue Type: Test
>          Components: ci, tests
>    Affects Versions: 1.10.0, 1.10
>            Reporter: Gerardo Curiel
>            Assignee: Gerardo Curiel
>            Priority: Major
>              Labels: ci, docker, travis-ci
>             Fix For: 2.0.0
>
>   Original Estimate: 72h
>  Remaining Estimate: 72h
>
> PR: https://github.com/apache/incubator-airflow/pull/3393
> Currently, running unit tests is a difficult process. Airflow tests depend on many external services and other custom setup, which makes it hard for contributors to work on this codebase. CI builds have also been unreliable[0][1][2][3], and it is hard to reproduce the causes. Having contributors trying to emulate the build environment every time makes it easier to get to an "it works on my machine" sort of situation.
> This PR implements a dockerised version of the current build pipeline. This setup has a few advantages:
>  * TravisCI tests are reproducible locally
>  * The same build setup can be used to create a local development environment (there's a request for it [4])
>  
> *Implementation details*
>  * I'm using Docker Compose for the container orchestration and configuration.
>  * MySQL, PostgreSQL, OpenLDAP, krb5 and rabbitmq are now services running inside their own containers
>  * I created a separate repo, called incubator-airflow-ci[5] (TravisCI build here[6]), where a base image with all dependencies is built. In this case, I'm following the same pattern the CouchDB[7] project follows
>  * Hadoop, Hive and MiniCluster were moved to this base image
>  * The current TravisCI pipeline lives here[8]. A few tests are still failing. It's still WIP.
>  
> *References*
> [0] https://issues.apache.org/jira/browse/AIRFLOW-671
>  [1] https://issues.apache.org/jira/browse/AIRFLOW-968
>  [2] https://issues.apache.org/jira/browse/AIRFLOW-2157
>  [3] https://issues.apache.org/jira/browse/AIRFLOW-2272
>  [4] https://issues.apache.org/jira/browse/AIRFLOW-1042 
>  [5] [https://github.com/gerardo/incubator-airflow-ci]
>  [6] [https://travis-ci.org/gerardo/incubator-airflow-ci]
>  [7] [https://travis-ci.org/apache/couchdb-ci]
>  [8] [https://travis-ci.org/gerardo/incubator-airflow]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)