You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by po...@apache.org on 2020/06/20 17:43:45 UTC

[airflow] branch v1-10-test updated (f47e81a -> e31761f)

This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a change to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git.


    omit f47e81a  In case of worktree .git might be a file - rat-check fails (#9435)
    omit 8fd5a6c  Fixed rendering of IMAGES.rst (#9433)
    omit 467dbe4  add guidance re yarn build for local virtualenv development (#9411)
    omit 7743aef  Fix in-breeze CLI tools to work also on Linux (#9376)
    omit b2196a5  Fixes Breeze 'tests' command (#9384)
    omit f734b86  Fixed crashing webserver after /tmp is mounted from the host (#9378)
    omit a801dc3  Fix Airflow Stable version in README.md (#9360)
    omit 3d254e5  clarify breeze initialize virtualenv instructions (#9319)
    omit 6794c90  Fixes unbound variable on MacOS (#9335)
    omit b6bca1c  Improve production image iteration speed (#9162)
    omit 55eda65  Fix broken CI image optimisation (#9313)
    omit 9ef7081  Update pre-commit-hooks repo version (#9195)
    omit 6f54b12  Remove generating temp remote manifest file in project dir (#9267)
    omit fe034cb  Add missing variable in run_cli_tool.sh (#9239)
    omit e473d06  Additional apt dependencies options in breeze (#9231)
    omit bbddbdb  Add generic CLI tool wrapper (#9223)
    omit ec80897  Correctly restore colour in logs after format arg (#9222)
    omit 4c9b9e1  n Improved compatibility with Python 3.5+ - Convert signal.SIGTERM to int (#9207)
    omit 55d15ae  Support additional apt dependencies (#9189)
    omit f3946bc  Updated missing parameters for docker image building (#9039)
    omit 14057cf  Fix typo in BREEZE.rst (#9199)
    omit 5b8a1f3  Remove httplib2 from Google requirements (#9194)
    omit b00e310  Add PR/issue note in Contribution Workflow Example (#9177)
    omit 076b7ee  Don't use the term "whitelist" - language matters (#9174)
    omit 9b46a44  Merging multiple sql operators (#9124)
     new 910ac9a  Merging multiple sql operators (#9124)
     new 566b9d3  Don't use the term "whitelist" - language matters (#9174)
     new 659863d  Add PR/issue note in Contribution Workflow Example (#9177)
     new 7aa60b8  Remove httplib2 from Google requirements (#9194)
     new 3a54137  Fix typo in BREEZE.rst (#9199)
     new c7bca9d  Updated missing parameters for docker image building (#9039)
     new 83ba24b  Support additional apt dependencies (#9189)
     new 936d767  n Improved compatibility with Python 3.5+ - Convert signal.SIGTERM to int (#9207)
     new 7658c18  Correctly restore colour in logs after format arg (#9222)
     new 68d7399  Add generic CLI tool wrapper (#9223)
     new 00211dc  Additional apt dependencies options in breeze (#9231)
     new 42bfb2e  Add missing variable in run_cli_tool.sh (#9239)
     new 5267573  Remove generating temp remote manifest file in project dir (#9267)
     new fcdb35c  Update pre-commit-hooks repo version (#9195)
     new 5135568  Fix broken CI image optimisation (#9313)
     new ea93adc  Improve production image iteration speed (#9162)
     new f045a8d  Fixes unbound variable on MacOS (#9335)
     new 6efe62f  clarify breeze initialize virtualenv instructions (#9319)
     new 0eb1378  Fix Airflow Stable version in README.md (#9360)
     new d22054f  Fixed crashing webserver after /tmp is mounted from the host (#9378)
     new 7835318  Fixes Breeze 'tests' command (#9384)
     new a6f3d3e  Fix in-breeze CLI tools to work also on Linux (#9376)
     new 3e10afc  add guidance re yarn build for local virtualenv development (#9411)
     new e1675c3  Fixed rendering of IMAGES.rst (#9433)
     new e31761f  In case of worktree .git might be a file - rat-check fails (#9435)

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (f47e81a)
            \
             N -- N -- N   refs/heads/v1-10-test (e31761f)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 25 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 airflow/operators/sql.py | 4 ++++
 1 file changed, 4 insertions(+)


[airflow] 25/25: In case of worktree .git might be a file - rat-check fails (#9435)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e31761f4eceb84215f4ac8f08f6ac25205af11e4
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sat Jun 20 11:27:50 2020 +0200

    In case of worktree .git might be a file - rat-check fails (#9435)
    
    (cherry picked from commit 4359b8e6ca484ded750dcf2f93131bae3e8502dc)
---
 .rat-excludes | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/.rat-excludes b/.rat-excludes
index f50f7d3..6bef964 100644
--- a/.rat-excludes
+++ b/.rat-excludes
@@ -83,3 +83,6 @@ PULL_REQUEST_TEMPLATE.md
 
 # the example notebook is ASF 2 licensed but RAT cannot read this
 input_notebook.ipynb
+
+# .git might be a file in case of worktree
+.git


[airflow] 19/25: Fix Airflow Stable version in README.md (#9360)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 0eb13789fcc914a1c83f7163603ce89c2c2635b8
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Thu Jun 18 00:33:04 2020 +0100

    Fix Airflow Stable version in README.md (#9360)
    
    
    (cherry picked from commit 8622c134a02cf2f57f57692e4008b0883039c65a)
---
 README.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/README.md b/README.md
index ad8915b..a6ca2af 100644
--- a/README.md
+++ b/README.md
@@ -67,7 +67,7 @@ Apache Airflow is tested with:
 * Sqlite - latest stable (it is used mainly for development purpose)
 * Kubernetes - 1.16.2, 1.17.0
 
-### Stable version (1.10.9)
+### Stable version (1.10.10)
 
 * Python versions: 2.7, 3.5, 3.6, 3.7
 * Postgres DB: 9.6, 10


[airflow] 11/25: Additional apt dependencies options in breeze (#9231)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 00211dcc13b3ead5720f304dc0c7c4f959ad8d5f
Author: zikun <33...@users.noreply.github.com>
AuthorDate: Fri Jun 12 00:53:26 2020 +0800

    Additional apt dependencies options in breeze (#9231)
    
    (cherry picked from commit 0682e784b1515284c2954464a37ff9fe880171e3)
---
 BREEZE.rst                            | 24 ++++++++++++++++++++++++
 breeze                                | 14 ++++++++++++++
 breeze-complete                       |  2 +-
 scripts/ci/libraries/_build_images.sh | 10 ++++++++++
 4 files changed, 49 insertions(+), 1 deletion(-)

diff --git a/BREEZE.rst b/BREEZE.rst
index bf5e311..0393369 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -838,6 +838,12 @@ This is the current syntax for  `./breeze <./breeze>`_:
   --additional-python-deps
           Additional python dependencies to use when building the images.
 
+  --additional-dev-deps
+          Additional apt dev dependencies to use when building the images.
+
+  --additional-runtime-deps
+          Additional apt runtime dependencies to use when building the images.
+
   -C, --force-clean-images
           Force build images with cache disabled. This will remove the pulled or build images
           and start building images from scratch. This might take a long time.
@@ -1064,6 +1070,12 @@ This is the current syntax for  `./breeze <./breeze>`_:
   --additional-python-deps
           Additional python dependencies to use when building the images.
 
+  --additional-dev-deps
+          Additional apt dev dependencies to use when building the images.
+
+  --additional-runtime-deps
+          Additional apt runtime dependencies to use when building the images.
+
   -C, --force-clean-images
           Force build images with cache disabled. This will remove the pulled or build images
           and start building images from scratch. This might take a long time.
@@ -1283,6 +1295,12 @@ This is the current syntax for  `./breeze <./breeze>`_:
   --additional-python-deps
           Additional python dependencies to use when building the images.
 
+  --additional-dev-deps
+          Additional apt dev dependencies to use when building the images.
+
+  --additional-runtime-deps
+          Additional apt runtime dependencies to use when building the images.
+
   -C, --force-clean-images
           Force build images with cache disabled. This will remove the pulled or build images
           and start building images from scratch. This might take a long time.
@@ -1553,6 +1571,12 @@ This is the current syntax for  `./breeze <./breeze>`_:
   --additional-python-deps
           Additional python dependencies to use when building the images.
 
+  --additional-dev-deps
+          Additional apt dev dependencies to use when building the images.
+
+  --additional-runtime-deps
+          Additional apt runtime dependencies to use when building the images.
+
   -C, --force-clean-images
           Force build images with cache disabled. This will remove the pulled or build images
           and start building images from scratch. This might take a long time.
diff --git a/breeze b/breeze
index 005f917..9e1c746 100755
--- a/breeze
+++ b/breeze
@@ -724,6 +724,14 @@ function parse_arguments() {
           export ADDITIONAL_PYTHON_DEPS="${2}"
           echo "Additional python dependencies: ${ADDITIONAL_PYTHON_DEPS}"
           shift 2 ;;
+        --additional-dev-deps)
+          export ADDITIONAL_DEV_DEPS="${2}"
+          echo "Additional apt dev dependencies: ${ADDITIONAL_DEV_DEPS}"
+          shift 2 ;;
+        --additional-runtime-deps)
+          export ADDITIONAL_RUNTIME_DEPS="${2}"
+          echo "Additional apt runtime dependencies: ${ADDITIONAL_RUNTIME_DEPS}"
+          shift 2 ;;
         -D|--dockerhub-user)
           export DOCKERHUB_USER="${2}"
           echo "Dockerhub user ${DOCKERHUB_USER}"
@@ -1577,6 +1585,12 @@ ${FORMATTED_DEFAULT_PROD_EXTRAS}
 --additional-python-deps
         Additional python dependencies to use when building the images.
 
+--additional-dev-deps
+        Additional apt dev dependencies to use when building the images.
+
+--additional-runtime-deps
+        Additional apt runtime dependencies to use when building the images.
+
 -C, --force-clean-images
         Force build images with cache disabled. This will remove the pulled or build images
         and start building images from scratch. This might take a long time.
diff --git a/breeze-complete b/breeze-complete
index 5d8a724..6038014 100644
--- a/breeze-complete
+++ b/breeze-complete
@@ -101,7 +101,7 @@ verbose assume-yes assume-no assume-quit forward-credentials
 force-build-images force-pull-images production-image extras: force-clean-images use-local-cache
 dockerhub-user: dockerhub-repo: registry-cache github-organisation: github-repo:
 postgres-version: mysql-version:
-additional-extras: additional-python-deps:
+additional-extras: additional-python-deps: additional-dev-deps: additional-runtime-deps:
 "
 
 export BREEZE_COMMANDS="
diff --git a/scripts/ci/libraries/_build_images.sh b/scripts/ci/libraries/_build_images.sh
index 83cce06..be349b8 100644
--- a/scripts/ci/libraries/_build_images.sh
+++ b/scripts/ci/libraries/_build_images.sh
@@ -347,6 +347,8 @@ function prepare_ci_build() {
     export AIRFLOW_EXTRAS="${AIRFLOW_EXTRAS:="${DEFAULT_CI_EXTRAS}"}"
     export ADDITIONAL_AIRFLOW_EXTRAS="${ADDITIONAL_AIRFLOW_EXTRAS:=""}"
     export ADDITIONAL_PYTHON_DEPS="${ADDITIONAL_PYTHON_DEPS:=""}"
+    export ADDITIONAL_DEV_DEPS="${ADDITIONAL_DEV_DEPS:=""}"
+    export ADDITIONAL_RUNTIME_DEPS="${ADDITIONAL_RUNTIME_DEPS:=""}"
     export AIRFLOW_IMAGE="${AIRFLOW_CI_IMAGE}"
     go_to_airflow_sources
     fix_group_permissions
@@ -546,6 +548,8 @@ Docker building ${AIRFLOW_CI_IMAGE}.
         --build-arg AIRFLOW_EXTRAS="${AIRFLOW_EXTRAS}" \
         --build-arg ADDITIONAL_AIRFLOW_EXTRAS="${ADDITIONAL_AIRFLOW_EXTRAS}" \
         --build-arg ADDITIONAL_PYTHON_DEPS="${ADDITIONAL_PYTHON_DEPS}" \
+        --build-arg ADDITIONAL_DEV_DEPS="${ADDITIONAL_DEV_DEPS}" \
+        --build-arg ADDITIONAL_RUNTIME_DEPS="${ADDITIONAL_RUNTIME_DEPS}" \
         --build-arg AIRFLOW_CONTAINER_CI_OPTIMISED_BUILD="${AIRFLOW_CONTAINER_CI_OPTIMISED_BUILD}" \
         --build-arg UPGRADE_TO_LATEST_REQUIREMENTS="${UPGRADE_TO_LATEST_REQUIREMENTS}" \
         "${DOCKER_CACHE_CI_DIRECTIVE[@]}" \
@@ -581,6 +585,8 @@ function prepare_prod_build() {
     export AIRFLOW_EXTRAS="${AIRFLOW_EXTRAS:="${DEFAULT_PROD_EXTRAS}"}"
     export ADDITIONAL_AIRFLOW_EXTRAS="${ADDITIONAL_AIRFLOW_EXTRAS:=""}"
     export ADDITIONAL_PYTHON_DEPS="${ADDITIONAL_PYTHON_DEPS:=""}"
+    export ADDITIONAL_DEV_DEPS="${ADDITIONAL_DEV_DEPS:=""}"
+    export ADDITIONAL_RUNTIME_DEPS="${ADDITIONAL_RUNTIME_DEPS:=""}"
     export AIRFLOW_IMAGE="${AIRFLOW_PROD_IMAGE}"
 
     if [[ ${ENABLE_REGISTRY_CACHE="false"} == "true" ]]; then
@@ -667,6 +673,8 @@ function build_prod_image() {
         --build-arg AIRFLOW_EXTRAS="${AIRFLOW_EXTRAS}" \
         --build-arg ADDITIONAL_AIRFLOW_EXTRAS="${ADDITIONAL_AIRFLOW_EXTRAS}" \
         --build-arg ADDITIONAL_PYTHON_DEPS="${ADDITIONAL_PYTHON_DEPS}" \
+        --build-arg ADDITIONAL_DEV_DEPS="${ADDITIONAL_DEV_DEPS}" \
+        --build-arg ADDITIONAL_RUNTIME_DEPS="${ADDITIONAL_RUNTIME_DEPS}" \
         "${DOCKER_CACHE_PROD_BUILD_DIRECTIVE[@]}" \
         -t "${AIRFLOW_PROD_BUILD_IMAGE}" \
         --target "airflow-build-image" \
@@ -677,6 +685,8 @@ function build_prod_image() {
         --build-arg PYTHON_MAJOR_MINOR_VERSION="${PYTHON_MAJOR_MINOR_VERSION}" \
         --build-arg ADDITIONAL_AIRFLOW_EXTRAS="${ADDITIONAL_AIRFLOW_EXTRAS}" \
         --build-arg ADDITIONAL_PYTHON_DEPS="${ADDITIONAL_PYTHON_DEPS}" \
+        --build-arg ADDITIONAL_DEV_DEPS="${ADDITIONAL_DEV_DEPS}" \
+        --build-arg ADDITIONAL_RUNTIME_DEPS="${ADDITIONAL_RUNTIME_DEPS}" \
         --build-arg AIRFLOW_VERSION="${AIRFLOW_VERSION}" \
         --build-arg AIRFLOW_EXTRAS="${AIRFLOW_EXTRAS}" \
         "${DOCKER_CACHE_PROD_DIRECTIVE[@]}" \


[airflow] 05/25: Fix typo in BREEZE.rst (#9199)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3a541373eb8e9d35c9cbd84c382098e3c5315676
Author: Udit Chaudhary <33...@users.noreply.github.com>
AuthorDate: Wed Jun 10 00:04:46 2020 +0530

    Fix typo in BREEZE.rst (#9199)
    
    Changed 'y' to 'by' since it was incorrect.
    
    (cherry picked from commit c18f4c035c3fcee2e47c5f274b2f59add4d14ced)
---
 BREEZE.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/BREEZE.rst b/BREEZE.rst
index b2ff795..bf5e311 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -1641,7 +1641,7 @@ On Linux, there is a problem with propagating ownership of created files (a know
 files and directories created in the container are not owned by the host user (but by the root user in our
 case). This may prevent you from switching branches, for example, if files owned by the root user are
 created within your sources. In case you are on a Linux host and have some files in your sources created
-y the root user, you can fix the ownership of those files by running this script:
+by the root user, you can fix the ownership of those files by running this script:
 
 .. code-block::
 


[airflow] 01/25: Merging multiple sql operators (#9124)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 910ac9a94d75a26fae681dbc6dda94deed405465
Author: samuelkhtu <46...@users.noreply.github.com>
AuthorDate: Wed Jun 17 14:32:46 2020 -0400

    Merging multiple sql operators (#9124)
    
    * Merge various SQL Operators into sql.py
    
    * Fix unit test code format
    
    * Merge multiple SQL operators into one
    
    1. Merge check_operator.py into airflow.operators.sql
    2. Merge sql_branch_operator.py into airflow.operators.sql
    3. Merge unit test for both into test_sql.py
    
    * Rename test_core_to_contrib Interval/ValueCheckOperator to SQLInterval/ValueCheckOperator
    
    * Fixed deprecated class and added check to test_core_to_contrib
    
    (cherry picked from commit 0b9bf4a285a074bbde270839a90fb53c257340be)
---
 ...eea_add_precision_to_execution_date_in_mysql.py |   2 +-
 airflow/operators/check_operator.py                | 425 ++------------
 airflow/operators/sql.py                           | 636 +++++++++++++++++++++
 airflow/operators/sql_branch_operator.py           | 162 +-----
 docs/operators-and-hooks-ref.rst                   |  30 +-
 tests/api/common/experimental/test_pool.py         |  64 ++-
 tests/contrib/hooks/test_discord_webhook_hook.py   |   6 +-
 .../contrib/operators/test_databricks_operator.py  |   6 +-
 .../contrib/operators/test_gcs_to_gcs_operator.py  |   4 +-
 .../operators/test_qubole_check_operator.py        |   7 +-
 tests/contrib/operators/test_sftp_operator.py      |   6 +-
 tests/contrib/operators/test_ssh_operator.py       |   6 +-
 tests/contrib/operators/test_winrm_operator.py     |   6 +-
 tests/contrib/sensors/test_weekday_sensor.py       |  18 +-
 .../contrib/utils/test_mlengine_operator_utils.py  |  16 +-
 tests/jobs/test_backfill_job.py                    |   9 +-
 tests/kubernetes/test_worker_configuration.py      |  11 +-
 tests/models/test_baseoperator.py                  |   5 +-
 tests/operators/test_check_operator.py             | 327 -----------
 tests/operators/test_s3_to_hive_operator.py        |  12 +-
 .../{test_sql_branch_operator.py => test_sql.py}   | 342 +++++++++--
 tests/secrets/test_local_filesystem.py             |  16 +-
 tests/sensors/test_http_sensor.py                  |   3 +-
 tests/utils/test_compression.py                    |  16 +-
 tests/utils/test_decorators.py                     |  10 +-
 tests/utils/test_json.py                           |  11 +-
 tests/utils/test_module_loading.py                 |   4 +-
 tests/www/test_validators.py                       |  11 +-
 tests/www_rbac/test_validators.py                  |   9 +-
 29 files changed, 1159 insertions(+), 1021 deletions(-)

diff --git a/airflow/migrations/versions/a66efa278eea_add_precision_to_execution_date_in_mysql.py b/airflow/migrations/versions/a66efa278eea_add_precision_to_execution_date_in_mysql.py
index ecb589d..59098a8 100644
--- a/airflow/migrations/versions/a66efa278eea_add_precision_to_execution_date_in_mysql.py
+++ b/airflow/migrations/versions/a66efa278eea_add_precision_to_execution_date_in_mysql.py
@@ -29,7 +29,7 @@ from sqlalchemy.dialects import mysql
 
 # revision identifiers, used by Alembic.
 revision = 'a66efa278eea'
-down_revision = '8f966b9c467a'
+down_revision = '952da73b5eff'
 branch_labels = None
 depends_on = None
 
diff --git a/airflow/operators/check_operator.py b/airflow/operators/check_operator.py
index b6d3a18..12ac472 100644
--- a/airflow/operators/check_operator.py
+++ b/airflow/operators/check_operator.py
@@ -17,409 +17,70 @@
 # specific language governing permissions and limitations
 # under the License.
 
-from builtins import str, zip
-from typing import Optional, Any, Iterable, Dict, SupportsAbs
+"""This module is deprecated. Please use `airflow.operators.sql`."""
 
-from airflow.exceptions import AirflowException
-from airflow.hooks.base_hook import BaseHook
-from airflow.models import BaseOperator
-from airflow.utils.decorators import apply_defaults
+import warnings
 
+from airflow.operators.sql import (
+    SQLCheckOperator, SQLIntervalCheckOperator, SQLThresholdCheckOperator, SQLValueCheckOperator,
+)
 
-class CheckOperator(BaseOperator):
-    """
-    Performs checks against a db. The ``CheckOperator`` expects
-    a sql query that will return a single row. Each value on that
-    first row is evaluated using python ``bool`` casting. If any of the
-    values return ``False`` the check is failed and errors out.
-
-    Note that Python bool casting evals the following as ``False``:
-
-    * ``False``
-    * ``0``
-    * Empty string (``""``)
-    * Empty list (``[]``)
-    * Empty dictionary or set (``{}``)
-
-    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
-    the count ``== 0``. You can craft much more complex query that could,
-    for instance, check that the table has the same number of rows as
-    the source table upstream, or that the count of today's partition is
-    greater than yesterday's partition, or that a set of metrics are less
-    than 3 standard deviation for the 7 day average.
-
-    This operator can be used as a data quality check in your pipeline, and
-    depending on where you put it in your DAG, you have the choice to
-    stop the critical path, preventing from
-    publishing dubious data, or on the side and receive email alerts
-    without stopping the progress of the DAG.
 
-    Note that this is an abstract class and get_db_hook
-    needs to be defined. Whereas a get_db_hook is hook that gets a
-    single record from an external source.
-
-    :param sql: the sql to be executed. (templated)
-    :type sql: str
+class CheckOperator(SQLCheckOperator):
     """
-
-    template_fields = ('sql',)  # type: Iterable[str]
-    template_ext = ('.hql', '.sql',)  # type: Iterable[str]
-    ui_color = '#fff7e6'
-
-    @apply_defaults
-    def __init__(
-        self,
-        sql,  # type: str
-        conn_id=None,  # type: Optional[str]
-        *args,
-        **kwargs
-    ):
-        super(CheckOperator, self).__init__(*args, **kwargs)
-        self.conn_id = conn_id
-        self.sql = sql
-
-    def execute(self, context=None):
-        self.log.info('Executing SQL check: %s', self.sql)
-        records = self.get_db_hook().get_first(self.sql)
-
-        self.log.info('Record: %s', records)
-        if not records:
-            raise AirflowException("The query returned None")
-        elif not all([bool(r) for r in records]):
-            raise AirflowException("Test failed.\nQuery:\n{query}\nResults:\n{records!s}".format(
-                query=self.sql, records=records))
-
-        self.log.info("Success.")
-
-    def get_db_hook(self):
-        return BaseHook.get_hook(conn_id=self.conn_id)
-
-
-def _convert_to_float_if_possible(s):
+    This class is deprecated.
+    Please use `airflow.operators.sql.SQLCheckOperator`.
     """
-    A small helper function to convert a string to a numeric value
-    if appropriate
 
-    :param s: the string to be converted
-    :type s: str
-    """
-    try:
-        ret = float(s)
-    except (ValueError, TypeError):
-        ret = s
-    return ret
+    def __init__(self, *args, **kwargs):
+        warnings.warn(
+            """This class is deprecated.
+            Please use `airflow.operators.sql.SQLCheckOperator`.""",
+            DeprecationWarning, stacklevel=2
+        )
+        super(CheckOperator, self).__init__(*args, **kwargs)
 
 
-class ValueCheckOperator(BaseOperator):
+class IntervalCheckOperator(SQLIntervalCheckOperator):
     """
-    Performs a simple value check using sql code.
-
-    Note that this is an abstract class and get_db_hook
-    needs to be defined. Whereas a get_db_hook is hook that gets a
-    single record from an external source.
-
-    :param sql: the sql to be executed. (templated)
-    :type sql: str
+    This class is deprecated.
+    Please use `airflow.operators.sql.SQLIntervalCheckOperator`.
     """
 
-    __mapper_args__ = {
-        'polymorphic_identity': 'ValueCheckOperator'
-    }
-    template_fields = ('sql', 'pass_value',)  # type: Iterable[str]
-    template_ext = ('.hql', '.sql',)  # type: Iterable[str]
-    ui_color = '#fff7e6'
-
-    @apply_defaults
-    def __init__(
-        self,
-        sql,  # type: str
-        pass_value,  # type: Any
-        tolerance=None,  # type: Any
-        conn_id=None,  # type: Optional[str]
-        *args,
-        **kwargs
-    ):
-        super(ValueCheckOperator, self).__init__(*args, **kwargs)
-        self.sql = sql
-        self.conn_id = conn_id
-        self.pass_value = str(pass_value)
-        tol = _convert_to_float_if_possible(tolerance)
-        self.tol = tol if isinstance(tol, float) else None
-        self.has_tolerance = self.tol is not None
-
-    def execute(self, context=None):
-        self.log.info('Executing SQL check: %s', self.sql)
-        records = self.get_db_hook().get_first(self.sql)
-
-        if not records:
-            raise AirflowException("The query returned None")
-
-        pass_value_conv = _convert_to_float_if_possible(self.pass_value)
-        is_numeric_value_check = isinstance(pass_value_conv, float)
-
-        tolerance_pct_str = str(self.tol * 100) + '%' if self.has_tolerance else None
-        error_msg = ("Test failed.\nPass value:{pass_value_conv}\n"
-                     "Tolerance:{tolerance_pct_str}\n"
-                     "Query:\n{sql}\nResults:\n{records!s}").format(
-            pass_value_conv=pass_value_conv,
-            tolerance_pct_str=tolerance_pct_str,
-            sql=self.sql,
-            records=records
+    def __init__(self, *args, **kwargs):
+        warnings.warn(
+            """This class is deprecated.
+            Please use `airflow.operators.sql.SQLIntervalCheckOperator`.""",
+            DeprecationWarning, stacklevel=2
         )
-
-        if not is_numeric_value_check:
-            tests = self._get_string_matches(records, pass_value_conv)
-        elif is_numeric_value_check:
-            try:
-                numeric_records = self._to_float(records)
-            except (ValueError, TypeError):
-                raise AirflowException("Converting a result to float failed.\n{}".format(error_msg))
-            tests = self._get_numeric_matches(numeric_records, pass_value_conv)
-        else:
-            tests = []
-
-        if not all(tests):
-            raise AirflowException(error_msg)
-
-    def _to_float(self, records):
-        return [float(record) for record in records]
-
-    def _get_string_matches(self, records, pass_value_conv):
-        return [str(record) == pass_value_conv for record in records]
-
-    def _get_numeric_matches(self, numeric_records, numeric_pass_value_conv):
-        if self.has_tolerance:
-            return [
-                numeric_pass_value_conv * (1 - self.tol) <= record <= numeric_pass_value_conv * (1 + self.tol)
-                for record in numeric_records
-            ]
-
-        return [record == numeric_pass_value_conv for record in numeric_records]
-
-    def get_db_hook(self):
-        return BaseHook.get_hook(conn_id=self.conn_id)
+        super(IntervalCheckOperator, self).__init__(*args, **kwargs)
 
 
-class IntervalCheckOperator(BaseOperator):
+class ThresholdCheckOperator(SQLThresholdCheckOperator):
     """
-    Checks that the values of metrics given as SQL expressions are within
-    a certain tolerance of the ones from days_back before.
-
-    Note that this is an abstract class and get_db_hook
-    needs to be defined. Whereas a get_db_hook is hook that gets a
-    single record from an external source.
-
-    :param table: the table name
-    :type table: str
-    :param days_back: number of days between ds and the ds we want to check
-        against. Defaults to 7 days
-    :type days_back: int
-    :param ratio_formula: which formula to use to compute the ratio between
-        the two metrics. Assuming cur is the metric of today and ref is
-        the metric to today - days_back.
-
-        max_over_min: computes max(cur, ref) / min(cur, ref)
-        relative_diff: computes abs(cur-ref) / ref
-
-        Default: 'max_over_min'
-    :type ratio_formula: str
-    :param ignore_zero: whether we should ignore zero metrics
-    :type ignore_zero: bool
-    :param metrics_threshold: a dictionary of ratios indexed by metrics
-    :type metrics_threshold: dict
+    This class is deprecated.
+    Please use `airflow.operators.sql.SQLThresholdCheckOperator`.
     """
 
-    __mapper_args__ = {
-        'polymorphic_identity': 'IntervalCheckOperator'
-    }
-    template_fields = ('sql1', 'sql2')  # type: Iterable[str]
-    template_ext = ('.hql', '.sql',)  # type: Iterable[str]
-    ui_color = '#fff7e6'
-
-    ratio_formulas = {
-        'max_over_min': lambda cur, ref: float(max(cur, ref)) / min(cur, ref),
-        'relative_diff': lambda cur, ref: float(abs(cur - ref)) / ref,
-    }
-
-    @apply_defaults
-    def __init__(
-        self,
-        table,  # type: str
-        metrics_thresholds,  # type: Dict[str, int]
-        date_filter_column='ds',  # type: Optional[str]
-        days_back=-7,  # type: SupportsAbs[int]
-        ratio_formula='max_over_min',  # type: Optional[str]
-        ignore_zero=True,  # type: Optional[bool]
-        conn_id=None,  # type: Optional[str]
-        *args, **kwargs
-    ):
-        super(IntervalCheckOperator, self).__init__(*args, **kwargs)
-        if ratio_formula not in self.ratio_formulas:
-            msg_template = "Invalid diff_method: {diff_method}. " \
-                           "Supported diff methods are: {diff_methods}"
-
-            raise AirflowException(
-                msg_template.format(diff_method=ratio_formula,
-                                    diff_methods=self.ratio_formulas)
-            )
-        self.ratio_formula = ratio_formula
-        self.ignore_zero = ignore_zero
-        self.table = table
-        self.metrics_thresholds = metrics_thresholds
-        self.metrics_sorted = sorted(metrics_thresholds.keys())
-        self.date_filter_column = date_filter_column
-        self.days_back = -abs(days_back)
-        self.conn_id = conn_id
-        sqlexp = ', '.join(self.metrics_sorted)
-        sqlt = "SELECT {sqlexp} FROM {table} WHERE {date_filter_column}=".format(
-            sqlexp=sqlexp, table=table, date_filter_column=date_filter_column
+    def __init__(self, *args, **kwargs):
+        warnings.warn(
+            """This class is deprecated.
+            Please use `airflow.operators.sql.SQLThresholdCheckOperator`.""",
+            DeprecationWarning, stacklevel=2
         )
-
-        self.sql1 = sqlt + "'{{ ds }}'"
-        self.sql2 = sqlt + "'{{ macros.ds_add(ds, " + str(self.days_back) + ") }}'"
-
-    def execute(self, context=None):
-        hook = self.get_db_hook()
-        self.log.info('Using ratio formula: %s', self.ratio_formula)
-        self.log.info('Executing SQL check: %s', self.sql2)
-        row2 = hook.get_first(self.sql2)
-        self.log.info('Executing SQL check: %s', self.sql1)
-        row1 = hook.get_first(self.sql1)
-
-        if not row2:
-            raise AirflowException("The query {} returned None".format(self.sql2))
-        if not row1:
-            raise AirflowException("The query {} returned None".format(self.sql1))
-
-        current = dict(zip(self.metrics_sorted, row1))
-        reference = dict(zip(self.metrics_sorted, row2))
-
-        ratios = {}
-        test_results = {}
-
-        for m in self.metrics_sorted:
-            cur = current[m]
-            ref = reference[m]
-            threshold = self.metrics_thresholds[m]
-            if cur == 0 or ref == 0:
-                ratios[m] = None
-                test_results[m] = self.ignore_zero
-            else:
-                ratios[m] = self.ratio_formulas[self.ratio_formula](current[m], reference[m])
-                test_results[m] = ratios[m] < threshold
-
-            self.log.info(
-                (
-                    "Current metric for %s: %s\n"
-                    "Past metric for %s: %s\n"
-                    "Ratio for %s: %s\n"
-                    "Threshold: %s\n"
-                ), m, cur, m, ref, m, ratios[m], threshold)
-
-        if not all(test_results.values()):
-            failed_tests = [it[0] for it in test_results.items() if not it[1]]
-            j = len(failed_tests)
-            n = len(self.metrics_sorted)
-            self.log.warning("The following %s tests out of %s failed:", j, n)
-            for k in failed_tests:
-                self.log.warning(
-                    "'%s' check failed. %s is above %s", k, ratios[k], self.metrics_thresholds[k]
-                )
-            raise AirflowException("The following tests have failed:\n {0}".format(", ".join(
-                sorted(failed_tests))))
-
-        self.log.info("All tests have passed")
-
-    def get_db_hook(self):
-        return BaseHook.get_hook(conn_id=self.conn_id)
+        super(ThresholdCheckOperator, self).__init__(*args, **kwargs)
 
 
-class ThresholdCheckOperator(BaseOperator):
+class ValueCheckOperator(SQLValueCheckOperator):
     """
-    Performs a value check using sql code against a mininmum threshold
-    and a maximum threshold. Thresholds can be in the form of a numeric
-    value OR a sql statement that results a numeric.
-
-    Note that this is an abstract class and get_db_hook
-    needs to be defined. Whereas a get_db_hook is hook that gets a
-    single record from an external source.
-
-    :param sql: the sql to be executed. (templated)
-    :type sql: str
-    :param min_threshold: numerical value or min threshold sql to be executed (templated)
-    :type min_threshold: numeric or str
-    :param max_threshold: numerical value or max threshold sql to be executed (templated)
-    :type max_threshold: numeric or str
+    This class is deprecated.
+    Please use `airflow.operators.sql.SQLValueCheckOperator`.
     """
 
-    template_fields = ('sql', 'min_threshold', 'max_threshold')  # type: Iterable[str]
-    template_ext = ('.hql', '.sql',)  # type: Iterable[str]
-
-    @apply_defaults
-    def __init__(
-        self,
-        sql,   # type: str
-        min_threshold,   # type: Any
-        max_threshold,   # type: Any
-        conn_id=None,   # type: Optional[str]
-        *args, **kwargs
-    ):
-        super(ThresholdCheckOperator, self).__init__(*args, **kwargs)
-        self.sql = sql
-        self.conn_id = conn_id
-        self.min_threshold = _convert_to_float_if_possible(min_threshold)
-        self.max_threshold = _convert_to_float_if_possible(max_threshold)
-
-    def execute(self, context=None):
-        hook = self.get_db_hook()
-        result = hook.get_first(self.sql)[0][0]
-
-        if isinstance(self.min_threshold, float):
-            lower_bound = self.min_threshold
-        else:
-            lower_bound = hook.get_first(self.min_threshold)[0][0]
-
-        if isinstance(self.max_threshold, float):
-            upper_bound = self.max_threshold
-        else:
-            upper_bound = hook.get_first(self.max_threshold)[0][0]
-
-        meta_data = {
-            "result": result,
-            "task_id": self.task_id,
-            "min_threshold": lower_bound,
-            "max_threshold": upper_bound,
-            "within_threshold": lower_bound <= result <= upper_bound
-        }
-
-        self.push(meta_data)
-        if not meta_data["within_threshold"]:
-            error_msg = (
-                'Threshold Check: "{task_id}" failed.\n'
-                'DAG: {dag_id}\nTask_id: {task_id}\n'
-                'Check description: {description}\n'
-                'SQL: {sql}\n'
-                'Result: {result} is not within thresholds '
-                '{min_threshold} and {max_threshold}'
-            ).format(
-                task_id=self.task_id, dag_id=self.dag_id,
-                description=meta_data.get("description"), sql=self.sql,
-                result=round(meta_data.get("result"), 2),
-                min_threshold=meta_data.get("min_threshold"),
-                max_threshold=meta_data.get("max_threshold")
-            )
-            raise AirflowException(error_msg)
-
-        self.log.info("Test %s Successful.", self.task_id)
-
-    def push(self, meta_data):
-        """
-        Optional: Send data check info and metadata to an external database.
-        Default functionality will log metadata.
-        """
-
-        info = "\n".join(["""{}: {}""".format(key, item) for key, item in meta_data.items()])
-        self.log.info("Log from %s:\n%s", self.dag_id, info)
-
-    def get_db_hook(self):
-        return BaseHook.get_hook(conn_id=self.conn_id)
+    def __init__(self, *args, **kwargs):
+        warnings.warn(
+            """This class is deprecated.
+            Please use `airflow.operators.sql.SQLValueCheckOperator`.""",
+            DeprecationWarning, stacklevel=2
+        )
+        super(ValueCheckOperator, self).__init__(*args, **kwargs)
diff --git a/airflow/operators/sql.py b/airflow/operators/sql.py
new file mode 100644
index 0000000..1e5b090
--- /dev/null
+++ b/airflow/operators/sql.py
@@ -0,0 +1,636 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+import six
+from distutils.util import strtobool
+from typing import Iterable
+
+from airflow.exceptions import AirflowException
+from airflow.hooks.base_hook import BaseHook
+from airflow.models import BaseOperator, SkipMixin
+from airflow.utils.decorators import apply_defaults
+
+ALLOWED_CONN_TYPE = {
+    "google_cloud_platform",
+    "jdbc",
+    "mssql",
+    "mysql",
+    "odbc",
+    "oracle",
+    "postgres",
+    "presto",
+    "sqlite",
+    "vertica",
+}
+
+
+class SQLCheckOperator(BaseOperator):
+    """
+    Performs checks against a db. The ``SQLCheckOperator`` expects
+    a sql query that will return a single row. Each value on that
+    first row is evaluated using python ``bool`` casting. If any of the
+    values return ``False`` the check is failed and errors out.
+
+    Note that Python bool casting evals the following as ``False``:
+
+    * ``False``
+    * ``0``
+    * Empty string (``""``)
+    * Empty list (``[]``)
+    * Empty dictionary or set (``{}``)
+
+    Given a query like ``SELECT COUNT(*) FROM foo``, it will fail only if
+    the count ``== 0``. You can craft much more complex query that could,
+    for instance, check that the table has the same number of rows as
+    the source table upstream, or that the count of today's partition is
+    greater than yesterday's partition, or that a set of metrics are less
+    than 3 standard deviation for the 7 day average.
+
+    This operator can be used as a data quality check in your pipeline, and
+    depending on where you put it in your DAG, you have the choice to
+    stop the critical path, preventing from
+    publishing dubious data, or on the side and receive email alerts
+    without stopping the progress of the DAG.
+
+    Note that this is an abstract class and get_db_hook
+    needs to be defined. Whereas a get_db_hook is hook that gets a
+    single record from an external source.
+
+    :param sql: the sql to be executed. (templated)
+    :type sql: str
+    """
+
+    template_fields = ("sql",)  # type: Iterable[str]
+    template_ext = (
+        ".hql",
+        ".sql",
+    )  # type: Iterable[str]
+    ui_color = "#fff7e6"
+
+    @apply_defaults
+    def __init__(
+        self, sql, conn_id=None, *args, **kwargs
+    ):
+        super(SQLCheckOperator, self).__init__(*args, **kwargs)
+        self.conn_id = conn_id
+        self.sql = sql
+
+    def execute(self, context=None):
+        self.log.info("Executing SQL check: %s", self.sql)
+        records = self.get_db_hook().get_first(self.sql)
+
+        self.log.info("Record: %s", records)
+        if not records:
+            raise AirflowException("The query returned None")
+        elif not all([bool(r) for r in records]):
+            raise AirflowException(
+                "Test failed.\nQuery:\n{query}\nResults:\n{records!s}".format(
+                    query=self.sql, records=records
+                )
+            )
+
+        self.log.info("Success.")
+
+    def get_db_hook(self):
+        """
+        Get the database hook for the connection.
+
+        :return: the database hook object.
+        :rtype: DbApiHook
+        """
+        return BaseHook.get_hook(conn_id=self.conn_id)
+
+
+def _convert_to_float_if_possible(s):
+    """
+    A small helper function to convert a string to a numeric value
+    if appropriate
+
+    :param s: the string to be converted
+    :type s: str
+    """
+    try:
+        ret = float(s)
+    except (ValueError, TypeError):
+        ret = s
+    return ret
+
+
+class SQLValueCheckOperator(BaseOperator):
+    """
+    Performs a simple value check using sql code.
+
+    Note that this is an abstract class and get_db_hook
+    needs to be defined. Whereas a get_db_hook is hook that gets a
+    single record from an external source.
+
+    :param sql: the sql to be executed. (templated)
+    :type sql: str
+    """
+
+    __mapper_args__ = {"polymorphic_identity": "SQLValueCheckOperator"}
+    template_fields = (
+        "sql",
+        "pass_value",
+    )  # type: Iterable[str]
+    template_ext = (
+        ".hql",
+        ".sql",
+    )  # type: Iterable[str]
+    ui_color = "#fff7e6"
+
+    @apply_defaults
+    def __init__(
+            self,
+            sql,
+            pass_value,
+            tolerance=None,
+            conn_id=None,
+            *args,
+            **kwargs):
+        super(SQLValueCheckOperator, self).__init__(*args, **kwargs)
+        self.sql = sql
+        self.conn_id = conn_id
+        self.pass_value = str(pass_value)
+        tol = _convert_to_float_if_possible(tolerance)
+        self.tol = tol if isinstance(tol, float) else None
+        self.has_tolerance = self.tol is not None
+
+    def execute(self, context=None):
+        self.log.info("Executing SQL check: %s", self.sql)
+        records = self.get_db_hook().get_first(self.sql)
+
+        if not records:
+            raise AirflowException("The query returned None")
+
+        pass_value_conv = _convert_to_float_if_possible(self.pass_value)
+        is_numeric_value_check = isinstance(pass_value_conv, float)
+
+        tolerance_pct_str = str(self.tol * 100) + "%" if self.has_tolerance else None
+        error_msg = (
+            "Test failed.\nPass value:{pass_value_conv}\n"
+            "Tolerance:{tolerance_pct_str}\n"
+            "Query:\n{sql}\nResults:\n{records!s}"
+        ).format(
+            pass_value_conv=pass_value_conv,
+            tolerance_pct_str=tolerance_pct_str,
+            sql=self.sql,
+            records=records,
+        )
+
+        if not is_numeric_value_check:
+            tests = self._get_string_matches(records, pass_value_conv)
+        elif is_numeric_value_check:
+            try:
+                numeric_records = self._to_float(records)
+            except (ValueError, TypeError):
+                raise AirflowException(
+                    "Converting a result to float failed.\n{}".format(error_msg)
+                )
+            tests = self._get_numeric_matches(numeric_records, pass_value_conv)
+        else:
+            tests = []
+
+        if not all(tests):
+            raise AirflowException(error_msg)
+
+    def _to_float(self, records):
+        return [float(record) for record in records]
+
+    def _get_string_matches(self, records, pass_value_conv):
+        return [str(record) == pass_value_conv for record in records]
+
+    def _get_numeric_matches(self, numeric_records, numeric_pass_value_conv):
+        if self.has_tolerance:
+            return [
+                numeric_pass_value_conv * (1 - self.tol) <= record <= numeric_pass_value_conv * (1 + self.tol)
+                for record in numeric_records
+            ]
+
+        return [record == numeric_pass_value_conv for record in numeric_records]
+
+    def get_db_hook(self):
+        """
+        Get the database hook for the connection.
+
+        :return: the database hook object.
+        :rtype: DbApiHook
+        """
+        return BaseHook.get_hook(conn_id=self.conn_id)
+
+
+class SQLIntervalCheckOperator(BaseOperator):
+    """
+    Checks that the values of metrics given as SQL expressions are within
+    a certain tolerance of the ones from days_back before.
+
+    Note that this is an abstract class and get_db_hook
+    needs to be defined. Whereas a get_db_hook is hook that gets a
+    single record from an external source.
+
+    :param table: the table name
+    :type table: str
+    :param days_back: number of days between ds and the ds we want to check
+        against. Defaults to 7 days
+    :type days_back: int
+    :param ratio_formula: which formula to use to compute the ratio between
+        the two metrics. Assuming cur is the metric of today and ref is
+        the metric to today - days_back.
+
+        max_over_min: computes max(cur, ref) / min(cur, ref)
+        relative_diff: computes abs(cur-ref) / ref
+
+        Default: 'max_over_min'
+    :type ratio_formula: str
+    :param ignore_zero: whether we should ignore zero metrics
+    :type ignore_zero: bool
+    :param metrics_threshold: a dictionary of ratios indexed by metrics
+    :type metrics_threshold: dict
+    """
+
+    __mapper_args__ = {"polymorphic_identity": "SQLIntervalCheckOperator"}
+    template_fields = ("sql1", "sql2")  # type: Iterable[str]
+    template_ext = (
+        ".hql",
+        ".sql",
+    )  # type: Iterable[str]
+    ui_color = "#fff7e6"
+
+    ratio_formulas = {
+        "max_over_min": lambda cur, ref: float(max(cur, ref)) / min(cur, ref),
+        "relative_diff": lambda cur, ref: float(abs(cur - ref)) / ref,
+    }
+
+    @apply_defaults
+    def __init__(
+        self,
+        table,
+        metrics_thresholds,
+        date_filter_column="ds",
+        days_back=-7,
+        ratio_formula="max_over_min",
+        ignore_zero=True,
+        conn_id=None,
+        *args,
+        **kwargs
+    ):
+        super(SQLIntervalCheckOperator, self).__init__(*args, **kwargs)
+        if ratio_formula not in self.ratio_formulas:
+            msg_template = (
+                "Invalid diff_method: {diff_method}. "
+                "Supported diff methods are: {diff_methods}"
+            )
+
+            raise AirflowException(
+                msg_template.format(
+                    diff_method=ratio_formula, diff_methods=self.ratio_formulas
+                )
+            )
+        self.ratio_formula = ratio_formula
+        self.ignore_zero = ignore_zero
+        self.table = table
+        self.metrics_thresholds = metrics_thresholds
+        self.metrics_sorted = sorted(metrics_thresholds.keys())
+        self.date_filter_column = date_filter_column
+        self.days_back = -abs(days_back)
+        self.conn_id = conn_id
+        sqlexp = ", ".join(self.metrics_sorted)
+        sqlt = "SELECT {sqlexp} FROM {table} WHERE {date_filter_column}=".format(
+            sqlexp=sqlexp, table=table, date_filter_column=date_filter_column
+        )
+
+        self.sql1 = sqlt + "'{{ ds }}'"
+        self.sql2 = sqlt + "'{{ macros.ds_add(ds, " + str(self.days_back) + ") }}'"
+
+    def execute(self, context=None):
+        hook = self.get_db_hook()
+        self.log.info("Using ratio formula: %s", self.ratio_formula)
+        self.log.info("Executing SQL check: %s", self.sql2)
+        row2 = hook.get_first(self.sql2)
+        self.log.info("Executing SQL check: %s", self.sql1)
+        row1 = hook.get_first(self.sql1)
+
+        if not row2:
+            raise AirflowException("The query {} returned None".format(self.sql2))
+        if not row1:
+            raise AirflowException("The query {} returned None".format(self.sql1))
+
+        current = dict(zip(self.metrics_sorted, row1))
+        reference = dict(zip(self.metrics_sorted, row2))
+
+        ratios = {}
+        test_results = {}
+
+        for metric in self.metrics_sorted:
+            cur = current[metric]
+            ref = reference[metric]
+            threshold = self.metrics_thresholds[metric]
+            if cur == 0 or ref == 0:
+                ratios[metric] = None
+                test_results[metric] = self.ignore_zero
+            else:
+                ratios[metric] = self.ratio_formulas[self.ratio_formula](
+                    current[metric], reference[metric]
+                )
+                test_results[metric] = ratios[metric] < threshold
+
+            self.log.info(
+                (
+                    "Current metric for %s: %s\n"
+                    "Past metric for %s: %s\n"
+                    "Ratio for %s: %s\n"
+                    "Threshold: %s\n"
+                ),
+                metric,
+                cur,
+                metric,
+                ref,
+                metric,
+                ratios[metric],
+                threshold,
+            )
+
+        if not all(test_results.values()):
+            failed_tests = [it[0] for it in test_results.items() if not it[1]]
+            self.log.warning(
+                "The following %s tests out of %s failed:",
+                len(failed_tests),
+                len(self.metrics_sorted),
+            )
+            for k in failed_tests:
+                self.log.warning(
+                    "'%s' check failed. %s is above %s",
+                    k,
+                    ratios[k],
+                    self.metrics_thresholds[k],
+                )
+            raise AirflowException(
+                "The following tests have failed:\n {0}".format(
+                    ", ".join(sorted(failed_tests))
+                )
+            )
+
+        self.log.info("All tests have passed")
+
+    def get_db_hook(self):
+        """
+        Get the database hook for the connection.
+
+        :return: the database hook object.
+        :rtype: DbApiHook
+        """
+        return BaseHook.get_hook(conn_id=self.conn_id)
+
+
+class SQLThresholdCheckOperator(BaseOperator):
+    """
+    Performs a value check using sql code against a mininmum threshold
+    and a maximum threshold. Thresholds can be in the form of a numeric
+    value OR a sql statement that results a numeric.
+
+    Note that this is an abstract class and get_db_hook
+    needs to be defined. Whereas a get_db_hook is hook that gets a
+    single record from an external source.
+
+    :param sql: the sql to be executed. (templated)
+    :type sql: str
+    :param min_threshold: numerical value or min threshold sql to be executed (templated)
+    :type min_threshold: numeric or str
+    :param max_threshold: numerical value or max threshold sql to be executed (templated)
+    :type max_threshold: numeric or str
+    """
+
+    template_fields = ("sql", "min_threshold", "max_threshold")  # type: Iterable[str]
+    template_ext = (
+        ".hql",
+        ".sql",
+    )  # type: Iterable[str]
+
+    @apply_defaults
+    def __init__(
+        self,
+        sql,
+        min_threshold,
+        max_threshold,
+        conn_id=None,
+        *args,
+        **kwargs
+    ):
+        super(SQLThresholdCheckOperator, self).__init__(*args, **kwargs)
+        self.sql = sql
+        self.conn_id = conn_id
+        self.min_threshold = _convert_to_float_if_possible(min_threshold)
+        self.max_threshold = _convert_to_float_if_possible(max_threshold)
+
+    def execute(self, context=None):
+        hook = self.get_db_hook()
+        result = hook.get_first(self.sql)[0][0]
+
+        if isinstance(self.min_threshold, float):
+            lower_bound = self.min_threshold
+        else:
+            lower_bound = hook.get_first(self.min_threshold)[0][0]
+
+        if isinstance(self.max_threshold, float):
+            upper_bound = self.max_threshold
+        else:
+            upper_bound = hook.get_first(self.max_threshold)[0][0]
+
+        meta_data = {
+            "result": result,
+            "task_id": self.task_id,
+            "min_threshold": lower_bound,
+            "max_threshold": upper_bound,
+            "within_threshold": lower_bound <= result <= upper_bound,
+        }
+
+        self.push(meta_data)
+        if not meta_data["within_threshold"]:
+            error_msg = (
+                'Threshold Check: "{task_id}" failed.\n'
+                'DAG: {dag_id}\nTask_id: {task_id}\n'
+                'Check description: {description}\n'
+                "SQL: {sql}\n"
+                'Result: {round} is not within thresholds '
+                '{min} and {max}'
+                .format(task_id=meta_data.get("task_id"),
+                        dag_id=self.dag_id,
+                        description=meta_data.get("description"),
+                        sql=self.sql,
+                        round=round(meta_data.get("result"), 2),
+                        min=meta_data.get("min_threshold"),
+                        max=meta_data.get("max_threshold"),
+                        ))
+            raise AirflowException(error_msg)
+
+        self.log.info("Test %s Successful.", self.task_id)
+
+    def push(self, meta_data):
+        """
+        Optional: Send data check info and metadata to an external database.
+        Default functionality will log metadata.
+        """
+
+        info = "\n".join(["{key}: {item}".format(key=key, item=item) for key, item in meta_data.items()])
+        self.log.info("Log from %s:\n%s", self.dag_id, info)
+
+    def get_db_hook(self):
+        """
+        Returns DB hook
+        """
+        return BaseHook.get_hook(conn_id=self.conn_id)
+
+
+class BranchSQLOperator(BaseOperator, SkipMixin):
+    """
+    Executes sql code in a specific database
+
+    :param sql: the sql code to be executed. (templated)
+    :type sql: Can receive a str representing a sql statement or reference to a template file.
+               Template reference are recognized by str ending in '.sql'.
+               Expected SQL query to return Boolean (True/False), integer (0 = False, Otherwise = 1)
+               or string (true/y/yes/1/on/false/n/no/0/off).
+    :param follow_task_ids_if_true: task id or task ids to follow if query return true
+    :type follow_task_ids_if_true: str or list
+    :param follow_task_ids_if_false: task id or task ids to follow if query return true
+    :type follow_task_ids_if_false: str or list
+    :param conn_id: reference to a specific database
+    :type conn_id: str
+    :param database: name of database which overwrite defined one in connection
+    :param parameters: (optional) the parameters to render the SQL query with.
+    :type parameters: mapping or iterable
+    """
+
+    template_fields = ("sql",)
+    template_ext = (".sql",)
+    ui_color = "#a22034"
+    ui_fgcolor = "#F7F7F7"
+
+    @apply_defaults
+    def __init__(
+        self,
+        sql,
+        follow_task_ids_if_true,
+        follow_task_ids_if_false,
+        conn_id="default_conn_id",
+        database=None,
+        parameters=None,
+        *args,
+        **kwargs
+    ):
+        super(BranchSQLOperator, self).__init__(*args, **kwargs)
+        self.conn_id = conn_id
+        self.sql = sql
+        self.parameters = parameters
+        self.follow_task_ids_if_true = follow_task_ids_if_true
+        self.follow_task_ids_if_false = follow_task_ids_if_false
+        self.database = database
+        self._hook = None
+
+    def _get_hook(self):
+        self.log.debug("Get connection for %s", self.conn_id)
+        conn = BaseHook.get_connection(self.conn_id)
+
+        if conn.conn_type not in ALLOWED_CONN_TYPE:
+            raise AirflowException(
+                "The connection type is not supported by BranchSQLOperator.\
+                Supported connection types: {}".format(list(ALLOWED_CONN_TYPE))
+            )
+
+        if not self._hook:
+            self._hook = conn.get_hook()
+            if self.database:
+                self._hook.schema = self.database
+
+        return self._hook
+
+    def execute(self, context):
+        # get supported hook
+        self._hook = self._get_hook()
+
+        if self._hook is None:
+            raise AirflowException(
+                "Failed to establish connection to '%s'" % self.conn_id
+            )
+
+        if self.sql is None:
+            raise AirflowException("Expected 'sql' parameter is missing.")
+
+        if self.follow_task_ids_if_true is None:
+            raise AirflowException(
+                "Expected 'follow_task_ids_if_true' paramter is missing."
+            )
+
+        if self.follow_task_ids_if_false is None:
+            raise AirflowException(
+                "Expected 'follow_task_ids_if_false' parameter is missing."
+            )
+
+        self.log.info(
+            "Executing: %s (with parameters %s) with connection: %s",
+            self.sql,
+            self.parameters,
+            self._hook,
+        )
+        record = self._hook.get_first(self.sql, self.parameters)
+        if not record:
+            raise AirflowException(
+                "No rows returned from sql query. Operator expected True or False return value."
+            )
+
+        if isinstance(record, list):
+            if isinstance(record[0], list):
+                query_result = record[0][0]
+            else:
+                query_result = record[0]
+        elif isinstance(record, tuple):
+            query_result = record[0]
+        else:
+            query_result = record
+
+        self.log.info("Query returns %s, type '%s'", query_result, type(query_result))
+
+        follow_branch = None
+        try:
+            if isinstance(query_result, bool):
+                if query_result:
+                    follow_branch = self.follow_task_ids_if_true
+            elif isinstance(query_result, str):
+                # return result is not Boolean, try to convert from String to Boolean
+                if bool(strtobool(query_result)):
+                    follow_branch = self.follow_task_ids_if_true
+            elif isinstance(query_result, int):
+                if bool(query_result):
+                    follow_branch = self.follow_task_ids_if_true
+            elif six.PY2 and isinstance(query_result, long):  # noqa
+                if bool(query_result):
+                    follow_branch = self.follow_task_ids_if_true
+            else:
+                raise AirflowException(
+                    "Unexpected query return result '%s' type '%s'"
+                    % (query_result, type(query_result))
+                )
+
+            if follow_branch is None:
+                follow_branch = self.follow_task_ids_if_false
+        except ValueError:
+            raise AirflowException(
+                "Unexpected query return result '%s' type '%s'"
+                % (query_result, type(query_result))
+            )
+
+        self.skip_all_except(context["ti"], follow_branch)
diff --git a/airflow/operators/sql_branch_operator.py b/airflow/operators/sql_branch_operator.py
index 072c40c..b911e34 100644
--- a/airflow/operators/sql_branch_operator.py
+++ b/airflow/operators/sql_branch_operator.py
@@ -14,160 +14,22 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
+"""This module is deprecated. Please use `airflow.operators.sql`."""
+import warnings
 
-from distutils.util import strtobool
+from airflow.operators.sql import BranchSQLOperator
 
-from airflow.exceptions import AirflowException
-from airflow.hooks.base_hook import BaseHook
-from airflow.models import BaseOperator, SkipMixin
-from airflow.utils.decorators import apply_defaults
 
-ALLOWED_CONN_TYPE = {
-    "google_cloud_platform",
-    "jdbc",
-    "mssql",
-    "mysql",
-    "odbc",
-    "oracle",
-    "postgres",
-    "presto",
-    "sqlite",
-    "vertica",
-}
-
-
-class BranchSqlOperator(BaseOperator, SkipMixin):
+class BranchSqlOperator(BranchSQLOperator):
     """
-    Executes sql code in a specific database
-
-    :param sql: the sql code to be executed. (templated)
-    :type sql: Can receive a str representing a sql statement or reference to a template file.
-               Template reference are recognized by str ending in '.sql'.
-               Expected SQL query to return Boolean (True/False), integer (0 = False, Otherwise = 1)
-               or string (true/y/yes/1/on/false/n/no/0/off).
-    :param follow_task_ids_if_true: task id or task ids to follow if query return true
-    :type follow_task_ids_if_true: str or list
-    :param follow_task_ids_if_false: task id or task ids to follow if query return true
-    :type follow_task_ids_if_false: str or list
-    :param conn_id: reference to a specific database
-    :type conn_id: str
-    :param database: name of database which overwrite defined one in connection
-    :param parameters: (optional) the parameters to render the SQL query with.
-    :type parameters: mapping or iterable
+    This class is deprecated.
+    Please use `airflow.operators.sql.BranchSQLOperator`.
     """
 
-    template_fields = ("sql",)
-    template_ext = (".sql",)
-    ui_color = "#a22034"
-    ui_fgcolor = "#F7F7F7"
-
-    @apply_defaults
-    def __init__(
-            self,
-            sql,
-            follow_task_ids_if_true,
-            follow_task_ids_if_false,
-            conn_id="default_conn_id",
-            database=None,
-            parameters=None,
-            *args,
-            **kwargs):
-        super(BranchSqlOperator, self).__init__(*args, **kwargs)
-        self.conn_id = conn_id
-        self.sql = sql
-        self.parameters = parameters
-        self.follow_task_ids_if_true = follow_task_ids_if_true
-        self.follow_task_ids_if_false = follow_task_ids_if_false
-        self.database = database
-        self._hook = None
-
-    def _get_hook(self):
-        self.log.debug("Get connection for %s", self.conn_id)
-        conn = BaseHook.get_connection(self.conn_id)
-
-        if conn.conn_type not in ALLOWED_CONN_TYPE:
-            raise AirflowException(
-                "The connection type is not supported by BranchSqlOperator. "
-                + "Supported connection types: {}".format(list(ALLOWED_CONN_TYPE))
-            )
-
-        if not self._hook:
-            self._hook = conn.get_hook()
-            if self.database:
-                self._hook.schema = self.database
-
-        return self._hook
-
-    def execute(self, context):
-        # get supported hook
-        self._hook = self._get_hook()
-
-        if self._hook is None:
-            raise AirflowException(
-                "Failed to establish connection to '%s'" % self.conn_id
-            )
-
-        if self.sql is None:
-            raise AirflowException("Expected 'sql' parameter is missing.")
-
-        if self.follow_task_ids_if_true is None:
-            raise AirflowException(
-                "Expected 'follow_task_ids_if_true' paramter is missing."
-            )
-
-        if self.follow_task_ids_if_false is None:
-            raise AirflowException(
-                "Expected 'follow_task_ids_if_false' parameter is missing."
-            )
-
-        self.log.info(
-            "Executing: %s (with parameters %s) with connection: %s",
-            self.sql,
-            self.parameters,
-            self._hook,
+    def __init__(self, *args, **kwargs):
+        warnings.warn(
+            """This class is deprecated.
+            Please use `airflow.operators.sql.BranchSQLOperator`.""",
+            DeprecationWarning, stacklevel=2
         )
-        record = self._hook.get_first(self.sql, self.parameters)
-        if not record:
-            raise AirflowException(
-                "No rows returned from sql query. Operator expected True or False return value."
-            )
-
-        if isinstance(record, list):
-            if isinstance(record[0], list):
-                query_result = record[0][0]
-            else:
-                query_result = record[0]
-        elif isinstance(record, tuple):
-            query_result = record[0]
-        else:
-            query_result = record
-
-        self.log.info("Query returns %s, type '%s'", query_result, type(query_result))
-
-        follow_branch = None
-        try:
-            if isinstance(query_result, bool):
-                if query_result:
-                    follow_branch = self.follow_task_ids_if_true
-            elif isinstance(query_result, str):
-                # return result is not Boolean, try to convert from String to Boolean
-                if bool(strtobool(query_result)):
-                    follow_branch = self.follow_task_ids_if_true
-            elif isinstance(query_result, int):
-                if bool(query_result):
-                    follow_branch = self.follow_task_ids_if_true
-            else:
-                raise AirflowException(
-                    "Unexpected query return result '%s' type '%s'"
-                    % (query_result, type(query_result))
-                )
-
-            if follow_branch is None:
-                follow_branch = self.follow_task_ids_if_false
-        except ValueError:
-            raise AirflowException(
-                "Unexpected query return result '%s' type '%s'"
-                % (query_result, type(query_result))
-            )
-
-        self.skip_all_except(context["ti"], follow_branch)
+        super(BranchSqlOperator, self).__init__(*args, **kwargs)
diff --git a/docs/operators-and-hooks-ref.rst b/docs/operators-and-hooks-ref.rst
index 55176f8..1fd11c3 100644
--- a/docs/operators-and-hooks-ref.rst
+++ b/docs/operators-and-hooks-ref.rst
@@ -57,10 +57,6 @@ Fundamentals
 
    * - :mod:`airflow.operators.branch_operator`
      -
-
-   * - :mod:`airflow.operators.check_operator`
-     -
-
    * - :mod:`airflow.operators.dagrun_operator`
      -
 
@@ -76,7 +72,7 @@ Fundamentals
    * - :mod:`airflow.operators.subdag_operator`
      -
 
-   * - :mod:`airflow.operators.sql_branch_operator`
+   * - :mod:`airflow.operators.sql`
      -
 
 **Sensors:**
@@ -90,9 +86,6 @@ Fundamentals
    * - :mod:`airflow.sensors.weekday_sensor`
      -
 
-   * - :mod:`airflow.sensors.external_task_sensor`
-     - :doc:`How to use <howto/operator/external_task_sensor>`
-
    * - :mod:`airflow.sensors.sql_sensor`
      -
 
@@ -470,7 +463,7 @@ These integrations allow you to copy data from/to Amazon Web Services.
 
    * - `Amazon Simple Storage Service (S3) <https://aws.amazon.com/s3/>`__
      - `Google Cloud Storage (GCS) <https://cloud.google.com/gcs/>`__
-     - :doc:`How to use <howto/operator/gcp/cloud_storage_transfer_service>`
+     -
      - :mod:`airflow.contrib.operators.s3_to_gcs_operator`,
        :mod:`airflow.gcp.operators.cloud_storage_transfer_service`
 
@@ -551,7 +544,7 @@ These integrations allow you to perform various operations within the Google Clo
      - Sensors
 
    * - `AutoML <https://cloud.google.com/automl/>`__
-     - :doc:`How to use <howto/operator/gcp/automl>`
+     -
      - :mod:`airflow.gcp.hooks.automl`
      - :mod:`airflow.gcp.operators.automl`
      -
@@ -563,7 +556,7 @@ These integrations allow you to perform various operations within the Google Clo
      - :mod:`airflow.gcp.sensors.bigquery`
 
    * - `BigQuery Data Transfer Service <https://cloud.google.com/bigquery/transfer/>`__
-     - :doc:`How to use <howto/operator/gcp/bigquery_dts>`
+     -
      - :mod:`airflow.gcp.hooks.bigquery_dts`
      - :mod:`airflow.gcp.operators.bigquery_dts`
      - :mod:`airflow.gcp.sensors.bigquery_dts`
@@ -611,7 +604,7 @@ These integrations allow you to perform various operations within the Google Clo
      -
 
    * - `Cloud Functions <https://cloud.google.com/functions/>`__
-     - :doc:`How to use <howto/operator/gcp/functions>`
+     - :doc:`How to use <howto/operator/gcp/function>`
      - :mod:`airflow.gcp.hooks.functions`
      - :mod:`airflow.gcp.operators.functions`
      -
@@ -635,7 +628,7 @@ These integrations allow you to perform various operations within the Google Clo
      -
 
    * - `Cloud Memorystore <https://cloud.google.com/memorystore/>`__
-     - :doc:`How to use <howto/operator/gcp/cloud_memorystore>`
+     -
      - :mod:`airflow.gcp.hooks.cloud_memorystore`
      - :mod:`airflow.gcp.operators.cloud_memorystore`
      -
@@ -677,7 +670,7 @@ These integrations allow you to perform various operations within the Google Clo
      - :mod:`airflow.gcp.sensors.gcs`
 
    * - `Storage Transfer Service <https://cloud.google.com/storage/transfer/>`__
-     - :doc:`How to use <howto/operator/gcp/cloud_storage_transfer_service>`
+     -
      - :mod:`airflow.gcp.hooks.cloud_storage_transfer_service`
      - :mod:`airflow.gcp.operators.cloud_storage_transfer_service`
      - :mod:`airflow.gcp.sensors.cloud_storage_transfer_service`
@@ -701,7 +694,7 @@ These integrations allow you to perform various operations within the Google Clo
      -
 
    * - `Cloud Video Intelligence <https://cloud.google.com/video_intelligence/>`__
-     - :doc:`How to use <howto/operator/gcp/video_intelligence>`
+     - :doc:`How to use <howto/operator/gcp/video>`
      - :mod:`airflow.gcp.hooks.video_intelligence`
      - :mod:`airflow.gcp.operators.video_intelligence`
      -
@@ -741,7 +734,7 @@ These integrations allow you to copy data from/to Google Cloud Platform.
 
    * - `Amazon Simple Storage Service (S3) <https://aws.amazon.com/s3/>`__
      - `Google Cloud Storage (GCS) <https://cloud.google.com/gcs/>`__
-     - :doc:`How to use <howto/operator/gcp/cloud_storage_transfer_service>`
+     -
      - :mod:`airflow.contrib.operators.s3_to_gcs_operator`,
        :mod:`airflow.gcp.operators.cloud_storage_transfer_service`
 
@@ -772,8 +765,7 @@ These integrations allow you to copy data from/to Google Cloud Platform.
 
    * - `Google Cloud Storage (GCS) <https://cloud.google.com/gcs/>`__
      - `Google Cloud Storage (GCS) <https://cloud.google.com/gcs/>`__
-     - :doc:`How to use <howto/operator/gcp/gcs_to_gcs>`,
-       :doc:`How to use <howto/operator/gcp/cloud_storage_transfer_service>`
+     -
      - :mod:`airflow.operators.gcs_to_gcs`,
        :mod:`airflow.gcp.operators.cloud_storage_transfer_service`
 
@@ -1037,7 +1029,7 @@ These integrations allow you to perform various operations using various softwar
      - :mod:`airflow.contrib.sensors.bash_sensor`
 
    * - `Kubernetes <https://kubernetes.io/>`__
-     - :doc:`How to use <howto/operator/kubernetes>`
+     -
      -
      - :mod:`airflow.contrib.operators.kubernetes_pod_operator`
      -
diff --git a/tests/api/common/experimental/test_pool.py b/tests/api/common/experimental/test_pool.py
index 29c7105..97c970b 100644
--- a/tests/api/common/experimental/test_pool.py
+++ b/tests/api/common/experimental/test_pool.py
@@ -56,17 +56,18 @@ class TestPool(unittest.TestCase):
         self.assertEqual(pool.pool, self.pools[0].pool)
 
     def test_get_pool_non_existing(self):
-        self.assertRaisesRegexp(PoolNotFound,
-                                "^Pool 'test' doesn't exist$",
-                                pool_api.get_pool,
-                                name='test')
+        six.assertRaisesRegex(self, PoolNotFound,
+                              "^Pool 'test' doesn't exist$",
+                              pool_api.get_pool,
+                              name='test')
 
     def test_get_pool_bad_name(self):
         for name in ('', '    '):
-            self.assertRaisesRegexp(AirflowBadRequest,
-                                    "^Pool name shouldn't be empty$",
-                                    pool_api.get_pool,
-                                    name=name)
+            six.assertRaisesRegex(self,
+                                  AirflowBadRequest,
+                                  "^Pool name shouldn't be empty$",
+                                  pool_api.get_pool,
+                                  name=name)
 
     def test_get_pools(self):
         pools = sorted(pool_api.get_pools(),
@@ -96,20 +97,21 @@ class TestPool(unittest.TestCase):
 
     def test_create_pool_bad_name(self):
         for name in ('', '    '):
-            self.assertRaisesRegexp(AirflowBadRequest,
-                                    "^Pool name shouldn't be empty$",
-                                    pool_api.create_pool,
-                                    name=name,
-                                    slots=5,
-                                    description='')
+            six.assertRaisesRegex(self,
+                                  AirflowBadRequest,
+                                  "^Pool name shouldn't be empty$",
+                                  pool_api.create_pool,
+                                  name=name,
+                                  slots=5,
+                                  description='')
 
     def test_create_pool_bad_slots(self):
-        self.assertRaisesRegexp(AirflowBadRequest,
-                                "^Bad value for `slots`: foo$",
-                                pool_api.create_pool,
-                                name='foo',
-                                slots='foo',
-                                description='')
+        six.assertRaisesRegex(self, AirflowBadRequest,
+                              "^Bad value for `slots`: foo$",
+                              pool_api.create_pool,
+                              name='foo',
+                              slots='foo',
+                              description='')
 
     def test_delete_pool(self):
         pool = pool_api.delete_pool(name=self.pools[-1].pool)
@@ -118,21 +120,23 @@ class TestPool(unittest.TestCase):
             self.assertEqual(session.query(models.Pool).count(), self.TOTAL_POOL_COUNT - 1)
 
     def test_delete_pool_non_existing(self):
-        self.assertRaisesRegexp(pool_api.PoolNotFound,
-                                "^Pool 'test' doesn't exist$",
-                                pool_api.delete_pool,
-                                name='test')
+        six.assertRaisesRegex(self, pool_api.PoolNotFound,
+                              "^Pool 'test' doesn't exist$",
+                              pool_api.delete_pool,
+                              name='test')
 
     def test_delete_pool_bad_name(self):
         for name in ('', '    '):
-            self.assertRaisesRegexp(AirflowBadRequest,
-                                    "^Pool name shouldn't be empty$",
-                                    pool_api.delete_pool,
-                                    name=name)
+            six.assertRaisesRegex(self,
+                                  AirflowBadRequest,
+                                  "^Pool name shouldn't be empty$",
+                                  pool_api.delete_pool,
+                                  name=name)
 
     def test_delete_default_pool_not_allowed(self):
-        with self.assertRaisesRegex(AirflowBadRequest,
-                                    "^default_pool cannot be deleted$"):
+        with six.assertRaisesRegex(self,
+                                   AirflowBadRequest,
+                                   "^default_pool cannot be deleted$"):
             pool_api.delete_pool(Pool.DEFAULT_POOL_NAME)
 
 
diff --git a/tests/contrib/hooks/test_discord_webhook_hook.py b/tests/contrib/hooks/test_discord_webhook_hook.py
index d0c9001..384b7f3 100644
--- a/tests/contrib/hooks/test_discord_webhook_hook.py
+++ b/tests/contrib/hooks/test_discord_webhook_hook.py
@@ -20,6 +20,8 @@
 import json
 import unittest
 
+import six
+
 from airflow import AirflowException
 from airflow.models import Connection
 from airflow.utils import db
@@ -73,7 +75,7 @@ class TestDiscordWebhookHook(unittest.TestCase):
 
         # When/Then
         expected_message = 'Expected Discord webhook endpoint in the form of'
-        with self.assertRaisesRegexp(AirflowException, expected_message):
+        with six.assertRaisesRegex(self, AirflowException, expected_message):
             DiscordWebhookHook(webhook_endpoint=provided_endpoint)
 
     def test_get_webhook_endpoint_conn_id(self):
@@ -107,7 +109,7 @@ class TestDiscordWebhookHook(unittest.TestCase):
 
         # When/Then
         expected_message = 'Discord message length must be 2000 or fewer characters'
-        with self.assertRaisesRegexp(AirflowException, expected_message):
+        with six.assertRaisesRegex(self, AirflowException, expected_message):
             hook._build_discord_payload()
 
 
diff --git a/tests/contrib/operators/test_databricks_operator.py b/tests/contrib/operators/test_databricks_operator.py
index 9a7b6ec..6e59408 100644
--- a/tests/contrib/operators/test_databricks_operator.py
+++ b/tests/contrib/operators/test_databricks_operator.py
@@ -21,6 +21,8 @@
 import unittest
 from datetime import datetime
 
+import six
+
 from airflow.contrib.hooks.databricks_hook import RunState
 import airflow.contrib.operators.databricks_operator as databricks_operator
 from airflow.contrib.operators.databricks_operator import DatabricksSubmitRunOperator
@@ -180,7 +182,7 @@ class DatabricksSubmitRunOperatorTest(unittest.TestCase):
         # Looks a bit weird since we have to escape regex reserved symbols.
         exception_message = r'Type \<(type|class) \'datetime.datetime\'\> used ' + \
                             r'for parameter json\[test\] is not a number or a string'
-        with self.assertRaisesRegexp(AirflowException, exception_message):
+        with six.assertRaisesRegex(self, AirflowException, exception_message):
             DatabricksSubmitRunOperator(task_id=TASK_ID, json=json)
 
     @mock.patch('airflow.contrib.operators.databricks_operator.DatabricksHook')
@@ -347,7 +349,7 @@ class DatabricksRunNowOperatorTest(unittest.TestCase):
         # Looks a bit weird since we have to escape regex reserved symbols.
         exception_message = r'Type \<(type|class) \'datetime.datetime\'\> used ' + \
                             r'for parameter json\[test\] is not a number or a string'
-        with self.assertRaisesRegexp(AirflowException, exception_message):
+        with six.assertRaisesRegex(self, AirflowException, exception_message):
             DatabricksRunNowOperator(task_id=TASK_ID, job_id=JOB_ID, json=json)
 
     @mock.patch('airflow.contrib.operators.databricks_operator.DatabricksHook')
diff --git a/tests/contrib/operators/test_gcs_to_gcs_operator.py b/tests/contrib/operators/test_gcs_to_gcs_operator.py
index f9085e2..622fa8f 100644
--- a/tests/contrib/operators/test_gcs_to_gcs_operator.py
+++ b/tests/contrib/operators/test_gcs_to_gcs_operator.py
@@ -20,6 +20,8 @@
 import unittest
 from datetime import datetime
 
+import six
+
 from airflow.contrib.operators.gcs_to_gcs import \
     GoogleCloudStorageToGoogleCloudStorageOperator, WILDCARD
 from airflow.exceptions import AirflowException
@@ -290,7 +292,7 @@ class GoogleCloudStorageToCloudStorageOperatorTest(unittest.TestCase):
         error_msg = "Only one wildcard '[*]' is allowed in source_object parameter. " \
                     "Found {}".format(total_wildcards)
 
-        with self.assertRaisesRegexp(AirflowException, error_msg):
+        with six.assertRaisesRegex(self, AirflowException, error_msg):
             operator.execute(None)
 
     @mock.patch('airflow.contrib.operators.gcs_to_gcs.GoogleCloudStorageHook')
diff --git a/tests/contrib/operators/test_qubole_check_operator.py b/tests/contrib/operators/test_qubole_check_operator.py
index b1692d8..f6d875a 100644
--- a/tests/contrib/operators/test_qubole_check_operator.py
+++ b/tests/contrib/operators/test_qubole_check_operator.py
@@ -19,6 +19,9 @@
 #
 import unittest
 from datetime import datetime
+
+import six
+
 from airflow.models import DAG
 from airflow.exceptions import AirflowException
 from airflow.contrib.operators.qubole_check_operator import QuboleValueCheckOperator
@@ -88,8 +91,8 @@ class QuboleValueCheckOperatorTest(unittest.TestCase):
 
         operator = self.__construct_operator('select value from tab1 limit 1;', 5, 1)
 
-        with self.assertRaisesRegexp(AirflowException,
-                                     'Qubole Command Id: ' + str(mock_cmd.id)):
+        with six.assertRaisesRegex(self, AirflowException,
+                                   'Qubole Command Id: ' + str(mock_cmd.id)):
             operator.execute()
 
         mock_cmd.is_success.assert_called_with(mock_cmd.status)
diff --git a/tests/contrib/operators/test_sftp_operator.py b/tests/contrib/operators/test_sftp_operator.py
index 24db36e..fe478b9 100644
--- a/tests/contrib/operators/test_sftp_operator.py
+++ b/tests/contrib/operators/test_sftp_operator.py
@@ -362,10 +362,8 @@ class SFTPOperatorTest(unittest.TestCase):
         os.environ['AIRFLOW_CONN_' + conn_id.upper()] = "ssh://test_id@localhost"
 
         # Exception should be raised if neither ssh_hook nor ssh_conn_id is provided
-        if six.PY2:
-            self.assertRaisesRegex = self.assertRaisesRegexp
-        with self.assertRaisesRegex(AirflowException,
-                                    "Cannot operate without ssh_hook or ssh_conn_id."):
+        with six.assertRaisesRegex(self, AirflowException,
+                                   "Cannot operate without ssh_hook or ssh_conn_id."):
             task_0 = SFTPOperator(
                 task_id="test_sftp",
                 local_filepath=self.test_local_filepath,
diff --git a/tests/contrib/operators/test_ssh_operator.py b/tests/contrib/operators/test_ssh_operator.py
index f2294ba..1413050 100644
--- a/tests/contrib/operators/test_ssh_operator.py
+++ b/tests/contrib/operators/test_ssh_operator.py
@@ -152,10 +152,8 @@ class SSHOperatorTest(TestCase):
         os.environ['AIRFLOW_CONN_' + conn_id.upper()] = "ssh://test_id@localhost"
 
         # Exception should be raised if neither ssh_hook nor ssh_conn_id is provided
-        if six.PY2:
-            self.assertRaisesRegex = self.assertRaisesRegexp
-        with self.assertRaisesRegex(AirflowException,
-                                    "Cannot operate without ssh_hook or ssh_conn_id."):
+        with six.assertRaisesRegex(self, AirflowException,
+                                   "Cannot operate without ssh_hook or ssh_conn_id."):
             task_0 = SSHOperator(task_id="test", command=COMMAND,
                                  timeout=TIMEOUT, dag=self.dag)
             task_0.execute(None)
diff --git a/tests/contrib/operators/test_winrm_operator.py b/tests/contrib/operators/test_winrm_operator.py
index 27792a0..c6b26f7 100644
--- a/tests/contrib/operators/test_winrm_operator.py
+++ b/tests/contrib/operators/test_winrm_operator.py
@@ -20,6 +20,8 @@
 import mock
 import unittest
 
+import six
+
 from airflow.contrib.operators.winrm_operator import WinRMOperator
 from airflow.exceptions import AirflowException
 
@@ -30,7 +32,7 @@ class WinRMOperatorTest(unittest.TestCase):
                            winrm_hook=None,
                            ssh_conn_id=None)
         exception_msg = "Cannot operate without winrm_hook or ssh_conn_id."
-        with self.assertRaisesRegexp(AirflowException, exception_msg):
+        with six.assertRaisesRegex(self, AirflowException, exception_msg):
             op.execute(None)
 
     @mock.patch('airflow.contrib.operators.winrm_operator.WinRMHook')
@@ -41,7 +43,7 @@ class WinRMOperatorTest(unittest.TestCase):
             command=None
         )
         exception_msg = "No command specified so nothing to execute here."
-        with self.assertRaisesRegexp(AirflowException, exception_msg):
+        with six.assertRaisesRegex(self, AirflowException, exception_msg):
             op.execute(None)
 
 
diff --git a/tests/contrib/sensors/test_weekday_sensor.py b/tests/contrib/sensors/test_weekday_sensor.py
index 016f71c..d822a69 100644
--- a/tests/contrib/sensors/test_weekday_sensor.py
+++ b/tests/contrib/sensors/test_weekday_sensor.py
@@ -19,6 +19,9 @@
 #
 
 import unittest
+
+import six
+
 from airflow import DAG, models
 from airflow.contrib.sensors.weekday_sensor import DayOfWeekSensor
 from airflow.contrib.utils.weekday import WeekDay
@@ -78,9 +81,8 @@ class DayOfWeekSensorTests(unittest.TestCase):
 
     def test_invalid_weekday_number(self):
         invalid_week_day = 'Thsday'
-        with self.assertRaisesRegexp(AttributeError,
-                                     'Invalid Week Day passed: "{}"'.format(
-                                         invalid_week_day)):
+        with six.assertRaisesRegex(self, AttributeError,
+                                   'Invalid Week Day passed: "{}"'.format(invalid_week_day)):
             DayOfWeekSensor(
                 task_id='weekday_sensor_invalid_weekday_num',
                 week_day=invalid_week_day,
@@ -139,11 +141,11 @@ class DayOfWeekSensorTests(unittest.TestCase):
 
     def test_weekday_sensor_with_invalid_type(self):
         invalid_week_day = ['Thsday']
-        with self.assertRaisesRegexp(TypeError,
-                                     'Unsupported Type for week_day parameter:'
-                                     ' {}. It should be one of str, set or '
-                                     'Weekday enum type'.format(type(invalid_week_day))
-                                     ):
+        with six.assertRaisesRegex(self, TypeError,
+                                   'Unsupported Type for week_day parameter:'
+                                   ' {}. It should be one of str, set or '
+                                   'Weekday enum type'.format(type(invalid_week_day))
+                                   ):
             DayOfWeekSensor(
                 task_id='weekday_sensor_check_true',
                 week_day=invalid_week_day,
diff --git a/tests/contrib/utils/test_mlengine_operator_utils.py b/tests/contrib/utils/test_mlengine_operator_utils.py
index 28efef5..53e1323 100644
--- a/tests/contrib/utils/test_mlengine_operator_utils.py
+++ b/tests/contrib/utils/test_mlengine_operator_utils.py
@@ -22,6 +22,8 @@ from __future__ import print_function
 import datetime
 import unittest
 
+import six
+
 from airflow import DAG
 from airflow.contrib.utils import mlengine_operator_utils
 from airflow.exceptions import AirflowException
@@ -152,25 +154,25 @@ class CreateEvaluateOpsTest(unittest.TestCase):
             'dag': dag,
         }
 
-        with self.assertRaisesRegexp(AirflowException, 'Missing model origin'):
+        with six.assertRaisesRegex(self, AirflowException, 'Missing model origin'):
             mlengine_operator_utils.create_evaluate_ops(**other_params_but_models)
 
-        with self.assertRaisesRegexp(AirflowException, 'Ambiguous model origin'):
+        with six.assertRaisesRegex(self, AirflowException, 'Ambiguous model origin'):
             mlengine_operator_utils.create_evaluate_ops(model_uri='abc', model_name='cde',
                                                         **other_params_but_models)
 
-        with self.assertRaisesRegexp(AirflowException, 'Ambiguous model origin'):
+        with six.assertRaisesRegex(self, AirflowException, 'Ambiguous model origin'):
             mlengine_operator_utils.create_evaluate_ops(model_uri='abc', version_name='vvv',
                                                         **other_params_but_models)
 
-        with self.assertRaisesRegexp(AirflowException,
-                                     '`metric_fn` param must be callable'):
+        with six.assertRaisesRegex(self, AirflowException,
+                                   '`metric_fn` param must be callable'):
             params = other_params_but_models.copy()
             params['metric_fn_and_keys'] = (None, ['abc'])
             mlengine_operator_utils.create_evaluate_ops(model_uri='gs://blah', **params)
 
-        with self.assertRaisesRegexp(AirflowException,
-                                     '`validate_fn` param must be callable'):
+        with six.assertRaisesRegex(self, AirflowException,
+                                   '`validate_fn` param must be callable'):
             params = other_params_but_models.copy()
             params['validate_fn'] = None
             mlengine_operator_utils.create_evaluate_ops(model_uri='gs://blah', **params)
diff --git a/tests/jobs/test_backfill_job.py b/tests/jobs/test_backfill_job.py
index e522556..d272fcf 100644
--- a/tests/jobs/test_backfill_job.py
+++ b/tests/jobs/test_backfill_job.py
@@ -25,6 +25,7 @@ import threading
 import unittest
 
 import pytest
+import six
 import sqlalchemy
 from parameterized import parameterized
 
@@ -790,7 +791,8 @@ class BackfillJobTest(unittest.TestCase):
         run_date = DEFAULT_DATE + datetime.timedelta(days=5)
 
         # backfill should deadlock
-        self.assertRaisesRegexp(
+        six.assertRaisesRegex(
+            self,
             AirflowException,
             'BackfillJob is deadlocked',
             BackfillJob(dag=dag, start_date=run_date, end_date=run_date).run)
@@ -890,7 +892,7 @@ class BackfillJobTest(unittest.TestCase):
         # raises backwards
         expected_msg = 'You cannot backfill backwards because one or more tasks depend_on_past: {}'.format(
             'test_dop_task')
-        with self.assertRaisesRegexp(AirflowException, expected_msg):
+        with six.assertRaisesRegex(self, AirflowException, expected_msg):
             executor = MockExecutor()
             job = BackfillJob(dag=dag,
                               executor=executor,
@@ -1166,7 +1168,8 @@ class BackfillJobTest(unittest.TestCase):
                           start_date=DEFAULT_DATE,
                           end_date=DEFAULT_DATE,
                           executor=executor)
-        self.assertRaisesRegexp(
+        six.assertRaisesRegex(
+            self,
             AirflowException,
             'Some task instances failed',
             job.run)
diff --git a/tests/kubernetes/test_worker_configuration.py b/tests/kubernetes/test_worker_configuration.py
index 1b15d98..8378f9f 100644
--- a/tests/kubernetes/test_worker_configuration.py
+++ b/tests/kubernetes/test_worker_configuration.py
@@ -19,6 +19,9 @@
 import unittest
 import uuid
 from datetime import datetime
+
+import six
+
 from tests.compat import mock
 from tests.test_utils.config import conf_vars
 try:
@@ -99,10 +102,10 @@ class TestKubernetesWorkerConfiguration(unittest.TestCase):
         ('kubernetes', 'kube_client_request_args'): '{"_request_timeout" : [60,360]}',
     })
     def test_worker_configuration_auth_both_ssh_and_user(self):
-        with self.assertRaisesRegexp(AirflowConfigException,
-                                     'either `git_user` and `git_password`.*'
-                                     'or `git_ssh_key_secret_name`.*'
-                                     'but not both$'):
+        with six.assertRaisesRegex(self, AirflowConfigException,
+                                   'either `git_user` and `git_password`.*'
+                                   'or `git_ssh_key_secret_name`.*'
+                                   'but not both$'):
             KubeConfig()
 
     def test_worker_with_subpaths(self):
diff --git a/tests/models/test_baseoperator.py b/tests/models/test_baseoperator.py
index 2d00c59..ea65823 100644
--- a/tests/models/test_baseoperator.py
+++ b/tests/models/test_baseoperator.py
@@ -19,6 +19,9 @@
 
 import datetime
 import unittest
+
+import six
+
 from tests.compat import mock
 import uuid
 
@@ -196,7 +199,7 @@ class BaseOperatorTest(unittest.TestCase):
 
         re = "('ClassWithCustomAttributes' object|ClassWithCustomAttributes instance) " \
              "has no attribute 'missing_field'"
-        with self.assertRaisesRegexp(AttributeError, re):
+        with six.assertRaisesRegex(self, AttributeError, re):
             task.render_template(ClassWithCustomAttributes(template_fields=["missing_field"]), {})
 
     def test_jinja_invalid_expression_is_just_propagated(self):
diff --git a/tests/operators/test_check_operator.py b/tests/operators/test_check_operator.py
deleted file mode 100644
index 22523a4..0000000
--- a/tests/operators/test_check_operator.py
+++ /dev/null
@@ -1,327 +0,0 @@
-# -*- coding: utf-8 -*-
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-import six
-import unittest
-from datetime import datetime
-
-from airflow.exceptions import AirflowException
-from airflow.models import DAG
-from airflow.operators.check_operator import (
-    CheckOperator, IntervalCheckOperator, ThresholdCheckOperator, ValueCheckOperator,
-)
-from tests.compat import mock
-
-
-class TestCheckOperator(unittest.TestCase):
-
-    @mock.patch.object(CheckOperator, 'get_db_hook')
-    def test_execute_no_records(self, mock_get_db_hook):
-        mock_get_db_hook.return_value.get_first.return_value = []
-
-        with self.assertRaises(AirflowException):
-            CheckOperator(sql='sql').execute()
-
-    @mock.patch.object(CheckOperator, 'get_db_hook')
-    def test_execute_not_all_records_are_true(self, mock_get_db_hook):
-        mock_get_db_hook.return_value.get_first.return_value = ["data", ""]
-
-        with self.assertRaises(AirflowException):
-            CheckOperator(sql='sql').execute()
-
-
-class TestValueCheckOperator(unittest.TestCase):
-
-    def setUp(self):
-        self.task_id = 'test_task'
-        self.conn_id = 'default_conn'
-
-    def _construct_operator(self, sql, pass_value, tolerance=None):
-        dag = DAG('test_dag', start_date=datetime(2017, 1, 1))
-
-        return ValueCheckOperator(
-            dag=dag,
-            task_id=self.task_id,
-            conn_id=self.conn_id,
-            sql=sql,
-            pass_value=pass_value,
-            tolerance=tolerance)
-
-    def test_pass_value_template_string(self):
-        pass_value_str = "2018-03-22"
-        operator = self._construct_operator('select date from tab1;', "{{ ds }}")
-
-        operator.render_template_fields({'ds': pass_value_str})
-
-        self.assertEqual(operator.task_id, self.task_id)
-        self.assertEqual(operator.pass_value, pass_value_str)
-
-    def test_pass_value_template_string_float(self):
-        pass_value_float = 4.0
-        operator = self._construct_operator('select date from tab1;', pass_value_float)
-
-        operator.render_template_fields({})
-
-        self.assertEqual(operator.task_id, self.task_id)
-        self.assertEqual(operator.pass_value, str(pass_value_float))
-
-    @mock.patch.object(ValueCheckOperator, 'get_db_hook')
-    def test_execute_pass(self, mock_get_db_hook):
-        mock_hook = mock.Mock()
-        mock_hook.get_first.return_value = [10]
-        mock_get_db_hook.return_value = mock_hook
-        sql = 'select value from tab1 limit 1;'
-        operator = self._construct_operator(sql, 5, 1)
-
-        operator.execute(None)
-
-        mock_hook.get_first.assert_called_with(sql)
-
-    @mock.patch.object(ValueCheckOperator, 'get_db_hook')
-    def test_execute_fail(self, mock_get_db_hook):
-        mock_hook = mock.Mock()
-        mock_hook.get_first.return_value = [11]
-        mock_get_db_hook.return_value = mock_hook
-
-        operator = self._construct_operator('select value from tab1 limit 1;', 5, 1)
-
-        with self.assertRaisesRegexp(AirflowException, 'Tolerance:100.0%'):
-            operator.execute()
-
-
-class IntervalCheckOperatorTest(unittest.TestCase):
-
-    def _construct_operator(self, table, metric_thresholds,
-                            ratio_formula, ignore_zero):
-        return IntervalCheckOperator(
-            task_id='test_task',
-            table=table,
-            metrics_thresholds=metric_thresholds,
-            ratio_formula=ratio_formula,
-            ignore_zero=ignore_zero,
-        )
-
-    def test_invalid_ratio_formula(self):
-        with self.assertRaisesRegexp(AirflowException, 'Invalid diff_method'):
-            self._construct_operator(
-                table='test_table',
-                metric_thresholds={
-                    'f1': 1,
-                },
-                ratio_formula='abs',
-                ignore_zero=False,
-            )
-
-    @mock.patch.object(IntervalCheckOperator, 'get_db_hook')
-    def test_execute_not_ignore_zero(self, mock_get_db_hook):
-        mock_hook = mock.Mock()
-        mock_hook.get_first.return_value = [0]
-        mock_get_db_hook.return_value = mock_hook
-
-        operator = self._construct_operator(
-            table='test_table',
-            metric_thresholds={
-                'f1': 1,
-            },
-            ratio_formula='max_over_min',
-            ignore_zero=False,
-        )
-
-        with self.assertRaises(AirflowException):
-            operator.execute()
-
-    @mock.patch.object(IntervalCheckOperator, 'get_db_hook')
-    def test_execute_ignore_zero(self, mock_get_db_hook):
-        mock_hook = mock.Mock()
-        mock_hook.get_first.return_value = [0]
-        mock_get_db_hook.return_value = mock_hook
-
-        operator = self._construct_operator(
-            table='test_table',
-            metric_thresholds={
-                'f1': 1,
-            },
-            ratio_formula='max_over_min',
-            ignore_zero=True,
-        )
-
-        operator.execute()
-
-    @mock.patch.object(IntervalCheckOperator, 'get_db_hook')
-    def test_execute_min_max(self, mock_get_db_hook):
-        mock_hook = mock.Mock()
-
-        def returned_row():
-            rows = [
-                [2, 2, 2, 2],  # reference
-                [1, 1, 1, 1],  # current
-            ]
-
-            for r in rows:
-                yield r
-
-        mock_hook.get_first.side_effect = returned_row()
-        mock_get_db_hook.return_value = mock_hook
-
-        operator = self._construct_operator(
-            table='test_table',
-            metric_thresholds={
-                'f0': 1.0,
-                'f1': 1.5,
-                'f2': 2.0,
-                'f3': 2.5,
-            },
-            ratio_formula='max_over_min',
-            ignore_zero=True,
-        )
-
-        with self.assertRaisesRegexp(AirflowException, "f0, f1, f2"):
-            operator.execute()
-
-    @mock.patch.object(IntervalCheckOperator, 'get_db_hook')
-    def test_execute_diff(self, mock_get_db_hook):
-        mock_hook = mock.Mock()
-
-        def returned_row():
-            rows = [
-                [3, 3, 3, 3],  # reference
-                [1, 1, 1, 1],  # current
-            ]
-
-            for r in rows:
-                yield r
-
-        mock_hook.get_first.side_effect = returned_row()
-        mock_get_db_hook.return_value = mock_hook
-
-        operator = self._construct_operator(
-            table='test_table',
-            metric_thresholds={
-                'f0': 0.5,
-                'f1': 0.6,
-                'f2': 0.7,
-                'f3': 0.8,
-            },
-            ratio_formula='relative_diff',
-            ignore_zero=True,
-        )
-
-        with self.assertRaisesRegexp(AirflowException, "f0, f1"):
-            operator.execute()
-
-
-class TestThresholdCheckOperator(unittest.TestCase):
-
-    def _construct_operator(self, sql, min_threshold, max_threshold):
-        dag = DAG('test_dag', start_date=datetime(2017, 1, 1))
-
-        return ThresholdCheckOperator(
-            task_id='test_task',
-            sql=sql,
-            min_threshold=min_threshold,
-            max_threshold=max_threshold,
-            dag=dag
-        )
-
-    @mock.patch.object(ThresholdCheckOperator, 'get_db_hook')
-    def test_pass_min_value_max_value(self, mock_get_db_hook):
-        mock_hook = mock.Mock()
-        mock_hook.get_first.return_value = [(10,)]
-        mock_get_db_hook.return_value = mock_hook
-
-        operator = self._construct_operator(
-            'Select avg(val) from table1 limit 1',
-            1,
-            100
-        )
-
-        operator.execute()
-
-    @mock.patch.object(ThresholdCheckOperator, 'get_db_hook')
-    def test_fail_min_value_max_value(self, mock_get_db_hook):
-        mock_hook = mock.Mock()
-        mock_hook.get_first.return_value = [(10,)]
-        mock_get_db_hook.return_value = mock_hook
-
-        operator = self._construct_operator(
-            'Select avg(val) from table1 limit 1',
-            20,
-            100
-        )
-
-        with six.assertRaisesRegex(self, AirflowException, '10.*20.0.*100.0'):
-            operator.execute()
-
-    @mock.patch.object(ThresholdCheckOperator, 'get_db_hook')
-    def test_pass_min_sql_max_sql(self, mock_get_db_hook):
-        mock_hook = mock.Mock()
-        mock_hook.get_first.side_effect = lambda x: [(int(x.split()[1]),)]
-        mock_get_db_hook.return_value = mock_hook
-
-        operator = self._construct_operator(
-            'Select 10',
-            'Select 1',
-            'Select 100'
-        )
-
-        operator.execute()
-
-    @mock.patch.object(ThresholdCheckOperator, 'get_db_hook')
-    def test_fail_min_sql_max_sql(self, mock_get_db_hook):
-        mock_hook = mock.Mock()
-        mock_hook.get_first.side_effect = lambda x: [(int(x.split()[1]),)]
-        mock_get_db_hook.return_value = mock_hook
-
-        operator = self._construct_operator(
-            'Select 10',
-            'Select 20',
-            'Select 100'
-        )
-
-        with six.assertRaisesRegex(self, AirflowException, '10.*20.*100'):
-            operator.execute()
-
-    @mock.patch.object(ThresholdCheckOperator, 'get_db_hook')
-    def test_pass_min_value_max_sql(self, mock_get_db_hook):
-        mock_hook = mock.Mock()
-        mock_hook.get_first.side_effect = lambda x: [(int(x.split()[1]),)]
-        mock_get_db_hook.return_value = mock_hook
-
-        operator = self._construct_operator(
-            'Select 75',
-            45,
-            'Select 100'
-        )
-
-        operator.execute()
-
-    @mock.patch.object(ThresholdCheckOperator, 'get_db_hook')
-    def test_fail_min_sql_max_value(self, mock_get_db_hook):
-        mock_hook = mock.Mock()
-        mock_hook.get_first.side_effect = lambda x: [(int(x.split()[1]),)]
-        mock_get_db_hook.return_value = mock_hook
-
-        operator = self._construct_operator(
-            'Select 155',
-            'Select 45',
-            100
-        )
-
-        with six.assertRaisesRegex(self, AirflowException, '155.*45.*100.0'):
-            operator.execute()
diff --git a/tests/operators/test_s3_to_hive_operator.py b/tests/operators/test_s3_to_hive_operator.py
index 8366465..8ed7ad2 100644
--- a/tests/operators/test_s3_to_hive_operator.py
+++ b/tests/operators/test_s3_to_hive_operator.py
@@ -19,12 +19,14 @@
 
 import unittest
 
+import six
+
+from airflow import AirflowException
 from tests.compat import mock
 import logging
 from itertools import product
 from airflow.operators.s3_to_hive_operator import S3ToHiveTransfer
 from collections import OrderedDict
-from airflow.exceptions import AirflowException
 from tempfile import NamedTemporaryFile, mkdtemp
 from gzip import GzipFile
 import bz2
@@ -156,10 +158,10 @@ class S3ToHiveTransferTest(unittest.TestCase):
     def test_bad_parameters(self):
         self.kwargs['check_headers'] = True
         self.kwargs['headers'] = False
-        self.assertRaisesRegexp(AirflowException,
-                                "To check_headers.*",
-                                S3ToHiveTransfer,
-                                **self.kwargs)
+        six.assertRaisesRegex(self, AirflowException,
+                              "To check_headers.*",
+                              S3ToHiveTransfer,
+                              **self.kwargs)
 
     def test__get_top_row_as_list(self):
         self.kwargs['delimiter'] = '\t'
diff --git a/tests/operators/test_sql_branch_operator.py b/tests/operators/test_sql.py
similarity index 57%
rename from tests/operators/test_sql_branch_operator.py
rename to tests/operators/test_sql.py
index 6510609..6ccc5fa 100644
--- a/tests/operators/test_sql_branch_operator.py
+++ b/tests/operators/test_sql.py
@@ -18,14 +18,20 @@
 
 import datetime
 import unittest
+
+import six
+
 from tests.compat import mock
 
 import pytest
 
 from airflow.exceptions import AirflowException
 from airflow.models import DAG, DagRun, TaskInstance as TI
+from airflow.operators.check_operator import (
+    CheckOperator, IntervalCheckOperator, ThresholdCheckOperator, ValueCheckOperator,
+)
 from airflow.operators.dummy_operator import DummyOperator
-from airflow.operators.sql_branch_operator import BranchSqlOperator
+from airflow.operators.sql import BranchSQLOperator
 from airflow.utils import timezone
 from airflow.utils.db import create_session
 from airflow.utils.state import State
@@ -60,6 +66,266 @@ SUPPORTED_FALSE_VALUES = [
 ]
 
 
+class TestCheckOperator(unittest.TestCase):
+    @mock.patch.object(CheckOperator, "get_db_hook")
+    def test_execute_no_records(self, mock_get_db_hook):
+        mock_get_db_hook.return_value.get_first.return_value = []
+
+        with self.assertRaises(AirflowException):
+            CheckOperator(sql="sql").execute()
+
+    @mock.patch.object(CheckOperator, "get_db_hook")
+    def test_execute_not_all_records_are_true(self, mock_get_db_hook):
+        mock_get_db_hook.return_value.get_first.return_value = ["data", ""]
+
+        with self.assertRaises(AirflowException):
+            CheckOperator(sql="sql").execute()
+
+
+class TestValueCheckOperator(unittest.TestCase):
+    def setUp(self):
+        self.task_id = "test_task"
+        self.conn_id = "default_conn"
+
+    def _construct_operator(self, sql, pass_value, tolerance=None):
+        dag = DAG("test_dag", start_date=datetime.datetime(2017, 1, 1))
+
+        return ValueCheckOperator(
+            dag=dag,
+            task_id=self.task_id,
+            conn_id=self.conn_id,
+            sql=sql,
+            pass_value=pass_value,
+            tolerance=tolerance,
+        )
+
+    def test_pass_value_template_string(self):
+        pass_value_str = "2018-03-22"
+        operator = self._construct_operator(
+            "select date from tab1;", "{{ ds }}")
+
+        operator.render_template_fields({"ds": pass_value_str})
+
+        self.assertEqual(operator.task_id, self.task_id)
+        self.assertEqual(operator.pass_value, pass_value_str)
+
+    def test_pass_value_template_string_float(self):
+        pass_value_float = 4.0
+        operator = self._construct_operator(
+            "select date from tab1;", pass_value_float)
+
+        operator.render_template_fields({})
+
+        self.assertEqual(operator.task_id, self.task_id)
+        self.assertEqual(operator.pass_value, str(pass_value_float))
+
+    @mock.patch.object(ValueCheckOperator, "get_db_hook")
+    def test_execute_pass(self, mock_get_db_hook):
+        mock_hook = mock.Mock()
+        mock_hook.get_first.return_value = [10]
+        mock_get_db_hook.return_value = mock_hook
+        sql = "select value from tab1 limit 1;"
+        operator = self._construct_operator(sql, 5, 1)
+
+        operator.execute(None)
+
+        mock_hook.get_first.assert_called_once_with(sql)
+
+    @mock.patch.object(ValueCheckOperator, "get_db_hook")
+    def test_execute_fail(self, mock_get_db_hook):
+        mock_hook = mock.Mock()
+        mock_hook.get_first.return_value = [11]
+        mock_get_db_hook.return_value = mock_hook
+
+        operator = self._construct_operator(
+            "select value from tab1 limit 1;", 5, 1)
+
+        with six.assertRaisesRegex(self, AirflowException, "Tolerance:100.0%"):
+            operator.execute()
+
+
+class TestIntervalCheckOperator(unittest.TestCase):
+    def _construct_operator(self, table, metric_thresholds, ratio_formula, ignore_zero):
+        return IntervalCheckOperator(
+            task_id="test_task",
+            table=table,
+            metrics_thresholds=metric_thresholds,
+            ratio_formula=ratio_formula,
+            ignore_zero=ignore_zero,
+        )
+
+    def test_invalid_ratio_formula(self):
+        with six.assertRaisesRegex(self, AirflowException, "Invalid diff_method"):
+            self._construct_operator(
+                table="test_table",
+                metric_thresholds={"f1": 1, },
+                ratio_formula="abs",
+                ignore_zero=False,
+            )
+
+    @mock.patch.object(IntervalCheckOperator, "get_db_hook")
+    def test_execute_not_ignore_zero(self, mock_get_db_hook):
+        mock_hook = mock.Mock()
+        mock_hook.get_first.return_value = [0]
+        mock_get_db_hook.return_value = mock_hook
+
+        operator = self._construct_operator(
+            table="test_table",
+            metric_thresholds={"f1": 1, },
+            ratio_formula="max_over_min",
+            ignore_zero=False,
+        )
+
+        with self.assertRaises(AirflowException):
+            operator.execute()
+
+    @mock.patch.object(IntervalCheckOperator, "get_db_hook")
+    def test_execute_ignore_zero(self, mock_get_db_hook):
+        mock_hook = mock.Mock()
+        mock_hook.get_first.return_value = [0]
+        mock_get_db_hook.return_value = mock_hook
+
+        operator = self._construct_operator(
+            table="test_table",
+            metric_thresholds={"f1": 1, },
+            ratio_formula="max_over_min",
+            ignore_zero=True,
+        )
+
+        operator.execute()
+
+    @mock.patch.object(IntervalCheckOperator, "get_db_hook")
+    def test_execute_min_max(self, mock_get_db_hook):
+        mock_hook = mock.Mock()
+
+        def returned_row():
+            rows = [
+                [2, 2, 2, 2],  # reference
+                [1, 1, 1, 1],  # current
+            ]
+            return rows
+
+        mock_hook.get_first.side_effect = returned_row()
+        mock_get_db_hook.return_value = mock_hook
+
+        operator = self._construct_operator(
+            table="test_table",
+            metric_thresholds={"f0": 1.0, "f1": 1.5, "f2": 2.0, "f3": 2.5, },
+            ratio_formula="max_over_min",
+            ignore_zero=True,
+        )
+
+        with six.assertRaisesRegex(self, AirflowException, "f0, f1, f2"):
+            operator.execute()
+
+    @mock.patch.object(IntervalCheckOperator, "get_db_hook")
+    def test_execute_diff(self, mock_get_db_hook):
+        mock_hook = mock.Mock()
+
+        def returned_row():
+            rows = [
+                [3, 3, 3, 3],  # reference
+                [1, 1, 1, 1],  # current
+            ]
+
+            return rows
+
+        mock_hook.get_first.side_effect = returned_row()
+        mock_get_db_hook.return_value = mock_hook
+
+        operator = self._construct_operator(
+            table="test_table",
+            metric_thresholds={"f0": 0.5, "f1": 0.6, "f2": 0.7, "f3": 0.8, },
+            ratio_formula="relative_diff",
+            ignore_zero=True,
+        )
+
+        with six.assertRaisesRegex(self, AirflowException, "f0, f1"):
+            operator.execute()
+
+
+class TestThresholdCheckOperator(unittest.TestCase):
+    def _construct_operator(self, sql, min_threshold, max_threshold):
+        dag = DAG("test_dag", start_date=datetime.datetime(2017, 1, 1))
+
+        return ThresholdCheckOperator(
+            task_id="test_task",
+            sql=sql,
+            min_threshold=min_threshold,
+            max_threshold=max_threshold,
+            dag=dag,
+        )
+
+    @mock.patch.object(ThresholdCheckOperator, "get_db_hook")
+    def test_pass_min_value_max_value(self, mock_get_db_hook):
+        mock_hook = mock.Mock()
+        mock_hook.get_first.return_value = [(10,)]
+        mock_get_db_hook.return_value = mock_hook
+
+        operator = self._construct_operator(
+            "Select avg(val) from table1 limit 1", 1, 100
+        )
+
+        operator.execute()
+
+    @mock.patch.object(ThresholdCheckOperator, "get_db_hook")
+    def test_fail_min_value_max_value(self, mock_get_db_hook):
+        mock_hook = mock.Mock()
+        mock_hook.get_first.return_value = [(10,)]
+        mock_get_db_hook.return_value = mock_hook
+
+        operator = self._construct_operator(
+            "Select avg(val) from table1 limit 1", 20, 100
+        )
+
+        with six.assertRaisesRegex(self, AirflowException, "10.*20.0.*100.0"):
+            operator.execute()
+
+    @mock.patch.object(ThresholdCheckOperator, "get_db_hook")
+    def test_pass_min_sql_max_sql(self, mock_get_db_hook):
+        mock_hook = mock.Mock()
+        mock_hook.get_first.side_effect = lambda x: [(int(x.split()[1]),)]
+        mock_get_db_hook.return_value = mock_hook
+
+        operator = self._construct_operator(
+            "Select 10", "Select 1", "Select 100")
+
+        operator.execute()
+
+    @mock.patch.object(ThresholdCheckOperator, "get_db_hook")
+    def test_fail_min_sql_max_sql(self, mock_get_db_hook):
+        mock_hook = mock.Mock()
+        mock_hook.get_first.side_effect = lambda x: [(int(x.split()[1]),)]
+        mock_get_db_hook.return_value = mock_hook
+
+        operator = self._construct_operator(
+            "Select 10", "Select 20", "Select 100")
+
+        with six.assertRaisesRegex(self, AirflowException, "10.*20.*100"):
+            operator.execute()
+
+    @mock.patch.object(ThresholdCheckOperator, "get_db_hook")
+    def test_pass_min_value_max_sql(self, mock_get_db_hook):
+        mock_hook = mock.Mock()
+        mock_hook.get_first.side_effect = lambda x: [(int(x.split()[1]),)]
+        mock_get_db_hook.return_value = mock_hook
+
+        operator = self._construct_operator("Select 75", 45, "Select 100")
+
+        operator.execute()
+
+    @mock.patch.object(ThresholdCheckOperator, "get_db_hook")
+    def test_fail_min_sql_max_value(self, mock_get_db_hook):
+        mock_hook = mock.Mock()
+        mock_hook.get_first.side_effect = lambda x: [(int(x.split()[1]),)]
+        mock_get_db_hook.return_value = mock_hook
+
+        operator = self._construct_operator("Select 155", "Select 45", 100)
+
+        with six.assertRaisesRegex(self, AirflowException, "155.*45.*100.0"):
+            operator.execute()
+
+
 class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
     """
     Test for SQL Branch Operator
@@ -92,8 +358,8 @@ class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
             session.query(TI).delete()
 
     def test_unsupported_conn_type(self):
-        """ Check if BranchSqlOperator throws an exception for unsupported connection type """
-        op = BranchSqlOperator(
+        """ Check if BranchSQLOperator throws an exception for unsupported connection type """
+        op = BranchSQLOperator(
             task_id="make_choice",
             conn_id="redis_default",
             sql="SELECT count(1) FROM INFORMATION_SCHEMA.TABLES",
@@ -103,11 +369,12 @@ class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
         )
 
         with self.assertRaises(AirflowException):
-            op.run(start_date=DEFAULT_DATE, end_date=DEFAULT_DATE, ignore_ti_state=True)
+            op.run(start_date=DEFAULT_DATE,
+                   end_date=DEFAULT_DATE, ignore_ti_state=True)
 
     def test_invalid_conn(self):
-        """ Check if BranchSqlOperator throws an exception for invalid connection """
-        op = BranchSqlOperator(
+        """ Check if BranchSQLOperator throws an exception for invalid connection """
+        op = BranchSQLOperator(
             task_id="make_choice",
             conn_id="invalid_connection",
             sql="SELECT count(1) FROM INFORMATION_SCHEMA.TABLES",
@@ -117,11 +384,12 @@ class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
         )
 
         with self.assertRaises(AirflowException):
-            op.run(start_date=DEFAULT_DATE, end_date=DEFAULT_DATE, ignore_ti_state=True)
+            op.run(start_date=DEFAULT_DATE,
+                   end_date=DEFAULT_DATE, ignore_ti_state=True)
 
     def test_invalid_follow_task_true(self):
-        """ Check if BranchSqlOperator throws an exception for invalid connection """
-        op = BranchSqlOperator(
+        """ Check if BranchSQLOperator throws an exception for invalid connection """
+        op = BranchSQLOperator(
             task_id="make_choice",
             conn_id="invalid_connection",
             sql="SELECT count(1) FROM INFORMATION_SCHEMA.TABLES",
@@ -131,11 +399,12 @@ class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
         )
 
         with self.assertRaises(AirflowException):
-            op.run(start_date=DEFAULT_DATE, end_date=DEFAULT_DATE, ignore_ti_state=True)
+            op.run(start_date=DEFAULT_DATE,
+                   end_date=DEFAULT_DATE, ignore_ti_state=True)
 
     def test_invalid_follow_task_false(self):
-        """ Check if BranchSqlOperator throws an exception for invalid connection """
-        op = BranchSqlOperator(
+        """ Check if BranchSQLOperator throws an exception for invalid connection """
+        op = BranchSQLOperator(
             task_id="make_choice",
             conn_id="invalid_connection",
             sql="SELECT count(1) FROM INFORMATION_SCHEMA.TABLES",
@@ -145,12 +414,13 @@ class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
         )
 
         with self.assertRaises(AirflowException):
-            op.run(start_date=DEFAULT_DATE, end_date=DEFAULT_DATE, ignore_ti_state=True)
+            op.run(start_date=DEFAULT_DATE,
+                   end_date=DEFAULT_DATE, ignore_ti_state=True)
 
     @pytest.mark.backend("mysql")
     def test_sql_branch_operator_mysql(self):
-        """ Check if BranchSqlOperator works with backend """
-        branch_op = BranchSqlOperator(
+        """ Check if BranchSQLOperator works with backend """
+        branch_op = BranchSQLOperator(
             task_id="make_choice",
             conn_id="mysql_default",
             sql="SELECT 1",
@@ -164,8 +434,8 @@ class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
 
     @pytest.mark.backend("postgres")
     def test_sql_branch_operator_postgres(self):
-        """ Check if BranchSqlOperator works with backend """
-        branch_op = BranchSqlOperator(
+        """ Check if BranchSQLOperator works with backend """
+        branch_op = BranchSQLOperator(
             task_id="make_choice",
             conn_id="postgres_default",
             sql="SELECT 1",
@@ -177,10 +447,10 @@ class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
             start_date=DEFAULT_DATE, end_date=DEFAULT_DATE, ignore_ti_state=True
         )
 
-    @mock.patch("airflow.operators.sql_branch_operator.BaseHook")
+    @mock.patch("airflow.operators.sql.BaseHook")
     def test_branch_single_value_with_dag_run(self, mock_hook):
-        """ Check BranchSqlOperator branch operation """
-        branch_op = BranchSqlOperator(
+        """ Check BranchSQLOperator branch operation """
+        branch_op = BranchSQLOperator(
             task_id="make_choice",
             conn_id="mysql_default",
             sql="SELECT 1",
@@ -220,10 +490,10 @@ class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
             else:
                 raise ValueError("Invalid task id {task_id} found!".format(task_id=ti.task_id))
 
-    @mock.patch("airflow.operators.sql_branch_operator.BaseHook")
+    @mock.patch("airflow.operators.sql.BaseHook")
     def test_branch_true_with_dag_run(self, mock_hook):
-        """ Check BranchSqlOperator branch operation """
-        branch_op = BranchSqlOperator(
+        """ Check BranchSQLOperator branch operation """
+        branch_op = BranchSQLOperator(
             task_id="make_choice",
             conn_id="mysql_default",
             sql="SELECT 1",
@@ -264,10 +534,10 @@ class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
                 else:
                     raise ValueError("Invalid task id {task_id} found!".format(task_id=ti.task_id))
 
-    @mock.patch("airflow.operators.sql_branch_operator.BaseHook")
+    @mock.patch("airflow.operators.sql.BaseHook")
     def test_branch_false_with_dag_run(self, mock_hook):
-        """ Check BranchSqlOperator branch operation """
-        branch_op = BranchSqlOperator(
+        """ Check BranchSQLOperator branch operation """
+        branch_op = BranchSQLOperator(
             task_id="make_choice",
             conn_id="mysql_default",
             sql="SELECT 1",
@@ -308,10 +578,10 @@ class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
                 else:
                     raise ValueError("Invalid task id {task_id} found!".format(task_id=ti.task_id))
 
-    @mock.patch("airflow.operators.sql_branch_operator.BaseHook")
+    @mock.patch("airflow.operators.sql.BaseHook")
     def test_branch_list_with_dag_run(self, mock_hook):
-        """ Checks if the BranchSqlOperator supports branching off to a list of tasks."""
-        branch_op = BranchSqlOperator(
+        """ Checks if the BranchSQLOperator supports branching off to a list of tasks."""
+        branch_op = BranchSQLOperator(
             task_id="make_choice",
             conn_id="mysql_default",
             sql="SELECT 1",
@@ -354,10 +624,10 @@ class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
             else:
                 raise ValueError("Invalid task id {task_id} found!".format(task_id=ti.task_id))
 
-    @mock.patch("airflow.operators.sql_branch_operator.BaseHook")
+    @mock.patch("airflow.operators.sql.BaseHook")
     def test_invalid_query_result_with_dag_run(self, mock_hook):
-        """ Check BranchSqlOperator branch operation """
-        branch_op = BranchSqlOperator(
+        """ Check BranchSQLOperator branch operation """
+        branch_op = BranchSQLOperator(
             task_id="make_choice",
             conn_id="mysql_default",
             sql="SELECT 1",
@@ -387,10 +657,10 @@ class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
         with self.assertRaises(AirflowException):
             branch_op.run(start_date=DEFAULT_DATE, end_date=DEFAULT_DATE)
 
-    @mock.patch("airflow.operators.sql_branch_operator.BaseHook")
+    @mock.patch("airflow.operators.sql.BaseHook")
     def test_with_skip_in_branch_downstream_dependencies(self, mock_hook):
         """ Test SQL Branch with skipping all downstream dependencies """
-        branch_op = BranchSqlOperator(
+        branch_op = BranchSQLOperator(
             task_id="make_choice",
             conn_id="mysql_default",
             sql="SELECT 1",
@@ -431,10 +701,10 @@ class TestSqlBranch(TestHiveEnvironment, unittest.TestCase):
                 else:
                     raise ValueError("Invalid task id {task_id} found!".format(task_id=ti.task_id))
 
-    @mock.patch("airflow.operators.sql_branch_operator.BaseHook")
+    @mock.patch("airflow.operators.sql.BaseHook")
     def test_with_skip_in_branch_downstream_dependencies2(self, mock_hook):
         """ Test skipping downstream dependency for false condition"""
-        branch_op = BranchSqlOperator(
+        branch_op = BranchSQLOperator(
             task_id="make_choice",
             conn_id="mysql_default",
             sql="SELECT 1",
diff --git a/tests/secrets/test_local_filesystem.py b/tests/secrets/test_local_filesystem.py
index 60cec06..dc06969 100644
--- a/tests/secrets/test_local_filesystem.py
+++ b/tests/secrets/test_local_filesystem.py
@@ -48,7 +48,7 @@ class FileParsers(unittest.TestCase):
     )
     def test_env_file_invalid_format(self, content, expected_message):
         with mock_local_file(content):
-            with self.assertRaisesRegexp(AirflowFileParseException, re.escape(expected_message)):
+            with six.assertRaisesRegex(self, AirflowFileParseException, re.escape(expected_message)):
                 local_filesystem.load_variables("a.env")
 
     @parameterized.expand(
@@ -65,7 +65,7 @@ class FileParsers(unittest.TestCase):
     )
     def test_json_file_invalid_format(self, content, expected_message):
         with mock_local_file(content):
-            with self.assertRaisesRegexp(AirflowFileParseException, re.escape(expected_message)):
+            with six.assertRaisesRegex(self, AirflowFileParseException, re.escape(expected_message)):
                 local_filesystem.load_variables("a.json")
 
 
@@ -87,7 +87,7 @@ class TestLoadVariables(unittest.TestCase):
     @parameterized.expand((("AA=A\nAA=B", "The \"a.env\" file contains multiple values for keys: ['AA']"),))
     def test_env_file_invalid_logic(self, content, expected_message):
         with mock_local_file(content):
-            with self.assertRaisesRegexp(AirflowException, re.escape(expected_message)):
+            with six.assertRaisesRegex(self, AirflowException, re.escape(expected_message)):
                 local_filesystem.load_variables("a.env")
 
     @parameterized.expand(
@@ -105,7 +105,8 @@ class TestLoadVariables(unittest.TestCase):
 
     @mock.patch("airflow.secrets.local_filesystem.os.path.exists", return_value=False)
     def test_missing_file(self, mock_exists):
-        with self.assertRaisesRegexp(
+        with six.assertRaisesRegex(
+            self,
             AirflowException,
             re.escape("File a.json was not found. Check the configuration of your Secrets backend."),
         ):
@@ -148,7 +149,7 @@ class TestLoadConnection(unittest.TestCase):
     )
     def test_env_file_invalid_format(self, content, expected_message):
         with mock_local_file(content):
-            with self.assertRaisesRegexp(AirflowFileParseException, re.escape(expected_message)):
+            with six.assertRaisesRegex(self, AirflowFileParseException, re.escape(expected_message)):
                 local_filesystem.load_connections("a.env")
 
     @parameterized.expand(
@@ -189,12 +190,13 @@ class TestLoadConnection(unittest.TestCase):
     )
     def test_env_file_invalid_input(self, file_content, expected_connection_uris):
         with mock_local_file(json.dumps(file_content)):
-            with self.assertRaisesRegexp(AirflowException, re.escape(expected_connection_uris)):
+            with six.assertRaisesRegex(self, AirflowException, re.escape(expected_connection_uris)):
                 local_filesystem.load_connections("a.json")
 
     @mock.patch("airflow.secrets.local_filesystem.os.path.exists", return_value=False)
     def test_missing_file(self, mock_exists):
-        with self.assertRaisesRegexp(
+        with six.assertRaisesRegex(
+            self,
             AirflowException,
             re.escape("File a.json was not found. Check the configuration of your Secrets backend."),
         ):
diff --git a/tests/sensors/test_http_sensor.py b/tests/sensors/test_http_sensor.py
index 5b2d19e..db0f02a 100644
--- a/tests/sensors/test_http_sensor.py
+++ b/tests/sensors/test_http_sensor.py
@@ -19,6 +19,7 @@
 import unittest
 
 import requests
+import six
 from mock import patch
 
 from airflow import DAG
@@ -61,7 +62,7 @@ class HttpSensorTests(unittest.TestCase):
             response_check=resp_check,
             timeout=5,
             poke_interval=1)
-        with self.assertRaisesRegexp(AirflowException, 'AirflowException raised here!'):
+        with six.assertRaisesRegex(self, AirflowException, 'AirflowException raised here!'):
             task.execute(None)
 
     @patch("airflow.hooks.http_hook.requests.Session.send")
diff --git a/tests/utils/test_compression.py b/tests/utils/test_compression.py
index 5a36709..022d981 100644
--- a/tests/utils/test_compression.py
+++ b/tests/utils/test_compression.py
@@ -26,6 +26,8 @@ import shutil
 import tempfile
 import unittest
 
+import six
+
 from airflow.utils import compression
 
 
@@ -81,13 +83,13 @@ class Compression(unittest.TestCase):
 
     def test_uncompress_file(self):
         # Testing txt file type
-        self.assertRaisesRegexp(NotImplementedError,
-                                "^Received .txt format. Only gz and bz2.*",
-                                compression.uncompress_file,
-                                **{'input_file_name': None,
-                                   'file_extension': '.txt',
-                                   'dest_dir': None
-                                   })
+        six.assertRaisesRegex(self, NotImplementedError,
+                              "^Received .txt format. Only gz and bz2.*",
+                              compression.uncompress_file,
+                              **{'input_file_name': None,
+                                 'file_extension': '.txt',
+                                 'dest_dir': None
+                                 })
         # Testing gz file type
         fn_txt = self._get_fn('.txt')
         fn_gz = self._get_fn('.gz')
diff --git a/tests/utils/test_decorators.py b/tests/utils/test_decorators.py
index d23cdcc..05df91b 100644
--- a/tests/utils/test_decorators.py
+++ b/tests/utils/test_decorators.py
@@ -19,6 +19,8 @@
 
 import unittest
 
+import six
+
 from airflow.utils.decorators import apply_defaults
 from airflow.exceptions import AirflowException
 
@@ -43,7 +45,7 @@ class ApplyDefaultTest(unittest.TestCase):
         dc = DummyClass(test_param=True)
         self.assertTrue(dc.test_param)
 
-        with self.assertRaisesRegexp(AirflowException, 'Argument.*test_param.*required'):
+        with six.assertRaisesRegex(self, AirflowException, 'Argument.*test_param.*required'):
             DummySubClass(test_sub_param=True)
 
     def test_default_args(self):
@@ -61,8 +63,8 @@ class ApplyDefaultTest(unittest.TestCase):
         self.assertTrue(dc.test_param)
         self.assertTrue(dsc.test_sub_param)
 
-        with self.assertRaisesRegexp(AirflowException,
-                                     'Argument.*test_sub_param.*required'):
+        with six.assertRaisesRegex(self, AirflowException,
+                                   'Argument.*test_sub_param.*required'):
             DummySubClass(default_args=default_args)
 
     def test_incorrect_default_args(self):
@@ -71,5 +73,5 @@ class ApplyDefaultTest(unittest.TestCase):
         self.assertTrue(dc.test_param)
 
         default_args = {'random_params': True}
-        with self.assertRaisesRegexp(AirflowException, 'Argument.*test_param.*required'):
+        with six.assertRaisesRegex(self, AirflowException, 'Argument.*test_param.*required'):
             DummyClass(default_args=default_args)
diff --git a/tests/utils/test_json.py b/tests/utils/test_json.py
index ce0eece..bc3c7f9 100644
--- a/tests/utils/test_json.py
+++ b/tests/utils/test_json.py
@@ -22,6 +22,7 @@ import json
 import unittest
 
 import numpy as np
+import six
 
 from airflow.utils import json as utils_json
 
@@ -76,11 +77,11 @@ class TestAirflowJsonEncoder(unittest.TestCase):
         )
 
     def test_encode_raises(self):
-        self.assertRaisesRegexp(TypeError,
-                                "^.*is not JSON serializable$",
-                                json.dumps,
-                                Exception,
-                                cls=utils_json.AirflowJsonEncoder)
+        six.assertRaisesRegex(self, TypeError,
+                              "^.*is not JSON serializable$",
+                              json.dumps,
+                              Exception,
+                              cls=utils_json.AirflowJsonEncoder)
 
 
 if __name__ == '__main__':
diff --git a/tests/utils/test_module_loading.py b/tests/utils/test_module_loading.py
index ba1ebca..cde32c5 100644
--- a/tests/utils/test_module_loading.py
+++ b/tests/utils/test_module_loading.py
@@ -19,6 +19,8 @@
 
 import unittest
 
+import six
+
 from airflow.utils.module_loading import import_string
 
 
@@ -31,5 +33,5 @@ class ModuleImportTestCase(unittest.TestCase):
         with self.assertRaises(ImportError):
             import_string('no_dots_in_path')
         msg = 'Module "airflow.utils" does not define a "nonexistent" attribute'
-        with self.assertRaisesRegexp(ImportError, msg):
+        with six.assertRaisesRegex(self, ImportError, msg):
             import_string('airflow.utils.nonexistent')
diff --git a/tests/www/test_validators.py b/tests/www/test_validators.py
index e624263..6b4fcbd 100644
--- a/tests/www/test_validators.py
+++ b/tests/www/test_validators.py
@@ -20,6 +20,8 @@
 import mock
 import unittest
 
+import six
+
 from airflow.www import validators
 
 
@@ -46,7 +48,8 @@ class TestGreaterEqualThan(unittest.TestCase):
         return validator(self.form_mock, self.form_field_mock)
 
     def test_field_not_found(self):
-        self.assertRaisesRegexp(
+        six.assertRaisesRegex(
+            self,
             validators.ValidationError,
             "^Invalid field name 'some'.$",
             self._validate,
@@ -75,7 +78,8 @@ class TestGreaterEqualThan(unittest.TestCase):
     def test_validation_raises(self):
         self.form_field_mock.data = '2017-05-04'
 
-        self.assertRaisesRegexp(
+        six.assertRaisesRegex(
+            self,
             validators.ValidationError,
             "^Field must be greater than or equal to other field.$",
             self._validate,
@@ -84,7 +88,8 @@ class TestGreaterEqualThan(unittest.TestCase):
     def test_validation_raises_custom_message(self):
         self.form_field_mock.data = '2017-05-04'
 
-        self.assertRaisesRegexp(
+        six.assertRaisesRegex(
+            self,
             validators.ValidationError,
             "^This field must be greater than or equal to MyField.$",
             self._validate,
diff --git a/tests/www_rbac/test_validators.py b/tests/www_rbac/test_validators.py
index 4a543ff..95c7562 100644
--- a/tests/www_rbac/test_validators.py
+++ b/tests/www_rbac/test_validators.py
@@ -48,7 +48,8 @@ class TestGreaterEqualThan(unittest.TestCase):
         return validator(self.form_mock, self.form_field_mock)
 
     def test_field_not_found(self):
-        self.assertRaisesRegexp(
+        six.assertRaisesRegex(
+            self,
             validators.ValidationError,
             "^Invalid field name 'some'.$",
             self._validate,
@@ -77,7 +78,8 @@ class TestGreaterEqualThan(unittest.TestCase):
     def test_validation_raises(self):
         self.form_field_mock.data = '2017-05-04'
 
-        self.assertRaisesRegexp(
+        six.assertRaisesRegex(
+            self,
             validators.ValidationError,
             "^Field must be greater than or equal to other field.$",
             self._validate,
@@ -86,7 +88,8 @@ class TestGreaterEqualThan(unittest.TestCase):
     def test_validation_raises_custom_message(self):
         self.form_field_mock.data = '2017-05-04'
 
-        self.assertRaisesRegexp(
+        six.assertRaisesRegex(
+            self,
             validators.ValidationError,
             "^This field must be greater than or equal to MyField.$",
             self._validate,


[airflow] 18/25: clarify breeze initialize virtualenv instructions (#9319)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 6efe62fba0f15f6d3a08fa406310aa1512e64ecf
Author: dstandish <ds...@users.noreply.github.com>
AuthorDate: Tue Jun 16 11:38:42 2020 -0700

    clarify breeze initialize virtualenv instructions (#9319)
    
    * you need to activate virtualenv, not enter breeze, before running the command
    
    (cherry picked from commit d6e5e7ce52f0a4b28dee29a64bcd2f5d6b152c92)
---
 LOCAL_VIRTUALENV.rst | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/LOCAL_VIRTUALENV.rst b/LOCAL_VIRTUALENV.rst
index e45e76c..cd66ea4 100644
--- a/LOCAL_VIRTUALENV.rst
+++ b/LOCAL_VIRTUALENV.rst
@@ -141,7 +141,7 @@ You can solve the problem by:
 
 Note that if you have the Breeze development environment installed, the ``breeze``
 script can automate initializing the created virtualenv (steps 2 and 3).
-Simply enter the Breeze environment by using ``workon`` and, once you are in it, run:
+Activate your virtualenv, e.g. by using ``workon``, and once you are in it, run:
 
 .. code-block:: bash
 


[airflow] 06/25: Updated missing parameters for docker image building (#9039)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit c7bca9ddfdf3588209fe0a3bcc8a1be2721ae881
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Tue Jun 2 09:27:09 2020 +0200

    Updated missing parameters for docker image building (#9039)
    
    
    (cherry picked from commit b7b48463b17cf656c34859baafcf7c941c664ae1)
---
 IMAGES.rst | 24 +++++++++++++++++++++---
 1 file changed, 21 insertions(+), 3 deletions(-)

diff --git a/IMAGES.rst b/IMAGES.rst
index 37b8d03..0a34140 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -198,6 +198,8 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_EXTRAS``                       | ``all``                                  | extras to install                        |
 +------------------------------------------+------------------------------------------+------------------------------------------+
+| ``ADDITIONAL_AIRFLOW_EXTRAS``            | ````                                     | additional extras to install             |
++------------------------------------------+------------------------------------------+------------------------------------------+
 | ``ADDITIONAL_PYTHON_DEPS``               | \```\`                                   | additional python dependencies to        |
 |                                          |                                          | install                                  |
 +------------------------------------------+------------------------------------------+------------------------------------------+
@@ -220,6 +222,22 @@ This builds the CI image in version 3.6 with "gcp" extra only.
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg AIRFLOW_EXTRAS=gcp
 
 
+This builds the CI image in version 3.6 with "apache-beam" extra added.
+
+.. code-block::
+
+  docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
+    --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg ADDITIONAL_AIRFLOW_EXTRAS="apache-beam"
+
+This builds the CI image in version 3.6 with "mssql" additional package added.
+
+.. code-block::
+
+  docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
+    --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg ADDITIONAL_PYTHON_DEPS="mssql"
+
+
+
 Production images
 .................
 
@@ -253,10 +271,10 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 | ``AIRFLOW_EXTRAS``                       | (see Dockerfile)                         | Default extras with which airflow is     |
 |                                          |                                          | installed                                |
 +------------------------------------------+------------------------------------------+------------------------------------------+
-| ``ADDITIONAL_AIRFLOW_EXTRAS``            | (see Dockerfile)                         | Additional extras with which airflow is  |
-|                                          |                                          | installed                                |
+| ``ADDITIONAL_AIRFLOW_EXTRAS``            | ````                                     | Optional additional extras with which    |
+|                                          |                                          | airflow is installed                     |
 +------------------------------------------+------------------------------------------+------------------------------------------+
-| ``ADDITIONAL_PYTHON_DEPS``               | (see Dockerfile)                         | Optional python packages to extend       |
+| ``ADDITIONAL_PYTHON_DEPS``               | ````                                     | Optional python packages to extend       |
 |                                          |                                          | the image with some extra dependencies   |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_HOME``                         | ``/opt/airflow``                         | Airflow’s HOME (that’s where logs and    |


[airflow] 03/25: Add PR/issue note in Contribution Workflow Example (#9177)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 659863d8b0995f9c7fac8b99835c227d1c75d9a9
Author: Kamil Breguła <mi...@users.noreply.github.com>
AuthorDate: Mon Jun 8 13:24:56 2020 +0200

    Add PR/issue note in Contribution Workflow Example (#9177)
    
    * Add PR/issue note in Contribution Workflow Example
    
    * Update CONTRIBUTING.rst
    
    Co-authored-by: Eric Lopes <nu...@users.noreply.github.com>
    
    * Update CONTRIBUTING.rst
    
    Co-authored-by: Eric Lopes <nu...@users.noreply.github.com>
    
    * Update CONTRIBUTING.rst
    
    Co-authored-by: Eric Lopes <nu...@users.noreply.github.com>
    (cherry picked from commit 2038b6957c48de8b2ad580d4ba7f51c1d1c98252)
---
 CONTRIBUTING.rst | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 1b8ce2f..e77a526 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -668,6 +668,9 @@ at `Apache JIRA <https://issues.apache.org/jira/browse/AIRFLOW>`__.
 If you create pull-request, you don't have to create an issue first, but if you want, you can do it.
 Creating an issue will allow you to collect feedback or share plans with other people.
 
+If you create pull-request, you don't have to create an issue first, but if you want, you can do it.
+Creating an issue will allow you to collect feedback or share plans with other people.
+
 For example, you want to have the following sample ticket assigned to you:
 `AIRFLOW-5934: Add extra CC: to the emails sent by Aiflow <https://issues.apache.org/jira/browse/AIRFLOW-5934>`_.
 


[airflow] 16/25: Improve production image iteration speed (#9162)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit ea93adc72ae52a6e531505c8464508f3fda8d5c9
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Tue Jun 16 12:36:46 2020 +0200

    Improve production image iteration speed (#9162)
    
    For a long time the way how entrypoint worked in ci scripts
    was wrong. The way it worked was convoluted and short of black
    magic. This did not allow to pass multiple test targets and
    required separate execute command scripts in Breeze.
    
    This is all now straightened out and both production and
    CI image are always using the right entrypoint by default
    and we can simply pass parameters to the image as usual without
    escaping strings.
    
    This also allowed to remove some breeze commands and
    change names of several flags in Breeze to make them more
    meaningful.
    
    Both CI and PROD image have now embedded scripts for log
    cleaning.
    
    History of image releases is added for 1.10.10-*
    alpha quality images.
    
    (cherry picked from commit 7c12a9d4e0b6c1e01fee6ab227a6e25b5aa5b157)
---
 .dockerignore                                      |   2 +-
 .github/workflows/ci.yml                           |   2 +-
 BREEZE.rst                                         | 219 +++----------
 Dockerfile                                         |  34 +-
 Dockerfile.ci                                      |  17 +-
 IMAGES.rst                                         | 346 +++++++++++++++++----
 TESTING.rst                                        |  10 +-
 breeze                                             | 216 ++++---------
 breeze-complete                                    |   7 +-
 scripts/ci/ci_run_airflow_testing.sh               |   6 +-
 scripts/ci/docker-compose/base.yml                 |   7 -
 scripts/ci/docker-compose/local-prod.yml           |   2 +-
 scripts/ci/docker-compose/local.yml                |   4 +-
 scripts/ci/in_container/entrypoint_ci.sh           |  42 +--
 scripts/ci/in_container/entrypoint_exec.sh         |   2 +-
 scripts/ci/in_container/run_ci_tests.sh            |   9 +-
 scripts/ci/libraries/_build_images.sh              |  90 +++---
 scripts/ci/libraries/_initialization.sh            |  12 +-
 scripts/ci/libraries/_local_mounts.sh              |   2 +-
 scripts/ci/libraries/_md5sum.sh                    |   4 +-
 scripts/ci/libraries/_push_pull_remove_images.sh   |   5 +-
 scripts/ci/libraries/_start_end.sh                 |   4 +
 scripts/ci/libraries/_verbosity.sh                 |  11 +
 scripts/docker/entrypoint.sh                       | 110 -------
 .../entrypoint_exec.sh => prod/clean-logs.sh}      |  22 +-
 entrypoint.sh => scripts/prod/entrypoint_prod.sh   |   0
 26 files changed, 534 insertions(+), 651 deletions(-)

diff --git a/.dockerignore b/.dockerignore
index e7d6564..0a89434 100644
--- a/.dockerignore
+++ b/.dockerignore
@@ -49,7 +49,7 @@
 !NOTICE
 !.github
 !requirements
-!entrypoint.sh
+!empty
 
 # Avoid triggering context change on README change (new companies using Airflow)
 # So please do not uncomment this line ;)
diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index f2c96d1..ee799dd 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -33,7 +33,7 @@ env:
   VERBOSE: "true"
   UPGRADE_TO_LATEST_REQUIREMENTS: "false"
   PYTHON_MAJOR_MINOR_VERSION: 3.5
-  ENABLE_REGISTRY_CACHE: "true"
+  USE_GITHUB_REGISTRY: "true"
   CACHE_IMAGE_PREFIX: ${{ github.repository }}
   CACHE_REGISTRY_USERNAME: ${{ github.actor }}
   CACHE_REGISTRY_PASSWORD: ${{ secrets.GITHUB_TOKEN }}
diff --git a/BREEZE.rst b/BREEZE.rst
index 0393369..3e9305a 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -293,7 +293,7 @@ Manage environments - CI (default) or Production - if ``--production-image`` fla
 
 Interact with CI environment:
 
-    * Run test target specified with ``breeze test-target`` command
+    * Run test target specified with ``breeze tests`` command
     * Execute arbitrary command in the test environment with ``breeze execute-command`` command
     * Execute arbitrary docker-compose command with ``breeze docker-compose`` command
 
@@ -709,7 +709,6 @@ This is the current syntax for  `./breeze <./breeze>`_:
     generate-requirements                    Generates pinned requirements for pip dependencies
     push-image                               Pushes images to registry
     initialize-local-virtualenv              Initializes local virtualenv
-    kind-cluster                             Manages KinD cluster on the host
     setup-autocomplete                       Sets up autocomplete for breeze
     stop                                     Stops the docker-compose environment
     restart                                  Stops the docker-compose environment including DB cleanup
@@ -719,11 +718,9 @@ This is the current syntax for  `./breeze <./breeze>`_:
   Commands with arguments:
 
     docker-compose                <ARG>      Executes specified docker-compose command
-    execute-command               <ARG>      Executes specified command in the container
     kind-cluster                  <ARG>      Manages KinD cluster on the host
     static-check                  <ARG>      Performs selected static check for changed files
-    static-check-all-files        <ARG>      Performs selected static check for all files
-    test-target                   <ARG>      Runs selected test target in the container
+    tests                         <ARG>      Runs selected tests in the container
 
   Help commands:
 
@@ -741,7 +738,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
   Detailed usage for command: shell
 
 
-  breeze shell [FLAGS] -- <EXTRA_ARGS>
+  breeze shell [FLAGS] [-- <EXTRA_ARGS>]
 
         This is default subcommand if no subcommand is used.
 
@@ -758,6 +755,11 @@ This is the current syntax for  `./breeze <./breeze>`_:
         and webserver ports are forwarded to appropriate database/webserver so that you can
         connect to it from your host environment.
 
+        You can also pass <EXTRA_ARGS> after -- they will be passed as bash parameters, this is
+        especially useful to pass bash options, for example -c to execute command:
+
+        'breeze shell -- -c "ls -la"'
+
   Flags:
 
   Run 'breeze flags' to see all applicable flags.
@@ -795,6 +797,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
   -p, --python <PYTHON_MAJOR_MINOR_VERSION>
           Python version used for the image. This is always major/minor version.
+
           One of:
 
                  2.7 3.5 3.6 3.7
@@ -858,10 +861,10 @@ This is the current syntax for  `./breeze <./breeze>`_:
   -H, --dockerhub-repo
           DockerHub repository used to pull, push, build images. Default: airflow.
 
-  -c, --registry-cache
-          If registry cache is enabled, pulls and pushes are done from the registry cache in github.
-          You need to be logged in to the registry in order to be able to pull/push from it and you
-          need to be committer to push to airflow registry.
+  -c, --github-registry
+          If GitHub registry is enabled, pulls and pushes are done from the GitHub registry not
+          DockerHub. You need to be logged in to the registry in order to be able to pull/push from it
+          and you need to be committer to push to Apache Airflow' GitHub registry.
 
   -G, --github-organisation
           GitHub organisation used to pull, push images when cache is used. Default: apache.
@@ -891,6 +894,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
   -p, --python <PYTHON_MAJOR_MINOR_VERSION>
           Python version used for the image. This is always major/minor version.
+
           One of:
 
                  2.7 3.5 3.6 3.7
@@ -910,7 +914,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
   Detailed usage for command: exec
 
 
-  breeze exec
+  breeze exec [-- <EXTRA_ARGS>]
 
         Execs into interactive shell to an already running container. The container mus be started
         already by breeze shell command. If you are not familiar with tmux, this is the best
@@ -936,6 +940,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
   -p, --python <PYTHON_MAJOR_MINOR_VERSION>
           Python version used for the image. This is always major/minor version.
+
           One of:
 
                  2.7 3.5 3.6 3.7
@@ -955,7 +960,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
   breeze push_image [FLAGS]
 
         Pushes images to docker registry. You can push the images to DockerHub registry (default)
-        or to the GitHub cache registry (if --registry-cache flag is used).
+        or to the GitHub registry (if --github-registry flag is used).
 
         For DockerHub pushes --dockerhub-user and --dockerhub-repo flags can be used to specify
         the repository to push to. For GitHub repository --github-organisation and --github-repo
@@ -968,8 +973,8 @@ This is the current syntax for  `./breeze <./breeze>`_:
         'breeze push-image' or
         'breeze push-image --dockerhub-user user' to push to your private registry or
         'breeze push-image --production-image' - to push production image or
-        'breeze push-image --registry-cache' - to push to GitHub cache or
-        'breeze push-image --registry-cache --github-organisation org' - for other organisation
+        'breeze push-image --github-registry' - to push to GitHub image registry or
+        'breeze push-image --github-registry --github-organisation org' - for other organisation
 
   Flags:
 
@@ -979,10 +984,10 @@ This is the current syntax for  `./breeze <./breeze>`_:
   -H, --dockerhub-repo
           DockerHub repository used to pull, push, build images. Default: airflow.
 
-  -c, --registry-cache
-          If registry cache is enabled, pulls and pushes are done from the registry cache in github.
-          You need to be logged in to the registry in order to be able to pull/push from it and you
-          need to be committer to push to airflow registry.
+  -c, --github-registry
+          If GitHub registry is enabled, pulls and pushes are done from the GitHub registry not
+          DockerHub. You need to be logged in to the registry in order to be able to pull/push from it
+          and you need to be committer to push to Apache Airflow' GitHub registry.
 
   -G, --github-organisation
           GitHub organisation used to pull, push images when cache is used. Default: apache.
@@ -1014,76 +1019,11 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
   -p, --python <PYTHON_MAJOR_MINOR_VERSION>
           Python version used for the image. This is always major/minor version.
-          One of:
-
-                 2.7 3.5 3.6 3.7
-
 
-  ####################################################################################################
-
-
-  Detailed usage for command: kind-cluster
-
-
-  breeze kind-cluster [FLAGS] OPERATION
-
-        Manages host-side Kind Kubernetes cluster that is used to run Kubernetes integration tests.
-        It allows to start/stop/restart/status the Kind Kubernetes cluster and deploy Airflow to it.
-        This enables you to run tests inside the breeze environment with latest airflow images loaded.
-        Note that in case of deploying airflow, the first step is to rebuild the image and loading it
-        to the cluster so you can also pass appropriate build image flags that will influence
-        rebuilding the production image. Operation is one of:
-
-                 start stop restart status deploy test
-
-  Flags:
-
-  -p, --python <PYTHON_MAJOR_MINOR_VERSION>
-          Python version used for the image. This is always major/minor version.
           One of:
 
                  2.7 3.5 3.6 3.7
 
-  -F, --force-build-images
-          Forces building of the local docker images. The images are rebuilt
-          automatically for the first time or when changes are detected in
-          package-related files, but you can force it using this flag.
-
-  -P, --force-pull-images
-          Forces pulling of images from DockerHub before building to populate cache. The
-          images are pulled by default only for the first time you run the
-          environment, later the locally build images are used as cache.
-
-  -E, --extras
-          Extras to pass to build images The default are different for CI and production images:
-
-          CI image:
-                 devel_ci
-
-          Production image:
-                 async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,
-                 ssh,statsd,virtualenv
-
-  --additional-extras
-          Additional extras to pass to build images The default is no additional extras.
-
-  --additional-python-deps
-          Additional python dependencies to use when building the images.
-
-  --additional-dev-deps
-          Additional apt dev dependencies to use when building the images.
-
-  --additional-runtime-deps
-          Additional apt runtime dependencies to use when building the images.
-
-  -C, --force-clean-images
-          Force build images with cache disabled. This will remove the pulled or build images
-          and start building images from scratch. This might take a long time.
-
-  -L, --use-local-cache
-          Uses local cache to build images. No pulled images will be used, but results of local
-          builds in the Docker cache are used instead.
-
 
   ####################################################################################################
 
@@ -1156,7 +1096,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
   Detailed usage for command: docker-compose
 
 
-  breeze docker-compose [FLAGS] COMMAND -- <EXTRA_ARGS>
+  breeze docker-compose [FLAGS] COMMAND [-- <EXTRA_ARGS>]
 
         Run docker-compose command instead of entering the environment. Use 'help' as command
         to see available commands. The <EXTRA_ARGS> passed after -- are treated
@@ -1168,54 +1108,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
   -p, --python <PYTHON_MAJOR_MINOR_VERSION>
           Python version used for the image. This is always major/minor version.
-          One of:
 
-                 2.7 3.5 3.6 3.7
-
-  -b, --backend <BACKEND>
-          Backend to use for tests - it determines which database is used.
-          One of:
-
-                 sqlite mysql postgres
-
-          Default: sqlite
-
-  --postgres-version <POSTGRES_VERSION>
-          Postgres version used. One of:
-
-                 9.6 10
-
-  --mysql-version <MYSQL_VERSION>
-          Mysql version used. One of:
-
-                 5.6 5.7
-
-  -v, --verbose
-          Show verbose information about executed commands (enabled by default for running test).
-          Note that you can further increase verbosity and see all the commands executed by breeze
-          by running 'export VERBOSE_COMMANDS="true"' before running breeze.
-
-
-  ####################################################################################################
-
-
-  Detailed usage for command: execute-command
-
-
-  breeze execute-command [FLAGS] COMMAND -- <EXTRA_ARGS>
-
-        Run chosen command instead of entering the environment. The command is run using
-        'bash -c "<command with args>" if you need to pass arguments to your command, you need
-        to pass them together with command surrounded with " or '. Alternatively you can
-        pass arguments as <EXTRA_ARGS> passed after --. For example:
-
-        'breeze execute-command "ls -la"' or
-        'breeze execute-command ls -- --la'
-
-  Flags:
-
-  -p, --python <PYTHON_MAJOR_MINOR_VERSION>
-          Python version used for the image. This is always major/minor version.
           One of:
 
                  2.7 3.5 3.6 3.7
@@ -1265,6 +1158,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
   -p, --python <PYTHON_MAJOR_MINOR_VERSION>
           Python version used for the image. This is always major/minor version.
+
           One of:
 
                  2.7 3.5 3.6 3.7
@@ -1316,7 +1210,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
   Detailed usage for command: static-check
 
 
-  breeze static-check [FLAGS] STATIC_CHECK
+  breeze static-check [FLAGS] STATIC_CHECK [-- <EXTRA_ARGS>]
 
         Run selected static checks for currently changed files. You should specify static check that
         you would like to run or 'all' to run all checks. One of:
@@ -1334,6 +1228,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
         'breeze static-check mypy' or
         'breeze static-check mypy -- --files tests/core.py'
+        'breeze static-check mypy -- --all-files'
 
         You can see all the options by adding --help EXTRA_ARG:
 
@@ -1343,46 +1238,18 @@ This is the current syntax for  `./breeze <./breeze>`_:
   ####################################################################################################
 
 
-  Detailed usage for command: static-check-all-files
-
-
-  breeze static-check-all [FLAGS] STATIC_CHECK
-
-        Run selected static checks for all applicable files. You should specify static check that
-        you would like to run or 'all' to run all checks. One of:
-
-                 all airflow-config-yaml bat-tests build check-apache-license
-                 check-executables-have-shebangs check-hooks-apply check-integrations
-                 check-merge-conflict check-xml debug-statements detect-private-key doctoc
-                 end-of-file-fixer fix-encoding-pragma flake8 forbid-tabs insert-license
-                 language-matters lint-dockerfile mixed-line-ending mypy pydevd python2-compile
-                 python2-fastcheck python-no-log-warn rst-backticks setup-order shellcheck
-                 trailing-whitespace update-breeze-file update-extras update-local-yml-file yamllint
-
-        You can pass extra arguments including options to the pre-commit framework as
-        <EXTRA_ARGS> passed after --. For example:
-
-        'breeze static-check-all-files mypy' or
-        'breeze static-check-all-files mypy -- --verbose'
-
-        You can see all the options by adding --help EXTRA_ARG:
-
-        'breeze static-check-all-files mypy -- --help'
-
-
-  ####################################################################################################
-
-
-  Detailed usage for command: test-target
+  Detailed usage for command: tests
 
 
-  breeze test-target [FLAGS] TEST_TARGET -- <EXTRA_ARGS>
+  breeze tests [FLAGS] [TEST_TARGET ..] [-- <EXTRA_ARGS>]
 
         Run the specified unit test target. There might be multiple
         targets specified separated with comas. The <EXTRA_ARGS> passed after -- are treated
-        as additional options passed to pytest. For example:
+        as additional options passed to pytest. You can pass 'tests' as target to
+        run all tests. For example:
 
-        'breeze test-target tests/test_core.py -- --logging-level=DEBUG'
+        'breeze tests tests/test_core.py -- --logging-level=DEBUG'
+        'breeze tests tests
 
   Flags:
 
@@ -1432,6 +1299,7 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
   -p, --python <PYTHON_MAJOR_MINOR_VERSION>
           Python version used for the image. This is always major/minor version.
+
           One of:
 
                  2.7 3.5 3.6 3.7
@@ -1594,10 +1462,10 @@ This is the current syntax for  `./breeze <./breeze>`_:
   -H, --dockerhub-repo
           DockerHub repository used to pull, push, build images. Default: airflow.
 
-  -c, --registry-cache
-          If registry cache is enabled, pulls and pushes are done from the registry cache in github.
-          You need to be logged in to the registry in order to be able to pull/push from it and you
-          need to be committer to push to airflow registry.
+  -c, --github-registry
+          If GitHub registry is enabled, pulls and pushes are done from the GitHub registry not
+          DockerHub. You need to be logged in to the registry in order to be able to pull/push from it
+          and you need to be committer to push to Apache Airflow' GitHub registry.
 
   -G, --github-organisation
           GitHub organisation used to pull, push images when cache is used. Default: apache.
@@ -1621,17 +1489,6 @@ This is the current syntax for  `./breeze <./breeze>`_:
 
  .. END BREEZE HELP MARKER
 
-Convenience Scripts
--------------------
-
-Once you run ``./breeze`` you can also execute various actions via generated convenience scripts:
-
-.. code-block::
-
-   Enter the environment          : ./.build/cmd_run
-   Run command in the environment : ./.build/cmd_run "[command with args]" [bash options]
-   Run tests in the environment   : ./.build/test_run [test-target] [pytest options]
-   Run Docker compose command     : ./.build/dc [help/pull/...] [docker-compose options]
 
 Troubleshooting
 ===============
diff --git a/Dockerfile b/Dockerfile
index 7c722cf..56fdf0f 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -158,6 +158,23 @@ ENV PIP_VERSION=${PIP_VERSION}
 
 RUN pip install --upgrade pip==${PIP_VERSION}
 
+ARG AIRFLOW_REPO=apache/airflow
+ENV AIRFLOW_REPO=${AIRFLOW_REPO}
+
+ARG AIRFLOW_BRANCH=master
+ENV AIRFLOW_BRANCH=${AIRFLOW_BRANCH}
+
+ARG AIRFLOW_EXTRAS
+ARG ADDITIONAL_AIRFLOW_EXTRAS=""
+ENV AIRFLOW_EXTRAS=${AIRFLOW_EXTRAS}${ADDITIONAL_AIRFLOW_EXTRAS:+,}${ADDITIONAL_AIRFLOW_EXTRAS}
+
+# In case of Production build image segment we want to pre-install master version of airflow
+# dependencies from github so that we do not have to always reinstall it from the scratch.
+RUN pip install --user \
+    "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz#egg=apache-airflow[${AIRFLOW_EXTRAS}]" \
+        --constraint "https://raw.githubusercontent.com/${AIRFLOW_REPO}/${AIRFLOW_BRANCH}/requirements/requirements-python${PYTHON_MAJOR_MINOR_VERSION}.txt" \
+    && pip uninstall --yes apache-airflow;
+
 ARG AIRFLOW_SOURCES_FROM="."
 ENV AIRFLOW_SOURCES_FROM=${AIRFLOW_SOURCES_FROM}
 
@@ -172,10 +189,6 @@ ENV CASS_DRIVER_BUILD_CONCURRENCY=${CASS_DRIVER_BUILD_CONCURRENCY}
 ARG AIRFLOW_VERSION
 ENV AIRFLOW_VERSION=${AIRFLOW_VERSION}
 
-ARG AIRFLOW_EXTRAS
-ARG ADDITIONAL_AIRFLOW_EXTRAS=""
-ENV AIRFLOW_EXTRAS=${AIRFLOW_EXTRAS}${ADDITIONAL_AIRFLOW_EXTRAS:+,}${ADDITIONAL_AIRFLOW_EXTRAS}
-
 ARG ADDITIONAL_PYTHON_DEPS=""
 ENV ADDITIONAL_PYTHON_DEPS=${ADDITIONAL_PYTHON_DEPS}
 
@@ -215,13 +228,6 @@ RUN \
         rm -rf "${WWW_DIR}/node_modules"; \
     fi
 
-ARG ENTRYPOINT_FILE="entrypoint.sh"
-ENV ENTRYPOINT_FILE="${ENTRYPOINT_FILE}"
-
-# hadolint ignore=DL3020
-ADD ${ENTRYPOINT_FILE} /entrypoint
-RUN chmod a+x /entrypoint
-
 ##############################################################################################
 # This is the actual Airflow image - much smaller than the build one. We copy
 # installed Airflow and all it's dependencies from the build image to make it smaller.
@@ -334,7 +340,11 @@ RUN mkdir -pv "${AIRFLOW_HOME}"; \
     chown -R "airflow" "${AIRFLOW_HOME}"
 
 COPY --chown=airflow:airflow --from=airflow-build-image /root/.local "/home/airflow/.local"
-COPY --chown=airflow:airflow --from=airflow-build-image /entrypoint /entrypoint
+
+COPY scripts/prod/entrypoint_prod.sh /entrypoint
+COPY scripts/prod/clean-logs.sh /clean-logs
+
+RUN chmod a+x /entrypoint /clean-logs
 
 USER airflow
 
diff --git a/Dockerfile.ci b/Dockerfile.ci
index 4c9741b..232711b 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -222,10 +222,11 @@ ENV AIRFLOW_CI_BUILD_EPOCH=${AIRFLOW_CI_BUILD_EPOCH}
 # In case of CI builds we want to pre-install master version of airflow dependencies so that
 # We do not have to always reinstall it from the scratch.
 # This can be reinstalled from latest master by increasing PIP_DEPENDENCIES_EPOCH_NUMBER.
-# And is automatically reinstalled from the scratch with every python patch level release
-RUN pip install "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz#egg=apache-airflow[${AIRFLOW_EXTRAS}]" \
+# And is automatically reinstalled from the scratch every time patch release of python gets released
+RUN pip install \
+    "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz#egg=apache-airflow[${AIRFLOW_EXTRAS}]" \
         --constraint "https://raw.githubusercontent.com/${AIRFLOW_REPO}/${AIRFLOW_BRANCH}/requirements/requirements-python${PYTHON_MAJOR_MINOR_VERSION}.txt" \
-    && pip uninstall --yes apache-airflow
+    && pip uninstall --yes apache-airflow;
 
 # Link dumb-init for backwards compatibility (so that older images also work)
 RUN ln -sf /usr/bin/dumb-init /usr/local/bin/dumb-init
@@ -274,7 +275,8 @@ COPY airflow/www_rbac/static ${AIRFLOW_SOURCES}/airflow/www_rbac/static/
 # Package JS/css for production
 RUN yarn --cwd airflow/www_rbac run prod
 
-COPY entrypoint.sh /entrypoint.sh
+COPY scripts/ci/in_container/entrypoint_ci.sh /entrypoint
+RUN chmod a+x /entrypoint
 
 # Copy selected subdirectories only
 COPY .github/ ${AIRFLOW_SOURCES}/.github/
@@ -290,9 +292,6 @@ COPY .coveragerc .rat-excludes .flake8 LICENSE MANIFEST.in NOTICE CHANGELOG.txt
      setup.cfg setup.py \
      ${AIRFLOW_SOURCES}/
 
-# Needed for building images via docker-in-docker inside the docker
-COPY Dockerfile.ci ${AIRFLOW_SOURCES}/Dockerfile.ci
-
 # Install autocomplete for airflow
 RUN register-python-argcomplete airflow >> ~/.bashrc
 
@@ -316,6 +315,4 @@ ENV PATH="${HOME}:${PATH}"
 
 EXPOSE 8080
 
-ENTRYPOINT ["/usr/bin/dumb-init", "--", "/entrypoint.sh"]
-
-CMD ["--help"]
+ENTRYPOINT ["/usr/bin/dumb-init", "--", "/entrypoint"]
diff --git a/IMAGES.rst b/IMAGES.rst
index 3add528..4cdf86d 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -53,13 +53,13 @@ also change the repository itself by adding ``--dockerhub-user`` and ``--dockerh
 
 You can build the CI image using this command:
 
-.. code-block::
+.. code-block:: bash
 
   ./breeze build-image
 
 You can build production image using this command:
 
-.. code-block::
+.. code-block:: bash
 
   ./breeze build-image --production-image
 
@@ -73,7 +73,7 @@ can change the extras via the ``--extras`` parameters. You can see default extra
 For example if you want to build python 3.7 version of production image with
 "all" extras installed you should run this command:
 
-.. code-block::
+.. code-block:: bash
 
   ./breeze build-image --python 3.7 --extras "all" --production-image
 
@@ -90,42 +90,132 @@ In Breeze by default, the airflow is installed using local sources of Apache Air
 You can also build production images from PIP packages via providing ``--install-airflow-version``
 parameter to Breeze:
 
-.. code-block::
+.. code-block:: bash
 
   ./breeze build-image --python 3.7 --extras=gcp --production-image --install-airflow-version=1.10.9
 
 This will build the image using command similar to:
 
-.. code-block::
+.. code-block:: bash
 
-    pip install apache-airflow[gcp]==1.10.9 \
+    pip install apache-airflow[sendgrid]==1.10.9 \
        --constraint https://raw.githubusercontent.com/apache/airflow/v1-10-test/requirements/requirements-python3.7.txt
 
-This will also download entrypoint script from https://raw.githubusercontent.com/apache/airflow/v1-10-test/entrypoint.sh
-url. It is important so that we have matching version of the requirements.
-
-The requirement files and entrypoint only appeared in version 1.10.10 of airflow so if you install
+The requirement files only appeared in version 1.10.10 of airflow so if you install
 an earlier version -  both constraint and requirements should point to 1.10.10 version.
 
 You can also build production images from specific Git version via providing ``--install-airflow-reference``
 parameter to Breeze:
 
-.. code-block::
+.. code-block:: bash
 
     pip install https://github.com/apache/airflow/archive/<tag>.tar.gz#egg=apache-airflow \
        --constraint https://raw.githubusercontent.com/apache/airflow/<tag>/requirements/requirements-python3.7.txt
 
-This will also Download entrypoint script from ``https://raw.githubusercontent.com/apache/airflow/<tag>/entrypoint.sh``
-url.
+Using cache during builds
+=========================
+
+Default mechanism used in Breeze for building images uses - as base - images puled from DockerHub or
+GitHub Image Registry. This is in order to speed up local builds and CI builds - instead of 15 minutes
+for rebuild of CI images, it takes usually less than 3 minutes when cache is used. For CI builds this is
+usually the best strategy - to use default "pull" cache - same for Production Image - it's better to rely
+on the "pull" mechanism rather than rebuild the image from the scratch.
+
+However when you are iterating on the images and want to rebuild them quickly and often you can provide the
+``--use-local-cache`` flag to build commands - this way the standard docker mechanism based on local cache
+will be used. The first time you run it, it will take considerably longer time than if you use the
+default pull mechanism, but then when you do small, incremental changes to local sources, Dockerfile image
+and scripts further rebuilds with --use-local-cache will be considerably faster.
+
+.. code-block:: bash
+
+  ./breeze build-image --python 3.7 --production-image --use-local-cache
+
+You can also turn local docker caching by setting DOCKER_CACHE variable to "local" instead of the default
+"pulled" and export it to Breeze.
+
+.. code-block:: bash
+
+  export DOCKER_CACHE="local"
+
+You can also - if you really want - disable caching altogether by setting this variable to "no-cache".
+This is how "scheduled" builds in our CI are run - those builds take a long time because they
+always rebuild everything from scratch.
+
+.. code-block:: bash
+
+  export DOCKER_CACHE="no-cache"
+
+
+Choosing image registry
+=======================
+
+By default images are pulled and pushed from and to DockerHub registry when you use Breeze's push-image
+or build commands.
+
+Our images are named like that:
+
+.. code-block:: bash
+
+  apache/airflow:<BRANCH_OR_TAG>[-<PATCH>]-pythonX.Y         - for production images
+  apache/airflow:<BRANCH_OR_TAG>[-<PATCH>]-pythonX.Y-ci      - for CI images
+  apache/airflow:<BRANCH_OR_TAG>[-<PATCH>]-pythonX.Y-build   - for production build stage
+
+For example:
+
+.. code-block:: bash
+
+  apache/airflow:master-python3.6                - production "latest" image from current master
+  apache/airflow:master-python3.6-ci             - CI "latest" image from current master
+  apache/airflow:v1-10-test-python2.7-ci         - CI "latest" image from current v1-10-test branch
+  apache/airflow:1.10.10-python3.6               - production image for 1.10.10 release
+  apache/airflow:1.10.10-1-python3.6             - production image for 1.10.10 with some patches applied
+
+
+You can see DockerHub images at `<https://hub.docker.com/repository/docker/apache/airflow>`_
+
+By default DockerHub registry is used when you push or pull such images.
+However for CI builds we keep the images in GitHub registry as well - this way we can easily push
+the images automatically after merge requests and use such images for Pull Requests
+as cache - which makes it much it much faster for CI builds (images are available in cache
+right after merged request in master finishes it's build), The difference is visible especially if
+significant changes are done in the Dockerfile.CI.
+
+The images are named differently (in Docker definition of image names - registry URL is part of the
+image name if DockerHub is not used as registry). Also GitHub has its own structure for registries
+each project has its own registry naming convention that should be followed. The name of
+images for GitHub registry are:
+
+.. code-block:: bash
+
+  docker.pkg.github.com/apache/airflow/<BRANCH>-pythonX.Y       - for production images
+  docker.pkg.github.com/apache/airflow/<BRANCH>-pythonX.Y-ci    - for CI images
+  docker.pkg.github.com/apache/airflow/<BRANCH>-pythonX.Y-build - for production build state
+
+Note that we never push or pull TAG images to GitHub registry. It is only used for CI builds
+
+You can see all the current GitHub images at `<https://github.com/apache/airflow/packages>`_
+
+In order to interact with the GitHub images you need to add ``--github-registry`` flag to the pull/push
+commands in Breeze. This way the images will be pulled/pushed from/to GitHub rather than from/to
+DockerHub. Images are build locally as ``apache/airflow`` images but then they are tagged with the right
+GitHub tags for you.
+
+You can read more about the CI configuration and how CI builds are using DockerHub/GitHub images
+in `<CI.rst>`_.
+
+Note that you need to be committer and have the right to push to DockerHub and GitHub and you need to
+be logged in. Only committers can push images directly.
+
 
 Technical details of Airflow images
 ===================================
 
 The CI image is used by Breeze as shell image but it is also used during CI build.
 The image is single segment image that contains Airflow installation with "all" dependencies installed.
-It is optimised for rebuild speed It installs PIP dependencies from the current branch first -
-so that any changes in setup.py do not trigger
-reinstalling of all dependencies. There is a second step of installation that re-installs the dependencies
+It is optimised for rebuild speed. It installs PIP dependencies from the current branch first -
+so that any changes in setup.py do not trigger reinstalling of all dependencies.
+There is a second step of installation that re-installs the dependencies
 from the latest sources so that we are sure that latest dependencies are installed.
 
 The production image is a multi-segment image. The first segment "airflow-build-image" contains all the
@@ -135,6 +225,11 @@ build it from local sources. This is particularly useful in CI environment where
 to run Kubernetes tests. See below for the list of arguments that should be provided to build
 production image from the local sources.
 
+The image is primarily optimised for size of the final image, but also for speed of rebuilds - the
+'airlfow-build-image' segment uses the same technique as the CI builds for pre-installing PIP dependencies.
+It first pre-installs them from the right github branch and only after that final airflow installation is
+done from either local sources or remote location (PIP or github repository).
+
 Manually building the images
 ----------------------------
 
@@ -180,11 +275,10 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 |                                          |                                          | done for cassandra driver (much faster)  |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_REPO``                         | ``apache/airflow``                       | the repository from which PIP            |
-|                                          |                                          | dependencies are installed (CI           |
-|                                          |                                          | optimised)                               |
+|                                          |                                          | dependencies are pre-installed           |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_BRANCH``                       | ``master``                               | the branch from which PIP dependencies   |
-|                                          |                                          | are installed (CI optimised)             |
+|                                          |                                          | are pre-installed                        |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_CI_BUILD_EPOCH``               | ``1``                                    | increasing this value will reinstall PIP |
 |                                          |                                          | dependencies from the repository from    |
@@ -192,9 +286,9 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_EXTRAS``                       | ``all``                                  | extras to install                        |
 +------------------------------------------+------------------------------------------+------------------------------------------+
-| ``ADDITIONAL_AIRFLOW_EXTRAS``            | ````                                     | additional extras to install             |
+| ``ADDITIONAL_AIRFLOW_EXTRAS``            |                                          | additional extras to install             |
 +------------------------------------------+------------------------------------------+------------------------------------------+
-| ``ADDITIONAL_PYTHON_DEPS``               | \```\`                                   | additional python dependencies to        |
+| ``ADDITIONAL_PYTHON_DEPS``               |                                          | additional python dependencies to        |
 |                                          |                                          | install                                  |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``ADDITIONAL_DEV_DEPS``                  | ````                                     | additional apt dev dependencies to       |
@@ -208,7 +302,7 @@ Here are some examples of how CI images can built manually. CI is always built f
 
 This builds the CI image in version 3.7 with default extras ("all").
 
-.. code-block::
+.. code-block:: bash
 
   docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7
@@ -216,7 +310,7 @@ This builds the CI image in version 3.7 with default extras ("all").
 
 This builds the CI image in version 3.6 with "gcp" extra only.
 
-.. code-block::
+.. code-block:: bash
 
   docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg AIRFLOW_EXTRAS=gcp
@@ -224,14 +318,14 @@ This builds the CI image in version 3.6 with "gcp" extra only.
 
 This builds the CI image in version 3.6 with "apache-beam" extra added.
 
-.. code-block::
+.. code-block:: bash
 
   docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg ADDITIONAL_AIRFLOW_EXTRAS="apache-beam"
 
 This builds the CI image in version 3.6 with "mssql" additional package added.
 
-.. code-block::
+.. code-block:: bash
 
   docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg ADDITIONAL_PYTHON_DEPS="mssql"
@@ -270,8 +364,11 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 | ``AIRFLOW_ORG``                          | ``apache``                               | Github organisation from which Airflow   |
 |                                          |                                          | is installed (when installed from repo)  |
 +------------------------------------------+------------------------------------------+------------------------------------------+
-| ``AIRFLOW_REPO``                         | ``airflow``                              | Github repository from which Airflow is  |
-|                                          |                                          | installed (when installed from repo)     |
+| ``AIRFLOW_REPO``                         | ``apache/airflow``                       | the repository from which PIP            |
+|                                          |                                          | dependencies are pre-installed           |
++------------------------------------------+------------------------------------------+------------------------------------------+
+| ``AIRFLOW_BRANCH``                       | ``master``                               | the branch from which PIP dependencies   |
+|                                          |                                          | are pre-installed                        |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_GIT_REFERENCE``                | ``master``                               | reference (branch or tag) from Github    |
 |                                          |                                          | repository from which Airflow is         |
@@ -285,10 +382,10 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 | ``AIRFLOW_EXTRAS``                       | (see Dockerfile)                         | Default extras with which airflow is     |
 |                                          |                                          | installed                                |
 +------------------------------------------+------------------------------------------+------------------------------------------+
-| ``ADDITIONAL_AIRFLOW_EXTRAS``            | ````                                     | Optional additional extras with which    |
+| ``ADDITIONAL_AIRFLOW_EXTRAS``            |                                          | Optional additional extras with which    |
 |                                          |                                          | airflow is installed                     |
 +------------------------------------------+------------------------------------------+------------------------------------------+
-| ``ADDITIONAL_PYTHON_DEPS``               | ````                                     | Optional python packages to extend       |
+| ``ADDITIONAL_PYTHON_DEPS``               |                                          | Optional python packages to extend       |
 |                                          |                                          | the image with some extra dependencies   |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``ADDITIONAL_DEV_DEPS``                  | ````                                     | additional apt dev dependencies to       |
@@ -342,25 +439,20 @@ production image. There are three types of build:
 |                                   | the package or from GitHub URL.   |
 |                                   | See examples below                |
 +-----------------------------------+-----------------------------------+
-| ``ENTRYPOINT_FILE``               | Should point to entrypoint.sh     |
-|                                   | file in case of installation from |
-|                                   | the package or from GitHub URL.   |
-|                                   | See examples below                |
-+-----------------------------------+-----------------------------------+
 | ``AIRFLOW_WWW``                   | In case of Airflow 2.0 it should  |
 |                                   | be "www", in case of Airflow 1.10 |
 |                                   | series it should be "www_rbac".   |
 |                                   | See examples below                |
 +-----------------------------------+-----------------------------------+
 | ``AIRFLOW_SOURCES_FROM``          | Sources of Airflow. Set it to     |
-|                                   | "entrypoint.sh" to avoid costly   |
+|                                   | "empty" to avoid costly           |
 |                                   | Docker context copying            |
 |                                   | in case of installation from      |
 |                                   | the package or from GitHub URL.   |
 |                                   | See examples below                |
 +-----------------------------------+-----------------------------------+
 | ``AIRFLOW_SOURCES_TO``            | Target for Airflow sources. Set   |
-|                                   | to "/entrypoint" to avoid costly  |
+|                                   | to "/empty" to avoid costly       |
 |                                   | Docker context copying            |
 |                                   | in case of installation from      |
 |                                   | the package or from GitHub URL.   |
@@ -368,9 +460,10 @@ production image. There are three types of build:
 +-----------------------------------+-----------------------------------+
 
 
-This builds production image in version 3.6 with default extras from the local sources:
+This builds production image in version 3.6 with default extras from the local sources (master version
+of 2.0 currently):
 
-.. code-block::
+.. code-block:: bash
 
   docker build .
 
@@ -379,46 +472,48 @@ requirements taken from v1-10-test branch in Github.
 Note that versions 1.10.9 and below have no requirements so requirements should be taken from head of
 the 1.10.10 tag.
 
-.. code-block::
+.. code-block:: bash
 
   docker build . \
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="https://github.com/apache/airflow/archive/1.10.10.tar.gz#egg=apache-airflow" \
     --build-arg CONSTRAINT_REQUIREMENTS="https://raw.githubusercontent.com/apache/airflow/1.10.10/requirements/requirements-python3.7.txt" \
-    --build-arg ENTRYPOINT_FILE="https://raw.githubusercontent.com/apache/airflow/1.10.10/entrypoint.sh" \
-    --build-arg AIRFLOW_SOURCES_FROM="entrypoint.sh" \
-    --build-arg AIRFLOW_SOURCES_TO="/entrypoint"
+    --build-arg AIRFLOW_BRANCH="v1-10-test" \
+    --build-arg AIRFLOW_SOURCES_FROM="empty" \
+    --build-arg AIRFLOW_SOURCES_TO="/empty"
 
 This builds the production image in version 3.7 with default extras from 1.10.10 Pypi package and
-requirements taken from v1-10-test branch in Github.
+requirements taken from 1.10.10 tag in Github and pre-installed pip dependencies from the top
+of v1-10-test branch.
 
-.. code-block::
+.. code-block:: bash
 
   docker build . \
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
     --build-arg AIRFLOW_INSTALL_VERSION="==1.10.10" \
+    --build-arg AIRFLOW_BRANCH="v1-10-test" \
     --build-arg CONSTRAINT_REQUIREMENTS="https://raw.githubusercontent.com/apache/airflow/1.10.10/requirements/requirements-python3.7.txt" \
-    --build-arg ENTRYPOINT_FILE="https://raw.githubusercontent.com/apache/airflow/1.10.10/entrypoint.sh" \
-    --build-arg AIRFLOW_SOURCES_FROM="entrypoint.sh" \
-    --build-arg AIRFLOW_SOURCES_TO="/entrypoint"
+    --build-arg AIRFLOW_SOURCES_FROM="empty" \
+    --build-arg AIRFLOW_SOURCES_TO="/empty"
 
 This builds the production image in version 3.7 with additional airflow extras from 1.10.10 Pypi package and
-additional python dependencies.
+additional python dependencies and pre-installed pip dependencies from the top
+of v1-10-test branch.
 
-.. code-block::
+.. code-block:: bash
 
   docker build . \
     --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
     --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
     --build-arg AIRFLOW_INSTALL_VERSION="==1.10.10" \
+    --build-arg AIRFLOW_BRANCH="v1-10-test" \
     --build-arg CONSTRAINT_REQUIREMENTS="https://raw.githubusercontent.com/apache/airflow/1.10.10/requirements/requirements-python3.7.txt" \
-    --build-arg ENTRYPOINT_FILE="https://raw.githubusercontent.com/apache/airflow/1.10.10/entrypoint.sh" \
-    --build-arg AIRFLOW_SOURCES_FROM="entrypoint.sh" \
-    --build-arg AIRFLOW_SOURCES_TO="/entrypoint" \
+    --build-arg AIRFLOW_SOURCES_FROM="empty" \
+    --build-arg AIRFLOW_SOURCES_TO="/empty" \
     --build-arg ADDITIONAL_AIRFLOW_EXTRAS="mssql,hdfs"
     --build-arg ADDITIONAL_PYTHON_DEPS="sshtunnel oauth2client"
 
@@ -446,7 +541,7 @@ Image manifests
 Together with the main CI images we also build and push image manifests. Those manifests are very small images
 that contain only results of the docker inspect for the image. This is in order to be able to
 determine very quickly if the image in the docker registry has changed a lot since the last time.
-Unfortunately docker registry (specifically dockerhub registry) has no anonymous way of querying image
+Unfortunately docker registry (specifically DockerHub registry) has no anonymous way of querying image
 details via API, you need to download the image to inspect it. We overcame it in the way that
 always when we build the image we build a very small image manifest and push it to registry together
 with the main CI image. The tag for the manifest image is the same as for the image it refers
@@ -465,29 +560,158 @@ You can do it via the ``--force-pull-images`` flag to force pulling the latest i
 
 For production image:
 
-.. code-block::
+.. code-block:: bash
 
   ./breeze build-image --force-pull-images --production-image
 
 For CI image Breeze automatically uses force pulling in case it determines that your image is very outdated,
 however uou can also force it with the same flag.
 
-.. code-block::
+.. code-block:: bash
 
   ./breeze build-image --force-pull-images
 
-Using the images
-================
 
-Both images have entrypoint set as dumb-init with entrypoint.sh script executed (in order to forward
-signals). This entrypoint works as follows:
+Embedded image scripts
+======================
+
+Both images have a set of scripts that can be used in the image. Those are:
+ * /entrypoint - entrypoint script used when entering the image
+ * /clean-logs - script for periodic log cleaning
+
+
+Running the CI image
+====================
+
+The entrypoint in the CI image contains all the initialisation needed for tests to be immediately executed.
+It is copied from ``scripts/ci/in_container/entrypoint_ci.sh``.
+
+The default behaviour is that you are dropped into bash shell. However if RUN_TESTS variable is
+set to "true", then tests passed as arguments are executed
+
+The entrypoint performs those operations:
+
+* checks if the environment is ready to test (including database and all integrations). It waits
+  until all the components are ready to work
+
+* installs older version of Airflow (if older version of Airflow is requested to be installed
+  via ``INSTALL_AIRFLOW_VERSION`` variable.
+
+* Sets up Kerberos if Kerberos integration is enabled (generates and configures Kerberos token)
+
+* Sets up ssh keys for ssh tests and restarts teh SSH server
+
+* Sets all variables and configurations needed for unit tests to run
+
+* Reads additional variables set in ``files/airflow-breeze-config/variables.env`` by sourcing that file
+
+* In case of CI run sets parallelism to 2 to avoid excessive number of processes to run
+
+* In case of CI run sets default parameters for pytest
+
+* In case of running integration/long_running/quarantined tests - it sets the right pytest flags
+
+* Sets default "tests" target in case the target is not explicitly set as additional argument
+
+* Runs system tests if RUN_SYSTEM_TESTS flag is specified, otherwise runs regular unit and integration tests
+
+
+Using the PROD image
+====================
+
+The PROD image entrypoint works as follows:
 
 * If ``AIRFLOW__CORE__SQL_ALCHEMY_CONN`` variable is passed to the container and it is either mysql or postgres
   SQL alchemy connection, then the connection is checked and the script waits until the database is reachable.
+
 * If no ``AIRFLOW__CORE__SQL_ALCHEMY_CONN`` variable is set or if it is set to sqlite SQL alchemy connection
   then db reset is executed.
+
 * If ``AIRFLOW__CELERY__BROKER_URL`` variable is passed and scheduler, worker of flower command is used then
   the connection is checked and the script waits until the Celery broker database is reachable.
 
-* If no argument is specified - you are dropped in bash shell.
-* If there are any arguments they are passed to "airflow" command
+* If first argument equals to "bash" - it dropped in bash shell or executes bash command if you specify
+  extra arguments. For example:
+
+.. code-block:: bash
+
+  docker run -it apache/airflow:master-python3.6 bash -c "ls -la"
+  total 16
+  drwxr-xr-x 4 airflow root 4096 Jun  5 18:12 .
+  drwxr-xr-x 1 root    root 4096 Jun  5 18:12 ..
+  drwxr-xr-x 2 airflow root 4096 Jun  5 18:12 dags
+  drwxr-xr-x 2 airflow root 4096 Jun  5 18:12 logs
+
+* If first argument is equal to "python" - you are dropped in python shell or python commands are executed if
+  you pass extra parameters. For example:
+
+.. code-block:: bash
+
+  > docker run -it apache/airflow:master-python3.6 python -c "print('test')"
+  test
+
+* If there are any other arguments - they are passed to "airflow" command
+
+.. code-block:: bash
+
+  > docker run -it apache/airflow:master-python3.6 --help
+
+  usage: airflow [-h]
+                 {celery,config,connections,dags,db,info,kerberos,plugins,pools,roles,rotate_fernet_key,scheduler,sync_perm,tasks,users,variables,version,webserver}
+                 ...
+
+  positional arguments:
+
+    Groups:
+      celery              Start celery components
+      connections         List/Add/Delete connections
+      dags                List and manage DAGs
+      db                  Database operations
+      pools               CRUD operations on pools
+      roles               Create/List roles
+      tasks               List and manage tasks
+      users               CRUD operations on users
+      variables           CRUD operations on variables
+
+    Commands:
+      config              Show current application configuration
+      info                Show information about current Airflow and environment
+      kerberos            Start a kerberos ticket renewer
+      plugins             Dump information about loaded plugins
+      rotate_fernet_key   Rotate encrypted connection credentials and variables
+      scheduler           Start a scheduler instance
+      sync_perm           Update permissions for existing roles and DAGs
+      version             Show the version
+      webserver           Start a Airflow webserver instance
+
+  optional arguments:
+    -h, --help            show this help message and exit
+
+
+Alpha versions of 1.10.10 production-ready images
+=================================================
+
+The production images have been released for the first time in 1.10.10 release of Airflow as "Alpha" quality
+ones. Between 1.10.10 the images are being improved and the 1.10.10 images should be patched and
+published several times separately in order to test them with the upcoming Helm Chart.
+
+Those images are for development and testing only and should not be used outside of the
+development community.
+
+The images were pushed with tags following the pattern: ``apache/airflow:1.10.10.1-alphaN-pythonX.Y``.
+Patch level is an increasing number (starting from 1).
+
+Those are alpha-quality releases however they contain the officially released Airflow ``1.10.10`` code.
+The main changes in the images are scripts embedded in the images.
+
+The following versions were pushed:
+
++-------+--------------------------------+----------------------------------------------------------+
+| Patch | Tag pattern                    | Description                                              |
++=======+================================+==========================================================+
+| 1     | ``1.10.10.1-alpha1-pythonX.Y`` | Support for parameters added to bash and python commands |
++-------+--------------------------------+----------------------------------------------------------+
+| 2     | ``1.10.10-1-alpha2-pythonX.Y`` | Added "/clean-logs" script                               |
++-------+--------------------------------+----------------------------------------------------------+
+
+The commits used to generate those images are tagged with ``prod-image-1.10.10.1-alphaN`` tags.
diff --git a/TESTING.rst b/TESTING.rst
index ea6884e..36c1427 100644
--- a/TESTING.rst
+++ b/TESTING.rst
@@ -131,23 +131,23 @@ Running Tests for a Specified Target Using Breeze from the Host
 ---------------------------------------------------------------
 
 If you wish to only run tests and not to drop into shell, apply the
-``-t``, ``--test-target`` flag. You can add extra pytest flags after ``--`` in the command line.
+``tests`` command. You can add extra targets and pytest flags after the ``tests`` command.
 
 .. code-block:: bash
 
-     ./breeze test-target tests/hooks/test_druid_hook.py -- --logging-level=DEBUG
+     ./breeze tests tests/hooks/test_druid_hook.py tests/tests_core.py --logging-level=DEBUG
 
-You can run the whole test suite with a special '.' test target:
+You can run the whole test suite with a 'tests' test target:
 
 .. code-block:: bash
 
-    ./breeze test-target .
+    ./breeze tests tests
 
 You can also specify individual tests or a group of tests:
 
 .. code-block:: bash
 
-    ./breeze test-target tests/test_core.py::TestCore
+    ./breeze tests tests/test_core.py::TestCore
 
 
 Airflow Integration Tests
diff --git a/breeze b/breeze
index 9e1c746..d78eaf2 100755
--- a/breeze
+++ b/breeze
@@ -68,7 +68,7 @@ function setup_default_breeze_variables() {
     # We have different versions of images depending on the python version used. We keep up with the
     # Latest patch-level changes in Python (this is done automatically during CI builds) so we have
     # To only take into account MAJOR and MINOR version of python. This variable keeps the major/minor
-    # version of python in X.Y format (3.6, 3.7 etc).
+    # version of python in X.Y format (2.7, 3.5, 3.6, 3.7 etc).
     export PYTHON_MAJOR_MINOR_VERSION="${PYTHON_MAJOR_MINOR_VERSION:=$(read_from_file PYTHON_MAJOR_MINOR_VERSION)}"
 
     # When we generate documentation for README files, we want to force the width of terminal so that
@@ -103,12 +103,6 @@ function setup_default_breeze_variables() {
     # Determines if help should be run (set to true by --help flag)
     RUN_HELP="false"
 
-    # Holds chosen command to run in case 'execute-command' command is used.
-    RUN_COMMAND=""
-
-    # Holds the test target if the 'test-target' command is used.
-    TEST_TARGET=""
-
     # Holds docker compose command if the `docker-compose` command is used.
     DOCKER_COMPOSE_COMMAND=""
 
@@ -426,12 +420,24 @@ function prepare_command_file() {
     local TESTS="${3}"
     local COMPOSE_FILE="${4}"
     local AIRFLOW_IMAGE="${5}"
-    local EXPANSION="${6-@}"
     cat <<EOF > "${FILE}"
 #!/usr/bin/env bash
-cd "\$(pwd)" || exit
+if [[ \${VERBOSE} == "true" ]]; then
+  echo
+  echo "Executing script:"
+  echo
+  echo "\${BASH_SOURCE[0]} \${@}"
+  echo
+  set -x
+fi
+cd "\$( dirname "\${BASH_SOURCE[0]}" )" || exit
 export DOCKERHUB_USER=${DOCKERHUB_USER}
 export DOCKERHUB_REPO=${DOCKERHUB_REPO}
+HOST_USER_ID=\$(id -ur)
+export HOST_USER_ID
+HOST_GROUP_ID=\$(id -gr)
+export HOST_GROUP_ID
+export HOST_AIRFLOW_SOURCES="${AIRFLOW_SOURCES}"
 export COMPOSE_FILE="${COMPOSE_FILE}"
 export PYTHON_MAJOR_MINOR_VERSION="${PYTHON_MAJOR_MINOR_VERSION}"
 export BACKEND="${BACKEND}"
@@ -446,10 +452,10 @@ export MYSQL_VERSION="${MYSQL_VERSION}"
 export AIRFLOW_SOURCES="${AIRFLOW_SOURCES}"
 export MYSQL_ENCODING="${MYSQL_ENCODING}"
 export AIRFLOW_CI_IMAGE="${AIRFLOW_CI_IMAGE}"
-export AIRFOW_PROD_IMAGE="${AIRFLOW_PROD_IMAGE}"
+export AIRFLOW_PROD_IMAGE="${AIRFLOW_PROD_IMAGE}"
 export AIRFLOW_IMAGE="${AIRFLOW_IMAGE}"
 export SQLITE_URL="${SQLITE_URL}"
-docker-compose --log-level INFO ${CMD}\$${EXPANSION}"
+docker-compose --log-level INFO ${CMD}
 EOF
     chmod u+x "${FILE}"
 }
@@ -502,43 +508,22 @@ function prepare_command_files() {
     export COMPOSE_CI_FILE
     export COMPOSE_PROD_FILE
 
-    CI_ENTRYPOINT_FILE="/opt/airflow/scripts/ci/in_container/entrypoint_ci.sh"
-    PROD_ENTRYPOINT_FILE="/entrypoint"
-
     # Base python image for the build
     export PYTHON_BASE_IMAGE=python:${PYTHON_MAJOR_MINOR_VERSION}-slim-buster
     export AIRFLOW_CI_IMAGE="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}-ci"
     export AIRFLOW_PROD_IMAGE="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}"
     export BUILT_IMAGE_FLAG_FILE="${BUILD_CACHE_DIR}/${BRANCH_NAME}/.built_${PYTHON_MAJOR_MINOR_VERSION}"
 
-    DC_RUN_CI_COMMAND="run --service-ports --rm airflow \"${CI_ENTRYPOINT_FILE} "
-    DC_RUN_PROD_COMMAND="run --service-ports --rm airflow \"${PROD_ENTRYPOINT_FILE} "
-
-    LAST_DC_RUN_CI_FILE="cmd_run_ci"
-    LAST_DC_RUN_PROD_FILE="cmd_run_prod"
-    LAST_DC_TEST_CI_FILE="test_run_ci"
     LAST_DC_CI_FILE="dc_ci"
     LAST_DC_PROD_FILE="dc_prod"
 
-    # Prepare script for "run ci command"
-    prepare_command_file "${BUILD_CACHE_DIR}/${LAST_DC_RUN_CI_FILE}" \
-        "${DC_RUN_CI_COMMAND}" "false" "${COMPOSE_CI_FILE}" "${AIRFLOW_CI_IMAGE}" '*'
-
-    # Prepare script for "run prod command"
-    prepare_command_file "${BUILD_CACHE_DIR}/${LAST_DC_RUN_PROD_FILE}" \
-        "${DC_RUN_PROD_COMMAND}" "false" "${COMPOSE_PROD_FILE}" "${AIRFLOW_PROD_IMAGE}" '*'
-
-    # Prepare script for "run test"
-    prepare_command_file "${BUILD_CACHE_DIR}/${LAST_DC_TEST_CI_FILE}" \
-        "${DC_RUN_CI_COMMAND}" "true" "${COMPOSE_CI_FILE}" "${AIRFLOW_CI_IMAGE}" '*'
-
-    # Prepare script for "run docker compose command"
+    # Prepare script for "run docker compose CI command"
     prepare_command_file "${BUILD_CACHE_DIR}/${LAST_DC_CI_FILE}" \
-        '"' "false" "${COMPOSE_CI_FILE}" "${AIRFLOW_CI_IMAGE}"
+        "\"\${@}\"" "false" "${COMPOSE_CI_FILE}" "${AIRFLOW_CI_IMAGE}"
 
-    # Prepare script for "run docker compose prod command"
+    # Prepare script for "run docker compose PROD command"
     prepare_command_file "${BUILD_CACHE_DIR}/${LAST_DC_PROD_FILE}" \
-        '"' "false" "${COMPOSE_PROD_FILE}" "${AIRFLOW_PROD_IMAGE}"
+        "\"\${@}\"" "false" "${COMPOSE_PROD_FILE}" "${AIRFLOW_PROD_IMAGE}"
 }
 
 # Prints detailed help for all commands and flgas. Used to generate documentation added to BREEZE.rst
@@ -747,11 +732,11 @@ function parse_arguments() {
           echo
           export FORWARD_CREDENTIALS="true"
           shift 1 ;;
-        -c|--registry-cache)
+        -c|--github-registry)
           echo
-          echo "Use cache for the container registry"
+          echo "Use github registry"
           echo
-          export ENABLE_REGISTRY_CACHE="true"
+          export USE_GITHUB_REGISTRY="true"
           shift ;;
         -G|--github-organisation)
           echo
@@ -819,10 +804,6 @@ function parse_arguments() {
           fi
           COMMAND_TO_RUN="run_docker_compose"
           ;;
-        execute-command)
-          LAST_SUBCOMMAND="${1}"
-          COMMAND_TO_RUN="run_in_bash"
-          shift ;;
         generate-requirements)
           LAST_SUBCOMMAND="${1}"
           COMMAND_TO_RUN="perform_generate_requirements"
@@ -834,7 +815,7 @@ function parse_arguments() {
         push-image)
           LAST_SUBCOMMAND="${1}"
           COMMAND_TO_RUN="perform_push_image"
-          SKIP_CHECK_REMOTE_IMAGE="true"
+          export SKIP_CHECK_REMOTE_IMAGE="true"
           shift ;;
         initialize-local-virtualenv)
           LAST_SUBCOMMAND="${1}"
@@ -885,33 +866,6 @@ function parse_arguments() {
             shift 2
           fi
           ;;
-        static-check-all-files)
-          LAST_SUBCOMMAND="${1}"
-          COMMAND_TO_RUN="perform_static_checks"
-          if [[ "$#" -lt 2 ]]; then
-            if [[ ${RUN_HELP} != "true" ]]; then
-              echo "You should specify static check that you would like to run or 'all' to run all checks."
-              echo
-              echo "One of :"
-              echo
-              echo "${_BREEZE_ALLOWED_STATIC_CHECKS:=}"
-              echo
-              echo "For example:"
-              echo
-              echo "${CMDNAME} static-check-all-files mypy"
-              echo
-              exit 1
-            else
-              shift
-            fi
-          else
-            export PYTHON_MAJOR_MINOR_VERSION=${STATIC_CHECK_PYTHON_MAJOR_MINOR_VERSION}
-            export STATIC_CHECK="${2:-}"
-            export STATIC_CHECK_ALL_FILES="true"
-            EXTRA_STATIC_CHECK_OPTIONS+=("--all-files" "--show-diff-on-failure")
-            shift 2
-          fi
-          ;;
         stop)
           LAST_SUBCOMMAND="${1}"
           COMMAND_TO_RUN="run_docker_compose"
@@ -926,14 +880,11 @@ function parse_arguments() {
           SECOND_COMMAND_TO_RUN="enter_breeze"
           echo "Restarts the environment. Includes emptying the databases."
           shift ;;
-        test-target)
+        tests)
           LAST_SUBCOMMAND="${1}"
           if [[ $# -lt 2 ]]; then
             RUN_HELP="true"
             shift
-          else
-            export TEST_TARGET="${2}"
-            shift 2
           fi
           COMMAND_TO_RUN="run_tests" ;;
         toggle-suppress-cheatsheet)
@@ -1033,7 +984,6 @@ function prepare_usage() {
     export USAGE_BUILD_IMAGE="Builds CI or Production docker image"
     export USAGE_CLEANUP_IMAGE="Cleans up the container image created"
     export USAGE_DOCKER_COMPOSE="Executes specified docker-compose command"
-    export USAGE_EXECUTE_COMMAND="Executes specified command in the container"
     export USAGE_FLAGS="Shows all breeze's flags"
     export USAGE_GENERATE_REQUIREMENTS="Generates pinned requirements for pip dependencies"
     export USAGE_INITIALIZE_LOCAL_VIRTUALENV="Initializes local virtualenv"
@@ -1046,14 +996,14 @@ function prepare_usage() {
     export USAGE_STATIC_CHECK_ALL_FILES="Performs selected static check for all files"
     export USAGE_TOGGLE_SUPPRESS_CHEATSHEET="Toggles on/off cheatsheet"
     export USAGE_TOGGLE_SUPPRESS_ASCIIART="Toggles on/off asciiart"
-    export USAGE_TEST_TARGET="Runs selected test target in the container"
+    export USAGE_TESTS="Runs selected tests in the container"
     export USAGE_HELP="Shows this help message"
     export USAGE_HELP_ALL="Shows detailed help for all commands and flags"
 
 
     # shellcheck disable=SC2089
     DETAILED_USAGE_SHELL="
-${CMDNAME} shell [FLAGS] -- <EXTRA_ARGS>
+${CMDNAME} shell [FLAGS] [-- <EXTRA_ARGS>]
 
       This is default subcommand if no subcommand is used.
 
@@ -1070,13 +1020,18 @@ ${CMDNAME} shell [FLAGS] -- <EXTRA_ARGS>
       and webserver ports are forwarded to appropriate database/webserver so that you can
       connect to it from your host environment.
 
+      You can also pass <EXTRA_ARGS> after -- they will be passed as bash parameters, this is
+      especially useful to pass bash options, for example -c to execute command:
+
+      '${CMDNAME} shell -- -c \"ls -la\"'
+
 Flags:
 $(flag_footer)
 "
     # shellcheck disable=SC2090
     export DETAILED_USAGE_SHELL
     export DETAILED_USAGE_EXEC="
-${CMDNAME} exec
+${CMDNAME} exec [-- <EXTRA_ARGS>]
 
       Execs into interactive shell to an already running container. The container mus be started
       already by breeze shell command. If you are not familiar with tmux, this is the best
@@ -1128,7 +1083,7 @@ $(flag_verbosity)
     export DETAILED_USAGE_CLEANUP_IMAGE
     # shellcheck disable=SC2089
     DETAILED_USAGE_DOCKER_COMPOSE="
-${CMDNAME} docker-compose [FLAGS] COMMAND -- <EXTRA_ARGS>
+${CMDNAME} docker-compose [FLAGS] COMMAND [-- <EXTRA_ARGS>]
 
       Run docker-compose command instead of entering the environment. Use 'help' as command
       to see available commands. The <EXTRA_ARGS> passed after -- are treated
@@ -1143,25 +1098,6 @@ $(flag_verbosity)
 "
     # shellcheck disable=SC2090
     export DETAILED_USAGE_DOCKER_COMPOSE
-    # shellcheck disable=SC2089
-    DETAILED_USAGE_EXECUTE_COMMAND="
-${CMDNAME} execute-command [FLAGS] COMMAND -- <EXTRA_ARGS>
-
-      Run chosen command instead of entering the environment. The command is run using
-      'bash -c \"<command with args>\" if you need to pass arguments to your command, you need
-      to pass them together with command surrounded with \" or '. Alternatively you can
-      pass arguments as <EXTRA_ARGS> passed after --. For example:
-
-      '${CMDNAME} execute-command \"ls -la\"' or
-      '${CMDNAME} execute-command ls -- --la'
-
-Flags:
-$(flag_airflow_variants)
-$(flag_backend_variants)
-$(flag_verbosity)
-"
-    # shellcheck disable=SC2090
-    export DETAILED_USAGE_EXECUTE_COMMAND
     export DETAILED_USAGE_FLAGS="
       Explains in detail all the flags that can be used with breeze.
 "
@@ -1199,7 +1135,7 @@ $(flag_airflow_variants)
 ${CMDNAME} push_image [FLAGS]
 
       Pushes images to docker registry. You can push the images to DockerHub registry (default)
-      or to the GitHub cache registry (if --registry-cache flag is used).
+      or to the GitHub registry (if --github-registry flag is used).
 
       For DockerHub pushes --dockerhub-user and --dockerhub-repo flags can be used to specify
       the repository to push to. For GitHub repository --github-organisation and --github-repo
@@ -1212,8 +1148,8 @@ ${CMDNAME} push_image [FLAGS]
       '${CMDNAME} push-image' or
       '${CMDNAME} push-image --dockerhub-user user' to push to your private registry or
       '${CMDNAME} push-image --production-image' - to push production image or
-      '${CMDNAME} push-image --registry-cache' - to push to GitHub cache or
-      '${CMDNAME} push-image --registry-cache --github-organisation org' - for other organisation
+      '${CMDNAME} push-image --github-registry' - to push to GitHub image registry or
+      '${CMDNAME} push-image --github-registry --github-organisation org' - for other organisation
 
 Flags:
 $(flag_pull_push_docker_images)
@@ -1264,7 +1200,7 @@ $(flag_footer)
 "
     export DETAILED_USAGE_RESTART
     export DETAILED_USAGE_STATIC_CHECK="
-${CMDNAME} static-check [FLAGS] STATIC_CHECK
+${CMDNAME} static-check [FLAGS] STATIC_CHECK [-- <EXTRA_ARGS>]
 
       Run selected static checks for currently changed files. You should specify static check that
       you would like to run or 'all' to run all checks. One of:
@@ -1276,44 +1212,29 @@ ${FORMATTED_STATIC_CHECKS}
 
       '${CMDNAME} static-check mypy' or
       '${CMDNAME} static-check mypy -- --files tests/core.py'
+      '${CMDNAME} static-check mypy -- --all-files'
 
       You can see all the options by adding --help EXTRA_ARG:
 
       '${CMDNAME} static-check mypy -- --help'
 "
-    export DETAILED_USAGE_STATIC_CHECK_ALL_FILES="
-${CMDNAME} static-check-all [FLAGS] STATIC_CHECK
-
-      Run selected static checks for all applicable files. You should specify static check that
-      you would like to run or 'all' to run all checks. One of:
-
-${FORMATTED_STATIC_CHECKS}
-
-      You can pass extra arguments including options to the pre-commit framework as
-      <EXTRA_ARGS> passed after --. For example:
-
-      '${CMDNAME} static-check-all-files mypy' or
-      '${CMDNAME} static-check-all-files mypy -- --verbose'
-
-      You can see all the options by adding --help EXTRA_ARG:
-
-      '${CMDNAME} static-check-all-files mypy -- --help'
-"
     # shellcheck disable=SC2089
-    DETAILED_USAGE_TEST_TARGET="
-${CMDNAME} test-target [FLAGS] TEST_TARGET -- <EXTRA_ARGS>
+    DETAILED_USAGE_TESTS="
+${CMDNAME} tests [FLAGS] [TEST_TARGET ..] [-- <EXTRA_ARGS>]
 
       Run the specified unit test target. There might be multiple
       targets specified separated with comas. The <EXTRA_ARGS> passed after -- are treated
-      as additional options passed to pytest. For example:
+      as additional options passed to pytest. You can pass 'tests' as target to
+      run all tests. For example:
 
-      '${CMDNAME} test-target tests/test_core.py -- --logging-level=DEBUG'
+      '${CMDNAME} tests tests/test_core.py -- --logging-level=DEBUG'
+      '${CMDNAME} tests tests
 
 Flags:
 $(flag_footer)
 "
     # shellcheck disable=SC2090
-    export DETAILED_USAGE_TEST_TARGET
+    export DETAILED_USAGE_TESTS
     export DETAILED_USAGE_TOGGLE_SUPPRESS_CHEATSHEET="
 ${CMDNAME} toggle-suppress-cheatsheet
 
@@ -1412,6 +1333,7 @@ function flag_airflow_variants() {
       echo "
 -p, --python <PYTHON_MAJOR_MINOR_VERSION>
         Python version used for the image. This is always major/minor version.
+
         One of:
 
 ${FORMATTED_PYTHON_MAJOR_MINOR_VERSIONS}
@@ -1610,10 +1532,10 @@ function flag_pull_push_docker_images() {
 -H, --dockerhub-repo
         DockerHub repository used to pull, push, build images. Default: ${_BREEZE_DEFAULT_DOCKERHUB_REPO:=}.
 
--c, --registry-cache
-        If registry cache is enabled, pulls and pushes are done from the registry cache in github.
-        You need to be logged in to the registry in order to be able to pull/push from it and you
-        need to be committer to push to airflow registry.
+-c, --github-registry
+        If GitHub registry is enabled, pulls and pushes are done from the GitHub registry not
+        DockerHub. You need to be logged in to the registry in order to be able to pull/push from it
+        and you need to be committer to push to Apache Airflow' GitHub registry.
 
 -G, --github-organisation
         GitHub organisation used to pull, push images when cache is used. Default: ${_BREEZE_DEFAULT_GITHUB_ORGANISATION:=}.
@@ -1764,25 +1686,6 @@ function print_cheatsheet() {
         echo
         echo "                                  Airflow Breeze CHEATSHEET"
         echo
-        print_line
-        echo
-        echo
-        print_line
-        echo
-        echo " Bash scripts to run commands quickly:"
-        echo
-        echo "    * Enter the CI environment          : ${BUILD_CACHE_DIR}/${LAST_DC_RUN_CI_FILE}"
-        echo "    * Enter the production environment  : ${BUILD_CACHE_DIR}/${LAST_DC_RUN_PROD_FILE}"
-        echo "    * Run command in CI environment     : ${BUILD_CACHE_DIR}/${LAST_DC_RUN_CI_FILE} "\
-                                                           "[command with args] [bash options]"
-        echo "    * Run tests in CI environment       : ${BUILD_CACHE_DIR}/${LAST_DC_TEST_CI_FILE} "\
-                                                           "[test target] [pytest options]"
-        echo "    * Run docker-compose CI command     : ${BUILD_CACHE_DIR}/${LAST_DC_CI_FILE} "\
-                                                           "[docker compose command] [docker-compose options]"
-        echo "    * Run docker-compose production cmd : ${BUILD_CACHE_DIR}/${LAST_DC_PROD_FILE} "\
-                                                           "[docker compose command] [docker-compose options]"
-        echo
-
         set +e
         if ! command -v breeze; then
             print_line
@@ -1890,7 +1793,7 @@ function run_static_checks {
 # command chosen
 function run_build_command {
     case "${COMMAND_TO_RUN}" in
-        run_tests|run_docker_compose|run_in_bash)
+        run_tests|run_docker_compose)
             prepare_ci_build
             rebuild_ci_image_if_needed
             ;;
@@ -1965,9 +1868,9 @@ function run_breeze_command {
     case "${COMMAND_TO_RUN}" in
         enter_breeze)
             if [[ ${PRODUCTION_IMAGE} == "true" ]]; then
-                "${BUILD_CACHE_DIR}/${LAST_DC_RUN_PROD_FILE}"
+                "${BUILD_CACHE_DIR}/${LAST_DC_PROD_FILE}" run --service-ports --rm airflow "${@}"
             else
-                "${BUILD_CACHE_DIR}/${LAST_DC_RUN_CI_FILE}"
+                "${BUILD_CACHE_DIR}/${LAST_DC_CI_FILE}" run --service-ports --rm airflow "${@}"
             fi
             ;;
         run_exec)
@@ -1983,10 +1886,10 @@ function run_breeze_command {
               : "${AIRFLOW_TESTING_CONTAINER:?"ERROR! Breeze must be running in order to exec into running container"}"
             set -e
             docker exec -it "${AIRFLOW_TESTING_CONTAINER}" \
-                "/opt/airflow/scripts/ci/in_container/entrypoint_exec.sh"
+                "/opt/airflow/scripts/ci/in_container/entrypoint_exec.sh" "${@}"
             ;;
         run_tests)
-            "${BUILD_CACHE_DIR}/${LAST_DC_TEST_CI_FILE}" "\"${TEST_TARGET}\"" "$@"
+            "${BUILD_CACHE_DIR}/${LAST_DC_CI_FILE}" run --service-ports --rm airflow "$@"
             ;;
         run_docker_compose)
             set +u
@@ -1998,12 +1901,9 @@ function run_breeze_command {
             "${DC_FILE}" "${DOCKER_COMPOSE_COMMAND}" "${EXTRA_DC_OPTIONS[@]}" "$@"
             set -u
             ;;
-        run_in_bash)
-            "${BUILD_CACHE_DIR}/${LAST_DC_RUN_CI_FILE}" "${RUN_COMMAND}" "$@"
-            ;;
         perform_static_checks)
             make_sure_precommit_is_installed
-            run_static_checks "$@"
+            run_static_checks "${@}"
             ;;
         build_image)
             ;;
diff --git a/breeze-complete b/breeze-complete
index 6038014..6bde8fa 100644
--- a/breeze-complete
+++ b/breeze-complete
@@ -99,7 +99,7 @@ kubernetes-mode: kubernetes-version:
 skip-mounting-local-sources install-airflow-version: install-airflow-reference: db-reset
 verbose assume-yes assume-no assume-quit forward-credentials
 force-build-images force-pull-images production-image extras: force-clean-images use-local-cache
-dockerhub-user: dockerhub-repo: registry-cache github-organisation: github-repo:
+dockerhub-user: dockerhub-repo: github-registry github-organisation: github-repo:
 postgres-version: mysql-version:
 additional-extras: additional-python-deps: additional-dev-deps: additional-runtime-deps:
 "
@@ -113,7 +113,6 @@ exec
 generate-requirements
 push-image
 initialize-local-virtualenv
-kind-cluster
 setup-autocomplete
 stop
 restart
@@ -122,11 +121,9 @@ toggle-suppress-asciiart"
 
 export BREEZE_EXTRA_ARG_COMMANDS="
 docker-compose
-execute-command
 kind-cluster
 static-check
-static-check-all-files
-test-target"
+tests"
 
 export BREEZE_HELP_COMMANDS="
 flags
diff --git a/scripts/ci/ci_run_airflow_testing.sh b/scripts/ci/ci_run_airflow_testing.sh
index ad24a77..7b69a36 100755
--- a/scripts/ci/ci_run_airflow_testing.sh
+++ b/scripts/ci/ci_run_airflow_testing.sh
@@ -25,11 +25,7 @@ function run_airflow_testing_in_docker() {
       -f "${MY_DIR}/docker-compose/backend-${BACKEND}.yml" \
       "${INTEGRATIONS[@]}" \
       "${DOCKER_COMPOSE_LOCAL[@]}" \
-         run airflow \
-           '/opt/airflow/scripts/ci/in_container/entrypoint_ci.sh "${@}"' \
-           /opt/airflow/scripts/ci/in_container/entrypoint_ci.sh "${@}"
-         # Note the command is there twice (!) because it is passed via bash -c
-         # and bash -c starts passing parameters from $0. TODO: fixme
+         run airflow "${@}"
     set -u
 }
 
diff --git a/scripts/ci/docker-compose/base.yml b/scripts/ci/docker-compose/base.yml
index 34e31ff..0feea60 100644
--- a/scripts/ci/docker-compose/base.yml
+++ b/scripts/ci/docker-compose/base.yml
@@ -19,16 +19,9 @@ version: "2.2"
 services:
   airflow:
     image: ${AIRFLOW_IMAGE}
-    init: true
-    entrypoint: ["/bin/bash", "-c"]
     environment:
       - USER=root
       - ADDITIONAL_PATH=~/.local/bin
-      - HADOOP_DISTRO=cdh
-      - HADOOP_HOME=/opt/hadoop-cdh
-      - HADOOP_OPTS=-D/opt/krb5.conf
-      - HIVE_HOME=/opt/hive
-      - MINICLUSTER_HOME=/opt/minicluster
       - CELERY_BROKER_URLS=amqp://guest:guest@rabbitmq:5672,redis://redis:6379/0
       - BACKEND
       - CI
diff --git a/scripts/ci/docker-compose/local-prod.yml b/scripts/ci/docker-compose/local-prod.yml
index 6342d33..4ad5d7e 100644
--- a/scripts/ci/docker-compose/local-prod.yml
+++ b/scripts/ci/docker-compose/local-prod.yml
@@ -31,7 +31,7 @@ services:
       - ../../../.kube:/root/.kube:cached
       - ../../../files:/files:cached
       - ../../../dist:/dist:cached
-      - ../../../scripts/ci/in_container/entrypoint_ci.sh:/entrypoint_ci.sh:cached
+      - ../../../scripts/prod/entrypoint_prod.sh:/entrypoint:cached
       - ../../../setup.cfg:/opt/airflow/setup.cfg:cached
       - ../../../setup.py:/opt/airflow/setup.py:cached
       - ../../../tests:/opt/airflow/tests:cached
diff --git a/scripts/ci/docker-compose/local.yml b/scripts/ci/docker-compose/local.yml
index ff88c6c..822d49d 100644
--- a/scripts/ci/docker-compose/local.yml
+++ b/scripts/ci/docker-compose/local.yml
@@ -18,6 +18,8 @@
 version: "2.2"
 services:
   airflow:
+    stdin_open: true # docker run -i
+    tty: true        # docker run -t
     # We need to mount files an directories individually because some files
     # such apache_airflow.egg-info should not be mounted from host
     # we only mount those files that it makes sense to edit while developing
@@ -49,7 +51,7 @@ services:
       - ../../../pytest.ini:/opt/airflow/pytest.ini:cached
       - ../../../requirements:/opt/airflow/requirements:cached
       - ../../../scripts:/opt/airflow/scripts:cached
-      - ../../../scripts/ci/in_container/entrypoint_ci.sh:/entrypoint_ci.sh:cached
+      - ../../../scripts/ci/in_container/entrypoint_ci.sh:/entrypoint:cached
       - ../../../setup.cfg:/opt/airflow/setup.cfg:cached
       - ../../../setup.py:/opt/airflow/setup.py:cached
       - ../../../tests:/opt/airflow/tests:cached
diff --git a/scripts/ci/in_container/entrypoint_ci.sh b/scripts/ci/in_container/entrypoint_ci.sh
index 2a7535c..1a753cf 100755
--- a/scripts/ci/in_container/entrypoint_ci.sh
+++ b/scripts/ci/in_container/entrypoint_ci.sh
@@ -20,7 +20,7 @@ if [[ ${VERBOSE_COMMANDS:="false"} == "true" ]]; then
 fi
 
 # shellcheck source=scripts/ci/in_container/_in_container_script_init.sh
-. "$( dirname "${BASH_SOURCE[0]}" )/_in_container_script_init.sh"
+. /opt/airflow/scripts/ci/in_container/_in_container_script_init.sh
 
 AIRFLOW_SOURCES=$(cd "${MY_DIR}/../../.." || exit 1; pwd)
 
@@ -41,8 +41,6 @@ fi
 
 echo
 
-ARGS=( "$@" )
-
 RUN_TESTS=${RUN_TESTS:="true"}
 INSTALL_AIRFLOW_VERSION="${INSTALL_AIRFLOW_VERSION:=""}"
 
@@ -86,15 +84,6 @@ fi
 
 export RUN_AIRFLOW_1_10=${RUN_AIRFLOW_1_10:="false"}
 
-export HADOOP_DISTRO="${HADOOP_DISTRO:="cdh"}"
-export HADOOP_HOME="${HADOOP_HOME:="/opt/hadoop-cdh"}"
-
-if [[ ${VERBOSE} == "true" ]]; then
-    echo
-    echo "Using ${HADOOP_DISTRO} distribution of Hadoop from ${HADOOP_HOME}"
-    echo
-fi
-
 # Added to have run-tests on path
 export PATH=${PATH}:${AIRFLOW_SOURCES}
 
@@ -134,10 +123,6 @@ if [[ ${INTEGRATION_KERBEROS:="false"} == "true" ]]; then
 fi
 
 
-# Start MiniCluster
-java -cp "/opt/minicluster-1.1-SNAPSHOT/*" com.ing.minicluster.MiniCluster \
-    >"${AIRFLOW_HOME}/logs/minicluster.log" 2>&1 &
-
 # Set up ssh keys
 echo 'yes' | ssh-keygen -t rsa -C your_email@youremail.com -m PEM -P '' -f ~/.ssh/id_rsa \
     >"${AIRFLOW_HOME}/logs/ssh-keygen.log" 2>&1
@@ -160,25 +145,13 @@ ssh-keyscan -H localhost >> ~/.ssh/known_hosts 2>/dev/null
 # shellcheck source=scripts/ci/in_container/configure_environment.sh
 . "${MY_DIR}/configure_environment.sh"
 
-if [[ ${CI:=} == "true" && ${RUN_TESTS} == "true" ]] ; then
-    echo
-    echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!"
-    echo "  Setting default parallellism to 2 because we can run out of memory during tests on CI"
-    echo " !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!"
-    echo
-    export AIRFLOW__CORE__PARALELLISM=2
-fi
+cd "${AIRFLOW_SOURCES}"
 
 set +u
 # If we do not want to run tests, we simply drop into bash
-if [[ "${RUN_TESTS}" == "false" ]]; then
-    if [[ ${#ARGS} == 0 ]]; then
-        exec /bin/bash
-    else
-        exec /bin/bash -c "$(printf "%q " "${ARGS[@]}")"
-    fi
+if [[ "${RUN_TESTS:=false}" != "true" ]]; then
+    exec /bin/bash "${@}"
 fi
-
 set -u
 
 if [[ "${CI}" == "true" ]]; then
@@ -199,10 +172,11 @@ else
     CI_ARGS=()
 fi
 
-TESTS_TO_RUN="tests/"
+declare -a TESTS_TO_RUN
+TESTS_TO_RUN=("tests")
 
 if [[ ${#@} -gt 0 && -n "$1" ]]; then
-    TESTS_TO_RUN="$1"
+    TESTS_TO_RUN=("${@}")
 fi
 
 if [[ -n ${RUN_INTEGRATION_TESTS:=""} ]]; then
@@ -227,7 +201,7 @@ elif [[ ${ONLY_RUN_QUARANTINED_TESTS:=""} == "true" ]]; then
         "--timeout" "90")
 fi
 
-ARGS=("${CI_ARGS[@]}" "${TESTS_TO_RUN}")
+ARGS=("${CI_ARGS[@]}" "${TESTS_TO_RUN[@]}")
 
 if [[ ${RUN_SYSTEM_TESTS:="false"} == "true" ]]; then
     "${MY_DIR}/run_system_tests.sh" "${ARGS[@]}"
diff --git a/scripts/ci/in_container/entrypoint_exec.sh b/scripts/ci/in_container/entrypoint_exec.sh
index d675d91..cfee03a 100755
--- a/scripts/ci/in_container/entrypoint_exec.sh
+++ b/scripts/ci/in_container/entrypoint_exec.sh
@@ -18,4 +18,4 @@
 # shellcheck source=scripts/ci/in_container/configure_environment.sh
 . "$( dirname "${BASH_SOURCE[0]}" )/configure_environment.sh"
 
-exec /bin/bash
+exec /bin/bash "${@}"
diff --git a/scripts/ci/in_container/run_ci_tests.sh b/scripts/ci/in_container/run_ci_tests.sh
index fe5a574..dc86cf0 100755
--- a/scripts/ci/in_container/run_ci_tests.sh
+++ b/scripts/ci/in_container/run_ci_tests.sh
@@ -18,15 +18,12 @@
 # shellcheck source=scripts/ci/in_container/_in_container_script_init.sh
 . "$( dirname "${BASH_SOURCE[0]}" )/_in_container_script_init.sh"
 
-# any argument received is overriding the default nose execution arguments:
-PYTEST_ARGS=( "$@" )
-
 echo
-echo "Starting the tests with those pytest arguments: ${PYTEST_ARGS[*]}"
+echo "Starting the tests with those pytest arguments:" "${@}"
 echo
 set +e
 
-pytest "${PYTEST_ARGS[@]}"
+pytest "${@}"
 
 RES=$?
 
@@ -36,7 +33,7 @@ if [[ "${RES}" == "0" && ${CI:="false"} == "true" ]]; then
     bash <(curl -s https://codecov.io/bash)
 fi
 
-if [[ ${CI} == "true" ]]; then
+if [[ ${CI:=} == "true" ]]; then
     send_airflow_logs_to_file_io
 fi
 
diff --git a/scripts/ci/libraries/_build_images.sh b/scripts/ci/libraries/_build_images.sh
index a54d6d8..fcb2cc8 100644
--- a/scripts/ci/libraries/_build_images.sh
+++ b/scripts/ci/libraries/_build_images.sh
@@ -23,10 +23,9 @@
 function add_build_args_for_remote_install() {
     # entrypoint is used as AIRFLOW_SOURCES_FROM/TO in order to avoid costly copying of all sources of
     # Airflow - those are not needed for remote install at all. Entrypoint is later overwritten by
-    # ENTRYPOINT_FILE - downloaded entrypoint.sh so this is only for the purpose of iteration on Dockerfile
     EXTRA_DOCKER_PROD_BUILD_FLAGS+=(
-        "--build-arg" "AIRFLOW_SOURCES_FROM=entrypoint.sh"
-        "--build-arg" "AIRFLOW_SOURCES_TO=/entrypoint"
+        "--build-arg" "AIRFLOW_SOURCES_FROM=empty"
+        "--build-arg" "AIRFLOW_SOURCES_TO=/empty"
     )
     if [[ ${AIRFLOW_VERSION} =~ [^0-9]*1[^0-9]*10[^0-9]([0-9]*) ]]; then
         # All types of references/versions match this regexp for 1.10 series
@@ -36,21 +35,20 @@ function add_build_args_for_remote_install() {
             # This is only for 1.10.0 - 1.10.9
             EXTRA_DOCKER_PROD_BUILD_FLAGS+=(
                 "--build-arg" "CONSTRAINT_REQUIREMENTS=https://raw.githubusercontent.com/apache/airflow/1.10.10/requirements/requirements-python${PYTHON_MAJOR_MINOR_VERSION}.txt"
-                "--build-arg" "ENTRYPOINT_FILE=https://raw.githubusercontent.com/apache/airflow/1.10.10/entrypoint.sh"
             )
         else
             EXTRA_DOCKER_PROD_BUILD_FLAGS+=(
                 # For 1.10.10+ and v1-10-test it's ok to use AIRFLOW_VERSION as reference
                 "--build-arg" "CONSTRAINT_REQUIREMENTS=https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_VERSION}/requirements/requirements-python${PYTHON_MAJOR_MINOR_VERSION}.txt"
-                "--build-arg" "ENTRYPOINT_FILE=https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_VERSION}/entrypoint.sh"
             )
         fi
+        AIRFLOW_BRANCH_FOR_PYPI_PRELOADING="v1-10-test"
     else
         # For all other (master, 2.0+) we just match ${AIRFLOW_VERSION}
         EXTRA_DOCKER_PROD_BUILD_FLAGS+=(
             "--build-arg" "CONSTRAINT_REQUIREMENTS=https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_VERSION}/requirements/requirements-python${PYTHON_MAJOR_MINOR_VERSION}.txt"
-            "--build-arg" "ENTRYPOINT_FILE=https://raw.githubusercontent.com/apache/airflow/${AIRFLOW_VERSION}/entrypoint.sh"
         )
+        AIRFLOW_BRANCH_FOR_PYPI_PRELOADING="master"
     fi
 }
 
@@ -205,10 +203,10 @@ function get_local_image_info() {
     TMP_MANIFEST_LOCAL_SHA=$(mktemp)
     set +e
     # Remove the container just in case
-    verbose_docker rm --force "local-airflow-manifest"  >/dev/null 2>&1
+    verbose_docker_hide_output_on_success rm --force "local-airflow-manifest"
     # Create manifest from the local manifest image
-    if ! verbose_docker create --name "local-airflow-manifest" \
-        "${AIRFLOW_CI_LOCAL_MANIFEST_IMAGE}"  >/dev/null 2>&1 ; then
+    if ! verbose_docker_hide_output_on_success create --name "local-airflow-manifest" \
+        "${AIRFLOW_CI_LOCAL_MANIFEST_IMAGE}"  >>"${OUTPUT_LOG}" 2>&1 ; then
         echo
         echo "Local manifest image not available"
         echo
@@ -217,9 +215,10 @@ function get_local_image_info() {
     fi
     set -e
      # Create manifest from the local manifest image
-    verbose_docker cp "local-airflow-manifest:${AIRFLOW_CI_BASE_TAG}.json" "${TMP_MANIFEST_LOCAL_JSON}" >/dev/null 2>&1
+    verbose_docker_hide_output_on_success cp "local-airflow-manifest:${AIRFLOW_CI_BASE_TAG}.json" \
+        "${TMP_MANIFEST_LOCAL_JSON}" >>"${OUTPUT_LOG}" 2>&1
     sed 's/ *//g' "${TMP_MANIFEST_LOCAL_JSON}" | grep '^"sha256:' >"${TMP_MANIFEST_LOCAL_SHA}"
-    verbose_docker rm --force "local-airflow-manifest" >/dev/null 2>&1
+    verbose_docker_hide_output_on_success rm --force "local-airflow-manifest" >>"${OUTPUT_LOG}" 2>&1
 }
 
 #
@@ -233,7 +232,7 @@ function get_local_image_info() {
 function get_remote_image_info() {
     set +e
     # Pull remote manifest image
-    if ! verbose_docker pull "${AIRFLOW_CI_REMOTE_MANIFEST_IMAGE}" >/dev/null; then
+    if ! verbose_docker_hide_output_on_success pull "${AIRFLOW_CI_REMOTE_MANIFEST_IMAGE}";  then
         echo
         echo "Remote docker registry unreachable"
         echo
@@ -250,12 +249,14 @@ function get_remote_image_info() {
     TMP_MANIFEST_REMOTE_JSON=$(mktemp)
     TMP_MANIFEST_REMOTE_SHA=$(mktemp)
     # Create container out of the manifest image without running it
-    verbose_docker create --cidfile "${TMP_CONTAINER_ID}" "${AIRFLOW_CI_REMOTE_MANIFEST_IMAGE}"
+    verbose_docker_hide_output_on_success create --cidfile "${TMP_CONTAINER_ID}" \
+        "${AIRFLOW_CI_REMOTE_MANIFEST_IMAGE}"
     # Extract manifest and store it in local file
-    verbose_docker cp "$(cat "${TMP_CONTAINER_ID}"):${AIRFLOW_CI_BASE_TAG}.json" "${TMP_MANIFEST_REMOTE_JSON}"
+    verbose_docker_hide_output_on_success cp "$(cat "${TMP_CONTAINER_ID}"):${AIRFLOW_CI_BASE_TAG}.json" \
+        "${TMP_MANIFEST_REMOTE_JSON}"
     # Filter everything except SHAs of image layers
     sed 's/ *//g' "${TMP_MANIFEST_REMOTE_JSON}" | grep '^"sha256:' >"${TMP_MANIFEST_REMOTE_SHA}"
-    verbose_docker rm --force "$( cat "${TMP_CONTAINER_ID}")"
+    verbose_docker_hide_output_on_success rm --force "$( cat "${TMP_CONTAINER_ID}")"
 }
 
 # The Number determines the cut-off between local building time and pull + build time.
@@ -273,7 +274,7 @@ function get_remote_image_info() {
 # Note that this only matters if you have any of the important files changed since the last build
 # of your image such as Dockerfile.ci, setup.py etc.
 #
-MAGIC_CUT_OFF_NUMBER_OF_LAYERS=34
+MAGIC_CUT_OFF_NUMBER_OF_LAYERS=36
 
 # Compares layers from both remote and local image and set FORCE_PULL_IMAGES to true in case
 # More than the last NN layers are different.
@@ -315,11 +316,11 @@ function print_build_info() {
 # Prepares all variables needed by the CI build. Depending on the configuration used (python version
 # DockerHub user etc. the variables are set so that other functions can use those variables.
 function prepare_ci_build() {
-    export AIRFLOW_CI_BASE_TAG="${DEFAULT_BRANCH}-python${PYTHON_MAJOR_MINOR_VERSION}-ci"
+    export AIRFLOW_CI_BASE_TAG="${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}-ci"
     export AIRFLOW_CI_LOCAL_MANIFEST_IMAGE="local/${DOCKERHUB_REPO}:${AIRFLOW_CI_BASE_TAG}-manifest"
     export AIRFLOW_CI_REMOTE_MANIFEST_IMAGE="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${AIRFLOW_CI_BASE_TAG}-manifest"
     export AIRFLOW_CI_IMAGE="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${AIRFLOW_CI_BASE_TAG}"
-    if [[ ${ENABLE_REGISTRY_CACHE="false"} == "true" ]]; then
+    if [[ ${USE_GITHUB_REGISTRY="false"} == "true" ]]; then
         if [[ ${CACHE_REGISTRY_PASSWORD:=} != "" ]]; then
             echo "${CACHE_REGISTRY_PASSWORD}" | docker login \
                 --username "${CACHE_REGISTRY_USERNAME}" \
@@ -334,7 +335,7 @@ function prepare_ci_build() {
         export CACHED_PYTHON_BASE_IMAGE=""
     fi
     export AIRFLOW_BUILD_CI_IMAGE="${DOCKERHUB_USER}/${DOCKERHUB_REPO}/${AIRFLOW_CI_BASE_TAG}"
-    export AIRFLOW_CI_IMAGE_DEFAULT="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${DEFAULT_BRANCH}-ci"
+    export AIRFLOW_CI_IMAGE_DEFAULT="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${BRANCH_NAME}-ci"
     export PYTHON_BASE_IMAGE="python:${PYTHON_MAJOR_MINOR_VERSION}-slim-buster"
     export BUILT_IMAGE_FLAG_FILE="${BUILD_CACHE_DIR}/${BRANCH_NAME}/.built_${PYTHON_MAJOR_MINOR_VERSION}"
     if [[ "${DEFAULT_PYTHON_MAJOR_MINOR_VERSION}" == "${PYTHON_MAJOR_MINOR_VERSION}" ]]; then
@@ -569,10 +570,31 @@ Docker building ${AIRFLOW_CI_IMAGE}.
 # Prepares all variables needed by the CI build. Depending on the configuration used (python version
 # DockerHub user etc. the variables are set so that other functions can use those variables.
 function prepare_prod_build() {
-    export AIRFLOW_PROD_BASE_TAG="${DEFAULT_BRANCH}-python${PYTHON_MAJOR_MINOR_VERSION}"
+    if [[ "${INSTALL_AIRFLOW_REFERENCE:=}" != "" ]]; then
+        # When --install-airflow-reference is used then the image is build from github tag
+        EXTRA_DOCKER_PROD_BUILD_FLAGS=(
+            "--build-arg" "AIRFLOW_INSTALL_SOURCES=https://github.com/apache/airflow/archive/${INSTALL_AIRFLOW_REFERENCE}.tar.gz#egg=apache-airflow"
+        )
+        export AIRFLOW_VERSION="${INSTALL_AIRFLOW_REFERENCE}"
+        add_build_args_for_remote_install
+    elif [[ "${INSTALL_AIRFLOW_VERSION:=}" != "" ]]; then
+        # When --install-airflow-version is used then the image is build from PIP package
+        EXTRA_DOCKER_PROD_BUILD_FLAGS=(
+            "--build-arg" "AIRFLOW_INSTALL_SOURCES=apache-airflow"
+            "--build-arg" "AIRFLOW_INSTALL_VERSION===${INSTALL_AIRFLOW_VERSION}"
+        )
+        export AIRFLOW_VERSION="${INSTALL_AIRFLOW_VERSION}"
+        add_build_args_for_remote_install
+    else
+        # When no airflow version/reference is specified, production image is built from local sources
+        EXTRA_DOCKER_PROD_BUILD_FLAGS=(
+        )
+    fi
+
+    export AIRFLOW_PROD_BASE_TAG="${BRANCH_NAME}-python${PYTHON_MAJOR_MINOR_VERSION}"
     export AIRFLOW_PROD_BUILD_IMAGE="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${AIRFLOW_PROD_BASE_TAG}-build"
     export AIRFLOW_PROD_IMAGE="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${AIRFLOW_PROD_BASE_TAG}"
-    export AIRFLOW_PROD_IMAGE_DEFAULT="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${DEFAULT_BRANCH}"
+    export AIRFLOW_PROD_IMAGE_DEFAULT="${DOCKERHUB_USER}/${DOCKERHUB_REPO}:${BRANCH_NAME}"
     export PYTHON_BASE_IMAGE="python:${PYTHON_MAJOR_MINOR_VERSION}-slim-buster"
     if [[ "${DEFAULT_PYTHON_MAJOR_MINOR_VERSION}" == "${PYTHON_MAJOR_MINOR_VERSION}" ]]; then
         export DEFAULT_IMAGE="${AIRFLOW_PROD_IMAGE_DEFAULT}"
@@ -588,7 +610,7 @@ function prepare_prod_build() {
     export ADDITIONAL_RUNTIME_DEPS="${ADDITIONAL_RUNTIME_DEPS:=""}"
     export AIRFLOW_IMAGE="${AIRFLOW_PROD_IMAGE}"
 
-    if [[ ${ENABLE_REGISTRY_CACHE="false"} == "true" ]]; then
+    if [[ ${USE_GITHUB_REGISTRY="false"} == "true" ]]; then
         if [[ ${CACHE_REGISTRY_PASSWORD:=} != "" ]]; then
             echo "${CACHE_REGISTRY_PASSWORD}" | docker login \
                 --username "${CACHE_REGISTRY_USERNAME}" \
@@ -610,26 +632,8 @@ function prepare_prod_build() {
     AIRFLOW_KUBERNETES_IMAGE_TAG=$(echo "${AIRFLOW_KUBERNETES_IMAGE}" | cut -f 2 -d ":")
     export AIRFLOW_KUBERNETES_IMAGE_TAG
 
-    if [[ "${INSTALL_AIRFLOW_REFERENCE:=}" != "" ]]; then
-        # When --install-airflow-reference is used then the image is build from github tag
-        EXTRA_DOCKER_PROD_BUILD_FLAGS=(
-            "--build-arg" "AIRFLOW_INSTALL_SOURCES=https://github.com/apache/airflow/archive/${INSTALL_AIRFLOW_REFERENCE}.tar.gz#egg=apache-airflow"
-        )
-        export AIRFLOW_VERSION="${INSTALL_AIRFLOW_REFERENCE}"
-        add_build_args_for_remote_install
-    elif [[ "${INSTALL_AIRFLOW_VERSION:=}" != "" ]]; then
-        # When --install-airflow-version is used then the image is build from PIP package
-        EXTRA_DOCKER_PROD_BUILD_FLAGS=(
-            "--build-arg" "AIRFLOW_INSTALL_SOURCES=apache-airflow"
-            "--build-arg" "AIRFLOW_INSTALL_VERSION===${INSTALL_AIRFLOW_VERSION}"
-        )
-        export AIRFLOW_VERSION="${INSTALL_AIRFLOW_VERSION}"
-        add_build_args_for_remote_install
-    else
-        # When no airflow version/reference is specified, production image is built from local sources
-        EXTRA_DOCKER_PROD_BUILD_FLAGS=(
-        )
-    fi
+    AIRFLOW_BRANCH_FOR_PYPI_PRELOADING="${BRANCH_NAME}"
+
     go_to_airflow_sources
 }
 
@@ -669,6 +673,7 @@ function build_prod_image() {
         --build-arg PYTHON_BASE_IMAGE="${PYTHON_BASE_IMAGE}" \
         --build-arg PYTHON_MAJOR_MINOR_VERSION="${PYTHON_MAJOR_MINOR_VERSION}" \
         --build-arg AIRFLOW_VERSION="${AIRFLOW_VERSION}" \
+        --build-arg AIRFLOW_BRANCH="${AIRFLOW_BRANCH_FOR_PYPI_PRELOADING}" \
         --build-arg AIRFLOW_EXTRAS="${AIRFLOW_EXTRAS}" \
         --build-arg ADDITIONAL_AIRFLOW_EXTRAS="${ADDITIONAL_AIRFLOW_EXTRAS}" \
         --build-arg ADDITIONAL_PYTHON_DEPS="${ADDITIONAL_PYTHON_DEPS}" \
@@ -687,6 +692,7 @@ function build_prod_image() {
         --build-arg ADDITIONAL_DEV_DEPS="${ADDITIONAL_DEV_DEPS}" \
         --build-arg ADDITIONAL_RUNTIME_DEPS="${ADDITIONAL_RUNTIME_DEPS}" \
         --build-arg AIRFLOW_VERSION="${AIRFLOW_VERSION}" \
+        --build-arg AIRFLOW_BRANCH="${AIRFLOW_BRANCH_FOR_PYPI_PRELOADING}" \
         --build-arg AIRFLOW_EXTRAS="${AIRFLOW_EXTRAS}" \
         "${DOCKER_CACHE_PROD_DIRECTIVE[@]}" \
         -t "${AIRFLOW_PROD_IMAGE}" \
diff --git a/scripts/ci/libraries/_initialization.sh b/scripts/ci/libraries/_initialization.sh
index 6f66b0d..f70655b 100644
--- a/scripts/ci/libraries/_initialization.sh
+++ b/scripts/ci/libraries/_initialization.sh
@@ -46,6 +46,16 @@ function initialize_common_environment {
     # All the subsequent questions
     export LAST_FORCE_ANSWER_FILE="${BUILD_CACHE_DIR}/last_force_answer.sh"
 
+    # This folder is mounted to inside the container in /files folder. This is the way how
+    # We can exchange DAGs, scripts, packages etc with the container environment
+    export FILES_DIR="${AIRFLOW_SOURCES}/files"
+    # Temporary dir used well ... temporarily
+    export TMP_DIR="${AIRFLOW_SOURCES}/tmp"
+
+    # Create those folders above in case they do not exist
+    mkdir -p "${TMP_DIR}"
+    mkdir -p "${FILES_DIR}"
+
     # Create useful directories if not yet created
     mkdir -p "${AIRFLOW_SOURCES}/.mypy_cache"
     mkdir -p "${AIRFLOW_SOURCES}/logs"
@@ -69,7 +79,7 @@ function initialize_common_environment {
     export GITHUB_ORGANISATION=${GITHUB_ORGANISATION:="apache"}
     export GITHUB_REPO=${GITHUB_REPO:="airflow"}
     export CACHE_REGISTRY=${CACHE_REGISTRY:="docker.pkg.github.com"}
-    export ENABLE_REGISTRY_CACHE=${ENABLE_REGISTRY_CACHE:="false"}
+    export USE_GITHUB_REGISTRY=${USE_GITHUB_REGISTRY:="false"}
 
     # Default port numbers for forwarded ports
     export WEBSERVER_HOST_PORT=${WEBSERVER_HOST_PORT:="28080"}
diff --git a/scripts/ci/libraries/_local_mounts.sh b/scripts/ci/libraries/_local_mounts.sh
index 24a2eb3..127ebb3 100644
--- a/scripts/ci/libraries/_local_mounts.sh
+++ b/scripts/ci/libraries/_local_mounts.sh
@@ -47,7 +47,7 @@ function generate_local_mounts_list {
         "$prefix"pytest.ini:/opt/airflow/pytest.ini:cached
         "$prefix"requirements:/opt/airflow/requirements:cached
         "$prefix"scripts:/opt/airflow/scripts:cached
-        "$prefix"scripts/ci/in_container/entrypoint_ci.sh:/entrypoint_ci.sh:cached
+        "$prefix"scripts/ci/in_container/entrypoint_ci.sh:/entrypoint:cached
         "$prefix"setup.cfg:/opt/airflow/setup.cfg:cached
         "$prefix"setup.py:/opt/airflow/setup.py:cached
         "$prefix"tests:/opt/airflow/tests:cached
diff --git a/scripts/ci/libraries/_md5sum.sh b/scripts/ci/libraries/_md5sum.sh
index c9a254a..95e4478 100644
--- a/scripts/ci/libraries/_md5sum.sh
+++ b/scripts/ci/libraries/_md5sum.sh
@@ -24,7 +24,7 @@
 function calculate_file_md5sum {
     local FILE="${1}"
     local MD5SUM
-    local MD5SUM_CACHE_DIR="${BUILD_CACHE_DIR}/${DEFAULT_BRANCH}/${PYTHON_MAJOR_MINOR_VERSION}/${THE_IMAGE_TYPE}"
+    local MD5SUM_CACHE_DIR="${BUILD_CACHE_DIR}/${BRANCH_NAME}/${PYTHON_MAJOR_MINOR_VERSION}/${THE_IMAGE_TYPE}"
     mkdir -pv "${MD5SUM_CACHE_DIR}"
     MD5SUM=$(md5sum "${FILE}")
     local MD5SUM_FILE
@@ -54,7 +54,7 @@ function calculate_file_md5sum {
 function move_file_md5sum {
     local FILE="${1}"
     local MD5SUM_FILE
-    local MD5SUM_CACHE_DIR="${BUILD_CACHE_DIR}/${DEFAULT_BRANCH}/${PYTHON_MAJOR_MINOR_VERSION}/${THE_IMAGE_TYPE}"
+    local MD5SUM_CACHE_DIR="${BUILD_CACHE_DIR}/${BRANCH_NAME}/${PYTHON_MAJOR_MINOR_VERSION}/${THE_IMAGE_TYPE}"
     mkdir -pv "${MD5SUM_CACHE_DIR}"
     MD5SUM_FILE="${MD5SUM_CACHE_DIR}"/$(basename "${FILE}").md5sum
     local MD5SUM_FILE_NEW
diff --git a/scripts/ci/libraries/_push_pull_remove_images.sh b/scripts/ci/libraries/_push_pull_remove_images.sh
index c09ff57..7b9d8a8 100644
--- a/scripts/ci/libraries/_push_pull_remove_images.sh
+++ b/scripts/ci/libraries/_push_pull_remove_images.sh
@@ -112,8 +112,8 @@ function pull_prod_images_if_needed() {
         fi
         # "Build" segment of production image
         pull_image_possibly_from_cache "${AIRFLOW_PROD_BUILD_IMAGE}" "${CACHED_AIRFLOW_PROD_BUILD_IMAGE}"
-        # Main segment of production image
-        pull_image_possibly_from_cache "${AIRFLOW_PROD_IMAGE}" "${CACHED_AIRFLOW_PROD_IMAGE}"
+        # we never pull the main segment of production image - we always build it locally = this is
+        # usually very fast this way and it is much nicer for rebuilds and development
     fi
 }
 
@@ -162,7 +162,6 @@ function push_prod_images() {
     if [[ -n ${DEFAULT_IMAGE:=""} && ${CACHED_AIRFLOW_PROD_IMAGE} == "" ]]; then
         verbose_docker push "${DEFAULT_IMAGE}"
     fi
-
     # we do not need to push PYTHON base image here - they are already pushed in the CI push
 }
 
diff --git a/scripts/ci/libraries/_start_end.sh b/scripts/ci/libraries/_start_end.sh
index b50a405..4dcb150 100644
--- a/scripts/ci/libraries/_start_end.sh
+++ b/scripts/ci/libraries/_start_end.sh
@@ -54,6 +54,10 @@ function script_end {
     #shellcheck disable=2181
     EXIT_CODE=$?
     if [[ ${EXIT_CODE} != 0 ]]; then
+        # Cat output log in case we exit with error
+        if [[ -f "${OUTPUT_LOG}" ]]; then
+            cat "${OUTPUT_LOG}"
+        fi
         print_info "###########################################################################################"
         print_info "                   EXITING WITH STATUS CODE ${EXIT_CODE}"
         print_info "###########################################################################################"
diff --git a/scripts/ci/libraries/_verbosity.sh b/scripts/ci/libraries/_verbosity.sh
index 9958635..b7a4c0a 100644
--- a/scripts/ci/libraries/_verbosity.sh
+++ b/scripts/ci/libraries/_verbosity.sh
@@ -35,6 +35,17 @@ function verbose_docker {
     docker "${@}"
 }
 
+# In case "VERBOSE" is set to "true" (--verbose flag in Breeze) all docker commands run will be
+# printed before execution
+function verbose_docker_hide_output_on_success {
+    if [[ ${VERBOSE:="false"} == "true" && ${VERBOSE_COMMANDS:=} != "true" ]]; then
+       # do not print echo if VERBOSE_COMMAND is set (set -x does it already)
+        echo "docker" "${@}"
+    fi
+    docker "${@}" >>"${OUTPUT_LOG}" 2>&1
+}
+
+
 # Prints verbose information in case VERBOSE variable is set
 function print_info() {
     if [[ ${VERBOSE:="false"} == "true" ]]; then
diff --git a/scripts/docker/entrypoint.sh b/scripts/docker/entrypoint.sh
deleted file mode 100755
index 3d436e2..0000000
--- a/scripts/docker/entrypoint.sh
+++ /dev/null
@@ -1,110 +0,0 @@
-#!/usr/bin/env bash
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-
-# Might be empty
-AIRFLOW_COMMAND="${1}"
-
-set -euo pipefail
-
-function verify_db_connection {
-    DB_URL="${1}"
-
-    DB_CHECK_MAX_COUNT=${MAX_DB_CHECK_COUNT:=20}
-    DB_CHECK_SLEEP_TIME=${DB_CHECK_SLEEP_TIME:=3}
-
-    local DETECTED_DB_BACKEND=""
-    local DETECTED_DB_HOST=""
-    local DETECTED_DB_PORT=""
-
-
-    if [[ ${DB_URL} != sqlite* ]]; then
-        # Auto-detect DB parameters
-        [[ ${DB_URL} =~ ([^:]*)://([^@/]*)@?([^/:]*):?([0-9]*)/([^\?]*)\??(.*) ]] && \
-            DETECTED_DB_BACKEND=${BASH_REMATCH[1]} &&
-            # Not used USER match
-            DETECTED_DB_HOST=${BASH_REMATCH[3]} &&
-            DETECTED_DB_PORT=${BASH_REMATCH[4]} &&
-            # Not used SCHEMA match
-            # Not used PARAMS match
-
-        echo DB_BACKEND="${DB_BACKEND:=${DETECTED_DB_BACKEND}}"
-
-        if [[ -z "${DETECTED_DB_PORT}" ]]; then
-            if [[ ${DB_BACKEND} == "postgres"* ]]; then
-                DETECTED_DB_PORT=5432
-            elif [[ ${DB_BACKEND} == "mysql"* ]]; then
-                DETECTED_DB_PORT=3306
-            fi
-        fi
-
-        DETECTED_DB_HOST=${DETECTED_DB_HOST:="localhost"}
-
-        # Allow the DB parameters to be overridden by environment variable
-        echo DB_HOST="${DB_HOST:=${DETECTED_DB_HOST}}"
-        echo DB_PORT="${DB_PORT:=${DETECTED_DB_PORT}}"
-
-        while true
-        do
-            set +e
-            LAST_CHECK_RESULT=$(nc -zvv "${DB_HOST}" "${DB_PORT}" >/dev/null 2>&1)
-            RES=$?
-            set -e
-            if [[ ${RES} == 0 ]]; then
-                echo
-                break
-            else
-                echo -n "."
-                DB_CHECK_MAX_COUNT=$((DB_CHECK_MAX_COUNT-1))
-            fi
-            if [[ ${DB_CHECK_MAX_COUNT} == 0 ]]; then
-                echo
-                echo "ERROR! Maximum number of retries (${DB_CHECK_MAX_COUNT}) reached while checking ${DB_BACKEND} db. Exiting"
-                echo
-                break
-            else
-                sleep "${DB_CHECK_SLEEP_TIME}"
-            fi
-        done
-        if [[ ${RES} != 0 ]]; then
-            echo "        ERROR: ${BACKEND} db could not be reached!"
-            echo
-            echo "${LAST_CHECK_RESULT}"
-            echo
-            export EXIT_CODE=${RES}
-        fi
-    fi
-}
-
-# if no DB configured - use sqlite db by default
-AIRFLOW__CORE__SQL_ALCHEMY_CONN="${AIRFLOW__CORE__SQL_ALCHEMY_CONN:="sqlite:///${AIRFLOW_HOME}/airflow.db"}"
-
-verify_db_connection "${AIRFLOW__CORE__SQL_ALCHEMY_CONN}"
-
-AIRFLOW__CELERY__BROKER_URL=${AIRFLOW__CELERY__BROKER_URL:=}
-
-if [[ -n ${AIRFLOW__CELERY__BROKER_URL} ]] && \
-        [[ ${AIRFLOW_COMMAND} =~ ^(scheduler|worker|flower)$ ]]; then
-    verify_db_connection "${AIRFLOW__CELERY__BROKER_URL}"
-fi
-
-if [[ ${AIRFLOW_COMMAND} == "" ]]; then
-   exec "/bin/bash"
-fi
-
-# Run the command
-exec airflow "${@}"
diff --git a/scripts/ci/in_container/entrypoint_exec.sh b/scripts/prod/clean-logs.sh
similarity index 64%
copy from scripts/ci/in_container/entrypoint_exec.sh
copy to scripts/prod/clean-logs.sh
index d675d91..d05f8a8 100755
--- a/scripts/ci/in_container/entrypoint_exec.sh
+++ b/scripts/prod/clean-logs.sh
@@ -1,4 +1,5 @@
 #!/usr/bin/env bash
+
 # Licensed to the Apache Software Foundation (ASF) under one
 # or more contributor license agreements.  See the NOTICE file
 # distributed with this work for additional information
@@ -15,7 +16,22 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-# shellcheck source=scripts/ci/in_container/configure_environment.sh
-. "$( dirname "${BASH_SOURCE[0]}" )/configure_environment.sh"
 
-exec /bin/bash
+set -euo pipefail
+
+DIRECTORY="${AIRFLOW_HOME:-/usr/local/airflow}"
+RETENTION="${AIRFLOW__LOG_RETENTION_DAYS:-15}"
+
+trap "exit" INT TERM
+
+EVERY=$((15*60))
+
+echo "Cleaning logs every $EVERY seconds"
+
+while true; do
+  seconds=$(( $(date -u +%s) % EVERY))
+  [[ $seconds -lt 1 ]] || sleep $((EVERY - seconds))
+
+  echo "Trimming airflow logs to ${RETENTION} days."
+  find "${DIRECTORY}"/logs -mtime +"${RETENTION}" -name '*.log' -delete
+done
diff --git a/entrypoint.sh b/scripts/prod/entrypoint_prod.sh
similarity index 100%
rename from entrypoint.sh
rename to scripts/prod/entrypoint_prod.sh


[airflow] 07/25: Support additional apt dependencies (#9189)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 83ba24b9d46e0cbf32e1b4d579c7f8fa91d688da
Author: zikun <33...@users.noreply.github.com>
AuthorDate: Wed Jun 10 05:05:43 2020 +0800

    Support additional apt dependencies (#9189)
    
    * Add ADDITONAL_DEV_DEPS and ADDITONAL_RUNTIME_DEPS
    
    * Add examples for additional apt dev and runtime dependencies
    
    * Update comment
    
    * Fix typo
    
    (cherry picked from commit 82c8343ab6294168104cb2f25018656b681d2de9)
---
 Dockerfile        | 12 ++++++++++--
 Dockerfile.ci     | 10 +++++++++-
 IMAGES.rst        | 44 ++++++++++++++++++++++++++++++++++++++++++++
 docs/concepts.rst |  2 +-
 4 files changed, 64 insertions(+), 4 deletions(-)

diff --git a/Dockerfile b/Dockerfile
index ac27823..7c722cf 100644
--- a/Dockerfile
+++ b/Dockerfile
@@ -79,7 +79,10 @@ RUN apt-get update \
     && apt-get clean \
     && rm -rf /var/lib/apt/lists/*
 
-# Install basic apt dependencies
+ARG ADDITIONAL_DEV_DEPS=""
+ENV ADDITIONAL_DEV_DEPS=${ADDITIONAL_DEV_DEPS}
+
+# Install basic and additional apt dependencies
 RUN curl --fail --location https://deb.nodesource.com/setup_10.x | bash - \
     && curl https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add - > /dev/null \
     && echo "deb https://dl.yarnpkg.com/debian/ stable main" > /etc/apt/sources.list.d/yarn.list \
@@ -121,6 +124,7 @@ RUN curl --fail --location https://deb.nodesource.com/setup_10.x | bash - \
            unixodbc \
            unixodbc-dev \
            yarn \
+           ${ADDITIONAL_DEV_DEPS} \
     && apt-get autoremove -yqq --purge \
     && apt-get clean \
     && rm -rf /var/lib/apt/lists/*
@@ -242,13 +246,16 @@ ENV PYTHON_BASE_IMAGE=${PYTHON_BASE_IMAGE}
 ARG AIRFLOW_VERSION
 ENV AIRFLOW_VERSION=${AIRFLOW_VERSION}
 
+ARG ADDITIONAL_RUNTIME_DEPS=""
+ENV ADDITIONAL_RUNTIME_DEPS=${ADDITIONAL_RUNTIME_DEPS}
+
 # Make sure noninteractive debian install is used and language variables set
 ENV DEBIAN_FRONTEND=noninteractive LANGUAGE=C.UTF-8 LANG=C.UTF-8 LC_ALL=C.UTF-8 \
     LC_CTYPE=C.UTF-8 LC_MESSAGES=C.UTF-8
 
 # Note missing man directories on debian-buster
 # https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=863199
-# Install basic apt dependencies
+# Install basic and additional apt dependencies
 RUN mkdir -pv /usr/share/man/man1 \
     && mkdir -pv /usr/share/man/man7 \
     && apt-get update \
@@ -277,6 +284,7 @@ RUN mkdir -pv /usr/share/man/man1 \
            sqlite3 \
            sudo \
            unixodbc \
+           ${ADDITIONAL_RUNTIME_DEPS} \
     && apt-get autoremove -yqq --purge \
     && apt-get clean \
     && rm -rf /var/lib/apt/lists/*
diff --git a/Dockerfile.ci b/Dockerfile.ci
index 8051431..1549214 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -54,7 +54,10 @@ RUN apt-get update \
     && apt-get clean \
     && rm -rf /var/lib/apt/lists/*
 
-# Install basic apt dependencies
+ARG ADDITIONAL_DEV_DEPS=""
+ENV ADDITIONAL_DEV_DEPS=${ADDITIONAL_DEV_DEPS}
+
+# Install basic and additional apt dependencies
 RUN curl --fail --location https://deb.nodesource.com/setup_10.x | bash - \
     && curl https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add - > /dev/null \
     && echo "deb https://dl.yarnpkg.com/debian/ stable main" > /etc/apt/sources.list.d/yarn.list \
@@ -83,6 +86,7 @@ RUN curl --fail --location https://deb.nodesource.com/setup_10.x | bash - \
            sasl2-bin \
            sudo \
            yarn \
+           ${ADDITIONAL_DEV_DEPS} \
     && apt-get autoremove -yqq --purge \
     && apt-get clean \
     && rm -rf /var/lib/apt/lists/*
@@ -115,6 +119,9 @@ RUN adduser airflow \
     && echo "airflow ALL=(ALL) NOPASSWD: ALL" > /etc/sudoers.d/airflow \
     && chmod 0440 /etc/sudoers.d/airflow
 
+ARG ADDITIONAL_RUNTIME_DEPS=""
+ENV ADDITIONAL_RUNTIME_DEPS=${ADDITIONAL_RUNTIME_DEPS}
+
 # Note missing man directories on debian-buster
 # https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=863199
 RUN mkdir -pv /usr/share/man/man1 \
@@ -142,6 +149,7 @@ RUN mkdir -pv /usr/share/man/man1 \
       tmux \
       unzip \
       vim \
+      ${ADDITIONAL_RUNTIME_DEPS} \
     && apt-get autoremove -yqq --purge \
     && apt-get clean \
     && rm -rf /var/lib/apt/lists/*
diff --git a/IMAGES.rst b/IMAGES.rst
index 0a34140..0d4bd8c 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -203,6 +203,12 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 | ``ADDITIONAL_PYTHON_DEPS``               | \```\`                                   | additional python dependencies to        |
 |                                          |                                          | install                                  |
 +------------------------------------------+------------------------------------------+------------------------------------------+
+| ``ADDITIONAL_DEV_DEPS``                  | ````                                     | additional apt dev dependencies to       |
+|                                          |                                          | install                                  |
++------------------------------------------+------------------------------------------+------------------------------------------+
+| ``ADDITIONAL_RUNTIME_DEPS``              | ````                                     | additional apt runtime dependencies to   |
+|                                          |                                          | install                                  |
++------------------------------------------+------------------------------------------+------------------------------------------+
 
 Here are some examples of how CI images can built manually. CI is always built from local sources.
 
@@ -236,6 +242,20 @@ This builds the CI image in version 3.6 with "mssql" additional package added.
   docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
     --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg ADDITIONAL_PYTHON_DEPS="mssql"
 
+This builds the CI image in version 3.6 with "gcc" and "g++" additional apt dev dependencies added.
+
+.. code-block::
+
+  docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
+    --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg ADDITIONAL_DEV_DEPS="gcc g++"
+
+This builds the CI image in version 3.6 with "jdbc" extra and "default-jre-headless" additional apt runtime dependencies added.
+
+.. code-block::
+
+  docker build . -f Dockerfile.ci --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
+    --build-arg PYTHON_MAJOR_MINOR_VERSION=3.6 --build-arg AIRFLOW_EXTRAS=jdbc --build-arg ADDITIONAL_RUNTIME_DEPS="default-jre-headless"
+
 
 
 Production images
@@ -277,6 +297,12 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 | ``ADDITIONAL_PYTHON_DEPS``               | ````                                     | Optional python packages to extend       |
 |                                          |                                          | the image with some extra dependencies   |
 +------------------------------------------+------------------------------------------+------------------------------------------+
+| ``ADDITIONAL_DEV_DEPS``                  | ````                                     | additional apt dev dependencies to       |
+|                                          |                                          | install                                  |
++------------------------------------------+------------------------------------------+------------------------------------------+
+| ``ADDITIONAL_RUNTIME_DEPS``              | ````                                     | additional apt runtime dependencies to   |
+|                                          |                                          | install                                  |
++------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_HOME``                         | ``/opt/airflow``                         | Airflow’s HOME (that’s where logs and    |
 |                                          |                                          | sqlite databases are stored)             |
 +------------------------------------------+------------------------------------------+------------------------------------------+
@@ -402,6 +428,24 @@ additional python dependencies.
     --build-arg ADDITIONAL_AIRFLOW_EXTRAS="mssql,hdfs"
     --build-arg ADDITIONAL_PYTHON_DEPS="sshtunnel oauth2client"
 
+This builds the production image in version 3.7 with additional airflow extras from 1.10.10 Pypi package and
+additional apt dev and runtime dependencies.
+
+.. code-block::
+
+  docker build . \
+    --build-arg PYTHON_BASE_IMAGE="python:3.7-slim-buster" \
+    --build-arg PYTHON_MAJOR_MINOR_VERSION=3.7 \
+    --build-arg AIRFLOW_INSTALL_SOURCES="apache-airflow" \
+    --build-arg AIRFLOW_INSTALL_VERSION="==1.10.10" \
+    --build-arg CONSTRAINT_REQUIREMENTS="https://raw.githubusercontent.com/apache/airflow/1.10.10/requirements/requirements-python3.7.txt" \
+    --build-arg ENTRYPOINT_FILE="https://raw.githubusercontent.com/apache/airflow/1.10.10/entrypoint.sh" \
+    --build-arg AIRFLOW_SOURCES_FROM="entrypoint.sh" \
+    --build-arg AIRFLOW_SOURCES_TO="/entrypoint" \
+    --build-arg ADDITIONAL_AIRFLOW_EXTRAS="jdbc"
+    --build-arg ADDITIONAL_DEV_DEPS="gcc g++"
+    --build-arg ADDITIONAL_RUNTIME_DEPS="default-jre-headless"
+
 Image manifests
 ---------------
 
diff --git a/docs/concepts.rst b/docs/concepts.rst
index 3a9c4c2..603a729 100644
--- a/docs/concepts.rst
+++ b/docs/concepts.rst
@@ -522,7 +522,7 @@ with the same ``conn_id``, the :py:meth:`~airflow.hooks.base_hook.BaseHook.get_c
 provide basic load balancing and fault tolerance, when used in conjunction with retries.
 
 Airflow also provides a mechanism to store connections outside the database, e.g. in :ref:`environment variables <environment_variables_secrets_backend>`.
-Additonal sources may be enabled, e.g. :ref:`AWS SSM Parameter Store <ssm_parameter_store_secrets>`, or you may
+Additional sources may be enabled, e.g. :ref:`AWS SSM Parameter Store <ssm_parameter_store_secrets>`, or you may
 :ref:`roll your own secrets backend <roll_your_own_secrets_backend>`.
 
 Many hooks have a default ``conn_id``, where operators using that hook do not


[airflow] 21/25: Fixes Breeze 'tests' command (#9384)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7835318e06cca1e332930a49839448f741a79869
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Thu Jun 18 21:59:52 2020 +0200

    Fixes Breeze 'tests' command (#9384)
    
    Fixes the 'tests' command allows to run individual tests immediately
    from the host without entering the container. It's been broken
    in 7c12a9d4e0b6c1e01fee6ab227a6e25b5aa5b157
    
    (cherry picked from commit 6484dea15b9962dd6a714316c68e5e4aadb84382)
---
 breeze                                   | 12 ++++++------
 scripts/ci/in_container/entrypoint_ci.sh | 19 ++++++++++---------
 2 files changed, 16 insertions(+), 15 deletions(-)

diff --git a/breeze b/breeze
index 118703d..74c0974 100755
--- a/breeze
+++ b/breeze
@@ -417,9 +417,8 @@ EOF
 function prepare_command_file() {
     local FILE="${1}"
     local CMD="${2}"
-    local TESTS="${3}"
-    local COMPOSE_FILE="${4}"
-    local AIRFLOW_IMAGE="${5}"
+    local COMPOSE_FILE="${3}"
+    local AIRFLOW_IMAGE="${4}"
     cat <<EOF > "${FILE}"
 #!/usr/bin/env bash
 if [[ \${VERBOSE} == "true" ]]; then
@@ -443,7 +442,6 @@ export PYTHON_MAJOR_MINOR_VERSION="${PYTHON_MAJOR_MINOR_VERSION}"
 export BACKEND="${BACKEND}"
 export AIRFLOW_VERSION="${AIRFLOW_VERSION}"
 export INSTALL_AIRFLOW_VERSION="${INSTALL_AIRFLOW_VERSION}"
-export RUN_TESTS="${TESTS}"
 export WEBSERVER_HOST_PORT="${WEBSERVER_HOST_PORT}"
 export POSTGRES_HOST_PORT="${POSTGRES_HOST_PORT}"
 export POSTGRES_VERSION="${POSTGRES_VERSION}"
@@ -519,11 +517,11 @@ function prepare_command_files() {
 
     # Prepare script for "run docker compose CI command"
     prepare_command_file "${BUILD_CACHE_DIR}/${LAST_DC_CI_FILE}" \
-        "\"\${@}\"" "false" "${COMPOSE_CI_FILE}" "${AIRFLOW_CI_IMAGE}"
+        "\"\${@}\"" "${COMPOSE_CI_FILE}" "${AIRFLOW_CI_IMAGE}"
 
     # Prepare script for "run docker compose PROD command"
     prepare_command_file "${BUILD_CACHE_DIR}/${LAST_DC_PROD_FILE}" \
-        "\"\${@}\"" "false" "${COMPOSE_PROD_FILE}" "${AIRFLOW_PROD_IMAGE}"
+        "\"\${@}\"" "${COMPOSE_PROD_FILE}" "${AIRFLOW_PROD_IMAGE}"
 }
 
 # Prints detailed help for all commands and flgas. Used to generate documentation added to BREEZE.rst
@@ -884,6 +882,7 @@ function parse_arguments() {
           LAST_SUBCOMMAND="${1}"
           if [[ $# -lt 2 ]]; then
             RUN_HELP="true"
+          else
             shift
           fi
           COMMAND_TO_RUN="run_tests" ;;
@@ -1889,6 +1888,7 @@ function run_breeze_command {
                 "/opt/airflow/scripts/ci/in_container/entrypoint_exec.sh" "${@}"
             ;;
         run_tests)
+            export RUN_TESTS="true"
             "${BUILD_CACHE_DIR}/${LAST_DC_CI_FILE}" run --service-ports --rm airflow "$@"
             ;;
         run_docker_compose)
diff --git a/scripts/ci/in_container/entrypoint_ci.sh b/scripts/ci/in_container/entrypoint_ci.sh
index 1a753cf..2ea5a14 100755
--- a/scripts/ci/in_container/entrypoint_ci.sh
+++ b/scripts/ci/in_container/entrypoint_ci.sh
@@ -41,7 +41,8 @@ fi
 
 echo
 
-RUN_TESTS=${RUN_TESTS:="true"}
+RUN_TESTS=${RUN_TESTS:="false"}
+CI=${CI:="false"}
 INSTALL_AIRFLOW_VERSION="${INSTALL_AIRFLOW_VERSION:=""}"
 
 if [[ ${AIRFLOW_VERSION} == *1.10* || ${INSTALL_AIRFLOW_VERSION} == *1.10* ]]; then
@@ -149,13 +150,13 @@ cd "${AIRFLOW_SOURCES}"
 
 set +u
 # If we do not want to run tests, we simply drop into bash
-if [[ "${RUN_TESTS:=false}" != "true" ]]; then
+if [[ "${RUN_TESTS}" != "true" ]]; then
     exec /bin/bash "${@}"
 fi
 set -u
 
 if [[ "${CI}" == "true" ]]; then
-    CI_ARGS=(
+    EXTRA_PYTEST_ARGS=(
         "--verbosity=0"
         "--strict-markers"
         "--instafail"
@@ -169,7 +170,7 @@ if [[ "${CI}" == "true" ]]; then
         "--pythonwarnings=ignore::PendingDeprecationWarning"
         )
 else
-    CI_ARGS=()
+    EXTRA_PYTEST_ARGS=()
 fi
 
 declare -a TESTS_TO_RUN
@@ -182,18 +183,18 @@ fi
 if [[ -n ${RUN_INTEGRATION_TESTS:=""} ]]; then
     for INT in ${RUN_INTEGRATION_TESTS}
     do
-        CI_ARGS+=("--integration" "${INT}")
+        EXTRA_PYTEST_ARGS+=("--integration" "${INT}")
     done
-    CI_ARGS+=("-rpfExX")
+    EXTRA_PYTEST_ARGS+=("-rpfExX")
 elif [[ ${ONLY_RUN_LONG_RUNNING_TESTS:=""} == "true" ]]; then
-    CI_ARGS+=(
+    EXTRA_PYTEST_ARGS+=(
         "-m" "long_running"
         "--include-long-running"
         "--verbosity=1"
         "--reruns" "3"
         "--timeout" "90")
 elif [[ ${ONLY_RUN_QUARANTINED_TESTS:=""} == "true" ]]; then
-    CI_ARGS+=(
+    EXTRA_PYTEST_ARGS+=(
         "-m" "quarantined"
         "--include-quarantined"
         "--verbosity=1"
@@ -201,7 +202,7 @@ elif [[ ${ONLY_RUN_QUARANTINED_TESTS:=""} == "true" ]]; then
         "--timeout" "90")
 fi
 
-ARGS=("${CI_ARGS[@]}" "${TESTS_TO_RUN[@]}")
+ARGS=("${EXTRA_PYTEST_ARGS[@]}" "${TESTS_TO_RUN[@]}")
 
 if [[ ${RUN_SYSTEM_TESTS:="false"} == "true" ]]; then
     "${MY_DIR}/run_system_tests.sh" "${ARGS[@]}"


[airflow] 02/25: Don't use the term "whitelist" - language matters (#9174)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 566b9d3ea7b44c21c656dc32f0f895165e8197fe
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Mon Jun 8 10:01:46 2020 +0100

    Don't use the term "whitelist" - language matters (#9174)
    
    It's fairly common to say whitelisting and blacklisting to describe
    desirable and undesirable things in cyber security. However just because
    it is common doesn't mean it's right.
    
    However, there's an issue with the terminology. It only makes sense if
    you equate white with 'good, permitted, safe' and black with 'bad,
    dangerous, forbidden'. There are some obvious problems with this.
    
    You may not see why this matters. If you're not adversely affected by
    racial stereotyping yourself, then please count yourself lucky. For some
    of your friends and colleagues (and potential future colleagues), this
    really is a change worth making.
    
    From now on, we will use 'allow list' and 'deny list' in place of
    'whitelist' and 'blacklist' wherever possible. Which, in fact, is
    clearer and less ambiguous. So as well as being more inclusive of all,
    this is a net benefit to our understandability.
    
    (Words mostly borrowed from
    <https://www.ncsc.gov.uk/blog-post/terminology-its-not-black-and-white>)
    
    Co-authored-by: Jarek Potiuk <ja...@potiuk.com>
    (cherry picked from commit 6350fd6ebb9958982cb3fa1d466168fc31708035)
---
 .pre-commit-config.yaml                            | 17 +++++++++++++++
 BREEZE.rst                                         | 12 +++++------
 STATIC_CODE_CHECKS.rst                             |  2 ++
 UPDATING.md                                        | 11 ++++------
 .../config_templates/default_webserver_config.py   |  1 -
 airflow/contrib/plugins/metastore_browser/main.py  | 12 +++++------
 airflow/jobs/scheduler_job.py                      | 18 ++++++++--------
 airflow/operators/hive_stats_operator.py           | 24 +++++++++++++++-------
 airflow/plugins_manager.py                         |  2 +-
 airflow/www/static/underscore.js                   |  4 ++--
 breeze-complete                                    |  1 +
 docs/security.rst                                  |  4 ++--
 tests/contrib/hooks/test_cassandra_hook.py         |  2 +-
 tests/contrib/operators/test_hive_stats.py         |  6 +++---
 tests/utils/test_dag_processing.py                 |  4 ++--
 15 files changed, 73 insertions(+), 47 deletions(-)

diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 88ab3cc..b1852b7 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -228,6 +228,23 @@ repos:
         files: ^setup.py$|^INSTALL$|^CONTRIBUTING.rst$
         pass_filenames: false
         require_serial: true
+      - id: pydevd
+        language: pygrep
+        name: Check for pydevd debug statements accidentally left
+        entry: "pydevd.*settrace\\("
+        pass_filenames: true
+        files: \.py$
+      - id: language-matters
+        language: pygrep
+        name: Check for language that we do not accept as community
+        description: Please use "deny_list" or "allow_list"  instead.
+        entry: "(?i)(black|white)[_-]?list"
+        pass_filenames: true
+        exclude: >
+          (?x)
+          ^airflow/contrib/hooks/cassandra_hook.py$|
+          ^airflow/operators/hive_stats_operator.py$|
+          ^tests/contrib/hooks/test_cassandra_hook.py
       - id: dont-use-safe-filter
         language: pygrep
         name: Don't use safe in templates
diff --git a/BREEZE.rst b/BREEZE.rst
index 3a1a6a0..b2ff795 100644
--- a/BREEZE.rst
+++ b/BREEZE.rst
@@ -1307,9 +1307,9 @@ This is the current syntax for  `./breeze <./breeze>`_:
                  check-executables-have-shebangs check-hooks-apply check-integrations
                  check-merge-conflict check-xml debug-statements detect-private-key doctoc
                  end-of-file-fixer fix-encoding-pragma flake8 forbid-tabs insert-license
-                 lint-dockerfile mixed-line-ending mypy pydevd python2-compile python2-fastcheck
-                 python-no-log-warn rst-backticks setup-order shellcheck trailing-whitespace
-                 update-breeze-file update-extras update-local-yml-file yamllint
+                 language-matters lint-dockerfile mixed-line-ending mypy pydevd python2-compile
+                 python2-fastcheck python-no-log-warn rst-backticks setup-order shellcheck
+                 trailing-whitespace update-breeze-file update-extras update-local-yml-file yamllint
 
         You can pass extra arguments including options to to the pre-commit framework as
         <EXTRA_ARGS> passed after --. For example:
@@ -1337,9 +1337,9 @@ This is the current syntax for  `./breeze <./breeze>`_:
                  check-executables-have-shebangs check-hooks-apply check-integrations
                  check-merge-conflict check-xml debug-statements detect-private-key doctoc
                  end-of-file-fixer fix-encoding-pragma flake8 forbid-tabs insert-license
-                 lint-dockerfile mixed-line-ending mypy pydevd python2-compile python2-fastcheck
-                 python-no-log-warn rst-backticks setup-order shellcheck trailing-whitespace
-                 update-breeze-file update-extras update-local-yml-file yamllint
+                 language-matters lint-dockerfile mixed-line-ending mypy pydevd python2-compile
+                 python2-fastcheck python-no-log-warn rst-backticks setup-order shellcheck
+                 trailing-whitespace update-breeze-file update-extras update-local-yml-file yamllint
 
         You can pass extra arguments including options to the pre-commit framework as
         <EXTRA_ARGS> passed after --. For example:
diff --git a/STATIC_CODE_CHECKS.rst b/STATIC_CODE_CHECKS.rst
index 1a5a6f6..3cb2c5e 100644
--- a/STATIC_CODE_CHECKS.rst
+++ b/STATIC_CODE_CHECKS.rst
@@ -76,6 +76,8 @@ require Breeze Docker images to be installed locally:
 ----------------------------------- ---------------------------------------------------------------- ------------
 ``isort``                             Sorts imports in python files.
 ----------------------------------- ---------------------------------------------------------------- ------------
+``language-matters``                  Check for language that we do not accept as community
+----------------------------------- ---------------------------------------------------------------- ------------
 ``lint-dockerfile``                   Lints a dockerfile.
 ----------------------------------- ---------------------------------------------------------------- ------------
 ``mixed-line-ending``                 Detects if mixed line ending is used (\r vs. \r\n).
diff --git a/UPDATING.md b/UPDATING.md
index 336459b..ea8c262 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -1013,14 +1013,11 @@ dags_are_paused_at_creation = False
 
 If you specify a hive conf to the run_cli command of the HiveHook, Airflow add some
 convenience variables to the config. In case you run a secure Hadoop setup it might be
-required to whitelist these variables by adding the following to your configuration:
+required to allow these variables by adjusting you hive configuration to add `airflow\.ctx\..*` to the regex
+of user-editable configuration properties. See
+[the Hive docs on Configuration Properties][hive.security.authorization.sqlstd] for more info.
 
-```
-<property>
-     <name>hive.security.authorization.sqlstd.confwhitelist.append</name>
-     <value>airflow\.ctx\..*</value>
-</property>
-```
+[hive.security.authorization.sqlstd]: https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=82903061#ConfigurationProperties-SQLStandardBasedAuthorization.1
 
 ### Google Cloud Operator and Hook alignment
 
diff --git a/airflow/config_templates/default_webserver_config.py b/airflow/config_templates/default_webserver_config.py
index bc8a6bb..23d3985 100644
--- a/airflow/config_templates/default_webserver_config.py
+++ b/airflow/config_templates/default_webserver_config.py
@@ -65,7 +65,6 @@ AUTH_TYPE = AUTH_DB
 # Google OAuth example:
 # OAUTH_PROVIDERS = [{
 #   'name':'google',
-#     'whitelist': ['@YOU_COMPANY_DOMAIN'],  # optional
 #     'token_key':'access_token',
 #     'icon':'fa-google',
 #         'remote_app': {
diff --git a/airflow/contrib/plugins/metastore_browser/main.py b/airflow/contrib/plugins/metastore_browser/main.py
index 6826030..d5a68a2 100644
--- a/airflow/contrib/plugins/metastore_browser/main.py
+++ b/airflow/contrib/plugins/metastore_browser/main.py
@@ -35,8 +35,8 @@ METASTORE_MYSQL_CONN_ID = 'metastore_mysql'
 PRESTO_CONN_ID = 'presto_default'
 HIVE_CLI_CONN_ID = 'hive_default'
 DEFAULT_DB = 'default'
-DB_WHITELIST = None
-DB_BLACKLIST = ['tmp']
+DB_ALLOW_LIST = None
+DB_DENY_LIST = ['tmp']
 TABLE_SELECTOR_LIMIT = 2000
 
 # Keeping pandas from truncating long strings
@@ -118,11 +118,11 @@ class MetastoreBrowserView(BaseView, wwwutils.DataProfilingMixin):
     @expose('/objects/')
     def objects(self):
         where_clause = ''
-        if DB_WHITELIST:
-            dbs = ",".join(["'" + db + "'" for db in DB_WHITELIST])
+        if DB_ALLOW_LIST:
+            dbs = ",".join(["'" + db + "'" for db in DB_ALLOW_LIST])
             where_clause = "AND b.name IN ({})".format(dbs)
-        if DB_BLACKLIST:
-            dbs = ",".join(["'" + db + "'" for db in DB_BLACKLIST])
+        if DB_DENY_LIST:
+            dbs = ",".join(["'" + db + "'" for db in DB_DENY_LIST])
             where_clause = "AND b.name NOT IN ({})".format(dbs)
         sql = """
         SELECT CONCAT(b.NAME, '.', a.TBL_NAME), TBL_TYPE
diff --git a/airflow/jobs/scheduler_job.py b/airflow/jobs/scheduler_job.py
index 1dd986e..e142dc5 100644
--- a/airflow/jobs/scheduler_job.py
+++ b/airflow/jobs/scheduler_job.py
@@ -68,8 +68,8 @@ class DagFileProcessor(AbstractDagFileProcessor, LoggingMixin):
     :type file_path: unicode
     :param pickle_dags: whether to serialize the DAG objects to the DB
     :type pickle_dags: bool
-    :param dag_id_white_list: If specified, only look at these DAG ID's
-    :type dag_id_white_list: list[unicode]
+    :param dag_ids: If specified, only look at these DAG ID's
+    :type dag_ids: list[unicode]
     :param zombies: zombie task instances to kill
     :type zombies: list[airflow.utils.dag_processing.SimpleTaskInstance]
     """
@@ -77,12 +77,12 @@ class DagFileProcessor(AbstractDagFileProcessor, LoggingMixin):
     # Counter that increments every time an instance of this class is created
     class_creation_counter = 0
 
-    def __init__(self, file_path, pickle_dags, dag_id_white_list, zombies):
+    def __init__(self, file_path, pickle_dags, dag_ids, zombies):
         self._file_path = file_path
 
         # The process that was launched to process the given .
         self._process = None
-        self._dag_id_white_list = dag_id_white_list
+        self._dag_ids = dag_ids
         self._pickle_dags = pickle_dags
         self._zombies = zombies
         # The result of Scheduler.process_file(file_path).
@@ -104,7 +104,7 @@ class DagFileProcessor(AbstractDagFileProcessor, LoggingMixin):
     def _run_file_processor(result_channel,
                             file_path,
                             pickle_dags,
-                            dag_id_white_list,
+                            dag_ids,
                             thread_name,
                             zombies):
         """
@@ -117,9 +117,9 @@ class DagFileProcessor(AbstractDagFileProcessor, LoggingMixin):
         :param pickle_dags: whether to pickle the DAGs found in the file and
             save them to the DB
         :type pickle_dags: bool
-        :param dag_id_white_list: if specified, only examine DAG ID's that are
+        :param dag_ids: if specified, only examine DAG ID's that are
             in this list
-        :type dag_id_white_list: list[unicode]
+        :type dag_ids: list[unicode]
         :param thread_name: the name to use for the process that is launched
         :type thread_name: unicode
         :param zombies: zombie task instances to kill
@@ -152,7 +152,7 @@ class DagFileProcessor(AbstractDagFileProcessor, LoggingMixin):
 
             log.info("Started process (PID=%s) to work on %s",
                      os.getpid(), file_path)
-            scheduler_job = SchedulerJob(dag_ids=dag_id_white_list, log=log)
+            scheduler_job = SchedulerJob(dag_ids=dag_ids, log=log)
             result = scheduler_job.process_file(file_path,
                                                 zombies,
                                                 pickle_dags)
@@ -184,7 +184,7 @@ class DagFileProcessor(AbstractDagFileProcessor, LoggingMixin):
                 _child_channel,
                 self.file_path,
                 self._pickle_dags,
-                self._dag_id_white_list,
+                self._dag_ids,
                 "DagFileProcessor{}".format(self._instance_id),
                 self._zombies
             ),
diff --git a/airflow/operators/hive_stats_operator.py b/airflow/operators/hive_stats_operator.py
index fe8ff9d..8648845 100644
--- a/airflow/operators/hive_stats_operator.py
+++ b/airflow/operators/hive_stats_operator.py
@@ -20,6 +20,7 @@
 from builtins import zip
 from collections import OrderedDict
 import json
+import warnings
 
 from airflow.exceptions import AirflowException
 from airflow.hooks.mysql_hook import MySqlHook
@@ -49,9 +50,9 @@ class HiveStatsCollectionOperator(BaseOperator):
     :param extra_exprs: dict of expression to run against the table where
         keys are metric names and values are Presto compatible expressions
     :type extra_exprs: dict
-    :param col_blacklist: list of columns to blacklist, consider
-        blacklisting blobs, large json columns, ...
-    :type col_blacklist: list
+    :param excluded_columns: list of columns to exclude, consider
+        excluding blobs, large json columns, ...
+    :type excluded_columns: list
     :param assignment_func: a function that receives a column name and
         a type, and returns a dict of metric names and an Presto expressions.
         If None is returned, the global defaults are applied. If an
@@ -69,18 +70,27 @@ class HiveStatsCollectionOperator(BaseOperator):
             table,
             partition,
             extra_exprs=None,
-            col_blacklist=None,
+            excluded_columns=None,
             assignment_func=None,
             metastore_conn_id='metastore_default',
             presto_conn_id='presto_default',
             mysql_conn_id='airflow_db',
             *args, **kwargs):
-        super(HiveStatsCollectionOperator, self).__init__(*args, **kwargs)
+        if 'col_blacklist' in kwargs:
+            warnings.warn(
+                'col_blacklist kwarg passed to {c} (task_id: {t}) is deprecated, please rename it to '
+                'excluded_columns instead'.format(
+                    c=self.__class__.__name__, t=kwargs.get('task_id')),
+                category=FutureWarning,
+                stacklevel=2
+            )
+            excluded_columns = kwargs.pop('col_blacklist')
 
+        super(HiveStatsCollectionOperator, self).__init__(*args, **kwargs)
         self.table = table
         self.partition = partition
         self.extra_exprs = extra_exprs or {}
-        self.col_blacklist = col_blacklist or {}
+        self.excluded_columns = excluded_columns or {}
         self.metastore_conn_id = metastore_conn_id
         self.presto_conn_id = presto_conn_id
         self.mysql_conn_id = mysql_conn_id
@@ -89,7 +99,7 @@ class HiveStatsCollectionOperator(BaseOperator):
         self.dttm = '{{ execution_date.isoformat() }}'
 
     def get_default_exprs(self, col, col_type):
-        if col in self.col_blacklist:
+        if col in self.excluded_columns:
             return {}
         d = {(col, 'non_null'): "COUNT({col})"}
         if col_type in ['double', 'int', 'bigint', 'float', 'double']:
diff --git a/airflow/plugins_manager.py b/airflow/plugins_manager.py
index 317cbde..d68f882 100644
--- a/airflow/plugins_manager.py
+++ b/airflow/plugins_manager.py
@@ -114,7 +114,7 @@ def register_inbuilt_operator_links():
     Register all the Operators Links that are already defined for the operators
     in the "airflow" project. Example: QDSLink (Operator Link for Qubole Operator)
 
-    This is required to populate the "whitelist" of allowed classes when deserializing operator links
+    This is required to populate the "allowed list" of allowed classes when deserializing operator links
     """
     inbuilt_operator_links = set()  # type: Set[Type]
 
diff --git a/airflow/www/static/underscore.js b/airflow/www/static/underscore.js
index 70fae3f..7252539 100644
--- a/airflow/www/static/underscore.js
+++ b/airflow/www/static/underscore.js
@@ -806,7 +806,7 @@
     return obj;
   };
 
-  // Return a copy of the object only containing the whitelisted properties.
+  // Return a copy of the object only containing the allowed list properties.
   _.pick = function(obj) {
     var copy = {};
     var keys = concat.apply(ArrayProto, slice.call(arguments, 1));
@@ -816,7 +816,7 @@
     return copy;
   };
 
-   // Return a copy of the object without the blacklisted properties.
+   // Return a copy of the object without the disallowed properties.
   _.omit = function(obj) {
     var copy = {};
     var keys = concat.apply(ArrayProto, slice.call(arguments, 1));
diff --git a/breeze-complete b/breeze-complete
index d1df5b9..5d8a724 100644
--- a/breeze-complete
+++ b/breeze-complete
@@ -60,6 +60,7 @@ fix-encoding-pragma
 flake8
 forbid-tabs
 insert-license
+language-matters
 lint-dockerfile
 mixed-line-ending
 mypy
diff --git a/docs/security.rst b/docs/security.rst
index e6bced8..0e49885 100644
--- a/docs/security.rst
+++ b/docs/security.rst
@@ -322,7 +322,7 @@ GitHub Enterprise (GHE) Authentication
 
 The GitHub Enterprise authentication backend can be used to authenticate users
 against an installation of GitHub Enterprise using OAuth2. You can optionally
-specify a team whitelist (composed of slug cased team names) to restrict login
+specify a team allowed list (composed of slug cased team names) to restrict login
 to only members of those teams.
 
 .. code-block:: bash
@@ -338,7 +338,7 @@ to only members of those teams.
     oauth_callback_route = /example/ghe_oauth/callback
     allowed_teams = 1, 345, 23
 
-.. note:: If you do not specify a team whitelist, anyone with a valid account on
+.. note:: If you do not specify a team allowed list, anyone with a valid account on
    your GHE installation will be able to login to Airflow.
 
 To use GHE authentication, you must install Airflow with the ``github_enterprise`` extras group:
diff --git a/tests/contrib/hooks/test_cassandra_hook.py b/tests/contrib/hooks/test_cassandra_hook.py
index a798152..b8c120b 100644
--- a/tests/contrib/hooks/test_cassandra_hook.py
+++ b/tests/contrib/hooks/test_cassandra_hook.py
@@ -113,7 +113,7 @@ class TestCassandraHook(unittest.TestCase):
                                    TokenAwarePolicy,
                                    expected_child_policy_type=RoundRobinPolicy)
 
-    def test_get_lb_policy_no_host_for_white_list(self):
+    def test_get_lb_policy_no_host_for_allow_list(self):
         # test host not specified for WhiteListRoundRobinPolicy should throw exception
         self._assert_get_lb_policy('WhiteListRoundRobinPolicy',
                                    {},
diff --git a/tests/contrib/operators/test_hive_stats.py b/tests/contrib/operators/test_hive_stats.py
index 7df8e35..69c9b74 100644
--- a/tests/contrib/operators/test_hive_stats.py
+++ b/tests/contrib/operators/test_hive_stats.py
@@ -60,9 +60,9 @@ class TestHiveStatsCollectionOperator(TestHiveEnvironment):
             (col, 'non_null'): 'COUNT({})'.format(col)
         })
 
-    def test_get_default_exprs_blacklist(self):
-        col = 'blacklisted_col'
-        self.kwargs.update(dict(col_blacklist=[col]))
+    def test_get_default_exprs_excluded_cols(self):
+        col = 'excluded_col'
+        self.kwargs.update(dict(excluded_columns=[col]))
 
         default_exprs = HiveStatsCollectionOperator(**self.kwargs).get_default_exprs(col, None)
 
diff --git a/tests/utils/test_dag_processing.py b/tests/utils/test_dag_processing.py
index de8f06c..2232310 100644
--- a/tests/utils/test_dag_processing.py
+++ b/tests/utils/test_dag_processing.py
@@ -246,9 +246,9 @@ class TestDagFileProcessorManager(unittest.TestCase):
             class FakeDagFIleProcessor(DagFileProcessor):
                 # This fake processor will return the zombies it received in constructor
                 # as its processing result w/o actually parsing anything.
-                def __init__(self, file_path, pickle_dags, dag_id_white_list, zombies):
+                def __init__(self, file_path, pickle_dags, dag_ids, zombies):
                     super(FakeDagFIleProcessor, self).__init__(
-                        file_path, pickle_dags, dag_id_white_list, zombies
+                        file_path, pickle_dags, dag_ids, zombies
                     )
 
                     self._result = zombies, 0


[airflow] 23/25: add guidance re yarn build for local virtualenv development (#9411)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 3e10afcf98978a11f488dccd10fc822723e0115c
Author: dstandish <ds...@users.noreply.github.com>
AuthorDate: Fri Jun 19 10:20:56 2020 -0700

    add guidance re yarn build for local virtualenv development (#9411)
    
    
    (cherry picked from commit 05ea88869be6a62d312598b5b01095e16f0a16f8)
---
 LOCAL_VIRTUALENV.rst | 7 +++++++
 1 file changed, 7 insertions(+)

diff --git a/LOCAL_VIRTUALENV.rst b/LOCAL_VIRTUALENV.rst
index cd66ea4..b744fc3 100644
--- a/LOCAL_VIRTUALENV.rst
+++ b/LOCAL_VIRTUALENV.rst
@@ -147,6 +147,13 @@ Activate your virtualenv, e.g. by using ``workon``, and once you are in it, run:
 
   ./breeze initialize-local-virtualenv
 
+5. (optionally) run yarn build if you plan to run the webserver
+
+.. code-block:: bash
+
+    cd airflow/www
+    yarn build
+
 Running Tests
 -------------
 


[airflow] 13/25: Remove generating temp remote manifest file in project dir (#9267)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 52675739eeaa393c71c32991b885eac9f6897d24
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Sat Jun 13 18:35:34 2020 +0100

    Remove generating temp remote manifest file in project dir (#9267)
    
    
    (cherry picked from commit f5795f1d6e965cccf76970f4eccab1f23c3b1092)
---
 scripts/ci/libraries/_build_images.sh | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/scripts/ci/libraries/_build_images.sh b/scripts/ci/libraries/_build_images.sh
index be349b8..9d2b748 100644
--- a/scripts/ci/libraries/_build_images.sh
+++ b/scripts/ci/libraries/_build_images.sh
@@ -243,7 +243,8 @@ function get_remote_image_info() {
     set -e
 
     # Docker needs the file passed to --cidfile to not exist, so we can't use mktemp
-    TMP_CONTAINER_ID="remote-airflow-manifest-$$.container_id"
+    TMP_CONTAINER_DIR="$(mktemp -d)"
+    TMP_CONTAINER_ID="${TMP_CONTAINER_DIR}/remote-airflow-manifest-$$.container_id"
     FILES_TO_CLEANUP_ON_EXIT+=("$TMP_CONTAINER_ID")
 
     TMP_MANIFEST_REMOTE_JSON=$(mktemp)


[airflow] 10/25: Add generic CLI tool wrapper (#9223)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 68d7399cb615518404aef52f137d858286feb877
Author: Kamil Breguła <mi...@users.noreply.github.com>
AuthorDate: Thu Jun 11 18:50:31 2020 +0200

    Add generic CLI tool wrapper (#9223)
    
    * Add generic  CLI tool wrapper
    
    * Pas working directory to container
    
    * Share namespaces between all containers
    
    * Fix permissions hack
    
    * Unify code style
    
    Co-authored-by: Felix Uellendall <fe...@users.noreply.github.com>
    
    * Detect standalone execution by checking symboli link
    
    * User friendly error message when env var is missing
    
    * Display error to stderr
    
    * Display errors on stderr
    
    * Fix permission hack
    
    * Fix condition in if
    
    * Fix missing env-file
    
    * TEST: Install airflow without copying ssources
    
    * Update scripts/ci/in_container/run_prepare_backport_readme.sh
    
    Co-authored-by: Felix Uellendall <fe...@users.noreply.github.com>
    (cherry picked from commit f17a02d33047ebbfd9f92d3d1d54d6d810f596c1)
---
 Dockerfile.ci                            |   2 +-
 scripts/ci/docker-compose/local-prod.yml |   2 +-
 scripts/ci/docker-compose/local.yml      |   2 +-
 scripts/ci/libraries/_local_mounts.sh    |   2 +-
 scripts/ci/prepare_tool_scripts.sh       |  64 ------------
 scripts/ci/run_cli_tool.sh               | 167 +++++++++++++++++++++++++++++++
 6 files changed, 171 insertions(+), 68 deletions(-)

diff --git a/Dockerfile.ci b/Dockerfile.ci
index 1549214..24ee87d 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -315,7 +315,7 @@ RUN if [[ -n "${ADDITIONAL_PYTHON_DEPS}" ]]; then \
         pip install ${ADDITIONAL_PYTHON_DEPS}; \
     fi
 
-RUN scripts/ci/prepare_tool_scripts.sh
+RUN source <(bash scripts/ci/run_cli_tool.sh)
 
 WORKDIR ${AIRFLOW_SOURCES}
 
diff --git a/scripts/ci/docker-compose/local-prod.yml b/scripts/ci/docker-compose/local-prod.yml
index a82b4f8..6342d33 100644
--- a/scripts/ci/docker-compose/local-prod.yml
+++ b/scripts/ci/docker-compose/local-prod.yml
@@ -35,7 +35,7 @@ services:
       - ../../../setup.cfg:/opt/airflow/setup.cfg:cached
       - ../../../setup.py:/opt/airflow/setup.py:cached
       - ../../../tests:/opt/airflow/tests:cached
-      - ../../../tmp:/opt/airflow/tmp:cached
+      - ../../../tmp:/tmp:cached
     environment:
       - HOST_USER_ID
       - HOST_GROUP_ID
diff --git a/scripts/ci/docker-compose/local.yml b/scripts/ci/docker-compose/local.yml
index 3c9e40b..ff88c6c 100644
--- a/scripts/ci/docker-compose/local.yml
+++ b/scripts/ci/docker-compose/local.yml
@@ -54,7 +54,7 @@ services:
       - ../../../setup.py:/opt/airflow/setup.py:cached
       - ../../../tests:/opt/airflow/tests:cached
       - ../../../kubernetes_tests:/opt/airflow/kubernetes_tests:cached
-      - ../../../tmp:/opt/airflow/tmp:cached
+      - ../../../tmp:/tmp:cached
       # END automatically generated volumes from LOCAL_MOUNTS in _local_mounts.sh
     environment:
       - HOST_USER_ID
diff --git a/scripts/ci/libraries/_local_mounts.sh b/scripts/ci/libraries/_local_mounts.sh
index 5750600..24a2eb3 100644
--- a/scripts/ci/libraries/_local_mounts.sh
+++ b/scripts/ci/libraries/_local_mounts.sh
@@ -52,7 +52,7 @@ function generate_local_mounts_list {
         "$prefix"setup.py:/opt/airflow/setup.py:cached
         "$prefix"tests:/opt/airflow/tests:cached
         "$prefix"kubernetes_tests:/opt/airflow/kubernetes_tests:cached
-        "$prefix"tmp:/opt/airflow/tmp:cached
+        "$prefix"tmp:/tmp:cached
     )
 }
 
diff --git a/scripts/ci/prepare_tool_scripts.sh b/scripts/ci/prepare_tool_scripts.sh
deleted file mode 100755
index 7a98c50..0000000
--- a/scripts/ci/prepare_tool_scripts.sh
+++ /dev/null
@@ -1,64 +0,0 @@
-#!/usr/bin/env bash
-
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing,
-# software distributed under the License is distributed on an
-# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-# KIND, either express or implied.  See the License for the
-# specific language governing permissions and limitations
-# under the License.
-set -euo pipefail
-
-function prepare_tool_script() {
-    IMAGE="${1}"
-    VOLUME="${2}"
-    TOOL="${3}"
-    COMMAND="${4:-}"
-
-    TARGET_TOOL_PATH="/usr/bin/${TOOL}"
-    TARGET_TOOL_UPDATE_PATH="/usr/bin/${TOOL}-update"
-
-    cat >"${TARGET_TOOL_PATH}" <<EOF
-#!/usr/bin/env bash
-docker run --rm -it \
-    -v "\${HOST_AIRFLOW_SOURCES}/tmp:/tmp" \
-    -v "\${HOST_AIRFLOW_SOURCES}/files:/files" \
-    -v "\${HOST_AIRFLOW_SOURCES}:/opt/airflow" \
-    -v "\${HOST_HOME}/${VOLUME}:/root/${VOLUME}" \
-    "${IMAGE}" ${COMMAND} "\$@"
-RES=\$?
-if [[ \${HOST_OS} == "Linux" ]]; then
-    docker run --rm \
-        -v "\${HOST_AIRFLOW_SOURCES}/tmp:/tmp" \
-        -v "\${HOST_AIRFLOW_SOURCES}/files:/files" \
-        -v "\${HOST_HOME}/${VOLUME}:/root/${VOLUME}" \
-        "\${AIRFLOW_CI_IMAGE}" bash -c \
-        "find '/tmp/' '/files/' '/root/${VOLUME}' -user root -print0 | xargs --null chown '\${HOST_USER_ID}.\${HOST_GROUP_ID}' --no-dereference" >/dev/null 2>&1
-fi
-exit \${RES}
-EOF
-
-    cat >"${TARGET_TOOL_UPDATE_PATH}" <<EOF
-#!/usr/bin/env bash
-docker pull "${IMAGE}"
-EOF
-
-    chmod a+x "${TARGET_TOOL_PATH}" "${TARGET_TOOL_UPDATE_PATH}"
-}
-
-GCLOUD_IMAGE="gcr.io/google.com/cloudsdktool/cloud-sdk:latest"
-
-prepare_tool_script "amazon/aws-cli:latest" ".aws" aws
-prepare_tool_script "mcr.microsoft.com/azure-cli:latest" ".azure" az az
-prepare_tool_script "${GCLOUD_IMAGE}" ".config/gcloud" bq bq
-prepare_tool_script "${GCLOUD_IMAGE}" ".config/gcloud" gcloud gcloud
-prepare_tool_script "${GCLOUD_IMAGE}" ".config/gcloud" gsutil gsutil
diff --git a/scripts/ci/run_cli_tool.sh b/scripts/ci/run_cli_tool.sh
new file mode 100755
index 0000000..cf840bc
--- /dev/null
+++ b/scripts/ci/run_cli_tool.sh
@@ -0,0 +1,167 @@
+#!/usr/bin/env bash
+
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+set -euo pipefail
+
+if [ -z "${AIRFLOW_CI_IMAGE}" ]; then
+    >&2 echo "Missing environment variable AIRFLOW_CI_IMAGE"
+    exit 1
+fi
+if [ -z "${HOST_AIRFLOW_SOURCES}" ]; then
+    >&2 echo "Missing environment variable HOST_AIRFLOW_SOURCES"
+    exit 1
+fi
+if [ -z "${HOST_USER_ID}" ]; then
+    >&2 echo "Missing environment variable HOST_USER_ID"
+    exit 1
+fi
+if [ -z "${HOST_GROUP_ID}" ]; then
+    >&2 echo "Missing environment variable HOST_GROUP_ID"
+    exit 1
+fi
+
+SCRIPT_NAME="$( basename "${BASH_SOURCE[0]}")"
+# Drop "-update" suffix, if exists
+TOOL_NAME="$(echo "${SCRIPT_NAME}" | cut -d "-" -f 1)"
+
+SUPPORTED_TOOL_NAMES=("aws" "az" "gcloud" "bq" "gsutil" "terraform" "java")
+
+if [ ! -L "${BASH_SOURCE[0]}" ]
+then
+    # Direct execution - return installation script
+    >&2 echo "# CLI tool wrappers"
+    >&2 echo "#"
+    >&2 echo "# To install, run the following command:"
+    >&2 echo "#     source <(bash ${SCRIPT_PATH@Q})"
+    >&2 echo "#"
+    >&2 echo ""
+    # Print installation script
+    for NAME in "${SUPPORTED_TOOL_NAMES[@]}"
+    do
+        echo "ln -s ${SCRIPT_PATH@Q} /usr/bin/${NAME}"
+        echo "ln -s ${SCRIPT_PATH@Q} /usr/bin/${NAME}-update"
+        echo "chmod +x /usr/bin/${NAME} /usr/bin/${NAME}-update"
+    done
+    exit 0
+fi
+ENV_TMP_FILE=$(mktemp)
+env > "${ENV_TMP_FILE}"
+cleanup() {
+    rm "${ENV_TMP_FILE}"
+}
+trap cleanup EXIT
+
+CONTAINER_ID="$(head -n 1 < /proc/self/cgroup | cut -d ":" -f 3 | cut -d "/" -f 3)"
+
+COMMON_DOCKER_ARGS=(
+    # Share namespaces between all containers.
+    # This way we are even closer to run those tools like if they were installed.
+    # More information: https://docs.docker.com/get-started/overview/#namespaces
+    --ipc "container:${CONTAINER_ID}"
+    --pid "container:${CONTAINER_ID}"
+    --network "container:${CONTAINER_ID}"
+    -v "${HOST_AIRFLOW_SOURCES}/tmp:/tmp"
+    -v "${HOST_AIRFLOW_SOURCES}/files:/files"
+    -v "${HOST_AIRFLOW_SOURCES}:/opt/airflow"
+    --env-file "${ENV_TMP_FILE}"
+    -w "${PWD}"
+)
+
+AWS_CREDENTIALS_DOCKER_ARGS=(-v "${HOST_HOME}/.aws:/root/.aws")
+AZURE_CREDENTIALS_DOCKER_ARGS=(-v "${HOST_HOME}/.azure:/root/.azure")
+GOOGLE_CREDENTIALS_DOCKER_ARGS=(-v "${HOST_HOME}/.config/gcloud:/root/.config/gcloud")
+
+DIRECTORIES_TO_FIX=('/tmp/' '/files/')
+
+COMMAND=("${@}")
+
+# Configure selected tool
+case "${TOOL_NAME}" in
+    aws )
+        COMMON_DOCKER_ARGS+=("${AWS_CREDENTIALS_DOCKER_ARGS[@]}")
+        DIRECTORIES_TO_FIX+=("/root/.aws")
+        IMAGE_NAME="amazon/aws-cli:latest"
+        ;;
+    az )
+        COMMON_DOCKER_ARGS+=("${AZURE_CREDENTIALS_DOCKER_ARGS[@]}")
+        DIRECTORIES_TO_FIX+=("/root/.azure")
+        IMAGE_NAME="mcr.microsoft.com/azure-cli:latest"
+        ;;
+    gcloud | bq | gsutil )
+        COMMON_DOCKER_ARGS+=("${GOOGLE_CREDENTIALS_DOCKER_ARGS[@]}")
+        DIRECTORIES_TO_FIX+=("/root/.config/gcloud")
+        IMAGE_NAME="gcr.io/google.com/cloudsdktool/cloud-sdk:latest"
+        COMMAND=("$TOOL_NAME" "${@}")
+        ;;
+    terraform )
+        COMMON_DOCKER_ARGS+=(
+            "${GOOGLE_CREDENTIALS_DOCKER_ARGS[@]}"
+            "${AZURE_CREDENTIALS_DOCKER_ARGS[@]}"
+            "${AWS_CREDENTIALS_DOCKER_ARGS[@]}"
+        )
+        DIRECTORIES_TO_FIX+=(
+            "/root/.config/gcloud"
+            "/root/.aws"
+            "/root/.azure"
+        )
+        IMAGE_NAME="hashicorp/terraform:latest"
+        ;;
+    java )
+        # TODO: Should we add other credentials?
+        COMMON_DOCKER_ARGS+=("${GOOGLE_CREDENTIALS_DOCKER_ARGS[@]}")
+        DIRECTORIES_TO_FIX+=("/root/.config/gcloud")
+        IMAGE_NAME="openjdk:8-jre-slim"
+        COMMAND=("/usr/local/openjdk-8/bin/java" "${@}")
+        ;;
+    * )
+        >&2 echo "Unsupported tool name: ${TOOL_NAME}"
+        exit 1
+        ;;
+esac
+
+# Run update, if requested
+if [[ "${SCRIPT_NAME}" == *-update ]]; then
+    docker pull "${IMAGE_NAME}"
+    exit $?
+fi
+
+# Otherwise, run tool
+TOOL_DOCKER_ARGS=(--rm --interactive)
+TOOL_DOCKER_ARGS+=("${COMMON_DOCKER_ARGS[@]}")
+
+if [ -t 0 ] ; then
+    TOOL_DOCKER_ARGS+=(
+        --tty
+    )
+fi
+
+docker run "${TOOL_DOCKER_ARGS[@]}" "${IMAGE_NAME}" "${COMMAND[@]}"
+
+RES=$?
+
+# Set file permissions to the host user
+if [[ "${HOST_OS}" == "Linux" ]]; then
+    FIX_DOCKER_ARGS=(--rm)
+    FIX_DOCKER_ARGS+=("${COMMON_DOCKER_ARGS[@]}")
+    FIX_COMMAND=(bash -c
+        "find ${DIRECTORIES_TO_FIX[@]@Q} -user root -print0 | xargs --null chown '${HOST_USER_ID}.${HOST_GROUP_ID}' --no-dereference")
+
+    docker run "${FIX_DOCKER_ARGS[@]}" "${AIRFLOW_CI_IMAGE}" "${FIX_COMMAND[@]}" >/dev/null 2>&1
+fi
+
+exit ${RES}


[airflow] 17/25: Fixes unbound variable on MacOS (#9335)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit f045a8d849feb84d924fecc37446200017a6283e
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Tue Jun 16 20:25:24 2020 +0200

    Fixes unbound variable on MacOS (#9335)
    
    Closes #9334
    
    (cherry picked from commit 2fc13f00682fe98061f55686d7aa5bada969a931)
---
 breeze | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/breeze b/breeze
index d78eaf2..118703d 100755
--- a/breeze
+++ b/breeze
@@ -1864,7 +1864,7 @@ function run_build_command {
 # Runs the actual command - depending on the command chosen it will use the right
 # Convenient script and run the right command with it
 function run_breeze_command {
-    set -u
+    set +u
     case "${COMMAND_TO_RUN}" in
         enter_breeze)
             if [[ ${PRODUCTION_IMAGE} == "true" ]]; then
@@ -1939,6 +1939,7 @@ function run_breeze_command {
           echo >&2
           ;;
     esac
+    set -u
 }
 
 setup_default_breeze_variables


[airflow] 04/25: Remove httplib2 from Google requirements (#9194)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7aa60b85f22b8da1bf6dd28e8ab90834bb8cb952
Author: Kamil Breguła <mi...@users.noreply.github.com>
AuthorDate: Tue Jun 9 20:33:17 2020 +0200

    Remove httplib2 from Google requirements (#9194)
    
    * Remove httplib2 from Google requirements
    
    * fixup! Remove httplib2 from Google requirements
    
    (cherry picked from commit 6d4972a0a7ff497a018f15a919a829f58de6be32)
---
 requirements/requirements-python2.7.txt | 54 +++++++++++++-------------
 requirements/requirements-python3.5.txt | 64 +++++++++++++++----------------
 requirements/requirements-python3.6.txt | 68 ++++++++++++++++-----------------
 requirements/requirements-python3.7.txt | 66 ++++++++++++++++----------------
 requirements/setup-2.7.md5              |  2 +-
 requirements/setup-3.5.md5              |  2 +-
 requirements/setup-3.6.md5              |  2 +-
 requirements/setup-3.7.md5              |  2 +-
 setup.py                                |  1 -
 9 files changed, 130 insertions(+), 131 deletions(-)

diff --git a/requirements/requirements-python2.7.txt b/requirements/requirements-python2.7.txt
index 909d880..bcce040 100644
--- a/requirements/requirements-python2.7.txt
+++ b/requirements/requirements-python2.7.txt
@@ -12,7 +12,7 @@ Flask-SQLAlchemy==2.4.3
 Flask-WTF==0.14.3
 Flask==1.1.2
 JPype1==0.7.1
-JayDeBeApi==1.2.2
+JayDeBeApi==1.2.3
 Jinja2==2.10.3
 Mako==1.1.3
 Markdown==2.6.11
@@ -45,7 +45,7 @@ astroid==1.6.6
 atlasclient==1.0.0
 atomicwrites==1.4.0
 attrs==19.3.0
-aws-sam-translator==1.24.0
+aws-sam-translator==1.25.0
 aws-xray-sdk==2.6.0
 azure-common==1.1.25
 azure-cosmos==3.1.2
@@ -69,9 +69,9 @@ beautifulsoup4==4.7.1
 billiard==3.6.3.0
 bleach==3.1.5
 blinker==1.4
-boto3==1.13.25
+boto3==1.14.7
 boto==2.49.0
-botocore==1.16.25
+botocore==1.17.7
 cached-property==1.5.1
 cachetools==3.1.1
 cassandra-driver==3.20.2
@@ -80,7 +80,7 @@ celery==4.4.5
 certifi==2020.4.5.2
 cffi==1.14.0
 cfgv==2.0.1
-cfn-lint==0.33.0
+cfn-lint==0.33.1
 cgroupspy==0.1.6
 chardet==3.0.4
 click==6.7
@@ -92,13 +92,13 @@ contextdecorator==0.10.0
 contextlib2==0.6.0.post1
 cookies==2.2.1
 coverage==5.1
-croniter==0.3.32
+croniter==0.3.34
 cryptography==2.9.2
 cx-Oracle==7.3.0
 datadog==0.36.0
 decorator==4.4.2
 defusedxml==0.6.0
-dill==0.3.1.1
+dill==0.3.2
 distlib==0.3.0
 dnspython==1.16.0
 docker-pycreds==0.4.0
@@ -126,24 +126,24 @@ future-fstrings==1.2.0
 future==0.18.2
 futures==3.3.0
 gcsfs==0.2.3
-google-api-core==1.19.1
-google-api-python-client==1.9.1
+google-api-core==1.21.0
+google-api-python-client==1.9.3
 google-auth-httplib2==0.0.3
 google-auth-oauthlib==0.4.1
-google-auth==1.16.1
+google-auth==1.18.0
 google-cloud-bigquery==1.25.0
 google-cloud-bigtable==1.2.1
-google-cloud-container==0.5.0
+google-cloud-container==1.0.1
 google-cloud-core==1.3.0
-google-cloud-dlp==0.15.0
+google-cloud-dlp==1.0.0
 google-cloud-language==1.3.0
 google-cloud-secret-manager==1.0.0
 google-cloud-spanner==1.17.0
 google-cloud-speech==1.3.2
-google-cloud-storage==1.28.1
+google-cloud-storage==1.29.0
 google-cloud-texttospeech==1.0.1
 google-cloud-translate==2.0.1
-google-cloud-videointelligence==1.14.0
+google-cloud-videointelligence==1.15.0
 google-cloud-vision==1.0.0
 google-resumable-media==0.5.1
 googleapis-common-protos==1.52.0
@@ -156,13 +156,13 @@ hdfs==2.5.8
 hmsclient==0.1.1
 httplib2==0.18.1
 humanize==0.5.1
-hvac==0.10.3
+hvac==0.10.4
 identify==1.4.19
 idna==2.9
 ijson==2.6.1
 imagesize==1.2.0
 importlib-metadata==1.6.1
-importlib-resources==2.0.0
+importlib-resources==2.0.1
 inflection==0.3.1
 ipaddress==1.0.23
 ipdb==0.13.2
@@ -198,7 +198,7 @@ mongomock==3.19.0
 monotonic==1.5
 more-itertools==5.0.0
 moto==1.3.14
-msrest==0.6.15
+msrest==0.6.16
 msrestazure==0.6.3
 multi-key-dict==2.0.3
 mysqlclient==1.3.14
@@ -208,7 +208,7 @@ nbformat==4.4.0
 networkx==2.2
 nodeenv==1.4.0
 nteract-scrapbook==0.3.1
-ntlm-auth==1.4.0
+ntlm-auth==1.5.0
 numpy==1.16.6
 oauthlib==3.1.0
 oscrypto==1.2.0
@@ -235,7 +235,7 @@ protobuf==3.12.2
 psutil==5.7.0
 psycopg2-binary==2.8.5
 ptyprocess==0.6.0
-py==1.8.1
+py==1.8.2
 pyOpenSSL==19.1.0
 pyasn1-modules==0.2.8
 pyasn1==0.4.8
@@ -251,11 +251,11 @@ pymssql==2.1.4
 pyparsing==2.4.7
 pyrsistent==0.16.0
 pysftp==0.2.9
-pytest-cov==2.9.0
+pytest-cov==2.10.0
 pytest-forked==1.1.3
-pytest-instafail==0.4.1.post0
+pytest-instafail==0.4.2
 pytest-rerunfailures==9.0
-pytest-timeout==1.3.4
+pytest-timeout==1.4.1
 pytest-xdist==1.32.0
 pytest==4.6.11
 python-daemon==2.1.2
@@ -279,15 +279,15 @@ requests-mock==1.8.0
 requests-ntlm==1.1.0
 requests-oauthlib==1.3.0
 requests-toolbelt==0.9.1
-requests==2.23.0
-responses==0.10.14
+requests==2.24.0
+responses==0.10.15
 rsa==4.0
 s3transfer==0.3.3
 sasl==0.2.1
 scandir==1.10.0
 sendgrid==5.6.0
 sentinels==1.0.0
-sentry-sdk==0.14.4
+sentry-sdk==0.15.1
 setproctitle==1.1.10
 simplegeneric==0.8.1
 singledispatch==3.4.0.3
@@ -301,7 +301,7 @@ soupsieve==1.9.6
 sphinx-argparse==0.2.5
 sphinx-autoapi==1.0.0
 sphinx-jinja==1.1.1
-sphinx-rtd-theme==0.4.3
+sphinx-rtd-theme==0.5.0
 sphinxcontrib-dotnetdomain==0.4
 sphinxcontrib-golangdomain==0.2.0.dev0
 sphinxcontrib-httpdomain==1.7.0
@@ -331,7 +331,7 @@ uritemplate==3.0.1
 urllib3==1.25.9
 vertica-python==0.10.4
 vine==1.3.0
-virtualenv==20.0.21
+virtualenv==20.0.23
 wcwidth==0.2.4
 webencodings==0.5.1
 websocket-client==0.57.0
diff --git a/requirements/requirements-python3.5.txt b/requirements/requirements-python3.5.txt
index a167c60..725fdd8 100644
--- a/requirements/requirements-python3.5.txt
+++ b/requirements/requirements-python3.5.txt
@@ -12,7 +12,7 @@ Flask-SQLAlchemy==2.4.3
 Flask-WTF==0.14.3
 Flask==1.1.2
 JPype1==0.7.1
-JayDeBeApi==1.2.2
+JayDeBeApi==1.2.3
 Jinja2==2.10.3
 Mako==1.1.3
 Markdown==2.6.11
@@ -25,7 +25,7 @@ PyYAML==5.3.1
 Pygments==2.6.1
 SQLAlchemy-JSONField==0.9.0
 SQLAlchemy==1.3.17
-Sphinx==3.1.0
+Sphinx==3.1.1
 Unidecode==1.1.1
 WTForms==2.3.1
 Werkzeug==0.16.1
@@ -44,7 +44,7 @@ aspy.yaml==1.3.0
 astroid==2.4.2
 atlasclient==1.0.0
 attrs==19.3.0
-aws-sam-translator==1.24.0
+aws-sam-translator==1.25.0
 aws-xray-sdk==2.6.0
 azure-common==1.1.25
 azure-cosmos==3.1.2
@@ -60,9 +60,9 @@ bcrypt==3.1.7
 beautifulsoup4==4.7.1
 billiard==3.6.3.0
 blinker==1.4
-boto3==1.13.25
+boto3==1.14.7
 boto==2.49.0
-botocore==1.16.25
+botocore==1.17.7
 cached-property==1.5.1
 cachetools==4.1.0
 cassandra-driver==3.20.2
@@ -71,7 +71,7 @@ celery==4.4.5
 certifi==2020.4.5.2
 cffi==1.14.0
 cfgv==2.0.1
-cfn-lint==0.33.0
+cfn-lint==0.33.1
 cgroupspy==0.1.6
 chardet==3.0.4
 click==6.7
@@ -80,13 +80,13 @@ colorama==0.4.3
 colorlog==4.0.2
 configparser==3.5.3
 coverage==5.1
-croniter==0.3.32
+croniter==0.3.34
 cryptography==2.9.2
 cx-Oracle==7.3.0
 datadog==0.36.0
 decorator==4.4.2
 defusedxml==0.6.0
-dill==0.3.1.1
+dill==0.3.2
 distlib==0.3.0
 dnspython==1.16.0
 docker-pycreds==0.4.0
@@ -112,24 +112,24 @@ funcsigs==1.0.2
 future-fstrings==1.2.0
 future==0.18.2
 gcsfs==0.6.2
-google-api-core==1.19.1
-google-api-python-client==1.9.1
+google-api-core==1.21.0
+google-api-python-client==1.9.3
 google-auth-httplib2==0.0.3
 google-auth-oauthlib==0.4.1
-google-auth==1.16.1
+google-auth==1.18.0
 google-cloud-bigquery==1.25.0
 google-cloud-bigtable==1.2.1
-google-cloud-container==0.5.0
+google-cloud-container==1.0.1
 google-cloud-core==1.3.0
-google-cloud-dlp==0.15.0
+google-cloud-dlp==1.0.0
 google-cloud-language==1.3.0
 google-cloud-secret-manager==1.0.0
 google-cloud-spanner==1.17.0
 google-cloud-speech==1.3.2
-google-cloud-storage==1.28.1
+google-cloud-storage==1.29.0
 google-cloud-texttospeech==1.0.1
 google-cloud-translate==2.0.1
-google-cloud-videointelligence==1.14.0
+google-cloud-videointelligence==1.15.0
 google-cloud-vision==1.0.0
 google-resumable-media==0.5.1
 googleapis-common-protos==1.52.0
@@ -142,13 +142,13 @@ hdfs==2.5.8
 hmsclient==0.1.1
 httplib2==0.18.1
 humanize==0.5.1
-hvac==0.10.3
+hvac==0.10.4
 identify==1.4.19
 idna==2.9
 ijson==2.6.1
 imagesize==1.2.0
 importlib-metadata==1.6.1
-importlib-resources==2.0.0
+importlib-resources==2.0.1
 inflection==0.5.0
 ipdb==0.13.2
 ipython-genutils==0.2.0
@@ -156,7 +156,7 @@ ipython==7.9.0
 iso8601==0.1.12
 isodate==0.6.0
 itsdangerous==1.1.0
-jedi==0.17.0
+jedi==0.17.1
 jira==2.0.0
 jmespath==0.10.0
 json-merge-patch==0.2
@@ -179,9 +179,9 @@ marshmallow==2.19.5
 mccabe==0.6.1
 mock==3.0.5
 mongomock==3.19.0
-more-itertools==8.3.0
+more-itertools==8.4.0
 moto==1.3.14
-msrest==0.6.15
+msrest==0.6.16
 msrestazure==0.6.3
 multi-key-dict==2.0.3
 mypy-extensions==0.4.3
@@ -189,11 +189,11 @@ mypy==0.720
 mysqlclient==1.3.14
 natsort==7.0.1
 nbclient==0.1.0
-nbformat==5.0.6
+nbformat==5.0.7
 networkx==2.4
 nodeenv==1.4.0
 nteract-scrapbook==0.4.1
-ntlm-auth==1.4.0
+ntlm-auth==1.5.0
 numpy==1.18.5
 oauthlib==3.1.0
 oscrypto==1.2.0
@@ -220,7 +220,7 @@ protobuf==3.12.2
 psutil==5.7.0
 psycopg2-binary==2.8.5
 ptyprocess==0.6.0
-py==1.8.1
+py==1.8.2
 pyOpenSSL==19.1.0
 pyarrow==0.17.1
 pyasn1-modules==0.2.8
@@ -237,11 +237,11 @@ pymssql==2.1.4
 pyparsing==2.4.7
 pyrsistent==0.16.0
 pysftp==0.2.9
-pytest-cov==2.9.0
+pytest-cov==2.10.0
 pytest-forked==1.1.3
-pytest-instafail==0.4.1.post0
+pytest-instafail==0.4.2
 pytest-rerunfailures==9.0
-pytest-timeout==1.3.4
+pytest-timeout==1.4.1
 pytest-xdist==1.32.0
 pytest==5.4.3
 python-daemon==2.1.2
@@ -265,14 +265,14 @@ requests-mock==1.8.0
 requests-ntlm==1.1.0
 requests-oauthlib==1.3.0
 requests-toolbelt==0.9.1
-requests==2.23.0
-responses==0.10.14
-rsa==4.0
+requests==2.24.0
+responses==0.10.15
+rsa==4.6
 s3transfer==0.3.3
 sasl==0.2.1
 sendgrid==5.6.0
 sentinels==1.0.0
-sentry-sdk==0.14.4
+sentry-sdk==0.15.1
 setproctitle==1.1.10
 six==1.15.0
 slackclient==1.3.2
@@ -283,7 +283,7 @@ soupsieve==2.0.1
 sphinx-argparse==0.2.5
 sphinx-autoapi==1.0.0
 sphinx-jinja==1.1.1
-sphinx-rtd-theme==0.4.3
+sphinx-rtd-theme==0.5.0
 sphinxcontrib-applehelp==1.0.2
 sphinxcontrib-devhelp==1.0.2
 sphinxcontrib-dotnetdomain==0.4
@@ -316,7 +316,7 @@ uritemplate==3.0.1
 urllib3==1.25.9
 vertica-python==0.10.4
 vine==1.3.0
-virtualenv==20.0.21
+virtualenv==20.0.23
 wcwidth==0.2.4
 websocket-client==0.57.0
 wrapt==1.12.1
diff --git a/requirements/requirements-python3.6.txt b/requirements/requirements-python3.6.txt
index b189c42..18a1b61 100644
--- a/requirements/requirements-python3.6.txt
+++ b/requirements/requirements-python3.6.txt
@@ -12,7 +12,7 @@ Flask-SQLAlchemy==2.4.3
 Flask-WTF==0.14.3
 Flask==1.1.2
 JPype1==0.7.1
-JayDeBeApi==1.2.2
+JayDeBeApi==1.2.3
 Jinja2==2.10.3
 Mako==1.1.3
 Markdown==2.6.11
@@ -26,7 +26,7 @@ Pygments==2.6.1
 SQLAlchemy-JSONField==0.9.0
 SQLAlchemy-Utils==0.36.6
 SQLAlchemy==1.3.17
-Sphinx==3.1.0
+Sphinx==3.1.1
 Unidecode==1.1.1
 WTForms==2.3.1
 Werkzeug==0.16.1
@@ -45,7 +45,7 @@ astroid==2.4.2
 async-generator==1.10
 atlasclient==1.0.0
 attrs==19.3.0
-aws-sam-translator==1.24.0
+aws-sam-translator==1.25.0
 aws-xray-sdk==2.6.0
 azure-common==1.1.25
 azure-cosmos==3.1.2
@@ -62,9 +62,9 @@ beautifulsoup4==4.7.1
 billiard==3.6.3.0
 black==19.10b0
 blinker==1.4
-boto3==1.13.25
+boto3==1.14.7
 boto==2.49.0
-botocore==1.16.25
+botocore==1.17.7
 cached-property==1.5.1
 cachetools==4.1.0
 cassandra-driver==3.20.2
@@ -73,7 +73,7 @@ celery==4.4.5
 certifi==2020.4.5.2
 cffi==1.14.0
 cfgv==3.1.0
-cfn-lint==0.33.0
+cfn-lint==0.33.1
 cgroupspy==0.1.6
 chardet==3.0.4
 click==6.7
@@ -82,13 +82,13 @@ colorama==0.4.3
 colorlog==4.0.2
 configparser==3.5.3
 coverage==5.1
-croniter==0.3.32
+croniter==0.3.34
 cryptography==2.9.2
 cx-Oracle==7.3.0
 datadog==0.36.0
 decorator==4.4.2
 defusedxml==0.6.0
-dill==0.3.1.1
+dill==0.3.2
 distlib==0.3.0
 dnspython==1.16.0
 docker-pycreds==0.4.0
@@ -114,24 +114,24 @@ funcsigs==1.0.2
 future-fstrings==1.2.0
 future==0.18.2
 gcsfs==0.6.2
-google-api-core==1.19.1
-google-api-python-client==1.9.1
+google-api-core==1.21.0
+google-api-python-client==1.9.3
 google-auth-httplib2==0.0.3
 google-auth-oauthlib==0.4.1
-google-auth==1.16.1
+google-auth==1.18.0
 google-cloud-bigquery==1.25.0
 google-cloud-bigtable==1.2.1
-google-cloud-container==0.5.0
+google-cloud-container==1.0.1
 google-cloud-core==1.3.0
-google-cloud-dlp==0.15.0
+google-cloud-dlp==1.0.0
 google-cloud-language==1.3.0
 google-cloud-secret-manager==1.0.0
 google-cloud-spanner==1.17.0
 google-cloud-speech==1.3.2
-google-cloud-storage==1.28.1
+google-cloud-storage==1.29.0
 google-cloud-texttospeech==1.0.1
 google-cloud-translate==2.0.1
-google-cloud-videointelligence==1.14.0
+google-cloud-videointelligence==1.15.0
 google-cloud-vision==1.0.0
 google-resumable-media==0.5.1
 googleapis-common-protos==1.52.0
@@ -144,13 +144,13 @@ hdfs==2.5.8
 hmsclient==0.1.1
 httplib2==0.18.1
 humanize==0.5.1
-hvac==0.10.3
+hvac==0.10.4
 identify==1.4.19
 idna==2.9
 ijson==2.6.1
 imagesize==1.2.0
 importlib-metadata==1.6.1
-importlib-resources==2.0.0
+importlib-resources==2.0.1
 inflection==0.5.0
 ipdb==0.13.2
 ipython-genutils==0.2.0
@@ -158,7 +158,7 @@ ipython==7.15.0
 iso8601==0.1.12
 isodate==0.6.0
 itsdangerous==1.1.0
-jedi==0.17.0
+jedi==0.17.1
 jira==2.0.0
 jmespath==0.10.0
 json-merge-patch==0.2
@@ -181,22 +181,22 @@ marshmallow==2.21.0
 mccabe==0.6.1
 mock==4.0.2
 mongomock==3.19.0
-more-itertools==8.3.0
+more-itertools==8.4.0
 moto==1.3.14
-msrest==0.6.15
+msrest==0.6.16
 msrestazure==0.6.3
 multi-key-dict==2.0.3
 mypy-extensions==0.4.3
 mypy==0.720
 mysqlclient==1.3.14
 natsort==7.0.1
-nbclient==0.3.1
-nbformat==5.0.6
+nbclient==0.4.0
+nbformat==5.0.7
 nest-asyncio==1.3.3
 networkx==2.4
 nodeenv==1.4.0
 nteract-scrapbook==0.4.1
-ntlm-auth==1.4.0
+ntlm-auth==1.5.0
 numpy==1.18.5
 oauthlib==3.1.0
 oscrypto==1.2.0
@@ -214,7 +214,7 @@ pexpect==4.8.0
 pickleshare==0.7.5
 pinotdb==0.1.1
 pluggy==0.13.1
-pre-commit==2.5.0
+pre-commit==2.5.1
 presto-python-client==0.7.0
 prison==0.1.3
 prompt-toolkit==3.0.5
@@ -222,7 +222,7 @@ protobuf==3.12.2
 psutil==5.7.0
 psycopg2-binary==2.8.5
 ptyprocess==0.6.0
-py==1.8.1
+py==1.8.2
 pyOpenSSL==19.1.0
 pyarrow==0.17.1
 pyasn1-modules==0.2.8
@@ -239,11 +239,11 @@ pymssql==2.1.4
 pyparsing==2.4.7
 pyrsistent==0.16.0
 pysftp==0.2.9
-pytest-cov==2.9.0
+pytest-cov==2.10.0
 pytest-forked==1.1.3
-pytest-instafail==0.4.1.post0
+pytest-instafail==0.4.2
 pytest-rerunfailures==9.0
-pytest-timeout==1.3.4
+pytest-timeout==1.4.1
 pytest-xdist==1.32.0
 pytest==5.4.3
 python-daemon==2.1.2
@@ -268,14 +268,14 @@ requests-mock==1.8.0
 requests-ntlm==1.1.0
 requests-oauthlib==1.3.0
 requests-toolbelt==0.9.1
-requests==2.23.0
-responses==0.10.14
-rsa==4.0
+requests==2.24.0
+responses==0.10.15
+rsa==4.6
 s3transfer==0.3.3
 sasl==0.2.1
 sendgrid==5.6.0
 sentinels==1.0.0
-sentry-sdk==0.14.4
+sentry-sdk==0.15.1
 setproctitle==1.1.10
 six==1.15.0
 slackclient==1.3.2
@@ -286,7 +286,7 @@ soupsieve==2.0.1
 sphinx-argparse==0.2.5
 sphinx-autoapi==1.0.0
 sphinx-jinja==1.1.1
-sphinx-rtd-theme==0.4.3
+sphinx-rtd-theme==0.5.0
 sphinxcontrib-applehelp==1.0.2
 sphinxcontrib-devhelp==1.0.2
 sphinxcontrib-dotnetdomain==0.4
@@ -318,7 +318,7 @@ uritemplate==3.0.1
 urllib3==1.25.9
 vertica-python==0.10.4
 vine==1.3.0
-virtualenv==20.0.21
+virtualenv==20.0.23
 wcwidth==0.2.4
 websocket-client==0.57.0
 wrapt==1.12.1
diff --git a/requirements/requirements-python3.7.txt b/requirements/requirements-python3.7.txt
index a890025..ff1137f 100644
--- a/requirements/requirements-python3.7.txt
+++ b/requirements/requirements-python3.7.txt
@@ -12,7 +12,7 @@ Flask-SQLAlchemy==2.4.3
 Flask-WTF==0.14.3
 Flask==1.1.2
 JPype1==0.7.1
-JayDeBeApi==1.2.2
+JayDeBeApi==1.2.3
 Jinja2==2.10.3
 Mako==1.1.3
 Markdown==2.6.11
@@ -26,7 +26,7 @@ Pygments==2.6.1
 SQLAlchemy-JSONField==0.9.0
 SQLAlchemy-Utils==0.36.6
 SQLAlchemy==1.3.17
-Sphinx==3.1.0
+Sphinx==3.1.1
 Unidecode==1.1.1
 WTForms==2.3.1
 Werkzeug==0.16.1
@@ -45,7 +45,7 @@ astroid==2.4.2
 async-generator==1.10
 atlasclient==1.0.0
 attrs==19.3.0
-aws-sam-translator==1.24.0
+aws-sam-translator==1.25.0
 aws-xray-sdk==2.6.0
 azure-common==1.1.25
 azure-cosmos==3.1.2
@@ -62,9 +62,9 @@ beautifulsoup4==4.7.1
 billiard==3.6.3.0
 black==19.10b0
 blinker==1.4
-boto3==1.13.25
+boto3==1.14.7
 boto==2.49.0
-botocore==1.16.25
+botocore==1.17.7
 cached-property==1.5.1
 cachetools==4.1.0
 cassandra-driver==3.20.2
@@ -73,7 +73,7 @@ celery==4.4.5
 certifi==2020.4.5.2
 cffi==1.14.0
 cfgv==3.1.0
-cfn-lint==0.33.0
+cfn-lint==0.33.1
 cgroupspy==0.1.6
 chardet==3.0.4
 click==6.7
@@ -82,13 +82,13 @@ colorama==0.4.3
 colorlog==4.0.2
 configparser==3.5.3
 coverage==5.1
-croniter==0.3.32
+croniter==0.3.34
 cryptography==2.9.2
 cx-Oracle==7.3.0
 datadog==0.36.0
 decorator==4.4.2
 defusedxml==0.6.0
-dill==0.3.1.1
+dill==0.3.2
 distlib==0.3.0
 dnspython==1.16.0
 docker-pycreds==0.4.0
@@ -114,24 +114,24 @@ funcsigs==1.0.2
 future-fstrings==1.2.0
 future==0.18.2
 gcsfs==0.6.2
-google-api-core==1.19.1
-google-api-python-client==1.9.1
+google-api-core==1.21.0
+google-api-python-client==1.9.3
 google-auth-httplib2==0.0.3
 google-auth-oauthlib==0.4.1
-google-auth==1.16.1
+google-auth==1.18.0
 google-cloud-bigquery==1.25.0
 google-cloud-bigtable==1.2.1
-google-cloud-container==0.5.0
+google-cloud-container==1.0.1
 google-cloud-core==1.3.0
-google-cloud-dlp==0.15.0
+google-cloud-dlp==1.0.0
 google-cloud-language==1.3.0
 google-cloud-secret-manager==1.0.0
 google-cloud-spanner==1.17.0
 google-cloud-speech==1.3.2
-google-cloud-storage==1.28.1
+google-cloud-storage==1.29.0
 google-cloud-texttospeech==1.0.1
 google-cloud-translate==2.0.1
-google-cloud-videointelligence==1.14.0
+google-cloud-videointelligence==1.15.0
 google-cloud-vision==1.0.0
 google-resumable-media==0.5.1
 googleapis-common-protos==1.52.0
@@ -144,7 +144,7 @@ hdfs==2.5.8
 hmsclient==0.1.1
 httplib2==0.18.1
 humanize==0.5.1
-hvac==0.10.3
+hvac==0.10.4
 identify==1.4.19
 idna==2.9
 ijson==2.6.1
@@ -157,7 +157,7 @@ ipython==7.15.0
 iso8601==0.1.12
 isodate==0.6.0
 itsdangerous==1.1.0
-jedi==0.17.0
+jedi==0.17.1
 jira==2.0.0
 jmespath==0.10.0
 json-merge-patch==0.2
@@ -180,22 +180,22 @@ marshmallow==2.21.0
 mccabe==0.6.1
 mock==4.0.2
 mongomock==3.19.0
-more-itertools==8.3.0
+more-itertools==8.4.0
 moto==1.3.14
-msrest==0.6.15
+msrest==0.6.16
 msrestazure==0.6.3
 multi-key-dict==2.0.3
 mypy-extensions==0.4.3
 mypy==0.720
 mysqlclient==1.3.14
 natsort==7.0.1
-nbclient==0.3.1
-nbformat==5.0.6
+nbclient==0.4.0
+nbformat==5.0.7
 nest-asyncio==1.3.3
 networkx==2.4
 nodeenv==1.4.0
 nteract-scrapbook==0.4.1
-ntlm-auth==1.4.0
+ntlm-auth==1.5.0
 numpy==1.18.5
 oauthlib==3.1.0
 oscrypto==1.2.0
@@ -213,7 +213,7 @@ pexpect==4.8.0
 pickleshare==0.7.5
 pinotdb==0.1.1
 pluggy==0.13.1
-pre-commit==2.5.0
+pre-commit==2.5.1
 presto-python-client==0.7.0
 prison==0.1.3
 prompt-toolkit==3.0.5
@@ -221,7 +221,7 @@ protobuf==3.12.2
 psutil==5.7.0
 psycopg2-binary==2.8.5
 ptyprocess==0.6.0
-py==1.8.1
+py==1.8.2
 pyOpenSSL==19.1.0
 pyarrow==0.17.1
 pyasn1-modules==0.2.8
@@ -238,11 +238,11 @@ pymssql==2.1.4
 pyparsing==2.4.7
 pyrsistent==0.16.0
 pysftp==0.2.9
-pytest-cov==2.9.0
+pytest-cov==2.10.0
 pytest-forked==1.1.3
-pytest-instafail==0.4.1.post0
+pytest-instafail==0.4.2
 pytest-rerunfailures==9.0
-pytest-timeout==1.3.4
+pytest-timeout==1.4.1
 pytest-xdist==1.32.0
 pytest==5.4.3
 python-daemon==2.1.2
@@ -267,14 +267,14 @@ requests-mock==1.8.0
 requests-ntlm==1.1.0
 requests-oauthlib==1.3.0
 requests-toolbelt==0.9.1
-requests==2.23.0
-responses==0.10.14
-rsa==4.0
+requests==2.24.0
+responses==0.10.15
+rsa==4.6
 s3transfer==0.3.3
 sasl==0.2.1
 sendgrid==5.6.0
 sentinels==1.0.0
-sentry-sdk==0.14.4
+sentry-sdk==0.15.1
 setproctitle==1.1.10
 six==1.15.0
 slackclient==1.3.2
@@ -285,7 +285,7 @@ soupsieve==2.0.1
 sphinx-argparse==0.2.5
 sphinx-autoapi==1.0.0
 sphinx-jinja==1.1.1
-sphinx-rtd-theme==0.4.3
+sphinx-rtd-theme==0.5.0
 sphinxcontrib-applehelp==1.0.2
 sphinxcontrib-devhelp==1.0.2
 sphinxcontrib-dotnetdomain==0.4
@@ -316,7 +316,7 @@ uritemplate==3.0.1
 urllib3==1.25.9
 vertica-python==0.10.4
 vine==1.3.0
-virtualenv==20.0.21
+virtualenv==20.0.23
 wcwidth==0.2.4
 websocket-client==0.57.0
 wrapt==1.12.1
diff --git a/requirements/setup-2.7.md5 b/requirements/setup-2.7.md5
index fa9e77d..87e77b8 100644
--- a/requirements/setup-2.7.md5
+++ b/requirements/setup-2.7.md5
@@ -1 +1 @@
-a42983dcaa7e536b9c343d1d59ae5b32  /opt/airflow/setup.py
+99a88ba9c37191240b3dd729aed29e4b  /opt/airflow/setup.py
diff --git a/requirements/setup-3.5.md5 b/requirements/setup-3.5.md5
index fa9e77d..87e77b8 100644
--- a/requirements/setup-3.5.md5
+++ b/requirements/setup-3.5.md5
@@ -1 +1 @@
-a42983dcaa7e536b9c343d1d59ae5b32  /opt/airflow/setup.py
+99a88ba9c37191240b3dd729aed29e4b  /opt/airflow/setup.py
diff --git a/requirements/setup-3.6.md5 b/requirements/setup-3.6.md5
index fa9e77d..87e77b8 100644
--- a/requirements/setup-3.6.md5
+++ b/requirements/setup-3.6.md5
@@ -1 +1 @@
-a42983dcaa7e536b9c343d1d59ae5b32  /opt/airflow/setup.py
+99a88ba9c37191240b3dd729aed29e4b  /opt/airflow/setup.py
diff --git a/requirements/setup-3.7.md5 b/requirements/setup-3.7.md5
index fa9e77d..87e77b8 100644
--- a/requirements/setup-3.7.md5
+++ b/requirements/setup-3.7.md5
@@ -1 +1 @@
-a42983dcaa7e536b9c343d1d59ae5b32  /opt/airflow/setup.py
+99a88ba9c37191240b3dd729aed29e4b  /opt/airflow/setup.py
diff --git a/setup.py b/setup.py
index 8178294..9f05da9 100644
--- a/setup.py
+++ b/setup.py
@@ -265,7 +265,6 @@ gcp = [
     'google-cloud-videointelligence>=1.7.0',
     'google-cloud-vision>=0.35.2',
     'grpcio-gcp>=0.2.2',
-    'httplib2~=0.15',
     'pandas-gbq',
 ]
 grpc = [


[airflow] 20/25: Fixed crashing webserver after /tmp is mounted from the host (#9378)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit d22054f60ab383002c6f27985fae0859e23e3c17
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Thu Jun 18 16:58:08 2020 +0200

    Fixed crashing webserver after /tmp is mounted from the host (#9378)
    
    The bug was introduced in f17a02d33047ebbfd9f92d3d1d54d6d810f596c1
    
    Gunicorn uses a lot of os.fchmod in /tmp directory and it can create some
    excessive blocking in os.fchmod
    https://docs.gunicorn.org/en/stable/faq.html#how-do-i-avoid-gunicorn-excessively-blocking-in-os-fchmod
    
    We want to switch to use /dev/shm in prod image (shared memory) to make
    blocking go away and make independent on the docker filesystem used (osxfs has
    problems with os.fchmod and use permissions as well).
    
    Use case / motivation
    
    Avoiding contention might be useful = in production image.
    
    This can be done with:
    
    GUNICORN_CMD_ARGS="--worker-tmp-dir /dev/shm"
    
    (cherry picked from commit 4fefaf78a291bee3826f4bd97a9518f7e31f9033)
---
 Dockerfile.ci | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/Dockerfile.ci b/Dockerfile.ci
index 232711b..651810d 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -313,6 +313,9 @@ WORKDIR ${AIRFLOW_SOURCES}
 
 ENV PATH="${HOME}:${PATH}"
 
+# Needed to stop Gunicorn from crashing when /tmp is now mounted from host
+ENV GUNICORN_CMD_ARGS="--worker-tmp-dir /dev/shm/"
+
 EXPOSE 8080
 
 ENTRYPOINT ["/usr/bin/dumb-init", "--", "/entrypoint"]


[airflow] 08/25: n Improved compatibility with Python 3.5+ - Convert signal.SIGTERM to int (#9207)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 936d767b10c0583c592217d6c4926828941807d5
Author: Jiening Wen <ph...@phill84.org>
AuthorDate: Wed Jun 10 12:19:18 2020 +0200

    n Improved compatibility with Python 3.5+ - Convert signal.SIGTERM to int (#9207)
    
    Co-authored-by: Jiening Wen <ph...@Jienings-MacBook-Pro.local>
    (cherry picked from commit 1cf52da4ff4c58050273f7adafa787c60776c6e8)
---
 airflow/utils/helpers.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/airflow/utils/helpers.py b/airflow/utils/helpers.py
index 05c6e4d..4913c4d 100644
--- a/airflow/utils/helpers.py
+++ b/airflow/utils/helpers.py
@@ -299,7 +299,7 @@ def reap_process_group(pgid, log, sig=signal.SIGTERM,
             # use sudo -n(--non-interactive) to kill the process
             if err.errno == errno.EPERM:
                 subprocess.check_call(
-                    ["sudo", "-n", "kill", "-" + str(sig)] + [str(p.pid) for p in children]
+                    ["sudo", "-n", "kill", "-" + str(int(sig))] + [str(p.pid) for p in children]
                 )
             else:
                 raise


[airflow] 14/25: Update pre-commit-hooks repo version (#9195)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit fcdb35ca20c8a283e33e97e13e15508f53566956
Author: Felix Uellendall <fe...@users.noreply.github.com>
AuthorDate: Sun Jun 14 02:37:30 2020 +0200

    Update pre-commit-hooks repo version (#9195)
    
    - use official yamllint pre-commit-hook
    - run isort pre-commit-hook on all python files instead of files ending with py
    
    (cherry picked from commit 1698db4ac103e27b0af132b6002d2e0473c9344b)
---
 .pre-commit-config.yaml | 59 +++++++++++++++++++++++++------------------------
 dev/airflow-jira        |  6 ++---
 2 files changed, 33 insertions(+), 32 deletions(-)

diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index b1852b7..ed27b09 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -140,7 +140,7 @@ repos:
     hooks:
       - id: check-hooks-apply
   - repo: https://github.com/pre-commit/pre-commit-hooks
-    rev: v2.5.0
+    rev: v3.1.0
     hooks:
       - id: check-merge-conflict
       - id: debug-statements
@@ -155,45 +155,46 @@ repos:
     hooks:
       - id: rst-backticks
       - id: python-no-log-warn
-  - repo: local
+  - repo: https://github.com/adrienverge/yamllint
+    rev: v1.23.0
     hooks:
       - id: yamllint
         name: Check yaml files with yamllint
         entry: yamllint -c yamllint-config.yml
-        language: python
-        additional_dependencies: ['yamllint']
         types: [yaml]
         exclude: ^.*init_git_sync\.template\.yaml$|^.*airflow\.template\.yaml$
+    ##
+    ## Dear committer.
+    ##
+    ## If you ever come here to add the missing isort step here - hear a little warning.
+    ##
+    ## Initially isort will cause surprising duplicates of urlparse and other urllib related methods.
+    ## The urllib imports seem broken for python 2 but they are actually fine due to future
+    ## backport aliases installed elsewhere in the code (implicitly) - in 6 places.
+    ##
+    ## When you decide how to fix it (likely talking to other people in community) and you push
+    ## build to CI you will find terrible truth that in Airflow 1.10 modules are so much
+    ## cross-dependent, that imports in a number of places have to be done in specific order and
+    ## if this is not followed properly, circular imports kick-in and you are doomed.
+    ##
+    ## Running isort breaks the import House of Cards and there is no easy way to fix it short of
+    ## splitting a number of files and probably breaking compatibility.
+    ##
+    ## Luckily this has been fixed in Airflow 2.0 by proper untangling of the cross-dependencies and
+    ## 1.10.* branch is really in maintenance mode, so do not really waste your time here.
+    ##
+    ## Unless you really want of course. But then either delete this comment or increase the counter
+    ## below after you give up.
+    ##
+    ## Total hours wasted here = 3
+    ##
+  - repo: local
+    hooks:
       - id: shellcheck
         name: Check Shell scripts syntax correctness
         language: docker_image
         entry: koalaman/shellcheck:stable -x -a
         files: ^breeze$|^breeze-complete$|\.sh$|^hooks/build$|^hooks/push$|\.bash$|\.bats$
-        ##
-        ## Dear committer.
-        ##
-        ## If you ever come here to add the missing isort step here - hear a little warning.
-        ##
-        ## Initially isort will cause surprising duplicates of urlparse and other urllib related methods.
-        ## The urllib imports seem broken for python 2 but they are actually fine due to future
-        ## backport aliases installed elsewhere in the code (implicitly) - in 6 places.
-        ##
-        ## When you decide how to fix it (likely talking to other people in community) and you push
-        ## build to CI you will find terrible truth that in Airflow 1.10 modules are so much
-        ## cross-dependent, that imports in a number of places have to be done in specific order and
-        ## if this is not followed properly, circular imports kick-in and you are doomed.
-        ##
-        ## Running isort breaks the import House of Cards and there is no easy way to fix it short of
-        ## splitting a number of files and probably breaking compatibility.
-        ##
-        ## Luckily this has been fixed in Airflow 2.0 by proper untangling of the cross-dependencies and
-        ## 1.10.* branch is really in maintenance mode, so do not really waste your time here.
-        ##
-        ## Unless you really want of course. But then either delete this comment or increase the counter
-        ## below after you give up.
-        ##
-        ## Total hours wasted here = 3
-        ##
       - id: lint-dockerfile
         name: Lint dockerfile
         language: system
diff --git a/dev/airflow-jira b/dev/airflow-jira
index dc006a8..ef9910a 100755
--- a/dev/airflow-jira
+++ b/dev/airflow-jira
@@ -20,11 +20,11 @@
 # This tool is based on the Spark merge_spark_pr script:
 # https://github.com/apache/spark/blob/master/dev/merge_spark_pr.py
 
-from collections import defaultdict, Counter
-
-import jira
 import re
 import sys
+from collections import Counter, defaultdict
+
+import jira
 
 PROJECT = "AIRFLOW"
 


[airflow] 12/25: Add missing variable in run_cli_tool.sh (#9239)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 42bfb2eae7794fb4832259d611dab7ed624d597b
Author: Kamil Breguła <mi...@users.noreply.github.com>
AuthorDate: Thu Jun 11 23:30:43 2020 +0200

    Add missing variable in run_cli_tool.sh (#9239)
    
    
    (cherry picked from commit 5a68f54e5c18687ed434e981fb667ad3326754dc)
---
 scripts/ci/run_cli_tool.sh | 1 +
 1 file changed, 1 insertion(+)

diff --git a/scripts/ci/run_cli_tool.sh b/scripts/ci/run_cli_tool.sh
index cf840bc..8a57c35 100755
--- a/scripts/ci/run_cli_tool.sh
+++ b/scripts/ci/run_cli_tool.sh
@@ -43,6 +43,7 @@ SUPPORTED_TOOL_NAMES=("aws" "az" "gcloud" "bq" "gsutil" "terraform" "java")
 
 if [ ! -L "${BASH_SOURCE[0]}" ]
 then
+    SCRIPT_PATH=$(readlink -e "${BASH_SOURCE[0]}")
     # Direct execution - return installation script
     >&2 echo "# CLI tool wrappers"
     >&2 echo "#"


[airflow] 09/25: Correctly restore colour in logs after format arg (#9222)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 7658c1867c8a0ef01de76c1d17a3a4ce090881c2
Author: Ash Berlin-Taylor <as...@firemirror.com>
AuthorDate: Thu Jun 11 11:50:33 2020 +0100

    Correctly restore colour in logs after format arg (#9222)
    
    The "\e[22m" escape sequence has been tested on Konsole, iTerm2 and
    Terminal.app
    
    (cherry picked from commit bfe1d6b1aafc757f821ceb077e6b882ff1363357)
---
 airflow/utils/log/colored_log.py        | 13 ++++++++-----
 requirements/requirements-python2.7.txt |  2 +-
 requirements/requirements-python3.5.txt |  2 +-
 requirements/requirements-python3.6.txt |  2 +-
 requirements/requirements-python3.7.txt |  2 +-
 requirements/setup-2.7.md5              |  2 +-
 requirements/setup-3.5.md5              |  2 +-
 requirements/setup-3.6.md5              |  2 +-
 requirements/setup-3.7.md5              |  2 +-
 setup.py                                |  1 -
 10 files changed, 16 insertions(+), 14 deletions(-)

diff --git a/airflow/utils/log/colored_log.py b/airflow/utils/log/colored_log.py
index a89e779..8f92d80 100644
--- a/airflow/utils/log/colored_log.py
+++ b/airflow/utils/log/colored_log.py
@@ -23,9 +23,7 @@ import re
 import sys
 
 from colorlog import TTYColoredFormatter
-from termcolor import colored
-
-ARGS = {"attrs": ["bold"]}
+from colorlog.escape_codes import esc, escape_codes
 
 DEFAULT_COLORS = {
     "DEBUG": "red",
@@ -35,6 +33,9 @@ DEFAULT_COLORS = {
     "CRITICAL": "red",
 }
 
+BOLD_ON = escape_codes['bold']
+BOLD_OFF = esc('22')
+
 
 class CustomTTYColoredFormatter(TTYColoredFormatter):
     """
@@ -52,7 +53,7 @@ class CustomTTYColoredFormatter(TTYColoredFormatter):
         if isinstance(arg, (int, float)):
             # In case of %d or %f formatting
             return arg
-        return colored(str(arg), **ARGS)  # type: ignore
+        return BOLD_ON + str(arg) + BOLD_OFF
 
     @staticmethod
     def _count_number_of_arguments_in_message(record):
@@ -83,7 +84,9 @@ class CustomTTYColoredFormatter(TTYColoredFormatter):
                 record.exc_text = self.formatException(record.exc_info)
 
             if record.exc_text:
-                record.exc_text = colored(record.exc_text, DEFAULT_COLORS["ERROR"])
+                record.exc_text = self.color(self.log_colors, record.levelname) + \
+                    record.exc_text + escape_codes['reset']
+
         return record
 
     def format(self, record):
diff --git a/requirements/requirements-python2.7.txt b/requirements/requirements-python2.7.txt
index bcce040..6fbc2c0 100644
--- a/requirements/requirements-python2.7.txt
+++ b/requirements/requirements-python2.7.txt
@@ -77,7 +77,7 @@ cachetools==3.1.1
 cassandra-driver==3.20.2
 cattrs==0.9.2
 celery==4.4.5
-certifi==2020.4.5.2
+certifi==2020.6.20
 cffi==1.14.0
 cfgv==2.0.1
 cfn-lint==0.33.1
diff --git a/requirements/requirements-python3.5.txt b/requirements/requirements-python3.5.txt
index 725fdd8..7bd9464 100644
--- a/requirements/requirements-python3.5.txt
+++ b/requirements/requirements-python3.5.txt
@@ -68,7 +68,7 @@ cachetools==4.1.0
 cassandra-driver==3.20.2
 cattrs==0.9.2
 celery==4.4.5
-certifi==2020.4.5.2
+certifi==2020.6.20
 cffi==1.14.0
 cfgv==2.0.1
 cfn-lint==0.33.1
diff --git a/requirements/requirements-python3.6.txt b/requirements/requirements-python3.6.txt
index 18a1b61..a7cf3c4 100644
--- a/requirements/requirements-python3.6.txt
+++ b/requirements/requirements-python3.6.txt
@@ -70,7 +70,7 @@ cachetools==4.1.0
 cassandra-driver==3.20.2
 cattrs==0.9.2
 celery==4.4.5
-certifi==2020.4.5.2
+certifi==2020.6.20
 cffi==1.14.0
 cfgv==3.1.0
 cfn-lint==0.33.1
diff --git a/requirements/requirements-python3.7.txt b/requirements/requirements-python3.7.txt
index ff1137f..3e29ecb 100644
--- a/requirements/requirements-python3.7.txt
+++ b/requirements/requirements-python3.7.txt
@@ -70,7 +70,7 @@ cachetools==4.1.0
 cassandra-driver==3.20.2
 cattrs==0.9.2
 celery==4.4.5
-certifi==2020.4.5.2
+certifi==2020.6.20
 cffi==1.14.0
 cfgv==3.1.0
 cfn-lint==0.33.1
diff --git a/requirements/setup-2.7.md5 b/requirements/setup-2.7.md5
index 87e77b8..5734329 100644
--- a/requirements/setup-2.7.md5
+++ b/requirements/setup-2.7.md5
@@ -1 +1 @@
-99a88ba9c37191240b3dd729aed29e4b  /opt/airflow/setup.py
+068295c74403ac4827ec95c6639154f8  /opt/airflow/setup.py
diff --git a/requirements/setup-3.5.md5 b/requirements/setup-3.5.md5
index 87e77b8..5734329 100644
--- a/requirements/setup-3.5.md5
+++ b/requirements/setup-3.5.md5
@@ -1 +1 @@
-99a88ba9c37191240b3dd729aed29e4b  /opt/airflow/setup.py
+068295c74403ac4827ec95c6639154f8  /opt/airflow/setup.py
diff --git a/requirements/setup-3.6.md5 b/requirements/setup-3.6.md5
index 87e77b8..5734329 100644
--- a/requirements/setup-3.6.md5
+++ b/requirements/setup-3.6.md5
@@ -1 +1 @@
-99a88ba9c37191240b3dd729aed29e4b  /opt/airflow/setup.py
+068295c74403ac4827ec95c6639154f8  /opt/airflow/setup.py
diff --git a/requirements/setup-3.7.md5 b/requirements/setup-3.7.md5
index 87e77b8..5734329 100644
--- a/requirements/setup-3.7.md5
+++ b/requirements/setup-3.7.md5
@@ -1 +1 @@
-99a88ba9c37191240b3dd729aed29e4b  /opt/airflow/setup.py
+068295c74403ac4827ec95c6639154f8  /opt/airflow/setup.py
diff --git a/setup.py b/setup.py
index 9f05da9..61339e7 100644
--- a/setup.py
+++ b/setup.py
@@ -586,7 +586,6 @@ INSTALL_REQUIREMENTS = [
     'sqlalchemy_jsonfield~=0.9;python_version>="3.5"',
     'tabulate>=0.7.5, <0.9',
     'tenacity==4.12.0',
-    'termcolor==1.1.0',
     'thrift>=0.9.2',
     'typing;python_version<"3.5"',
     'typing-extensions>=3.7.4;python_version<"3.8"',


[airflow] 15/25: Fix broken CI image optimisation (#9313)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 51355683b934cd7eaa7ed5dc51abe44f038b239d
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Tue Jun 16 01:38:55 2020 +0200

    Fix broken CI image optimisation (#9313)
    
    The commit 5918efc86a2217caa641a6ada289eee1c21407f8 broke
    optimisation of the CI image - using the Apache Airflow
    master branch as a base package installation source from PyPI.
    
    This commit restores it including removal of the
    obsolete CI_OPTIMISED arg - as now we have a separate
    production and CI image and CI image is by default
    CI_OPTIMISED
    
    (cherry picked from commit 696e74594f26ec67c1f1330af725db537e97f18e)
---
 Dockerfile.ci                         | 17 +++++------------
 IMAGES.rst                            | 10 ++--------
 scripts/ci/libraries/_build_images.sh |  2 --
 3 files changed, 7 insertions(+), 22 deletions(-)

diff --git a/Dockerfile.ci b/Dockerfile.ci
index 24ee87d..4c9741b 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -214,25 +214,18 @@ ENV AIRFLOW_EXTRAS=${AIRFLOW_EXTRAS}${ADDITIONAL_AIRFLOW_EXTRAS:+,}${ADDITIONAL_
 
 RUN echo "Installing with extras: ${AIRFLOW_EXTRAS}."
 
-ARG AIRFLOW_CONTAINER_CI_OPTIMISED_BUILD="true"
-ENV AIRFLOW_CONTAINER_CI_OPTIMISED_BUILD=${AIRFLOW_CONTAINER_CI_OPTIMISED_BUILD}
-
 # By changing the CI build epoch we can force reinstalling Arflow from the current master
 # It can also be overwritten manually by setting the AIRFLOW_CI_BUILD_EPOCH environment variable.
 ARG AIRFLOW_CI_BUILD_EPOCH="1"
 ENV AIRFLOW_CI_BUILD_EPOCH=${AIRFLOW_CI_BUILD_EPOCH}
 
-# In case of CI-optimised builds we want to pre-install master version of airflow dependencies so that
+# In case of CI builds we want to pre-install master version of airflow dependencies so that
 # We do not have to always reinstall it from the scratch.
 # This can be reinstalled from latest master by increasing PIP_DEPENDENCIES_EPOCH_NUMBER.
-# And is automatically reinstalled from the scratch every month
-RUN \
-    if [[ "${AIRFLOW_CONTAINER_CI_OPTIMISED_BUILD}" == "true" ]]; then \
-        pip install \
-        "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz#egg=apache-airflow[${AIRFLOW_EXTRAS}]" \
-            --constraint "https://raw.githubusercontent.com/${AIRFLOW_REPO}/${AIRFLOW_BRANCH}/requirements/requirements-python${PYTHON_MAJOR_MINOR_VERSION}.txt" \
-        && pip uninstall --yes apache-airflow; \
-    fi
+# And is automatically reinstalled from the scratch with every python patch level release
+RUN pip install "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz#egg=apache-airflow[${AIRFLOW_EXTRAS}]" \
+        --constraint "https://raw.githubusercontent.com/${AIRFLOW_REPO}/${AIRFLOW_BRANCH}/requirements/requirements-python${PYTHON_MAJOR_MINOR_VERSION}.txt" \
+    && pip uninstall --yes apache-airflow
 
 # Link dumb-init for backwards compatibility (so that older images also work)
 RUN ln -sf /usr/bin/dumb-init /usr/local/bin/dumb-init
diff --git a/IMAGES.rst b/IMAGES.rst
index 0d4bd8c..3add528 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -123,8 +123,8 @@ Technical details of Airflow images
 
 The CI image is used by Breeze as shell image but it is also used during CI build.
 The image is single segment image that contains Airflow installation with "all" dependencies installed.
-It is optimised for rebuild speed (``AIRFLOW_CONTAINER_CI_OPTIMISED_BUILD`` flag set to "true").
-It installs PIP dependencies from the current branch first - so that any changes in setup.py do not trigger
+It is optimised for rebuild speed It installs PIP dependencies from the current branch first -
+so that any changes in setup.py do not trigger
 reinstalling of all dependencies. There is a second step of installation that re-installs the dependencies
 from the latest sources so that we are sure that latest dependencies are installed.
 
@@ -179,12 +179,6 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 | ``CASS_DRIVER_NO_CYTHON``                | ``1``                                    | if set to 1 no CYTHON compilation is     |
 |                                          |                                          | done for cassandra driver (much faster)  |
 +------------------------------------------+------------------------------------------+------------------------------------------+
-| ``AIRFLOW_CONTAINER_CI_OPTIMISED_BUILD`` | ``true``                                 | if set then PIP dependencies are         |
-|                                          |                                          | installed from repo first before they    |
-|                                          |                                          | are reinstalled from local sources. This |
-|                                          |                                          | allows for incremental faster builds     |
-|                                          |                                          | when requirements change                 |
-+------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_REPO``                         | ``apache/airflow``                       | the repository from which PIP            |
 |                                          |                                          | dependencies are installed (CI           |
 |                                          |                                          | optimised)                               |
diff --git a/scripts/ci/libraries/_build_images.sh b/scripts/ci/libraries/_build_images.sh
index 9d2b748..a54d6d8 100644
--- a/scripts/ci/libraries/_build_images.sh
+++ b/scripts/ci/libraries/_build_images.sh
@@ -344,7 +344,6 @@ function prepare_ci_build() {
     fi
     export THE_IMAGE_TYPE="CI"
     export IMAGE_DESCRIPTION="Airflow CI"
-    export AIRFLOW_CONTAINER_CI_OPTIMISED_BUILD="true"
     export AIRFLOW_EXTRAS="${AIRFLOW_EXTRAS:="${DEFAULT_CI_EXTRAS}"}"
     export ADDITIONAL_AIRFLOW_EXTRAS="${ADDITIONAL_AIRFLOW_EXTRAS:=""}"
     export ADDITIONAL_PYTHON_DEPS="${ADDITIONAL_PYTHON_DEPS:=""}"
@@ -551,7 +550,6 @@ Docker building ${AIRFLOW_CI_IMAGE}.
         --build-arg ADDITIONAL_PYTHON_DEPS="${ADDITIONAL_PYTHON_DEPS}" \
         --build-arg ADDITIONAL_DEV_DEPS="${ADDITIONAL_DEV_DEPS}" \
         --build-arg ADDITIONAL_RUNTIME_DEPS="${ADDITIONAL_RUNTIME_DEPS}" \
-        --build-arg AIRFLOW_CONTAINER_CI_OPTIMISED_BUILD="${AIRFLOW_CONTAINER_CI_OPTIMISED_BUILD}" \
         --build-arg UPGRADE_TO_LATEST_REQUIREMENTS="${UPGRADE_TO_LATEST_REQUIREMENTS}" \
         "${DOCKER_CACHE_CI_DIRECTIVE[@]}" \
         -t "${AIRFLOW_CI_IMAGE}" \


[airflow] 22/25: Fix in-breeze CLI tools to work also on Linux (#9376)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit a6f3d3ec08148839b09d9db75c13c6d0344e5e00
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Fri Jun 19 08:58:32 2020 +0200

    Fix in-breeze CLI tools to work also on Linux (#9376)
    
    Instead of creating the links in the image (which did not work)
    the links are created now at the entry to the breeze image.
    The wrappers were not installed via Dockerfile and the ownership
    fixing did not work on Linux
    
    (cherry picked from commit ca8815188755866ae708c968df786c42043656c9)
---
 Dockerfile.ci                                  |  2 --
 scripts/ci/in_container/_in_container_utils.sh | 25 +++++++++++++------
 scripts/ci/in_container/entrypoint_ci.sh       |  4 +++
 scripts/ci/run_cli_tool.sh                     | 34 ++++++++------------------
 4 files changed, 32 insertions(+), 33 deletions(-)

diff --git a/Dockerfile.ci b/Dockerfile.ci
index 651810d..f4ba142 100644
--- a/Dockerfile.ci
+++ b/Dockerfile.ci
@@ -307,8 +307,6 @@ RUN if [[ -n "${ADDITIONAL_PYTHON_DEPS}" ]]; then \
         pip install ${ADDITIONAL_PYTHON_DEPS}; \
     fi
 
-RUN source <(bash scripts/ci/run_cli_tool.sh)
-
 WORKDIR ${AIRFLOW_SOURCES}
 
 ENV PATH="${HOME}:${PATH}"
diff --git a/scripts/ci/in_container/_in_container_utils.sh b/scripts/ci/in_container/_in_container_utils.sh
index 5d9fbb6..0eb3a8a 100644
--- a/scripts/ci/in_container/_in_container_utils.sh
+++ b/scripts/ci/in_container/_in_container_utils.sh
@@ -94,13 +94,24 @@ function in_container_cleanup_pycache() {
 #
 function in_container_fix_ownership() {
     if [[ ${HOST_OS:=} == "Linux" ]]; then
-        set +o pipefail
-        echo "Fixing ownership of mounted files"
-        sudo find "${AIRFLOW_SOURCES}" -print0 -user root \
-        | sudo xargs --null chown "${HOST_USER_ID}.${HOST_GROUP_ID}" --no-dereference >/dev/null 2>&1
-        sudo find "/root/.aws" "/root/.azure" "/root/.config" "/root/.docker" -print0 -user root \
-        | sudo xargs --null chown "${HOST_USER_ID}.${HOST_GROUP_ID}" --no-dereference || true >/dev/null 2>&1
-        set -o pipefail
+        DIRECTORIES_TO_FIX=(
+            "/tmp"
+            "/files"
+            "/root/.aws"
+            "/root/.azure"
+            "/root/.config/gcloud"
+            "/root/.docker"
+            "${AIRFLOW_SOURCES}"
+        )
+        if [[ ${VERBOSE} == "true" ]]; then
+            echo "Fixing ownership of mounted files"
+        fi
+        sudo find "${DIRECTORIES_TO_FIX[@]}" -print0 -user root 2>/dev/null \
+            | sudo xargs --null chown "${HOST_USER_ID}.${HOST_GROUP_ID}" --no-dereference ||
+                true >/dev/null 2>&1
+        if [[ ${VERBOSE} == "true" ]]; then
+            echo "Fixed ownership of mounted files"
+        fi
     fi
 }
 
diff --git a/scripts/ci/in_container/entrypoint_ci.sh b/scripts/ci/in_container/entrypoint_ci.sh
index 2ea5a14..349b092 100755
--- a/scripts/ci/in_container/entrypoint_ci.sh
+++ b/scripts/ci/in_container/entrypoint_ci.sh
@@ -45,6 +45,10 @@ RUN_TESTS=${RUN_TESTS:="false"}
 CI=${CI:="false"}
 INSTALL_AIRFLOW_VERSION="${INSTALL_AIRFLOW_VERSION:=""}"
 
+# Create links for useful CLI tools
+# shellcheck source=scripts/ci/run_cli_tool.sh
+source <(bash scripts/ci/run_cli_tool.sh)
+
 if [[ ${AIRFLOW_VERSION} == *1.10* || ${INSTALL_AIRFLOW_VERSION} == *1.10* ]]; then
     export RUN_AIRFLOW_1_10="true"
 else
diff --git a/scripts/ci/run_cli_tool.sh b/scripts/ci/run_cli_tool.sh
index 8a57c35..3989601 100755
--- a/scripts/ci/run_cli_tool.sh
+++ b/scripts/ci/run_cli_tool.sh
@@ -45,12 +45,12 @@ if [ ! -L "${BASH_SOURCE[0]}" ]
 then
     SCRIPT_PATH=$(readlink -e "${BASH_SOURCE[0]}")
     # Direct execution - return installation script
-    >&2 echo "# CLI tool wrappers"
-    >&2 echo "#"
-    >&2 echo "# To install, run the following command:"
-    >&2 echo "#     source <(bash ${SCRIPT_PATH@Q})"
-    >&2 echo "#"
-    >&2 echo ""
+    echo "# CLI tool wrappers"
+    echo "#"
+    echo "# To install, run the following command:"
+    echo "#     source <(bash ${SCRIPT_PATH@Q})"
+    echo "#"
+    echo ""
     # Print installation script
     for NAME in "${SUPPORTED_TOOL_NAMES[@]}"
     do
@@ -87,25 +87,20 @@ AWS_CREDENTIALS_DOCKER_ARGS=(-v "${HOST_HOME}/.aws:/root/.aws")
 AZURE_CREDENTIALS_DOCKER_ARGS=(-v "${HOST_HOME}/.azure:/root/.azure")
 GOOGLE_CREDENTIALS_DOCKER_ARGS=(-v "${HOST_HOME}/.config/gcloud:/root/.config/gcloud")
 
-DIRECTORIES_TO_FIX=('/tmp/' '/files/')
-
 COMMAND=("${@}")
 
 # Configure selected tool
 case "${TOOL_NAME}" in
     aws )
         COMMON_DOCKER_ARGS+=("${AWS_CREDENTIALS_DOCKER_ARGS[@]}")
-        DIRECTORIES_TO_FIX+=("/root/.aws")
         IMAGE_NAME="amazon/aws-cli:latest"
         ;;
     az )
         COMMON_DOCKER_ARGS+=("${AZURE_CREDENTIALS_DOCKER_ARGS[@]}")
-        DIRECTORIES_TO_FIX+=("/root/.azure")
         IMAGE_NAME="mcr.microsoft.com/azure-cli:latest"
         ;;
     gcloud | bq | gsutil )
         COMMON_DOCKER_ARGS+=("${GOOGLE_CREDENTIALS_DOCKER_ARGS[@]}")
-        DIRECTORIES_TO_FIX+=("/root/.config/gcloud")
         IMAGE_NAME="gcr.io/google.com/cloudsdktool/cloud-sdk:latest"
         COMMAND=("$TOOL_NAME" "${@}")
         ;;
@@ -115,17 +110,11 @@ case "${TOOL_NAME}" in
             "${AZURE_CREDENTIALS_DOCKER_ARGS[@]}"
             "${AWS_CREDENTIALS_DOCKER_ARGS[@]}"
         )
-        DIRECTORIES_TO_FIX+=(
-            "/root/.config/gcloud"
-            "/root/.aws"
-            "/root/.azure"
-        )
         IMAGE_NAME="hashicorp/terraform:latest"
         ;;
     java )
         # TODO: Should we add other credentials?
         COMMON_DOCKER_ARGS+=("${GOOGLE_CREDENTIALS_DOCKER_ARGS[@]}")
-        DIRECTORIES_TO_FIX+=("/root/.config/gcloud")
         IMAGE_NAME="openjdk:8-jre-slim"
         COMMAND=("/usr/local/openjdk-8/bin/java" "${@}")
         ;;
@@ -150,19 +139,16 @@ if [ -t 0 ] ; then
         --tty
     )
 fi
-
+set +e
 docker run "${TOOL_DOCKER_ARGS[@]}" "${IMAGE_NAME}" "${COMMAND[@]}"
 
 RES=$?
 
 # Set file permissions to the host user
 if [[ "${HOST_OS}" == "Linux" ]]; then
-    FIX_DOCKER_ARGS=(--rm)
-    FIX_DOCKER_ARGS+=("${COMMON_DOCKER_ARGS[@]}")
-    FIX_COMMAND=(bash -c
-        "find ${DIRECTORIES_TO_FIX[@]@Q} -user root -print0 | xargs --null chown '${HOST_USER_ID}.${HOST_GROUP_ID}' --no-dereference")
-
-    docker run "${FIX_DOCKER_ARGS[@]}" "${AIRFLOW_CI_IMAGE}" "${FIX_COMMAND[@]}" >/dev/null 2>&1
+    docker run --rm "${COMMON_DOCKER_ARGS[@]}" \
+        --entrypoint /opt/airflow/scripts/ci/in_container/run_fix_ownership.sh \
+            "${AIRFLOW_CI_IMAGE}"
 fi
 
 exit ${RES}


[airflow] 24/25: Fixed rendering of IMAGES.rst (#9433)

Posted by po...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v1-10-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e1675c3572010d309feddc9731fb37ebbdc1a18b
Author: Jarek Potiuk <ja...@polidea.com>
AuthorDate: Sat Jun 20 11:10:46 2020 +0200

    Fixed rendering of IMAGES.rst (#9433)
    
    
    (cherry picked from commit 07f12e5f9f1347c4493649e4c32e7757f2c84ddb)
---
 IMAGES.rst | 8 ++++----
 1 file changed, 4 insertions(+), 4 deletions(-)

diff --git a/IMAGES.rst b/IMAGES.rst
index 4cdf86d..8b4b8df 100644
--- a/IMAGES.rst
+++ b/IMAGES.rst
@@ -291,10 +291,10 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 | ``ADDITIONAL_PYTHON_DEPS``               |                                          | additional python dependencies to        |
 |                                          |                                          | install                                  |
 +------------------------------------------+------------------------------------------+------------------------------------------+
-| ``ADDITIONAL_DEV_DEPS``                  | ````                                     | additional apt dev dependencies to       |
+| ``ADDITIONAL_DEV_DEPS``                  |                                          | additional apt dev dependencies to       |
 |                                          |                                          | install                                  |
 +------------------------------------------+------------------------------------------+------------------------------------------+
-| ``ADDITIONAL_RUNTIME_DEPS``              | ````                                     | additional apt runtime dependencies to   |
+| ``ADDITIONAL_RUNTIME_DEPS``              |                                          | additional apt runtime dependencies to   |
 |                                          |                                          | install                                  |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 
@@ -388,10 +388,10 @@ The following build arguments (``--build-arg`` in docker build command) can be u
 | ``ADDITIONAL_PYTHON_DEPS``               |                                          | Optional python packages to extend       |
 |                                          |                                          | the image with some extra dependencies   |
 +------------------------------------------+------------------------------------------+------------------------------------------+
-| ``ADDITIONAL_DEV_DEPS``                  | ````                                     | additional apt dev dependencies to       |
+| ``ADDITIONAL_DEV_DEPS``                  |                                          | additional apt dev dependencies to       |
 |                                          |                                          | install                                  |
 +------------------------------------------+------------------------------------------+------------------------------------------+
-| ``ADDITIONAL_RUNTIME_DEPS``              | ````                                     | additional apt runtime dependencies to   |
+| ``ADDITIONAL_RUNTIME_DEPS``              |                                          | additional apt runtime dependencies to   |
 |                                          |                                          | install                                  |
 +------------------------------------------+------------------------------------------+------------------------------------------+
 | ``AIRFLOW_HOME``                         | ``/opt/airflow``                         | Airflow’s HOME (that’s where logs and    |