You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2019/10/15 08:55:21 UTC

[GitHub] [airflow] feluelle commented on a change in pull request #6285: [AIRFLOW-XXX] Updates to Breeze documentation from GSOD

feluelle commented on a change in pull request #6285: [AIRFLOW-XXX] Updates to Breeze documentation from GSOD
URL: https://github.com/apache/airflow/pull/6285#discussion_r334823975
 
 

 ##########
 File path: BREEZE.rst
 ##########
 @@ -474,214 +617,213 @@ Run pylint checks for all files:
      ./breeze --static-check-all-files pylint
 
 
-The ``license`` check is also run via separate script and separate docker image containing
+The ``license`` check is run via a separate script and a separate docker image containing the 
 Apache RAT verification tool that checks for Apache-compatibility of licences within the codebase.
-It does not take pre-commit parameters as extra args.
+It does not take pre-commit parameters as extra arguments.
 
 .. code-block:: bash
 
      ./breeze --static-check-all-files licenses
 
-Building the documentation
---------------------------
+Running Static Code Checks from the Host
+----------------------------------------
 
-The documentation is build using ``-O``, ``--build-docs`` command:
+You can trigger the static checks from the host environment, without entering the Docker container. To do this, run the following scripts (the same is done in Travis CI):
 
-.. code-block:: bash
+* `<scripts/ci/ci_check_license.sh>`_ - checks the licenses.
+* `<scripts/ci/ci_docs.sh>`_ - checks that documentation can be built without warnings.
+* `<scripts/ci/ci_flake8.sh>`_ - runs Flake8 source code style enforcement tool.
+* `<scripts/ci/ci_lint_dockerfile.sh>`_ - runs lint checker for the Dockerfile.
+* `<scripts/ci/ci_mypy.sh>`_ - runs a check for mypy type annotation consistency.
+* `<scripts/ci/ci_pylint_main.sh>`_ - runs pylint static code checker for main files.
+* '`<scripts/ci/ci_pylint_tests.sh>`_ - runs pylint static code checker for tests.
 
-     ./breeze --build-docs
+The scripts may ask you to rebuild the images, if needed.
 
-Results of the build can be found in ``docs/_build`` folder. Often errors during documentation generation
-come from the docstrings of auto-api generated classes. During the docs building auto-api generated
-files are stored in ``docs/_api`` folder - so that in case of problems with documentation you can
-find where the problems with documentation originated from.
+You can force rebuilding the images by deleting the [.build](./build) directory. This directory keeps cached
+information about the images already built and you can safely delete it if you want to start from scratch.
 
-Running tests directly from host
---------------------------------
+After documentation is built, the html results are available in the [docs/_build/html](docs/_build/html) folder.
+This folder is mounted from the host so you can access those files on your host as well.
 
-If you wish to run tests only and not drop into shell, you can run them by providing
--t, --test-target flag. You can add extra nosetest flags after -- in the commandline.
+Running Static Code Checks in the Docker
+------------------------------------------
 
-.. code-block:: bash
+If you are already in the Breeze Docker environment (by running the ``./breeze`` command), you can also run the same static checks from the container:
 
-     ./breeze --test-target tests/hooks/test_druid_hook.py -- --logging-level=DEBUG
+* Mypy: ``./scripts/ci/in_container/run_mypy.sh airflow tests``
+* Pylint for main files: ``./scripts/ci/in_container/run_pylint_main.sh``
+* Pylint for test files: ``./scripts/ci/in_container/run_pylint_tests.sh``
+* Flake8: ``./scripts/ci/in_container/run_flake8.sh``
+* Licence check: ``./scripts/ci/in_container/run_check_licence.sh``
+* Documentation: ``./scripts/ci/in_container/run_docs_build.sh``
 
-You can run the whole test suite with special '.' test target:
+Running Static Code Analysis for Selected Files
+-----------------------------------------------
 
-.. code-block:: bash
+In all static check scripts, both in the container and host versions, you can also pass a module/file path as
+parameters of the scripts to only check selected modules or files. For example:
 
-    ./breeze --test-target .
+In the Docker container:
 
-You can also specify individual tests or group of tests:
+.. code-block::
 
-.. code-block:: bash
+  ./scripts/ci/in_container/run_pylint.sh ./airflow/example_dags/
 
-    ./breeze --test-target tests.core:TestCore
+or
 
-Pulling the latest images
--------------------------
+.. code-block::
 
-Sometimes the image on DockerHub is rebuilt from the scratch. This happens for example when there is a
-security update of the python version that all the images are based on.
-In this case it is usually faster to pull latest images rather than rebuild them
-from the scratch.
+  ./scripts/ci/in_container/run_pylint.sh ./airflow/example_dags/test_utils.py
 
-You can do it via ``--force-pull-images`` flag to force pull latest images from DockerHub.
+On the host:
 
-In the future Breeze will warn you when you are advised to force pull images.
+.. code-block::
 
-Running commands inside Docker
-------------------------------
+  ./scripts/ci/ci_pylint.sh ./airflow/example_dags/
 
-If you wish to run other commands/executables inside of Docker environment you can do it via
-``-x``, ``--execute-command`` flag. Note that if you want to add arguments you should specify them
-together with the command surrounded with " or ' or pass them after -- as extra arguments.
 
-.. code-block:: bash
+.. code-block::
 
-     ./breeze --execute-command "ls -la"
+  ./scripts/ci/ci_pylint.sh ./airflow/example_dags/test_utils.py
 
-.. code-block:: bash
+Running Test Suites via Scripts
+--------------------------------------------
 
-     ./breeze --execute-command ls -- --la
+To run all tests with default settings (python 3.6, sqlite backend, docker environment), enter:
 
+.. code-block::
 
-Running Docker Compose commands
--------------------------------
+  ./scripts/ci/local_ci_run_airflow_testing.sh
 
-If you wish to run docker-compose command (such as help/pull etc. ) you can do it via
-``-d``, ``--docker-compose`` flag. Note that if you want to add extra arguments you should specify them
-after -- as extra arguments.
 
-.. code-block:: bash
+To select a Python version, backend, and a docker environment, specify:
 
-     ./breeze --docker-compose pull -- --ignore-pull-failures
+.. code-block::
 
-Using your host IDE
-===================
+  PYTHON_VERSION=3.5 BACKEND=postgres ENV=docker ./scripts/ci/local_ci_run_airflow_testing.sh
+
+To run kubernetes tests, enter:
+
+.. code-block::
+
+  KUBERNETES_VERSION==v1.13.0 KUBERNETES_MODE=persistent_mode BACKEND=postgres ENV=kubernetes \
+    ./scripts/ci/local_ci_run_airflow_testing.sh
+
+* PYTHON_VERSION is one of 3.5/3.6/3.7
+* BACKEND is one of postgres/sqlite/mysql
+* ENV is one of docker/kubernetes/bare
+* KUBERNETES_VERSION is required for Kubernetes tests. Currently, it is KUBERNETES_VERSION=v1.13.0.
+* KUBERNETES_MODE is a mode of kubernetes: either persistent_mode or git_mode.
+
+Using Your Host IDE with Breeze
+===============================
 
 Configuring local virtualenv
 ----------------------------
 
-In order to use your host IDE (for example IntelliJ's PyCharm/Idea) you need to have virtual environments
-setup. Ideally you should have virtualenvs for all python versions that Airflow supports (3.5, 3.6, 3.7).
-You can create the virtualenv using ``virtualenvwrapper`` - that will allow you to easily switch between
-virtualenvs using workon command and mange your virtual environments more easily.
+To use your host IDE (for example, IntelliJ's PyCharm/Idea), you need to set up virtual environments. Ideally, you should have virtualenvs for all Python versions supported by Airflow (3.5, 3.6, 3.7).
+You can create a virtualenv using ``virtualenvwrapper``. This allows you to easily switch between
+virtualenvs using the ``workon`` command and manage your virtual environments more easily.
 
 Typically creating the environment can be done by:
 
 .. code-block:: bash
 
   mkvirtualenv <ENV_NAME> --python=python<VERSION>
 
-After the virtualenv is created, you must initialize it. Simply enter the environment
-(using workon) and once you are in it run:
+After the virtualenv is created, you need to initialize it. Simply enter the environment by 
+using ``workon`` and, once you are in it, run:
 
 .. code-block:: bash
 
   ./breeze --initialize-local-virtualenv
 
-Once initialization is done, you should select the virtualenv you initialized as the project's default
+Once initialization is done, select the virtualenv you initialized as a default project
 virtualenv in your IDE.
 
-Running unit tests via IDE
+Running Unit Tests via IDE
 --------------------------
 
-After setting it up - you can use the usual "Run Test" option of the IDE and have all the
+When setup is done, you can use the usual **Run Test** option of the IDE, have all the
 autocomplete and documentation support from IDE as well as you can debug and click-through
-the sources of Airflow - which is very helpful during development. Usually you also can run most
-of the unit tests (those that do not require prerequisites) directly from the IDE:
+the sources of Airflow, which is very helpful during development. Usually you can also run most
+of the unit tests (those that do not have dependencies) directly from the IDE:
 
 Running unit tests from IDE is as simple as:
 
 .. image:: images/running_unittests.png
     :align: center
     :alt: Running unit tests
 
-Some of the core tests use dags defined in ``tests/dags`` folder - those tests should have
+Some of the core tests use dags defined in ``tests/dags`` folder. Those tests should have
 ``AIRFLOW__CORE__UNIT_TEST_MODE`` set to True. You can set it up in your test configuration:
 
 .. image:: images/airflow_unit_test_mode.png
     :align: center
     :alt: Airflow Unit test mode
 
 
-You cannot run all the tests this way - only unit tests that do not require external dependencies
-such as postgres/mysql/hadoop etc. You should use
-`Running tests in Airflow Breeze <#running-tests-in-airflow-breeze>`_ in order to run those tests. You can
-still use your IDE to debug those tests as explained in the next chapter.
+You cannot run all the tests this way but only unit tests that do not require external dependencies
+such as Postgres/Mysql/Hadoop/etc. You should use the
+`run-tests <#running-tests-in-airflow-breeze>`_ command for these tests. You can
+still use your IDE to debug those tests as explained in the next section.
 
 Debugging Airflow Breeze Tests in IDE
 -------------------------------------
 
-When you run example DAGs, even if you run them using UnitTests from within IDE, they are run in a separate
+When you run example DAGs, even if you run them using unit tests within IDE, they are run in a separate
 container. This makes it a little harder to use with IDE built-in debuggers.
-Fortunately for IntelliJ/PyCharm it is fairly easy using remote debugging feature (note that remote
-debugging is only available in paid versions of IntelliJ/PyCharm).
+Fortunately, IntelliJ/PyCharm provides an effective remote debugging feature (but only in paid versions). See additional details on _`remote debugging <https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html>`_.
 
-You can read general description `about remote debugging
-<https://www.jetbrains.com/help/pycharm/remote-debugging-with-product.html>`_
-
-You can setup your remote debug session as follows:
+You can set up your remote debugging session as follows:
 
 .. image:: images/setup_remote_debugging.png
     :align: center
     :alt: Setup remote debugging
 
-Not that if you are on ``MacOS`` you have to use the real IP address of your host rather than default
-localhost because on MacOS container runs in a virtual machine with different IP address.
+Note that on macOS, you have to use a real IP address of your host rather than default
+localhost because on macOS the container runs in a virtual machine with a different IP address.
 
-You also have to remember about configuring source code mapping in remote debugging configuration to map
-your local sources into the ``/opt/airflow`` location of the sources within the container.
+Make sure to configure source code mapping in the remote debugging configuration to map
+your local sources to the ``/opt/airflow`` location of the sources within the container:
 
 .. image:: images/source_code_mapping_ide.png
     :align: center
     :alt: Source code mapping
 
+Breeze Command-Line Interface Reference
+=======================================
 
-Airflow Breeze flags
-====================
+Airflow Breeze Syntax
+---------------------
 
-These are the current flags of the `./breeze <./breeze>`_ script
+This is the current syntax for  `./breeze <./breeze>`_:
 
 .. code-block:: text
 
     Usage: breeze [FLAGS] \
       [-k]|[-S <STATIC_CHECK>]|[-F <STATIC_CHECK>]|[-O]|[-e]|[-a]|[-b]|[-t <TARGET>]|[-x <COMMAND>]|[-d <COMMAND>] \
       -- <EXTRA_ARGS>
 
-    The swiss-knife-army tool for Airflow testings. It allows to perform various test tasks:
+    Commands
 
-      * Enter interactive environment when no command flags are specified (default behaviour)
-      * Stop the interactive environment with -k, --stop-environment command
-      * Run static checks - either for currently staged change or for all files with
-        -S, --static-check or -F, --static-check-all-files commanbd
-      * Build documentation with -O, --build-docs command
-      * Setup local virtualenv with -e, --setup-virtualenv command
-      * Setup autocomplete for itself with -a, --setup-autocomplete command
-      * Build docker image with -b, --build-only command
-      * Run test target specified with -t, --test-target connad
-      * Execute arbitrary command in the test environmenrt with -x, --execute-command command
-      * Execute arbitrary docker-compose command with -d, --docker-compose command
-
-    ** Commands
-
-      By default the script enters IT environment and drops you to bash shell,
-      but you can also choose one of the commands to run specific actions instead:
+      By default, the breeze script enters an IT environment and drops you to a bash shell,
 
 Review comment:
   ```suggestion
         By default, the Breeze script enters an IT environment and drops you to a bash shell,
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services