You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2019/01/05 14:05:27 UTC

[GitHub] kaxil closed pull request #4419: [AIRFLOW-3612] Remove incubation/incubator mention

kaxil closed pull request #4419: [AIRFLOW-3612] Remove incubation/incubator mention
URL: https://github.com/apache/airflow/pull/4419
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/.rat-excludes b/.rat-excludes
index 96e59ee43c..786920076d 100644
--- a/.rat-excludes
+++ b/.rat-excludes
@@ -57,5 +57,5 @@ flake8_diff.sh
 coverage.xml
 
 rat-results.txt
-apache-airflow-.*\+incubating-source.tar.gz.*
-apache-airflow-.*\+incubating-bin.tar.gz.*
+apache-airflow-.*\+source.tar.gz.*
+apache-airflow-.*\+bin.tar.gz.*
diff --git a/CHANGELOG.txt b/CHANGELOG.txt
index 98a1103792..bfb071f5f1 100644
--- a/CHANGELOG.txt
+++ b/CHANGELOG.txt
@@ -535,7 +535,7 @@ AIRFLOW 1.10.0, 2018-08-03
 [AIRFLOW-2097] tz referenced before assignment
 [AIRFLOW-2152] Add Multiply to list of companies using Airflow
 [AIRFLOW-1551] Add operator to trigger Jenkins job
-[AIRFLOW-2034] Fix mixup between %s and {} when using str.format Convention is to use .format for string formating oustide logging, else use lazy format See comment in related issue https://github.com/apache/incubator-airflow/pull/2823/files Identified problematic case using following command line .git/COMMIT_EDITMSG:`grep -r '%s'./* | grep '\.format('`
+[AIRFLOW-2034] Fix mixup between %s and {} when using str.format Convention is to use .format for string formating oustide logging, else use lazy format See comment in related issue https://github.com/apache/airflow/pull/2823/files Identified problematic case using following command line .git/COMMIT_EDITMSG:`grep -r '%s'./* | grep '\.format('`
 [AIRFLOW-2102] Add custom_args to Sendgrid personalizations
 [AIRFLOW-1035][AIRFLOW-1053] import unicode_literals to parse Unicode in HQL
 [AIRFLOW-2127] Keep loggers during DB migrations
@@ -1829,7 +1829,7 @@ AIRFLOW 1.7.1, 2016-05-19
 -------------------------
 
 - Fix : Don't treat premature tasks as could_not_run tasks
-- AIRFLOW-92 Avoid unneeded upstream_failed session closes apache/incubator-airflow#1485
+- AIRFLOW-92 Avoid unneeded upstream_failed session closes apache/airflow#1485
 - Add logic to lock DB and avoid race condition
 - Handle queued tasks from multiple jobs/executors
 - AIRFLOW-52 Warn about overwriting tasks in a DAG
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index d8dab9c281..4e88eb6f53 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -79,7 +79,7 @@ If you are proposing a feature:
 ## Documentation
 
 The latest API documentation is usually available
-[here](https://airflow.incubator.apache.org/). To generate a local version,
+[here](https://airflow.apache.org/). To generate a local version,
 you need to have set up an Airflow development environment (see below). Also
 install the `doc` extra.
 
@@ -107,7 +107,7 @@ There are three ways to setup an Apache Airflow development environment.
 1. Using tools and libraries installed directly on your system.
 
   Install Python (2.7.x or 3.5.x), MySQL, and libxml by using system-level package
-  managers like yum, apt-get for Linux, or Homebrew for Mac OS at first. Refer to the [base CI Dockerfile](https://github.com/apache/incubator-airflow-ci/blob/master/Dockerfile) for
+  managers like yum, apt-get for Linux, or Homebrew for Mac OS at first. Refer to the [base CI Dockerfile](https://github.com/apache/airflow-ci/blob/master/Dockerfile) for
   a comprehensive list of required packages.
 
   Then install python development requirements. It is usually best to work in a virtualenv:
@@ -261,14 +261,14 @@ Feel free to customize based on the extras available in [setup.py](./setup.py)
 Before you submit a pull request from your forked repo, check that it
 meets these guidelines:
 
-1. The pull request should include tests, either as doctests, unit tests, or both. The airflow repo uses [Travis CI](https://travis-ci.org/apache/incubator-airflow) to run the tests and [codecov](https://codecov.io/gh/apache/incubator-airflow) to track coverage. You can set up both for free on your fork (see the "Testing on Travis CI" section below). It will help you making sure you do not break the build with your PR and that you help increase coverage.
+1. The pull request should include tests, either as doctests, unit tests, or both. The airflow repo uses [Travis CI](https://travis-ci.org/apache/airflow) to run the tests and [codecov](https://codecov.io/gh/apache/airflow) to track coverage. You can set up both for free on your fork (see the "Testing on Travis CI" section below). It will help you making sure you do not break the build with your PR and that you help increase coverage.
 1. Please [rebase your fork](http://stackoverflow.com/a/7244456/1110993), squash commits, and resolve all conflicts.
 1. Every pull request should have an associated [JIRA](https://issues.apache.org/jira/browse/AIRFLOW/?selectedTab=com.atlassian.jira.jira-projects-plugin:summary-panel). The JIRA link should also be contained in the PR description.
 1. Preface your commit's subject & PR's title with **[AIRFLOW-XXX]** where *XXX* is the JIRA number. We compose release notes (i.e. for Airflow releases) from all commit titles in a release. By placing the JIRA number in the commit title and hence in the release notes, Airflow users can look into JIRA and Github PRs for more details about a particular change.
 1. Add an [Apache License](http://www.apache.org/legal/src-headers.html) header to all new files
 1. If the pull request adds functionality, the docs should be updated as part of the same PR. Doc string are often sufficient.  Make sure to follow the Sphinx compatible standards.
 1. The pull request should work for Python 2.7 and 3.5. If you need help writing code that works in both Python 2 and 3, see the documentation at the [Python-Future project](http://python-future.org) (the future package is an Airflow requirement and should be used where possible).
-1. As Airflow grows as a project, we try to enforce a more consistent style and try to follow the Python community guidelines. We track this using [landscape.io](https://landscape.io/github/apache/incubator-airflow/), which you can setup on your fork as well to check before you submit your PR. We currently enforce most [PEP8](https://www.python.org/dev/peps/pep-0008/) and a few other linting rules. It is usually a good idea to lint locally as well using [flake8](https://flake8.readthedocs.org/en/latest/) using `flake8 airflow tests`. `git diff upstream/master -u -- "*.py" | flake8 --diff` will return any changed files in your branch that require linting.
+1. As Airflow grows as a project, we try to enforce a more consistent style and try to follow the Python community guidelines. We currently enforce most [PEP8](https://www.python.org/dev/peps/pep-0008/) and a few other linting rules. It is usually a good idea to lint locally as well using [flake8](https://flake8.readthedocs.org/en/latest/) using `flake8 airflow tests`. `git diff upstream/master -u -- "*.py" | flake8 --diff` will return any changed files in your branch that require linting.
 1. Please read this excellent [article](http://chris.beams.io/posts/git-commit/) on commit messages and adhere to them. It makes the lives of those who come after you a lot easier.
 
 ### Testing on Travis CI
@@ -291,15 +291,15 @@ https://github.com/settings/installations -> Configure Travis CI.
 
 1. For the Travis CI GitHub App, you can set repository access to either "All
 repositories" for convenience, or "Only select repositories" and choose
-`<username>/incubator-airflow` in the dropdown.
+`<username>/airflow` in the dropdown.
 
 1. You can access Travis CI for your fork at
-`https://travis-ci.com/<username>/incubator-airflow`.
+`https://travis-ci.com/<username>/airflow`.
 
 #### Travis CI GitHub Services (legacy version)
 
 The Travis CI GitHub Services versions uses an Authorized OAuth App.  Note
-that `apache/incubator-airflow` is currently still using the legacy version.
+that `apache/airflow` is currently still using the legacy version.
 
 1. Once installed, you can configure the Travis CI Authorized OAuth App at
 https://github.com/settings/connections/applications/88c5b97de2dbfc50f3ac.
@@ -308,10 +308,10 @@ https://github.com/settings/connections/applications/88c5b97de2dbfc50f3ac.
 organization; otherwise, click the "Request" button.
 
 1. For the Travis CI Authorized OAuth App, you may have to grant access to the
-forked `<organization>/incubator-airflow` repo even though it is public.
+forked `<organization>/airflow` repo even though it is public.
 
 1. You can access Travis CI for your fork at
-`https://travis-ci.org/<organization>/incubator-airflow`.
+`https://travis-ci.org/<organization>/airflow`.
 
 #### Prefer travis-ci.com over travis-ci.org
 
diff --git a/DISCLAIMER b/DISCLAIMER
deleted file mode 100644
index 2758508789..0000000000
--- a/DISCLAIMER
+++ /dev/null
@@ -1,6 +0,0 @@
-Apache Airflow is an effort undergoing incubation at The Apache Software Foundation (ASF),
-sponsored by the Apache Incubator. Incubation is required of all newly accepted projects
-until a further review indicates that the infrastructure, communications, and decision
-making process have stabilized in a manner consistent with other successful ASF projects.
-While incubation status is not necessarily a reflection of the completeness or stability
-of the code, it does indicate that the project has yet to be fully endorsed by the ASF.
diff --git a/INSTALL b/INSTALL
index b018839ab1..00d458f450 100644
--- a/INSTALL
+++ b/INSTALL
@@ -1,4 +1,4 @@
-# INSTALL / BUILD instructions for Apache Airflow (incubating)
+# INSTALL / BUILD instructions for Apache Airflow
 
 # [required] fetch the tarball and untar the source
 # change into the directory that was untarred.
diff --git a/MANIFEST.in b/MANIFEST.in
index ec99c1f6b2..2ad56c61db 100644
--- a/MANIFEST.in
+++ b/MANIFEST.in
@@ -18,7 +18,6 @@
 
 include NOTICE
 include LICENSE
-include DISCLAIMER
 include CHANGELOG.txt
 include README.md
 graft licenses/
diff --git a/README.md b/README.md
index 0f7c36fdc7..225c5d210b 100644
--- a/README.md
+++ b/README.md
@@ -17,11 +17,11 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-# Apache Airflow (Incubating)
+# Apache Airflow
 
 [![PyPI version](https://badge.fury.io/py/apache-airflow.svg)](https://badge.fury.io/py/apache-airflow)
-[![Build Status](https://travis-ci.org/apache/incubator-airflow.svg?branch=master)](https://travis-ci.org/apache/incubator-airflow)
-[![Coverage Status](https://img.shields.io/codecov/c/github/apache/incubator-airflow/master.svg)](https://codecov.io/github/apache/incubator-airflow?branch=master)
+[![Build Status](https://travis-ci.org/apache/airflow.svg?branch=master)](https://travis-ci.org/apache/airflow)
+[![Coverage Status](https://img.shields.io/codecov/c/github/apache/airflow/master.svg)](https://codecov.io/github/apache/airflow?branch=master)
 [![Documentation Status](https://readthedocs.org/projects/airflow/badge/?version=latest)](https://airflow.readthedocs.io/en/latest/?badge=latest)
 [![License](http://img.shields.io/:license-Apache%202-blue.svg)](http://www.apache.org/licenses/LICENSE-2.0.txt)
 [![PyPI - Python Version](https://img.shields.io/pypi/pyversions/apache-airflow.svg)](https://pypi.org/project/apache-airflow/)
@@ -36,7 +36,7 @@ Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The
 
 ## Getting started
 
-Please visit the Airflow Platform documentation (latest **stable** release) for help with [installing Airflow](https://airflow.incubator.apache.org/installation.html), getting a [quick start](https://airflow.incubator.apache.org/start.html), or a more complete [tutorial](https://airflow.incubator.apache.org/tutorial.html).
+Please visit the Airflow Platform documentation (latest **stable** release) for help with [installing Airflow](https://airflow.apache.org/installation.html), getting a [quick start](https://airflow.apache.org/start.html), or a more complete [tutorial](https://airflow.apache.org/tutorial.html).
 
 Documentation of GitHub master (latest development branch): [ReadTheDocs Documentation](https://airflow.readthedocs.io/en/latest/)
 
@@ -93,7 +93,7 @@ unit of work and continuity.
 
 ## Contributing
 
-Want to help build Apache Airflow? Check out our [contributing documentation](https://github.com/apache/incubator-airflow/blob/master/CONTRIBUTING.md).
+Want to help build Apache Airflow? Check out our [contributing documentation](https://github.com/apache/airflow/blob/master/CONTRIBUTING.md).
 
 
 ## Who uses Apache Airflow?
@@ -339,7 +339,7 @@ Currently **officially** using Airflow:
 
 ## Who Maintains Apache Airflow?
 
-Airflow is the work of the [community](https://github.com/apache/incubator-airflow/graphs/contributors),
+Airflow is the work of the [community](https://github.com/apache/airflow/graphs/contributors),
 but the [core committers/maintainers](https://people.apache.org/committers-by-project.html#airflow)
 are responsible for reviewing and merging PRs as well as steering conversation around new feature requests.
 If you would like to become a maintainer, please review the Apache Airflow
@@ -347,7 +347,6 @@ If you would like to become a maintainer, please review the Apache Airflow
 
 ## Links
 
-- [Documentation](https://airflow.incubator.apache.org/)
+- [Documentation](https://airflow.apache.org/)
 - [Chat](https://apache-airflow-slack.herokuapp.com/)
-- [Apache Airflow Incubation Status](http://incubator.apache.org/projects/airflow.html)
 - [More](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Links)
diff --git a/UPDATING.md b/UPDATING.md
index 7409a35b5f..5a5fce681e 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -154,7 +154,7 @@ config file.
 
 If you want to use LDAP auth backend without TLS then you will have to create a
 custom-auth backend based on
-https://github.com/apache/incubator-airflow/blob/1.10.0/airflow/contrib/auth/backends/ldap_auth.py
+https://github.com/apache/airflow/blob/1.10.0/airflow/contrib/auth/backends/ldap_auth.py
 
 ## Airflow 1.10
 
@@ -470,11 +470,11 @@ The `file_task_handler` logger has been made more flexible. The default format c
 
 #### I'm using S3Log or GCSLogs, what do I do!?
 
-If you are logging to Google cloud storage, please see the [Google cloud platform documentation](https://airflow.incubator.apache.org/integration.html#gcp-google-cloud-platform) for logging instructions.
+If you are logging to Google cloud storage, please see the [Google cloud platform documentation](https://airflow.apache.org/integration.html#gcp-google-cloud-platform) for logging instructions.
 
 If you are using S3, the instructions should be largely the same as the Google cloud platform instructions above. You will need a custom logging config. The `REMOTE_BASE_LOG_FOLDER` configuration key in your airflow config has been removed, therefore you will need to take the following steps:
 
-- Copy the logging configuration from [`airflow/config_templates/airflow_logging_settings.py`](https://github.com/apache/incubator-airflow/blob/master/airflow/config_templates/airflow_local_settings.py).
+- Copy the logging configuration from [`airflow/config_templates/airflow_logging_settings.py`](https://github.com/apache/airflow/blob/master/airflow/config_templates/airflow_local_settings.py).
 - Place it in a directory inside the Python import path `PYTHONPATH`. If you are using Python 2.7, ensuring that any `__init__.py` files exist so that it is importable.
 - Update the config by setting the path of `REMOTE_BASE_LOG_FOLDER` explicitly in the config. The `REMOTE_BASE_LOG_FOLDER` key is not used anymore.
 - Set the `logging_config_class` to the filename and dict. For example, if you place `custom_logging_config.py` on the base of your `PYTHONPATH`, you will need to set `logging_config_class = custom_logging_config.LOGGING_CONFIG` in your config as Airflow 1.8.
@@ -628,7 +628,7 @@ supported and will be removed entirely in Airflow 2.0
 - Operators no longer accept arbitrary arguments
 
   Previously, `Operator.__init__()` accepted any arguments (either positional `*args` or keyword `**kwargs`) without
-  complaint. Now, invalid arguments will be rejected. (https://github.com/apache/incubator-airflow/pull/1285)
+  complaint. Now, invalid arguments will be rejected. (https://github.com/apache/airflow/pull/1285)
 
 - The config value secure_mode will default to True which will disable some insecure endpoints/features
 
diff --git a/airflow/config_templates/default_airflow.cfg b/airflow/config_templates/default_airflow.cfg
index 536f0061e4..da6ff1a43a 100644
--- a/airflow/config_templates/default_airflow.cfg
+++ b/airflow/config_templates/default_airflow.cfg
@@ -277,7 +277,7 @@ error_logfile = -
 expose_config = False
 
 # Set to true to turn on authentication:
-# https://airflow.incubator.apache.org/security.html#web-authentication
+# https://airflow.apache.org/security.html#web-authentication
 authenticate = False
 
 # Filter the list of dags by owner name (requires authentication to be enabled)
diff --git a/airflow/contrib/example_dags/example_twitter_README.md b/airflow/contrib/example_dags/example_twitter_README.md
index 67e95581d7..bef3737f34 100644
--- a/airflow/contrib/example_dags/example_twitter_README.md
+++ b/airflow/contrib/example_dags/example_twitter_README.md
@@ -52,6 +52,6 @@ CREATE TABLE toTwitter_A(id BIGINT, id_str STRING
 
 When you review the code for the DAG, you will notice that these tasks are generated using for loop. These two for loops could be combined into one loop. However, in most cases, you will be running different analysis on your incoming incoming and outgoing tweets, and hence they are kept separated in this example.
 Final step is a running the broker script, brokerapi.py, which will run queries in Hive and store the summarized data to MySQL in our case. To connect to Hive, pyhs2 library is extremely useful and easy to use. To insert data into MySQL from Python, sqlalchemy is also a good one to use.
-I hope you find this tutorial useful. If you have question feel free to ask me on [Twitter](https://twitter.com/EkhtiarSyed) or via the live Airflow chatroom room in [Gitter](https://gitter.im/apache/incubator-airflow).<p>
+I hope you find this tutorial useful. If you have question feel free to ask me on [Twitter](https://twitter.com/EkhtiarSyed) or via the live Airflow chatroom room in [Gitter](https://gitter.im/apache/airflow).<p>
 -Ekhtiar Syed
 Last Update: 8-April-2016
diff --git a/airflow/contrib/sensors/qubole_sensor.py b/airflow/contrib/sensors/qubole_sensor.py
index c0291b1521..08e4290570 100644
--- a/airflow/contrib/sensors/qubole_sensor.py
+++ b/airflow/contrib/sensors/qubole_sensor.py
@@ -74,7 +74,7 @@ class QuboleFileSensor(QuboleSensor):
     :param qubole_conn_id: Connection id which consists of qds auth_token
     :type qubole_conn_id: str
     :param data: a JSON object containing payload, whose presence needs to be checked
-        Check this `example <https://github.com/apache/incubator-airflow/blob/master\
+        Check this `example <https://github.com/apache/airflow/blob/master\
         /airflow/contrib/example_dags/example_qubole_sensor.py>`_ for sample payload
         structure.
     :type data: a JSON object
@@ -97,7 +97,7 @@ class QubolePartitionSensor(QuboleSensor):
     :param qubole_conn_id: Connection id which consists of qds auth_token
     :type qubole_conn_id: str
     :param data: a JSON object containing payload, whose presence needs to be checked.
-        Check this `example <https://github.com/apache/incubator-airflow/blob/master\
+        Check this `example <https://github.com/apache/airflow/blob/master\
         /airflow/contrib/example_dags/example_qubole_sensor.py>`_ for sample payload
         structure.
     :type data: a JSON object
diff --git a/airflow/example_dags/tutorial.py b/airflow/example_dags/tutorial.py
index ccf2e6e2ee..3ab107da83 100644
--- a/airflow/example_dags/tutorial.py
+++ b/airflow/example_dags/tutorial.py
@@ -20,7 +20,7 @@
 """
 ### Tutorial Documentation
 Documentation that goes along with the Airflow tutorial located
-[here](https://airflow.incubator.apache.org/tutorial.html)
+[here](https://airflow.apache.org/tutorial.html)
 """
 from datetime import timedelta
 
diff --git a/airflow/operators/slack_operator.py b/airflow/operators/slack_operator.py
index ddf8788e9b..b6fb75a236 100644
--- a/airflow/operators/slack_operator.py
+++ b/airflow/operators/slack_operator.py
@@ -116,7 +116,7 @@ def __init__(self,
                       'Here is a cat video instead\n'
                       'https://www.youtube.com/watch?v=J---aiyznGQ',
                  icon_url='https://raw.githubusercontent.com/apache/'
-                          'incubator-airflow/master/airflow/www/static/pin_100.jpg',
+                          'airflow/master/airflow/www/static/pin_100.jpg',
                  attachments=None,
                  *args, **kwargs):
         self.method = 'chat.postMessage'
diff --git a/airflow/version.py b/airflow/version.py
index 6b892acce7..c9c2bf2b95 100644
--- a/airflow/version.py
+++ b/airflow/version.py
@@ -18,4 +18,4 @@
 # under the License.
 #
 
-version = '2.0.0.dev0+incubating'
+version = '2.0.0.dev0+'
diff --git a/airflow/www/app.py b/airflow/www/app.py
index d7f102249b..d700f87dd2 100644
--- a/airflow/www/app.py
+++ b/airflow/www/app.py
@@ -116,11 +116,11 @@ def create_app(config=None, testing=False):
 
         admin.add_link(base.MenuLink(
             category='Docs', name='Documentation',
-            url='https://airflow.incubator.apache.org/'))
+            url='https://airflow.apache.org/'))
         admin.add_link(
             base.MenuLink(category='Docs',
                           name='Github',
-                          url='https://github.com/apache/incubator-airflow'))
+                          url='https://github.com/apache/airflow'))
 
         av(vs.VersionView(name='Version', category="About"))
 
diff --git a/airflow/www_rbac/app.py b/airflow/www_rbac/app.py
index 69b5acbd2b..50f823b50d 100644
--- a/airflow/www_rbac/app.py
+++ b/airflow/www_rbac/app.py
@@ -131,7 +131,7 @@ def init_views(appbuilder):
                                 category="Docs",
                                 category_icon="fa-cube")
             appbuilder.add_link("Github",
-                                href='https://github.com/apache/incubator-airflow',
+                                href='https://github.com/apache/airflow',
                                 category="Docs")
             appbuilder.add_link('Version',
                                 href='/version',
diff --git a/airflow/www_rbac/package.json b/airflow/www_rbac/package.json
index b6e0e05915..0e4d021447 100644
--- a/airflow/www_rbac/package.json
+++ b/airflow/www_rbac/package.json
@@ -12,7 +12,7 @@
   "license": "Apache-2.0",
   "repository": {
     "type": "git",
-    "url": "git+https://github.com/apache/incubator-airflow.git"
+    "url": "git+https://github.com/apache/airflow.git"
   },
   "homepage": "http://airflow.apache.org/",
   "keywords": [
diff --git a/dev/README.md b/dev/README.md
index b9e9138468..b2e93e641d 100755
--- a/dev/README.md
+++ b/dev/README.md
@@ -86,8 +86,8 @@ Users can configure this automatically by running `airflow-pr setup_git_remotes`
 
 ```bash
 $ git remote -v
-github	https://github.com/apache/incubator-airflow.git (fetch)
-github	https://github.com/apache/incubator-airflow.git (push)
+github	https://github.com/apache/airflow.git (fetch)
+github	https://github.com/apache/airflow.git (push)
 origin	https://github.com/<USER>/airflow (fetch)
 origin	https://github.com/<USER>/airflow (push)
 ```
diff --git a/dev/airflow-pr b/dev/airflow-pr
index c7da677a05..42e01cc9e8 100755
--- a/dev/airflow-pr
+++ b/dev/airflow-pr
@@ -73,8 +73,8 @@ GITHUB_REMOTE_NAME = os.environ.get("GITHUB_REMOTE_NAME", "github")
 # scope.
 GITHUB_OAUTH_KEY = os.environ.get("GITHUB_OAUTH_KEY")
 
-GITHUB_BASE = "https://github.com/apache/incubator-airflow/pull"
-GITHUB_API_BASE = "https://api.github.com/repos/apache/incubator-airflow"
+GITHUB_BASE = "https://github.com/apache/airflow/pull"
+GITHUB_API_BASE = "https://api.github.com/repos/apache/airflow"
 GITHUB_USER = 'asfgit'
 
 JIRA_BASE = "https://issues.apache.org/jira/browse"
@@ -1013,8 +1013,8 @@ def setup_git_remotes():
         GITHUB_REMOTE_NAME environment variable:
 
           git remote -v
-          github https://github.com/apache/incubator-airflow.git (fetch)
-          github https://github.com/apache/incubator-airflow.git (push)
+          github https://github.com/apache/airflow.git (fetch)
+          github https://github.com/apache/airflow.git (push)
 
         If these remotes already exist, the tool will display an error.
         """))
@@ -1022,7 +1022,7 @@ def setup_git_remotes():
 
     error = False
     try:
-        run_cmd('git remote add github git@github.com:apache/incubator-airflow.git')
+        run_cmd('git remote add github git@github.com:apache/airflow.git')
     except:
         click.echo(click.style(reflow(
             '>>ERROR: Could not create github remote. If it already exists, '
diff --git a/docs/img/incubator.jpg b/docs/img/incubator.jpg
deleted file mode 100644
index 6f34a85e81..0000000000
Binary files a/docs/img/incubator.jpg and /dev/null differ
diff --git a/docs/index.rst b/docs/index.rst
index efd0a8b78d..b0054ad670 100644
--- a/docs/index.rst
+++ b/docs/index.rst
@@ -18,25 +18,10 @@
 
 .. image:: img/pin_large.png
     :width: 100
-.. image:: img/incubator.jpg
-    :width: 150
 
-Apache Airflow (incubating) Documentation
+Apache Airflow Documentation
 =========================================
 
-.. important::
-
-    **Disclaimer**: Apache Airflow is an effort undergoing incubation at The
-    Apache Software Foundation (ASF), sponsored by the Apache Incubator.
-    Incubation is required of all newly accepted projects until a further
-    review indicates that the infrastructure, communications, and
-    decision making process have stabilized in a manner consistent with
-    other successful ASF projects. While incubation status is not
-    necessarily a reflection of the completeness or stability of
-    the code, it does indicate that the project has yet to be fully
-    endorsed by the ASF.
-
-
 Airflow is a platform to programmatically author, schedule and monitor
 workflows.
 
diff --git a/docs/plugins.rst b/docs/plugins.rst
index 4bd55cead9..1429c81e18 100644
--- a/docs/plugins.rst
+++ b/docs/plugins.rst
@@ -179,7 +179,7 @@ definitions in Airflow.
     ml = MenuLink(
         category='Test Plugin',
         name='Test Menu Link',
-        url='https://airflow.incubator.apache.org/')
+        url='https://airflow.apache.org/')
 
     # Creating a flask appbuilder BaseView
     class TestAppBuilderBaseView(AppBuilderBaseView):
diff --git a/docs/project.rst b/docs/project.rst
index 61e68a6984..85dfd40262 100644
--- a/docs/project.rst
+++ b/docs/project.rst
@@ -51,7 +51,7 @@ Committers
 
 For the full list of contributors, take a look at `Airflow's Github
 Contributor page:
-<https://github.com/apache/incubator-airflow/graphs/contributors>`_
+<https://github.com/apache/airflow/graphs/contributors>`_
 
 
 Resources & links
@@ -59,8 +59,8 @@ Resources & links
 
 * `Airflow's official documentation <http://airflow.apache.org/>`_
 * Mailing list (send emails to
-  ``dev-subscribe@airflow.incubator.apache.org`` and/or
-  ``commits-subscribe@airflow.incubator.apache.org``
+  ``dev-subscribe@airflow.apache.org`` and/or
+  ``commits-subscribe@airflow.apache.org``
   to subscribe to each)
 * `Issues on Apache's Jira <https://issues.apache.org/jira/browse/AIRFLOW>`_
 * `Slack (chat) Channel <https://apache-airflow-slack.herokuapp.com/>`_
diff --git a/docs/scheduler.rst b/docs/scheduler.rst
index 7c72d92107..4b8fb2c38d 100644
--- a/docs/scheduler.rst
+++ b/docs/scheduler.rst
@@ -114,7 +114,7 @@ interval series.
 
     """
     Code that goes along with the Airflow tutorial located at:
-    https://github.com/apache/incubator-airflow/blob/master/airflow/example_dags/tutorial.py
+    https://github.com/apache/airflow/blob/master/airflow/example_dags/tutorial.py
     """
     from airflow import DAG
     from airflow.operators.bash_operator import BashOperator
diff --git a/docs/tutorial.rst b/docs/tutorial.rst
index 1b8847c4d6..6bd6cc7ba6 100644
--- a/docs/tutorial.rst
+++ b/docs/tutorial.rst
@@ -32,7 +32,7 @@ complicated, a line by line explanation follows below.
 
     """
     Code that goes along with the Airflow tutorial located at:
-    https://github.com/apache/incubator-airflow/blob/master/airflow/example_dags/tutorial.py
+    https://github.com/apache/airflow/blob/master/airflow/example_dags/tutorial.py
     """
     from airflow import DAG
     from airflow.operators.bash_operator import BashOperator
@@ -308,7 +308,7 @@ something like this:
 
     """
     Code that goes along with the Airflow tutorial located at:
-    https://github.com/apache/incubator-airflow/blob/master/airflow/example_dags/tutorial.py
+    https://github.com/apache/airflow/blob/master/airflow/example_dags/tutorial.py
     """
     from airflow import DAG
     from airflow.operators.bash_operator import BashOperator
diff --git a/scripts/ci/kubernetes/kube/templates/configmaps.template.yaml b/scripts/ci/kubernetes/kube/templates/configmaps.template.yaml
index 0ca6d423e4..01776dcee0 100644
--- a/scripts/ci/kubernetes/kube/templates/configmaps.template.yaml
+++ b/scripts/ci/kubernetes/kube/templates/configmaps.template.yaml
@@ -122,7 +122,7 @@ data:
     expose_config = False
 
     # Set to true to turn on authentication:
-    # https://airflow.incubator.apache.org/security.html#web-authentication
+    # https://airflow.apache.org/security.html#web-authentication
     authenticate = False
 
     # Filter the list of dags by owner name (requires authentication to be enabled)
diff --git a/setup.cfg b/setup.cfg
index 881fe0107d..f29267e269 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -20,7 +20,7 @@ name = Airflow
 summary = Airflow is a system to programmatically author, schedule and monitor data pipelines.
 description-file = README.md
 author = Apache Airflow PMC
-author-email = dev@airflow.incubator.apache.org
+author-email = dev@airflow.apache.org
 license = Apache License, Version 2.0
 
 [files]
diff --git a/setup.py b/setup.py
index 6dc452302f..6ee7a93bc8 100644
--- a/setup.py
+++ b/setup.py
@@ -410,10 +410,10 @@ def do_setup():
             'Topic :: System :: Monitoring',
         ],
         author='Apache Software Foundation',
-        author_email='dev@airflow.incubator.apache.org',
-        url='http://airflow.incubator.apache.org/',
+        author_email='dev@airflow.apache.org',
+        url='http://airflow.apache.org/',
         download_url=(
-            'https://dist.apache.org/repos/dist/release/incubator/airflow/' + version),
+            'https://dist.apache.org/repos/dist/release/airflow/' + version),
         cmdclass={
             'test': Tox,
             'extra_clean': CleanCommand,
diff --git a/tests/plugins/test_plugin.py b/tests/plugins/test_plugin.py
index aaf1796e68..058a2bb227 100644
--- a/tests/plugins/test_plugin.py
+++ b/tests/plugins/test_plugin.py
@@ -101,7 +101,7 @@ def test(self):
 ml = MenuLink(
     category='Test Plugin',
     name="Test Menu Link",
-    url="https://airflow.incubator.apache.org/")
+    url="https://airflow.apache.org/")
 
 
 # Defining the plugin class
diff --git a/tests/sensors/test_http_sensor.py b/tests/sensors/test_http_sensor.py
index 6b6b4603d6..cfa135e204 100644
--- a/tests/sensors/test_http_sensor.py
+++ b/tests/sensors/test_http_sensor.py
@@ -140,7 +140,7 @@ class FakeSession(object):
     def __init__(self):
         self.response = requests.Response()
         self.response.status_code = 200
-        self.response._content = 'apache/incubator-airflow'.encode('ascii', 'ignore')
+        self.response._content = 'apache/airflow'.encode('ascii', 'ignore')
 
     def send(self, request, **kwargs):
         return self.response
@@ -178,7 +178,7 @@ def test_get_response_check(self):
             method='GET',
             endpoint='/search',
             data={"client": "ubuntu", "q": "airflow"},
-            response_check=lambda response: ("apache/incubator-airflow" in response.text),
+            response_check=lambda response: ("apache/airflow" in response.text),
             headers={},
             dag=self.dag)
         t.run(start_date=DEFAULT_DATE, end_date=DEFAULT_DATE, ignore_ti_state=True)
@@ -192,7 +192,7 @@ def test_sensor(self):
             request_params={"client": "ubuntu", "q": "airflow", 'date': '{{ds}}'},
             headers={},
             response_check=lambda response: (
-                "apache/incubator-airflow/" + DEFAULT_DATE.strftime('%Y-%m-%d')
+                "apache/airflow/" + DEFAULT_DATE.strftime('%Y-%m-%d')
                 in response.text),
             poke_interval=5,
             timeout=15,


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services