You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by ka...@apache.org on 2021/09/10 21:54:10 UTC
[airflow] 03/06: Update version added fields in
airflow/config_templates/config.yml (#18128)
This is an automated email from the ASF dual-hosted git repository.
kaxilnaik pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git
commit 0ee20fff9f32cdcfe07fc1ab545d9b491c4374ac
Author: Kamil BreguĊa <mi...@users.noreply.github.com>
AuthorDate: Fri Sep 10 01:21:06 2021 +0200
Update version added fields in airflow/config_templates/config.yml (#18128)
(cherry picked from commit 2767781b880b0fb03d46950c06e1e44902c25a7c)
---
airflow/config_templates/config.yml | 128 ++++++++++++++++++------------------
1 file changed, 64 insertions(+), 64 deletions(-)
diff --git a/airflow/config_templates/config.yml b/airflow/config_templates/config.yml
index 7abcb06..38be813 100644
--- a/airflow/config_templates/config.yml
+++ b/airflow/config_templates/config.yml
@@ -231,7 +231,7 @@
but means plugin changes picked up by tasks straight away)
default: "False"
example: ~
- version_added: "2.0.0"
+ version_added: 2.0.0
see_also: ":ref:`plugins:loading`"
type: boolean
- name: fernet_key
@@ -382,7 +382,7 @@
All the template_fields for each of Task Instance are stored in the Database.
Keeping this number small may cause an error when you try to view ``Rendered`` tab in
TaskInstance view for older tasks.
- version_added: 2.0.0
+ version_added: 1.10.10
type: integer
example: ~
default: "30"
@@ -422,7 +422,7 @@
Number of times the code should be retried in case of DB Operational Errors.
Not all transactions will be retried as it can cause undesired state.
Currently it is only used in ``DagFileProcessor.process_file`` to retry ``dagbag.sync_to_db``.
- version_added: ~
+ version_added: 2.0.0
type: integer
example: ~
default: "3"
@@ -431,7 +431,7 @@
Hide sensitive Variables or Connection extra json keys from UI and task logs when set to True
(Connection passwords are always hidden in logs)
- version_added: ~
+ version_added: 2.1.0
type: boolean
example: ~
default: "True"
@@ -439,7 +439,7 @@
description: |
A comma-separated list of extra sensitive keywords to look for in variables names or connection's
extra JSON.
- version_added: ~
+ version_added: 2.1.0
type: string
example: ~
default: ""
@@ -451,7 +451,7 @@
description: |
The folder where airflow should store its log files
This path must be absolute
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "{AIRFLOW_HOME}/logs"
@@ -459,7 +459,7 @@
description: |
Airflow can store logs remotely in AWS S3, Google Cloud Storage or Elastic Search.
Set this to True if you want to enable remote logging.
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "False"
@@ -467,7 +467,7 @@
description: |
Users must supply an Airflow connection id that provides access to the storage
location.
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: ""
@@ -477,7 +477,7 @@
Credentials
<https://cloud.google.com/docs/authentication/production#finding_credentials_automatically>`__ will
be used.
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: ""
@@ -489,14 +489,14 @@
GCS buckets should start with "gs://"
WASB buckets should start with "wasb" just to help Airflow select correct handler
Stackdriver logs should start with "stackdriver://"
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: ""
- name: encrypt_s3_logs
description: |
Use server-side encryption for logs stored in S3
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "False"
@@ -505,7 +505,7 @@
Logging level.
Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "INFO"
@@ -514,7 +514,7 @@
Logging level for Flask-appbuilder UI.
Supported values: ``CRITICAL``, ``ERROR``, ``WARNING``, ``INFO``, ``DEBUG``.
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "WARN"
@@ -523,7 +523,7 @@
Logging class
Specify the class that will specify the logging configuration
This class has to be on the python classpath
- version_added: ~
+ version_added: 2.0.0
type: string
example: "my.path.default_local_settings.LOGGING_CONFIG"
default: ""
@@ -531,14 +531,14 @@
description: |
Flag to enable/disable Colored logs in Console
Colour the logs when the controlling terminal is a TTY.
- version_added: 1.10.4
+ version_added: 2.0.0
type: string
example: ~
default: "True"
- name: colored_log_format
description: |
Log format for when Colored logs is enabled
- version_added: 1.10.4
+ version_added: 2.0.0
type: string
example: ~
default: >-
@@ -546,48 +546,48 @@
%%(log_color)s%%(levelname)s%%(reset)s - %%(log_color)s%%(message)s%%(reset)s
- name: colored_formatter_class
description: ~
- version_added: 1.10.4
+ version_added: 2.0.0
type: string
example: ~
default: "airflow.utils.log.colored_log.CustomTTYColoredFormatter"
- name: log_format
description: |
Format of Log line
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "[%%(asctime)s] {{%%(filename)s:%%(lineno)d}} %%(levelname)s - %%(message)s"
- name: simple_log_format
description: ~
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "%%(asctime)s %%(levelname)s - %%(message)s"
- name: task_log_prefix_template
description: |
Specify prefix pattern like mentioned below with stream handler TaskHandlerWithCustomFormatter
- version_added: ~
+ version_added: 2.0.0
type: string
example: "{{ti.dag_id}}-{{ti.task_id}}-{{execution_date}}-{{try_number}}"
default: ""
- name: log_filename_template
description: |
Formatting for how airflow generates file names/paths for each task run.
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "{{{{ ti.dag_id }}}}/{{{{ ti.task_id }}}}/{{{{ ts }}}}/{{{{ try_number }}}}.log"
- name: log_processor_filename_template
description: |
Formatting for how airflow generates file names for log
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "{{{{ filename }}}}.log"
- name: dag_processor_manager_log_location
description: |
full path of dag_processor_manager logfile
- version_added: 1.10.2
+ version_added: 2.0.0
type: string
example: ~
default: "{AIRFLOW_HOME}/logs/dag_processor_manager/dag_processor_manager.log"
@@ -595,7 +595,7 @@
description: |
Name of handler to read task instance logs.
Defaults to use ``task`` handler.
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "task"
@@ -603,7 +603,7 @@
description: |
A comma\-separated list of third-party logger names that will be configured to print messages to
consoles\.
- version_added: ~
+ version_added: 2.0.0
type: string
example: "connexion,sqlalchemy"
default: ""
@@ -614,25 +614,25 @@
- name: statsd_on
description: |
Enables sending metrics to StatsD.
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "False"
- name: statsd_host
description: ~
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "localhost"
- name: statsd_port
description: ~
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "8125"
- name: statsd_prefix
description: ~
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "airflow"
@@ -641,7 +641,7 @@
If you want to avoid sending all the available metrics to StatsD,
you can configure an allow list of prefixes (comma separated) to send only the metrics that
start with the elements of the list (e.g: "scheduler,executor,dagrun")
- version_added: 1.10.6
+ version_added: 2.0.0
type: string
example: ~
default: ""
@@ -652,21 +652,21 @@
The function should have the following signature:
def func_name(stat_name: str) -> str:
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: ""
- name: statsd_datadog_enabled
description: |
To enable datadog integration to send airflow metrics.
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "False"
- name: statsd_datadog_tags
description: |
List of datadog tags attached to all metrics(e.g: key1:value1,key2:value2)
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: ""
@@ -675,7 +675,7 @@
If you want to utilise your own custom Statsd client set the relevant
module path below.
Note: The module path must exist on your PYTHONPATH for Airflow to pick it up
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: ~
@@ -762,7 +762,7 @@
- name: maximum_page_limit
description: |
Used to set the maximum page limit for API requests
- version_added: ~
+ version_added: 2.0.0
type: integer
example: ~
default: "100"
@@ -774,14 +774,14 @@
If no limit is supplied, the OpenApi spec default is used.
type: integer
example: ~
- version_added: ~
+ version_added: 2.0.0
default: "100"
- name: google_oauth2_audience
description: The intended audience for JWT token credentials used for authorization.
This value must match on the client and server sides.
If empty, audience will not be tested.
type: string
- version_added: ~
+ version_added: 2.0.0
example: project-id-random-value.apps.googleusercontent.com
default: ""
- name: google_key_path
@@ -791,7 +791,7 @@
<https://cloud.google.com/docs/authentication/production#finding_credentials_automatically>`__ will
be used.
type: string
- version_added: ~
+ version_added: 2.0.0
example: /files/service-account-json
default: ""
- name: access_control_allow_headers
@@ -801,21 +801,21 @@
the server side response to the browser's
Access-Control-Request-Headers header.
type: string
- version_added: ~
+ version_added: 2.1.0
example: ~
default: ""
- name: access_control_allow_methods
description: |
Specifies the method or methods allowed when accessing the resource.
type: string
- version_added: ~
+ version_added: 2.1.0
example: ~
default: ""
- name: access_control_allow_origin
description: |
Indicates whether the response can be shared with requesting code from the given origin.
type: string
- version_added: ~
+ version_added: 2.2.0
example: ~
default: ""
- name: lineage
@@ -900,7 +900,7 @@
- name: default_queue
description: |
Default queue that tasks get assigned to and that worker listen on.
- version_added: ~
+ version_added: 2.1.0
type: string
example: ~
default: "default"
@@ -908,7 +908,7 @@
description: |
Is allowed to pass additional/unused arguments (args, kwargs) to the BaseOperator operator.
If set to False, an exception will be thrown, otherwise only the console message will be displayed.
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "False"
@@ -926,7 +926,7 @@
description: |
Template for mapred_job_name in HiveOperator, supports the following named parameters
hostname, dag_id, task_id, execution_date
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: ~
@@ -1064,7 +1064,7 @@
Access log format for gunicorn webserver.
default format is %%(h)s %%(l)s %%(u)s %%(t)s "%%(r)s" %%(s)s %%(b)s "%%(f)s" "%%(a)s"
documentation - https://docs.gunicorn.org/en/stable/settings.html#access-log-format
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: ""
@@ -1251,7 +1251,7 @@
- name: show_recent_stats_for_completed_runs
description: |
'Recent Tasks' stats will show for old DagRuns if set
- version_added: ~
+ version_added: 2.0.0
type: boolean
example: ~
default: "True"
@@ -1292,21 +1292,21 @@
default: "airflow.utils.email.send_email_smtp"
- name: email_conn_id
description: Email connection to use
- version_added: ~
+ version_added: 2.1.0
type: string
example: ~
default: "smtp_default"
- name: default_email_on_retry
description: |
Whether email alerts should be sent when a task is retried
- version_added: ~
+ version_added: 2.0.0
type: boolean
example: ~
default: "True"
- name: default_email_on_failure
description: |
Whether email alerts should be sent when a task failed
- version_added: ~
+ version_added: 2.0.0
type: boolean
example: ~
default: "True"
@@ -1314,7 +1314,7 @@
description: |
File that will be used as the template for Email subject (which will be rendered using Jinja2).
If not set, Airflow uses a base template.
- version_added: ~
+ version_added: 2.0.1
type: string
example: "/path/to/my_subject_template_file"
default: ~
@@ -1323,7 +1323,7 @@
description: |
File that will be used as the template for Email content (which will be rendered using Jinja2).
If not set, Airflow uses a base template.
- version_added: ~
+ version_added: 2.0.1
type: string
example: "/path/to/my_html_content_template_file"
default: ~
@@ -1380,13 +1380,13 @@
default: "airflow@example.com"
- name: smtp_timeout
description: ~
- version_added: ~
+ version_added: 2.0.0
type: integer
example: ~
default: "30"
- name: smtp_retry_limit
description: ~
- version_added: ~
+ version_added: 2.0.0
type: integer
example: ~
default: "5"
@@ -1400,7 +1400,7 @@
options:
- name: sentry_on
description: Enable error reporting to Sentry
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "false"
@@ -1468,7 +1468,7 @@
running tasks while another worker has unutilized processes that are unable to process the already
claimed blocked tasks.
https://docs.celeryproject.org/en/stable/userguide/optimizing.html#prefetch-limits
- version_added: ~
+ version_added: 2.0.0
type: integer
example: "1"
default: ~
@@ -1488,7 +1488,7 @@
Umask that will be used when starting workers with the ``airflow celery worker``
in daemon mode. This control the file-creation mode mask which determines the initial
value of file permission bits for newly created files.
- version_added: ~
+ version_added: 2.0.0
type: string
example: ~
default: "0o077"
@@ -1631,7 +1631,7 @@
- name: worker_precheck
description: |
Worker initialisation check to validate Metadata Database connection
- version_added: 1.10.1
+ version_added: 2.0.0
type: string
example: ~
default: "False"
@@ -1893,7 +1893,7 @@
default: "False"
- name: dependency_detector
description: DAG dependency detector class to use
- version_added: ~
+ version_added: 2.1.0
type: string
example: ~
default: "airflow.serialization.serialized_objects.DependencyDetector"
@@ -2140,7 +2140,7 @@
description: |
Enables TCP keepalive mechanism. This prevents Kubernetes API requests to hang indefinitely
when idle connection is time-outed on services like cloud load balancers or firewalls.
- version_added: ~
+ version_added: 2.0.0
type: boolean
example: ~
default: "True"
@@ -2148,7 +2148,7 @@
description: |
When the `enable_tcp_keepalive` option is enabled, TCP probes a connection that has
been idle for `tcp_keep_idle` seconds.
- version_added: ~
+ version_added: 2.0.0
type: integer
example: ~
default: "120"
@@ -2156,7 +2156,7 @@
description: |
When the `enable_tcp_keepalive` option is enabled, if Kubernetes API does not respond
to a keepalive probe, TCP retransmits the probe after `tcp_keep_intvl` seconds.
- version_added: ~
+ version_added: 2.0.0
type: integer
example: ~
default: "30"
@@ -2165,14 +2165,14 @@
When the `enable_tcp_keepalive` option is enabled, if Kubernetes API does not respond
to a keepalive probe, TCP retransmits the probe `tcp_keep_cnt number` of times before
a connection is considered to be broken.
- version_added: ~
+ version_added: 2.0.0
type: integer
example: ~
default: "6"
- name: verify_ssl
description: |
Set this to false to skip verifying SSL certificate of Kubernetes python client.
- version_added: ~
+ version_added: 2.1.0
type: boolean
example: ~
default: "True"
@@ -2220,7 +2220,7 @@
- name: shards
description: |
The number of running smart sensor processes for each service.
- version_added: ~
+ version_added: 2.0.0
type: integer
example: ~
default: "5"