You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by ka...@apache.org on 2020/08/15 16:16:37 UTC
[airflow] branch master updated: CI: Fix failing docs-build (#10342)
This is an automated email from the ASF dual-hosted git repository.
kaxilnaik pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/airflow.git
The following commit(s) were added to refs/heads/master by this push:
new bfa5a8d CI: Fix failing docs-build (#10342)
bfa5a8d is described below
commit bfa5a8d5f10458c14d380c4042ecfbac627d0639
Author: Kaxil Naik <ka...@gmail.com>
AuthorDate: Sat Aug 15 17:16:06 2020 +0100
CI: Fix failing docs-build (#10342)
CI is failing because of incorrect spelling "everytime", it should be "every time"
---
airflow/providers/google/cloud/operators/bigquery.py | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/airflow/providers/google/cloud/operators/bigquery.py b/airflow/providers/google/cloud/operators/bigquery.py
index 1296724..e08a09d 100644
--- a/airflow/providers/google/cloud/operators/bigquery.py
+++ b/airflow/providers/google/cloud/operators/bigquery.py
@@ -1644,7 +1644,7 @@ class BigQueryInsertJobOperator(BaseOperator):
- if job with given id already exists then it tries to reattach to the job if its not done and its
state is in ``reattach_states``. If the job is done the operator will raise ``AirflowException``.
- Using ``force_rerun`` will submit a new job everytime without attaching to already existing ones.
+ Using ``force_rerun`` will submit a new job every time without attaching to already existing ones.
For job definition see here: