You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@airflow.apache.org by Chris Riccomini <cr...@apache.org> on 2017/11/06 21:22:31 UTC

[VOTE] Airflow 1.9.0rc1

Hey all,

I have cut Airflow 1.9.0 RC1. This email is calling a vote on the release,
which will last fo 72 hours. Consider this my (binding) +1.

Airflow 1.9.0 RC1 is available at:

https://dist.apache.org/repos/dist/dev/incubator/airflow/1.9.0rc1/

apache-airflow-1.9.0rc1+incubating-source.tar.gz is a source release that
comes with INSTALL instructions.
apache-airflow-1.9.0rc1+incubating-bin.tar.gz is the binary Python "sdist"
release.

Public keys are available at:

https://dist.apache.org/repos/dist/release/incubator/airflow/

The release contains the following JIRAs:

ISSUE ID    |DESCRIPTION                                       |PR
|COMMIT
AIRFLOW-1779|Add keepalive packets to ssh hook                 |#2749
|d2f9d1
AIRFLOW-1776|stdout/stderr logging not captured                |#2745
|590d9f
AIRFLOW-1771|Change heartbeat text from boom to heartbeat      |-     |-

AIRFLOW-1767|Airflow Scheduler no longer schedules DAGs        |-     |-

AIRFLOW-1765|Default API auth backed should deny all.          |#2737
|6ecdac
AIRFLOW-1764|Web Interface should not use experimental api     |#2738
|6bed1d
AIRFLOW-1757|Contrib.SparkSubmitOperator should allow --package|#2725
|4e06ee
AIRFLOW-1745|BashOperator ignores SIGPIPE in subprocess        |#2714
|e021c9
AIRFLOW-1744|task.retries can be False                         |#2713
|6144c6
AIRFLOW-1743|Default config template should not contain ldap fi|#2712
|270684
AIRFLOW-1741|Task Duration shows two charts on first page load.|#2711
|974b49
AIRFLOW-1734|Sqoop Operator contains logic errors & needs optio|#2703
|f6810c
AIRFLOW-1731|Import custom config on PYTHONPATH                |#2721
|f07eb3
AIRFLOW-1726|Copy Expert command for Postgres Hook             |#2698
|8a4ad3
AIRFLOW-1719|Fix small typo - your vs you                      |-     |-

AIRFLOW-1712|Log SSHOperator output                            |-     |-

AIRFLOW-1711|Ldap Attributes not always a "list" part 2        |#2731
|40a936
AIRFLOW-1706|Scheduler is failed on startup with MS SQL Server |#2733
|9e209b
AIRFLOW-1698|Remove confusing SCHEDULER_RUNS env var from syste|#2677
|00dd06
AIRFLOW-1695|Redshift Hook using boto3 & AWS Hook              |#2717
|bfddae
AIRFLOW-1694|Hive Hooks: Python 3 does not have an `itertools.i|#2674
|c6e5ae
AIRFLOW-1692|Master cannot be checked out on windows           |#2673
|31805e
AIRFLOW-1691|Add better documentation for Google cloud storage |#2671
|ace2b1
AIRFLOW-1690|Error messages regarding gcs log commits are spars|#2670
|5fb5cd
AIRFLOW-1682|S3 task handler never writes to S3                |#2664
|0080f0
AIRFLOW-1678|Fix docstring errors for `set_upstream` and `set_d|-     |-

AIRFLOW-1677|Fix typo in example_qubole_operator               |-     |-

AIRFLOW-1676|GCS task handler never writes to GCS              |#2659
|781fa4
AIRFLOW-1675|Fix API docstrings to be properly rendered        |#2667
|f12381
AIRFLOW-1671|Missing @apply_defaults annotation for gcs downloa|#2655
|97666b
AIRFLOW-1669|Fix Docker import in Master                       |#na
 |f7f2a8
AIRFLOW-1668|Redhsift requires a keep alive of < 300s          |#2650
|f2bb77
AIRFLOW-1664|Make MySqlToGoogleCloudStorageOperator support bin|#2649
|95813d
AIRFLOW-1660|Change webpage width to full-width                |#2646
|8ee3d9
AIRFLOW-1659|Fix invalid attribute bug in FileTaskHandler      |#2645
|bee823
AIRFLOW-1658|Kill (possibly) still running Druid indexing job a|#2644
|cbf7ad
AIRFLOW-1657|Handle failure of Qubole Operator for s3distcp had|-     |-

AIRFLOW-1654|Show tooltips for link icons in DAGs view         |#2642
|ada7b2
AIRFLOW-1647|Fix Spark-sql hook                                |#2637
|b1e5c6
AIRFLOW-1641|Task gets stuck in queued state                   |#2715
|735497
AIRFLOW-1640|Add Qubole default connection in connection table |-     |-

AIRFLOW-1639|ValueError does not have .message attribute       |#2629
|87df67
AIRFLOW-1637|readme not tracking master branch for travis      |-     |-

AIRFLOW-1636|aws and emr connection types get cleared          |#2626
|540e04
AIRFLOW-1635|Allow creating Google Cloud Platform connection wi|#2640
|6dec7a
AIRFLOW-1629|make extra a textarea in edit connections form    |#2623
|f5d46f
AIRFLOW-1628|Docstring of sqlsensor is incorrect               |#2621
|9ba73d
AIRFLOW-1627|SubDagOperator initialization should only query po|#2620
|516ace
AIRFLOW-1621|Add tests for logic added on server side dag list |#2614
|8de9fd
AIRFLOW-1614|Improve performance of DAG parsing when there are |#2610
|a95adb
AIRFLOW-1611|Customize logging in Airflow                      |#2631
|8b4a50
AIRFLOW-1609|Ignore all venvs in gitignore                     |#2608
|f1f9b4
AIRFLOW-1608|GCP Dataflow hook missing pending job state       |#2607
|653562
AIRFLOW-1606|DAG.sync_to_db is static, but takes a DAG as first|#2606
|6ac296
AIRFLOW-1605|Fix log source of local loggers                   |-     |-

AIRFLOW-1604|Rename the logger to log                          |#2604
|af4050
AIRFLOW-1602|Use LoggingMixin for the DAG class                |#2602
|956699
AIRFLOW-1601|Add configurable time between SIGTERM and SIGKILL |#2601
|48a95e
AIRFLOW-1600|Uncaught exceptions in get_fernet if cryptography |#2600
|ad963e
AIRFLOW-1597|Add GameWisp as Airflow user                      |#2599
|26b747
AIRFLOW-1594|Installing via pip copies test files into python l|#2597
|a6b23a
AIRFLOW-1593|Expose load_string in WasbHook                    |#2596
|7ece95
AIRFLOW-1591|Exception: 'TaskInstance' object has no attribute |#2578
|f4653e
AIRFLOW-1590|Small fix for dates util                          |#2652
|31946e
AIRFLOW-1587|fix `ImportError: cannot import name 'CeleryExecut|#2590
|34c73b
AIRFLOW-1586|MySQL to GCS to BigQuery fails for tables with dat|#2589
|e83012
AIRFLOW-1584|Remove the insecure /headers endpoints            |#2588
|17ac07
AIRFLOW-1582|Improve logging structure of Airflow              |#2592
|a7a518
AIRFLOW-1580|Error in string formatter when throwing an excepti|#2583
|ea9ab9
AIRFLOW-1579|Allow jagged rows in BQ Hook.                     |#2582
|5b978b
AIRFLOW-1577|Add token support to DatabricksHook               |#2579
|c2c515
AIRFLOW-1573|Remove `thrift < 0.10.0` requirement              |#2574
|aa95f2
AIRFLOW-1571|Add AWS Lambda Hook for invoking Lambda Function  |#2718
|017f18
AIRFLOW-1568|Add datastore import/export operator              |#2568
|86063b
AIRFLOW-1567|Clean up ML Engine operators                      |#2567
|af91e2
AIRFLOW-1564|Default logging filename contains a colon         |#2565
|4c674c
AIRFLOW-1560|Add AWS DynamoDB hook for inserting batch items   |#2587
|71400b
AIRFLOW-1556|BigQueryBaseCursor should support SQL parameters  |#2557
|9df0ac
AIRFLOW-1546| add Zymergen to org list in README               |#2512
|7cc346
AIRFLOW-1535|Add support for Dataproc serviceAccountScopes in D|#2546
|b1f902
AIRFLOW-1529|Support quoted newlines in Google BigQuery load jo|#2545
|4a4b02
AIRFLOW-1527|Refactor celery config to make use of template    |#2542
|f4437b
AIRFLOW-1522|Increase size of val column for variable table in |#2535
|8a2d24
AIRFLOW-1521|Template fields definition for bigquery_table_dele|#2534
|f1a7c0
AIRFLOW-1520|S3Hook uses boto2                                 |#2532
|386583
AIRFLOW-1519|Main DAG list page does not scale using client sid|#2531
|d7d7ce
AIRFLOW-1512|Add operator for running Python functions in a vir|#2446
|14e6d7
AIRFLOW-1507|Make src, dst and bucket parameters as templated i|#2516
|d295cf
AIRFLOW-1505|Document when Jinja substitution occurs           |#2523
|984a87
AIRFLOW-1504|Log Cluster Name on Dataproc Operator When Execute|#2517
|1cd6c4
AIRFLOW-1499s|Eliminate duplicate and unneeded code             |-     |-

AIRFLOW-1497|Hidden fields in connection form aren't reset when|#2507
|d8da8b
AIRFLOW-1493|Fix race condition with airflow run               |#2505
|b2e175
AIRFLOW-1492|Add metric for task success/failure               |#2504
|fa84d4
AIRFLOW-1489|Docs: Typo in BigQueryCheckOperator               |#2501
|111ce5
AIRFLOW-1483|Page size on model views is to large to render qui|#2497
|04bfba
AIRFLOW-1478|Chart -> Owner column should be sortable          |#2493
|651e60
AIRFLOW-1476|Add INSTALL file for source releases              |#2492
|da76ac
AIRFLOW-1474|Add dag_id regex for 'airflow clear' CLI command  |#2486
|18f849
AIRFLOW-1470s|BashSensor Implementation                         |-     |-

AIRFLOW-1459|integration rst doc is broken in github view      |#2481
|322ec9
AIRFLOW-1438|Scheduler batch queries should have a limit       |#2462
|3547cb
AIRFLOW-1437|BigQueryTableDeleteOperator should define deletion|#2459
|b87903
AIRFLOW-1432|NVD3 Charts do not have labeled axes and units cha|#2710
|70ffa4
AIRFLOW-1402|Cleanup SafeConfigParser DeprecationWarning       |#2435
|38c86b
AIRFLOW-1401|Standardize GCP project, region, and zone argument|#2439
|b6d363
AIRFLOW-1397|Airflow 1.8.1 - No data displays in Last Run Colum|-     |-

AIRFLOW-1394|Add quote_character parameter to GoogleCloudStorag|#2428
|9fd0be
AIRFLOW-1389|BigQueryOperator should support `createDisposition|#2470
|6e2640
AIRFLOW-1384|Add ARGO/CaDC                                     |#2434
|715947
AIRFLOW-1368|Automatically remove the container when it exits  |#2653
|d42d23
AIRFLOW-1359|Provide GoogleCloudML operator for model evaluatio|#2407
|194d1d
AIRFLOW-1356|add `--celery_hostname` to `airflow worker`       |#2405
|b9d7d1
AIRFLOW-1352|Revert bad logging Handler                        |-     |-

AIRFLOW-1350|Add "query_uri" parameter for Google DataProc oper|#2402
|d32c72
AIRFLOW-1348|Paginated UI has broken toggles after first page  |-     |-

AIRFLOW-1345|Don't commit on each loop                         |#2397
|0dd002
AIRFLOW-1344|Builds failing on Python 3.5 with AttributeError  |#2394
|2a5883
AIRFLOW-1343|Add airflow default label to the dataproc operator|#2396
|e4b240
AIRFLOW-1338|gcp_dataflow_hook is incompatible with the recent |#2388
|cf2605
AIRFLOW-1337|Customize log format via config file              |#2392
|4841e3
AIRFLOW-1335|Use buffered logger                               |#2386
|0d23d3
AIRFLOW-1333|Enable copy function for Google Cloud Storage Hook|#2385
|e2c383
AIRFLOW-1331|Contrib.SparkSubmitOperator should allow --package|#2622
|fbca8f
AIRFLOW-1330|Connection.parse_from_uri doesn't work for google_|#2525
|6e5e9d
AIRFLOW-1324|Make the Druid operator/hook more general         |#2378
|de99aa
AIRFLOW-1323|Operators related to Dataproc should keep some par|#2636
|ed248d
AIRFLOW-1315|Add Qubole File and Partition Sensors             |-     |-

AIRFLOW-1309|Add optional hive_tblproperties in HiveToDruidTran|-     |-

AIRFLOW-1301|Add New Relic to Airflow user list                |#2359
|355fc9
AIRFLOW-1299|Google Dataproc cluster creation operator should s|#2358
|c2b80e
AIRFLOW-1289|Don't restrict scheduler threads to CPU cores     |#2353
|8e23d2
AIRFLOW-1286|BaseTaskRunner - Exception TypeError: a bytes-like|#2363
|d8891d
AIRFLOW-1277|Forbid creation of a known event with empty fields|#na
 |65184a
AIRFLOW-1276|Forbid event creation with end_data earlier than s|#na
 |d5d02f
AIRFLOW-1275|Fix `airflow pool` command exception              |#2346
|9958aa
AIRFLOW-1273|Google Cloud ML Version and Model CRUD Operator   |#2379
|534a0e
AIRFLOW-1272|Google Cloud ML Batch Prediction Operator         |#2390
|e92d6b
AIRFLOW-1271|Google Cloud ML Training Operator                 |#2408
|0fc450
AIRFLOW-1256|Add United Airlines as Airflow user               |#2332
|d3484a
AIRFLOW-1251|Add eRevalue as an Airflow user                   |#2331
|8d5160
AIRFLOW-1248|Fix inconsistent configuration name for worker tim|#2328
|92314f
AIRFLOW-1247|CLI: ignore all dependencies argument ignored     |#2441
|e88ecf
AIRFLOW-1245|Fix random failure of test_trigger_dag_for_date un|#2325
|cef01b
AIRFLOW-1244|Forbid creation of a pool with empty name         |#2324
|df9a10
AIRFLOW-1242|BigQueryHook assumes that a valid project_id can't|#2335
|ffe616
AIRFLOW-1237|Fix IN-predicate sqlalchemy warning               |#2320
|a1f422
AIRFLOW-1234|Cover utils.operator_helpers with unit tests      |#2317
|d16537
AIRFLOW-1233|Cover utils.json with unit tests                  |#2316
|502410
AIRFLOW-1232|Remove deprecated readfp warning                  |#2315
|6ffaaf
AIRFLOW-1231|Use flask_wtf.CSRFProtect instead of flask_wtf.Csr|#2313
|cac49e
AIRFLOW-1221|Fix DatabricksSubmitRunOperator Templating        |#2308
|0fa104
AIRFLOW-1217|Enable logging in Sqoop hook                      |#2307
|4f459b
AIRFLOW-1213|Add hcatalog parameters to the sqoop operator/hook|#2305
|857850
AIRFLOW-1208|Speed-up cli tests                                |#2301
|21c142
AIRFLOW-1207|Enable utils.helpers unit tests                   |#2300
|8ac87b
AIRFLOW-1203|Tests failing after oauth upgrade                 |#2296
|3e9c66
AIRFLOW-1201|Update deprecated 'nose-parameterized' library to |#2298
|d2d3e4
AIRFLOW-1193|Add Checkr to Airflow user list                   |#2276
|707238
AIRFLOW-1189|Get pandas DataFrame using BigQueryHook fails     |#2287
|93666f
AIRFLOW-1188|Add max_bad_records param to GoogleCloudStorageToB|#2286
|443e6b
AIRFLOW-1187|Obsolete package names in documentation           |-     |-

AIRFLOW-1185|Incorrect url to PyPi                             |#2283
|829755
AIRFLOW-1181|Enable delete and list function for Google Cloud S|#2281
|24f73c
AIRFLOW-1179|Pandas 0.20 broke Google BigQuery hook            |#2279
|ac9ccb
AIRFLOW-1177|variable json deserialize does not work at set def|#2540
|65319a
AIRFLOW-1175|Add Pronto Tools to Airflow user list             |#2277
|86aafa
AIRFLOW-1173|Add Robinhood to list of Airflow users            |#2271
|379115
AIRFLOW-1165|airflow webservice crashes on ubuntu16 - python3  |-     |-

AIRFLOW-1160|Upadte SparkSubmitOperator parameters             |#2265
|2e3f07
AIRFLOW-1155|Add Tails.com to community                        |#2261
|2fa690
AIRFLOW-1149|Allow custom filters to be added to jinja2        |#2258
|48135a
AIRFLOW-1141|Remove DAG.crawl_for_tasks method                 |#2275
|a30fee
AIRFLOW-1140|DatabricksSubmitRunOperator should template the "j|#2255
|e6d316
AIRFLOW-1136|Invalid parameters are not captured for Sqoop oper|#2252
|2ef4db
AIRFLOW-1125|Clarify documentation regarding fernet_key        |#2251
|831f8d
AIRFLOW-1122|Node strokes are too thin for people with color vi|#2246
|a08761
AIRFLOW-1121|airflow webserver --pid no longer write out pid fi|-     |-

AIRFLOW-1118|Add evo.company to Airflow users                  |#2243
|f16914
AIRFLOW-1112|Log which pool is full in scheduler when pool slot|#2242
|74c1ce
AIRFLOW-1107|Add support for ftps non-default port             |#2240
|4d0c2f
AIRFLOW-1106|Add Groupalia/Letsbonus                           |#2239
|945b42
AIRFLOW-1095|ldap_auth memberOf should come from configuration |#2232
|6b1c32
AIRFLOW-1094|Invalid unit tests under `contrib/`               |#2234
|219c50
AIRFLOW-1091|As a release manager I want to be able to compare |#2231
|bfae42
AIRFLOW-1090|Add HBO                                           |#2230
|177d34
AIRFLOW-1089|Add Spark application arguments to SparkSubmitOper|#2229
|e5b914
AIRFLOW-1081|Task duration page is slow                        |#2226
|0da512
AIRFLOW-1075|Cleanup security docs                             |#2222
|5a6f18
AIRFLOW-1065|Add functionality for Azure Blob Storage          |#2216
|f1bc5f
AIRFLOW-1059|Reset_state_for_orphaned_task should operate in ba|#2205
|e05d3b
AIRFLOW-1058|Improvements for SparkSubmitOperator              |-     |-

AIRFLOW-1051|Add a test for resetdb to CliTests                |#2198
|15aee0
AIRFLOW-1047|Airflow logs vulnerable to XSS                    |#2193
|fe9ebe
AIRFLOW-1045|Make log level configurable via airflow.cfg       |#2191
|e739a5
AIRFLOW-1043|Documentation issues for operators                |#2188
|b55f41
AIRFLOW-1041|DockerOperator replaces its xcom_push method with |#2274
|03704c
AIRFLOW-1040|Fix typos in comments/docstrings in models.py     |#2174
|d8c0f5
AIRFLOW-1036|Exponential backoff should use randomization      |#2262
|66168e
AIRFLOW-1035|Exponential backoff retry logic should use 2 as ba|#2196
|4ec932
AIRFLOW-1034|Make it possible to connect to S3 in sigv4 regions|#2181
|4c0905
AIRFLOW-1031|'scheduled__' may replace with DagRun.ID_PREFIX in|#2613
|aa3844
AIRFLOW-1030|HttpHook error when creating HttpSensor           |-     |-

AIRFLOW-1028|Databricks Operator for Airflow                   |#2202
|53ca50
AIRFLOW-1024|Handle CeleryExecutor errors gracefully           |#2355
|7af20f
AIRFLOW-1018|Scheduler DAG processes can not log to stdout     |#2728
|ef775d
AIRFLOW-1016|Allow HTTP HEAD request method on HTTPSensor      |#2175
|4c41f6
AIRFLOW-1010|Add a convenience script for signing              |#2169
|a2b65a
AIRFLOW-1009|Remove SQLOperator from Concepts page             |#2168
|7d1144
AIRFLOW-1007|Jinja sandbox is vulnerable to RCE                |#2184
|daa281
AIRFLOW-1005|Speed up Airflow startup time                     |#na
 |996dd3
AIRFLOW-999 |Support for Redis database                        |#2165
|8de850
AIRFLOW-997 |Change setup.cfg to point to Apache instead of Max|#na
 |75cd46
AIRFLOW-995 |Update Github PR template                         |#2163
|b62485
AIRFLOW-994 |Add MiNODES to the AIRFLOW Active Users List      |#2159
|ca1623
AIRFLOW-991 |Mark_success while a task is running leads to fail|-     |-

AIRFLOW-990 |DockerOperator fails when logging unicode string  |#2155
|6bbf54
AIRFLOW-988 |SLA Miss Callbacks Are Repeated if Email is Not be|#2415
|6e74d4
AIRFLOW-985 |Extend the sqoop operator/hook with additional par|#2177
|82eb20
AIRFLOW-984 |Subdags unrecognized when subclassing SubDagOperat|#2152
|a8bd16
AIRFLOW-979 |Add GovTech GDS                                   |#2149
|b17bd3
AIRFLOW-976 |Mark success running task causes it to fail       |-     |-

AIRFLOW-969 |Catch bad python_callable argument at DAG construc|#2142
|12901d
AIRFLOW-963 |Some code examples are not rendered in the airflow|#2139
|f69c1b
AIRFLOW-960 |Add support for .editorconfig                     |#na
 |f5cacc
AIRFLOW-959 |.gitignore file is disorganized and incomplete    |#na
 |3d3c14
AIRFLOW-958 |Improve tooltip readability                       |#2134
|b3c3eb
AIRFLOW-950 |Missing AWS integrations on documentation::integra|#2552
|01be02
AIRFLOW-947 |Make PrestoHook surface better messages when the P|#na
 |6dd4b3
AIRFLOW-945 |Revert psycopg2 workaround when psycopg2 2.7.1 is |-     |-

AIRFLOW-943 |Add Digital First Media to the Airflow users list |#2115
|2cfe28
AIRFLOW-942 |Add mytaxi to Airflow Users                       |#2111
|d579e6
AIRFLOW-935 |Impossible to use plugin executors                |#2120
|08a784
AIRFLOW-926 |jdbc connector is broken due to jaydebeapi api upd|#2651
|07ed29
AIRFLOW-917 |Incorrectly formatted failure status message      |#2109
|b8164c
AIRFLOW-916 |Fix ConfigParser deprecation warning              |#2108
|ef6dd1
AIRFLOW-911 |Add colouring and profiling info on tests         |#2106
|4f52db
AIRFLOW-903 |Add configuration setting for default DAG view.   |#2103
|cadfae
AIRFLOW-896 |BigQueryOperator fails to execute with certain inp|#2097
|2bceee
AIRFLOW-891 |Webserver Clock Should Include Day                |-     |-

AIRFLOW-889 |Minor error in the docstrings for BaseOperator.   |#2084
|50702d
AIRFLOW-887 |Add compatibility with future v0.16               |#na
 |50902d
AIRFLOW-886 |Pass Operator result to post_execute hook         |#na
 |4da361
AIRFLOW-885 |Add Change.org to the list of Airflow users       |#2089
|a279be
AIRFLOW-882 |Code example in docs has unnecessary DAG>>Operator|#2088
|baa4cd
AIRFLOW-881 |Create SubDagOperator within DAG context manager w|#2087
|0ed608
AIRFLOW-880 |Fix remote log functionality inconsistencies for W|#2086
|974b75
AIRFLOW-877 |GoogleCloudStorageDownloadOperator: template_ext c|#2083
|debc69
AIRFLOW-875 |Allow HttpSensor params to be templated           |#2080
|62f503
AIRFLOW-871 |multiple places use logging.warn() instead of warn|#2082
|21d775
AIRFLOW-866 |Add FTPSensor                                     |#2070
|5f87f8
AIRFLOW-863 |Example DAG start dates should be recent to avoid |#2068
|bbfd43
AIRFLOW-862 |Add DaskExecutor                                  |#2067
|6e2210
AIRFLOW-860 |Circular module dependency prevents loading of cus|-     |-

AIRFLOW-854 |Add Open Knowledge International to Airflow users |#2061
|51a311
AIRFLOW-842 |scheduler.clean_dirty raises warning: SAWarning: T|#2072
|485280
AIRFLOW-840 |Python3 encoding issue in Kerberos                |#2158
|639336
AIRFLOW-836 |The paused and queryview endpoints are vulnerable |#2054
|6aca2c
AIRFLOW-831 |Fix broken unit tests                             |#2050
|b86194
AIRFLOW-830 |Plugin manager should log to debug, not info      |-     |-

AIRFLOW-829 |Reduce verbosity of successful Travis unit tests  |-     |-

AIRFLOW-826 |Add Zendesk Hook                                  |#2066
|a09762
AIRFLOW-823 |Make task instance details available via API      |#2045
|3f546e
AIRFLOW-822 |Close the connection before throwing exception in |#2038
|4b6c38
AIRFLOW-821 |Scheduler dagbag importing not Py3 compatible     |#2039
|fbb59b
AIRFLOW-809 |SqlAlchemy is_ ColumnOperator Causing Errors in MS|-     |-

AIRFLOW-802 |Integration of spark-submit                       |-     |-

AIRFLOW-781 |Allow DataFlowJavaOperator to accept jar file stor|#2037
|259c86
AIRFLOW-770 |HDFS hooks should support alternative ways of gett|#2056
|261b65
AIRFLOW-756 |Refactor ssh_hook and ssh_operator                |-     |-

AIRFLOW-751 |SFTP file transfer functionality                  |#1999
|fe0ede
AIRFLOW-725 |Make merge tool use OS' keyring for password stora|#1966
|8c1695
AIRFLOW-706 |Configuration shell commands are not split properl|#2053
|0bb6f2
AIRFLOW-705 |airflow.configuration.run_command output does not |-     |-

AIRFLOW-681 |homepage doc link should pointing to apache's repo|#2164
|a8027a
AIRFLOW-654 |SSL for AMQP w/ Celery(Executor)                  |#2333
|868bfe
AIRFLOW-645 |HttpHook ignores https                            |#2311
|fd381a
AIRFLOW-365 |Code view in subdag trigger exception             |#2043
|cf102c
AIRFLOW-300 |Add Google Pubsub hook and operator               |#2036
|d231dc
AIRFLOW-289 |Use datetime.utcnow() to keep airflow system indep|#2618
|20c83e
AIRFLOW-71  |docker_operator - pulling from private repositorie|#na
 |d4406c

Cheers,
Chris

Re: [VOTE] Airflow 1.9.0rc1

Posted by "Driesprong, Fokko" <fo...@driesprong.frl>.
Hi all,

+1 (binding)

I got Airflow 1.9rc1 deployed today, had some delay. So far it works great,
except one thing. I've got some weird stuff in my log, but I'm not sure if
it is Airflow, or a misbehaving process on my side:
[2017-11-08 16:31:05,842] {logging_mixin.py:91} WARNING - Traceback (most
recent call last):
[2017-11-08 16:31:05,842] {logging_mixin.py:91} WARNING -   File
"/usr/lib/python2.7/logging/__init__.py", line 885, in emit
[2017-11-08 16:31:05,842] {logging_mixin.py:91} WARNING -     self.flush()
[2017-11-08 16:31:05,842] {logging_mixin.py:91} WARNING -   File
"/usr/lib/python2.7/logging/__init__.py", line 845, in flush
[2017-11-08 16:31:05,842] {logging_mixin.py:91} WARNING -
self.stream.flush()
[2017-11-08 16:31:05,843] {logging_mixin.py:91} WARNING - IOError: [Errno
32] Broken pipe
[2017-11-08 16:31:05,843] {logging_mixin.py:91} WARNING - Logged from file
base_task_runner.py, line 98

​​
I n
​eed some more time to figure this out.​


Apart from that. It would be nice to include this in 1.9:

https://github.com/apache/incubator-airflow/commit/3eb2dd86b9cdb5d83767d5969011a83c6521370d

It is only a small change, but otherwise I get warnings in my scheduler
logs every time the dag is scanner, which is not that nice.


Also the error of Ash Berlin-Taylor needs to be addressed before releasing.

Cheers, Fokko


2017-11-08 19:00 GMT+01:00 Ash Berlin-Taylor <ash_airflowlist@firemirror.com
>:

> -1 (for now. Non binding. Is that how this process works?)
>
> We've built a test env for this RC and are testing, but have run into an
> issue reading task logs. (See below)
>
> We haven't gotten very far with this yet, we will dig more tomorrow (it's
> the end of the UK work day now). I suspect this might be how we've
> misconfigured our logging. We will see tomorrow.
>
> -ash
>
>
>
>
> File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py", line
> 712, in log
>     logs = handler.read(ti)
> AttributeError: 'NoneType' object has no attribute 'read'
>
> During handling of the above exception, another exception occurred:
>
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1988,
> in wsgi_app
>     response = self.full_dispatch_request()
>   File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1641,
> in full_dispatch_request
>     rv = self.handle_user_exception(e)
>   File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1544,
> in handle_user_exception
>     reraise(exc_type, exc_value, tb)
>   File "/usr/local/lib/python3.5/dist-packages/flask/_compat.py", line
> 33, in reraise
>     raise value
>   File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1639,
> in full_dispatch_request
>     rv = self.dispatch_request()
>   File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1625,
> in dispatch_request
>     return self.view_functions[rule.endpoint](**req.view_args)
>   File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py", line
> 69, in inner
>     return self._run_view(f, *args, **kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py", line
> 368, in _run_view
>     return fn(self, *args, **kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/flask_login.py", line 758,
> in decorated_view
>     return func(*args, **kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/airflow/www/utils.py",
> line 262, in wrapper
>     return f(*args, **kwargs)
>   File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py",
> line 715, in log
>     .format(task_log_reader, e.message)]
> AttributeError: 'AttributeError' object has no attribute 'message'
>
>
> > On 8 Nov 2017, at 17:46, Chris Riccomini <cr...@apache.org> wrote:
> >
> > Anyone? :/
> >
> > On Mon, Nov 6, 2017 at 1:22 PM, Chris Riccomini <cr...@apache.org>
> > wrote:
> >
> >> Hey all,
> >>
> >> I have cut Airflow 1.9.0 RC1. This email is calling a vote on the
> release,
> >> which will last fo 72 hours. Consider this my (binding) +1.
> >>
> >> Airflow 1.9.0 RC1 is available at:
> >>
> >> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.9.0rc1/
> >>
> >> apache-airflow-1.9.0rc1+incubating-source.tar.gz is a source release
> that
> >> comes with INSTALL instructions.
> >> apache-airflow-1.9.0rc1+incubating-bin.tar.gz is the binary Python
> >> "sdist" release.
> >>
> >> Public keys are available at:
> >>
> >> https://dist.apache.org/repos/dist/release/incubator/airflow/
> >>
> >> The release contains the following JIRAs:
> >>
> >> ISSUE ID    |DESCRIPTION                                       |PR
> >> |COMMIT
> >> AIRFLOW-1779|Add keepalive packets to ssh hook                 |#2749
> >> |d2f9d1
> >> AIRFLOW-1776|stdout/stderr logging not captured                |#2745
> >> |590d9f
> >> AIRFLOW-1771|Change heartbeat text from boom to heartbeat      |-     |-
> >>
> >> AIRFLOW-1767|Airflow Scheduler no longer schedules DAGs        |-     |-
> >>
> >> AIRFLOW-1765|Default API auth backed should deny all.          |#2737
> >> |6ecdac
> >> AIRFLOW-1764|Web Interface should not use experimental api     |#2738
> >> |6bed1d
> >> AIRFLOW-1757|Contrib.SparkSubmitOperator should allow --package|#2725
> >> |4e06ee
> >> AIRFLOW-1745|BashOperator ignores SIGPIPE in subprocess        |#2714
> >> |e021c9
> >> AIRFLOW-1744|task.retries can be False                         |#2713
> >> |6144c6
> >> AIRFLOW-1743|Default config template should not contain ldap fi|#2712
> >> |270684
> >> AIRFLOW-1741|Task Duration shows two charts on first page load.|#2711
> >> |974b49
> >> AIRFLOW-1734|Sqoop Operator contains logic errors & needs optio|#2703
> >> |f6810c
> >> AIRFLOW-1731|Import custom config on PYTHONPATH                |#2721
> >> |f07eb3
> >> AIRFLOW-1726|Copy Expert command for Postgres Hook             |#2698
> >> |8a4ad3
> >> AIRFLOW-1719|Fix small typo - your vs you                      |-     |-
> >>
> >> AIRFLOW-1712|Log SSHOperator output                            |-     |-
> >>
> >> AIRFLOW-1711|Ldap Attributes not always a "list" part 2        |#2731
> >> |40a936
> >> AIRFLOW-1706|Scheduler is failed on startup with MS SQL Server |#2733
> >> |9e209b
> >> AIRFLOW-1698|Remove confusing SCHEDULER_RUNS env var from syste|#2677
> >> |00dd06
> >> AIRFLOW-1695|Redshift Hook using boto3 & AWS Hook              |#2717
> >> |bfddae
> >> AIRFLOW-1694|Hive Hooks: Python 3 does not have an `itertools.i|#2674
> >> |c6e5ae
> >> AIRFLOW-1692|Master cannot be checked out on windows           |#2673
> >> |31805e
> >> AIRFLOW-1691|Add better documentation for Google cloud storage |#2671
> >> |ace2b1
> >> AIRFLOW-1690|Error messages regarding gcs log commits are spars|#2670
> >> |5fb5cd
> >> AIRFLOW-1682|S3 task handler never writes to S3                |#2664
> >> |0080f0
> >> AIRFLOW-1678|Fix docstring errors for `set_upstream` and `set_d|-     |-
> >>
> >> AIRFLOW-1677|Fix typo in example_qubole_operator               |-     |-
> >>
> >> AIRFLOW-1676|GCS task handler never writes to GCS              |#2659
> >> |781fa4
> >> AIRFLOW-1675|Fix API docstrings to be properly rendered        |#2667
> >> |f12381
> >> AIRFLOW-1671|Missing @apply_defaults annotation for gcs downloa|#2655
> >> |97666b
> >> AIRFLOW-1669|Fix Docker import in Master                       |#na
> >> |f7f2a8
> >> AIRFLOW-1668|Redhsift requires a keep alive of < 300s          |#2650
> >> |f2bb77
> >> AIRFLOW-1664|Make MySqlToGoogleCloudStorageOperator support bin|#2649
> >> |95813d
> >> AIRFLOW-1660|Change webpage width to full-width                |#2646
> >> |8ee3d9
> >> AIRFLOW-1659|Fix invalid attribute bug in FileTaskHandler      |#2645
> >> |bee823
> >> AIRFLOW-1658|Kill (possibly) still running Druid indexing job a|#2644
> >> |cbf7ad
> >> AIRFLOW-1657|Handle failure of Qubole Operator for s3distcp had|-     |-
> >>
> >> AIRFLOW-1654|Show tooltips for link icons in DAGs view         |#2642
> >> |ada7b2
> >> AIRFLOW-1647|Fix Spark-sql hook                                |#2637
> >> |b1e5c6
> >> AIRFLOW-1641|Task gets stuck in queued state                   |#2715
> >> |735497
> >> AIRFLOW-1640|Add Qubole default connection in connection table |-     |-
> >>
> >> AIRFLOW-1639|ValueError does not have .message attribute       |#2629
> >> |87df67
> >> AIRFLOW-1637|readme not tracking master branch for travis      |-     |-
> >>
> >> AIRFLOW-1636|aws and emr connection types get cleared          |#2626
> >> |540e04
> >> AIRFLOW-1635|Allow creating Google Cloud Platform connection wi|#2640
> >> |6dec7a
> >> AIRFLOW-1629|make extra a textarea in edit connections form    |#2623
> >> |f5d46f
> >> AIRFLOW-1628|Docstring of sqlsensor is incorrect               |#2621
> >> |9ba73d
> >> AIRFLOW-1627|SubDagOperator initialization should only query po|#2620
> >> |516ace
> >> AIRFLOW-1621|Add tests for logic added on server side dag list |#2614
> >> |8de9fd
> >> AIRFLOW-1614|Improve performance of DAG parsing when there are |#2610
> >> |a95adb
> >> AIRFLOW-1611|Customize logging in Airflow                      |#2631
> >> |8b4a50
> >> AIRFLOW-1609|Ignore all venvs in gitignore                     |#2608
> >> |f1f9b4
> >> AIRFLOW-1608|GCP Dataflow hook missing pending job state       |#2607
> >> |653562
> >> AIRFLOW-1606|DAG.sync_to_db is static, but takes a DAG as first|#2606
> >> |6ac296
> >> AIRFLOW-1605|Fix log source of local loggers                   |-     |-
> >>
> >> AIRFLOW-1604|Rename the logger to log                          |#2604
> >> |af4050
> >> AIRFLOW-1602|Use LoggingMixin for the DAG class                |#2602
> >> |956699
> >> AIRFLOW-1601|Add configurable time between SIGTERM and SIGKILL |#2601
> >> |48a95e
> >> AIRFLOW-1600|Uncaught exceptions in get_fernet if cryptography |#2600
> >> |ad963e
> >> AIRFLOW-1597|Add GameWisp as Airflow user                      |#2599
> >> |26b747
> >> AIRFLOW-1594|Installing via pip copies test files into python l|#2597
> >> |a6b23a
> >> AIRFLOW-1593|Expose load_string in WasbHook                    |#2596
> >> |7ece95
> >> AIRFLOW-1591|Exception: 'TaskInstance' object has no attribute |#2578
> >> |f4653e
> >> AIRFLOW-1590|Small fix for dates util                          |#2652
> >> |31946e
> >> AIRFLOW-1587|fix `ImportError: cannot import name 'CeleryExecut|#2590
> >> |34c73b
> >> AIRFLOW-1586|MySQL to GCS to BigQuery fails for tables with dat|#2589
> >> |e83012
> >> AIRFLOW-1584|Remove the insecure /headers endpoints            |#2588
> >> |17ac07
> >> AIRFLOW-1582|Improve logging structure of Airflow              |#2592
> >> |a7a518
> >> AIRFLOW-1580|Error in string formatter when throwing an excepti|#2583
> >> |ea9ab9
> >> AIRFLOW-1579|Allow jagged rows in BQ Hook.                     |#2582
> >> |5b978b
> >> AIRFLOW-1577|Add token support to DatabricksHook               |#2579
> >> |c2c515
> >> AIRFLOW-1573|Remove `thrift < 0.10.0` requirement              |#2574
> >> |aa95f2
> >> AIRFLOW-1571|Add AWS Lambda Hook for invoking Lambda Function  |#2718
> >> |017f18
> >> AIRFLOW-1568|Add datastore import/export operator              |#2568
> >> |86063b
> >> AIRFLOW-1567|Clean up ML Engine operators                      |#2567
> >> |af91e2
> >> AIRFLOW-1564|Default logging filename contains a colon         |#2565
> >> |4c674c
> >> AIRFLOW-1560|Add AWS DynamoDB hook for inserting batch items   |#2587
> >> |71400b
> >> AIRFLOW-1556|BigQueryBaseCursor should support SQL parameters  |#2557
> >> |9df0ac
> >> AIRFLOW-1546| add Zymergen to org list in README               |#2512
> >> |7cc346
> >> AIRFLOW-1535|Add support for Dataproc serviceAccountScopes in D|#2546
> >> |b1f902
> >> AIRFLOW-1529|Support quoted newlines in Google BigQuery load jo|#2545
> >> |4a4b02
> >> AIRFLOW-1527|Refactor celery config to make use of template    |#2542
> >> |f4437b
> >> AIRFLOW-1522|Increase size of val column for variable table in |#2535
> >> |8a2d24
> >> AIRFLOW-1521|Template fields definition for bigquery_table_dele|#2534
> >> |f1a7c0
> >> AIRFLOW-1520|S3Hook uses boto2                                 |#2532
> >> |386583
> >> AIRFLOW-1519|Main DAG list page does not scale using client sid|#2531
> >> |d7d7ce
> >> AIRFLOW-1512|Add operator for running Python functions in a vir|#2446
> >> |14e6d7
> >> AIRFLOW-1507|Make src, dst and bucket parameters as templated i|#2516
> >> |d295cf
> >> AIRFLOW-1505|Document when Jinja substitution occurs           |#2523
> >> |984a87
> >> AIRFLOW-1504|Log Cluster Name on Dataproc Operator When Execute|#2517
> >> |1cd6c4
> >> AIRFLOW-1499s|Eliminate duplicate and unneeded code             |-
>  |-
> >>
> >> AIRFLOW-1497|Hidden fields in connection form aren't reset when|#2507
> >> |d8da8b
> >> AIRFLOW-1493|Fix race condition with airflow run               |#2505
> >> |b2e175
> >> AIRFLOW-1492|Add metric for task success/failure               |#2504
> >> |fa84d4
> >> AIRFLOW-1489|Docs: Typo in BigQueryCheckOperator               |#2501
> >> |111ce5
> >> AIRFLOW-1483|Page size on model views is to large to render qui|#2497
> >> |04bfba
> >> AIRFLOW-1478|Chart -> Owner column should be sortable          |#2493
> >> |651e60
> >> AIRFLOW-1476|Add INSTALL file for source releases              |#2492
> >> |da76ac
> >> AIRFLOW-1474|Add dag_id regex for 'airflow clear' CLI command  |#2486
> >> |18f849
> >> AIRFLOW-1470s|BashSensor Implementation                         |-
>  |-
> >>
> >> AIRFLOW-1459|integration rst doc is broken in github view      |#2481
> >> |322ec9
> >> AIRFLOW-1438|Scheduler batch queries should have a limit       |#2462
> >> |3547cb
> >> AIRFLOW-1437|BigQueryTableDeleteOperator should define deletion|#2459
> >> |b87903
> >> AIRFLOW-1432|NVD3 Charts do not have labeled axes and units cha|#2710
> >> |70ffa4
> >> AIRFLOW-1402|Cleanup SafeConfigParser DeprecationWarning       |#2435
> >> |38c86b
> >> AIRFLOW-1401|Standardize GCP project, region, and zone argument|#2439
> >> |b6d363
> >> AIRFLOW-1397|Airflow 1.8.1 - No data displays in Last Run Colum|-     |-
> >>
> >> AIRFLOW-1394|Add quote_character parameter to GoogleCloudStorag|#2428
> >> |9fd0be
> >> AIRFLOW-1389|BigQueryOperator should support `createDisposition|#2470
> >> |6e2640
> >> AIRFLOW-1384|Add ARGO/CaDC                                     |#2434
> >> |715947
> >> AIRFLOW-1368|Automatically remove the container when it exits  |#2653
> >> |d42d23
> >> AIRFLOW-1359|Provide GoogleCloudML operator for model evaluatio|#2407
> >> |194d1d
> >> AIRFLOW-1356|add `--celery_hostname` to `airflow worker`       |#2405
> >> |b9d7d1
> >> AIRFLOW-1352|Revert bad logging Handler                        |-     |-
> >>
> >> AIRFLOW-1350|Add "query_uri" parameter for Google DataProc oper|#2402
> >> |d32c72
> >> AIRFLOW-1348|Paginated UI has broken toggles after first page  |-     |-
> >>
> >> AIRFLOW-1345|Don't commit on each loop                         |#2397
> >> |0dd002
> >> AIRFLOW-1344|Builds failing on Python 3.5 with AttributeError  |#2394
> >> |2a5883
> >> AIRFLOW-1343|Add airflow default label to the dataproc operator|#2396
> >> |e4b240
> >> AIRFLOW-1338|gcp_dataflow_hook is incompatible with the recent |#2388
> >> |cf2605
> >> AIRFLOW-1337|Customize log format via config file              |#2392
> >> |4841e3
> >> AIRFLOW-1335|Use buffered logger                               |#2386
> >> |0d23d3
> >> AIRFLOW-1333|Enable copy function for Google Cloud Storage Hook|#2385
> >> |e2c383
> >> AIRFLOW-1331|Contrib.SparkSubmitOperator should allow --package|#2622
> >> |fbca8f
> >> AIRFLOW-1330|Connection.parse_from_uri doesn't work for google_|#2525
> >> |6e5e9d
> >> AIRFLOW-1324|Make the Druid operator/hook more general         |#2378
> >> |de99aa
> >> AIRFLOW-1323|Operators related to Dataproc should keep some par|#2636
> >> |ed248d
> >> AIRFLOW-1315|Add Qubole File and Partition Sensors             |-     |-
> >>
> >> AIRFLOW-1309|Add optional hive_tblproperties in HiveToDruidTran|-     |-
> >>
> >> AIRFLOW-1301|Add New Relic to Airflow user list                |#2359
> >> |355fc9
> >> AIRFLOW-1299|Google Dataproc cluster creation operator should s|#2358
> >> |c2b80e
> >> AIRFLOW-1289|Don't restrict scheduler threads to CPU cores     |#2353
> >> |8e23d2
> >> AIRFLOW-1286|BaseTaskRunner - Exception TypeError: a bytes-like|#2363
> >> |d8891d
> >> AIRFLOW-1277|Forbid creation of a known event with empty fields|#na
> >> |65184a
> >> AIRFLOW-1276|Forbid event creation with end_data earlier than s|#na
> >> |d5d02f
> >> AIRFLOW-1275|Fix `airflow pool` command exception              |#2346
> >> |9958aa
> >> AIRFLOW-1273|Google Cloud ML Version and Model CRUD Operator   |#2379
> >> |534a0e
> >> AIRFLOW-1272|Google Cloud ML Batch Prediction Operator         |#2390
> >> |e92d6b
> >> AIRFLOW-1271|Google Cloud ML Training Operator                 |#2408
> >> |0fc450
> >> AIRFLOW-1256|Add United Airlines as Airflow user               |#2332
> >> |d3484a
> >> AIRFLOW-1251|Add eRevalue as an Airflow user                   |#2331
> >> |8d5160
> >> AIRFLOW-1248|Fix inconsistent configuration name for worker tim|#2328
> >> |92314f
> >> AIRFLOW-1247|CLI: ignore all dependencies argument ignored     |#2441
> >> |e88ecf
> >> AIRFLOW-1245|Fix random failure of test_trigger_dag_for_date un|#2325
> >> |cef01b
> >> AIRFLOW-1244|Forbid creation of a pool with empty name         |#2324
> >> |df9a10
> >> AIRFLOW-1242|BigQueryHook assumes that a valid project_id can't|#2335
> >> |ffe616
> >> AIRFLOW-1237|Fix IN-predicate sqlalchemy warning               |#2320
> >> |a1f422
> >> AIRFLOW-1234|Cover utils.operator_helpers with unit tests      |#2317
> >> |d16537
> >> AIRFLOW-1233|Cover utils.json with unit tests                  |#2316
> >> |502410
> >> AIRFLOW-1232|Remove deprecated readfp warning                  |#2315
> >> |6ffaaf
> >> AIRFLOW-1231|Use flask_wtf.CSRFProtect instead of flask_wtf.Csr|#2313
> >> |cac49e
> >> AIRFLOW-1221|Fix DatabricksSubmitRunOperator Templating        |#2308
> >> |0fa104
> >> AIRFLOW-1217|Enable logging in Sqoop hook                      |#2307
> >> |4f459b
> >> AIRFLOW-1213|Add hcatalog parameters to the sqoop operator/hook|#2305
> >> |857850
> >> AIRFLOW-1208|Speed-up cli tests                                |#2301
> >> |21c142
> >> AIRFLOW-1207|Enable utils.helpers unit tests                   |#2300
> >> |8ac87b
> >> AIRFLOW-1203|Tests failing after oauth upgrade                 |#2296
> >> |3e9c66
> >> AIRFLOW-1201|Update deprecated 'nose-parameterized' library to |#2298
> >> |d2d3e4
> >> AIRFLOW-1193|Add Checkr to Airflow user list                   |#2276
> >> |707238
> >> AIRFLOW-1189|Get pandas DataFrame using BigQueryHook fails     |#2287
> >> |93666f
> >> AIRFLOW-1188|Add max_bad_records param to GoogleCloudStorageToB|#2286
> >> |443e6b
> >> AIRFLOW-1187|Obsolete package names in documentation           |-     |-
> >>
> >> AIRFLOW-1185|Incorrect url to PyPi                             |#2283
> >> |829755
> >> AIRFLOW-1181|Enable delete and list function for Google Cloud S|#2281
> >> |24f73c
> >> AIRFLOW-1179|Pandas 0.20 broke Google BigQuery hook            |#2279
> >> |ac9ccb
> >> AIRFLOW-1177|variable json deserialize does not work at set def|#2540
> >> |65319a
> >> AIRFLOW-1175|Add Pronto Tools to Airflow user list             |#2277
> >> |86aafa
> >> AIRFLOW-1173|Add Robinhood to list of Airflow users            |#2271
> >> |379115
> >> AIRFLOW-1165|airflow webservice crashes on ubuntu16 - python3  |-     |-
> >>
> >> AIRFLOW-1160|Upadte SparkSubmitOperator parameters             |#2265
> >> |2e3f07
> >> AIRFLOW-1155|Add Tails.com to community                        |#2261
> >> |2fa690
> >> AIRFLOW-1149|Allow custom filters to be added to jinja2        |#2258
> >> |48135a
> >> AIRFLOW-1141|Remove DAG.crawl_for_tasks method                 |#2275
> >> |a30fee
> >> AIRFLOW-1140|DatabricksSubmitRunOperator should template the "j|#2255
> >> |e6d316
> >> AIRFLOW-1136|Invalid parameters are not captured for Sqoop oper|#2252
> >> |2ef4db
> >> AIRFLOW-1125|Clarify documentation regarding fernet_key        |#2251
> >> |831f8d
> >> AIRFLOW-1122|Node strokes are too thin for people with color vi|#2246
> >> |a08761
> >> AIRFLOW-1121|airflow webserver --pid no longer write out pid fi|-     |-
> >>
> >> AIRFLOW-1118|Add evo.company to Airflow users                  |#2243
> >> |f16914
> >> AIRFLOW-1112|Log which pool is full in scheduler when pool slot|#2242
> >> |74c1ce
> >> AIRFLOW-1107|Add support for ftps non-default port             |#2240
> >> |4d0c2f
> >> AIRFLOW-1106|Add Groupalia/Letsbonus                           |#2239
> >> |945b42
> >> AIRFLOW-1095|ldap_auth memberOf should come from configuration |#2232
> >> |6b1c32
> >> AIRFLOW-1094|Invalid unit tests under `contrib/`               |#2234
> >> |219c50
> >> AIRFLOW-1091|As a release manager I want to be able to compare |#2231
> >> |bfae42
> >> AIRFLOW-1090|Add HBO                                           |#2230
> >> |177d34
> >> AIRFLOW-1089|Add Spark application arguments to SparkSubmitOper|#2229
> >> |e5b914
> >> AIRFLOW-1081|Task duration page is slow                        |#2226
> >> |0da512
> >> AIRFLOW-1075|Cleanup security docs                             |#2222
> >> |5a6f18
> >> AIRFLOW-1065|Add functionality for Azure Blob Storage          |#2216
> >> |f1bc5f
> >> AIRFLOW-1059|Reset_state_for_orphaned_task should operate in ba|#2205
> >> |e05d3b
> >> AIRFLOW-1058|Improvements for SparkSubmitOperator              |-     |-
> >>
> >> AIRFLOW-1051|Add a test for resetdb to CliTests                |#2198
> >> |15aee0
> >> AIRFLOW-1047|Airflow logs vulnerable to XSS                    |#2193
> >> |fe9ebe
> >> AIRFLOW-1045|Make log level configurable via airflow.cfg       |#2191
> >> |e739a5
> >> AIRFLOW-1043|Documentation issues for operators                |#2188
> >> |b55f41
> >> AIRFLOW-1041|DockerOperator replaces its xcom_push method with |#2274
> >> |03704c
> >> AIRFLOW-1040|Fix typos in comments/docstrings in models.py     |#2174
> >> |d8c0f5
> >> AIRFLOW-1036|Exponential backoff should use randomization      |#2262
> >> |66168e
> >> AIRFLOW-1035|Exponential backoff retry logic should use 2 as ba|#2196
> >> |4ec932
> >> AIRFLOW-1034|Make it possible to connect to S3 in sigv4 regions|#2181
> >> |4c0905
> >> AIRFLOW-1031|'scheduled__' may replace with DagRun.ID_PREFIX in|#2613
> >> |aa3844
> >> AIRFLOW-1030|HttpHook error when creating HttpSensor           |-     |-
> >>
> >> AIRFLOW-1028|Databricks Operator for Airflow                   |#2202
> >> |53ca50
> >> AIRFLOW-1024|Handle CeleryExecutor errors gracefully           |#2355
> >> |7af20f
> >> AIRFLOW-1018|Scheduler DAG processes can not log to stdout     |#2728
> >> |ef775d
> >> AIRFLOW-1016|Allow HTTP HEAD request method on HTTPSensor      |#2175
> >> |4c41f6
> >> AIRFLOW-1010|Add a convenience script for signing              |#2169
> >> |a2b65a
> >> AIRFLOW-1009|Remove SQLOperator from Concepts page             |#2168
> >> |7d1144
> >> AIRFLOW-1007|Jinja sandbox is vulnerable to RCE                |#2184
> >> |daa281
> >> AIRFLOW-1005|Speed up Airflow startup time                     |#na
> >> |996dd3
> >> AIRFLOW-999 |Support for Redis database                        |#2165
> >> |8de850
> >> AIRFLOW-997 |Change setup.cfg to point to Apache instead of Max|#na
> >> |75cd46
> >> AIRFLOW-995 |Update Github PR template                         |#2163
> >> |b62485
> >> AIRFLOW-994 |Add MiNODES to the AIRFLOW Active Users List      |#2159
> >> |ca1623
> >> AIRFLOW-991 |Mark_success while a task is running leads to fail|-     |-
> >>
> >> AIRFLOW-990 |DockerOperator fails when logging unicode string  |#2155
> >> |6bbf54
> >> AIRFLOW-988 |SLA Miss Callbacks Are Repeated if Email is Not be|#2415
> >> |6e74d4
> >> AIRFLOW-985 |Extend the sqoop operator/hook with additional par|#2177
> >> |82eb20
> >> AIRFLOW-984 |Subdags unrecognized when subclassing SubDagOperat|#2152
> >> |a8bd16
> >> AIRFLOW-979 |Add GovTech GDS                                   |#2149
> >> |b17bd3
> >> AIRFLOW-976 |Mark success running task causes it to fail       |-     |-
> >>
> >> AIRFLOW-969 |Catch bad python_callable argument at DAG construc|#2142
> >> |12901d
> >> AIRFLOW-963 |Some code examples are not rendered in the airflow|#2139
> >> |f69c1b
> >> AIRFLOW-960 |Add support for .editorconfig                     |#na
> >> |f5cacc
> >> AIRFLOW-959 |.gitignore file is disorganized and incomplete    |#na
> >> |3d3c14
> >> AIRFLOW-958 |Improve tooltip readability                       |#2134
> >> |b3c3eb
> >> AIRFLOW-950 |Missing AWS integrations on documentation::integra|#2552
> >> |01be02
> >> AIRFLOW-947 |Make PrestoHook surface better messages when the P|#na
> >> |6dd4b3
> >> AIRFLOW-945 |Revert psycopg2 workaround when psycopg2 2.7.1 is |-     |-
> >>
> >> AIRFLOW-943 |Add Digital First Media to the Airflow users list |#2115
> >> |2cfe28
> >> AIRFLOW-942 |Add mytaxi to Airflow Users                       |#2111
> >> |d579e6
> >> AIRFLOW-935 |Impossible to use plugin executors                |#2120
> >> |08a784
> >> AIRFLOW-926 |jdbc connector is broken due to jaydebeapi api upd|#2651
> >> |07ed29
> >> AIRFLOW-917 |Incorrectly formatted failure status message      |#2109
> >> |b8164c
> >> AIRFLOW-916 |Fix ConfigParser deprecation warning              |#2108
> >> |ef6dd1
> >> AIRFLOW-911 |Add colouring and profiling info on tests         |#2106
> >> |4f52db
> >> AIRFLOW-903 |Add configuration setting for default DAG view.   |#2103
> >> |cadfae
> >> AIRFLOW-896 |BigQueryOperator fails to execute with certain inp|#2097
> >> |2bceee
> >> AIRFLOW-891 |Webserver Clock Should Include Day                |-     |-
> >>
> >> AIRFLOW-889 |Minor error in the docstrings for BaseOperator.   |#2084
> >> |50702d
> >> AIRFLOW-887 |Add compatibility with future v0.16               |#na
> >> |50902d
> >> AIRFLOW-886 |Pass Operator result to post_execute hook         |#na
> >> |4da361
> >> AIRFLOW-885 |Add Change.org to the list of Airflow users       |#2089
> >> |a279be
> >> AIRFLOW-882 |Code example in docs has unnecessary DAG>>Operator|#2088
> >> |baa4cd
> >> AIRFLOW-881 |Create SubDagOperator within DAG context manager w|#2087
> >> |0ed608
> >> AIRFLOW-880 |Fix remote log functionality inconsistencies for W|#2086
> >> |974b75
> >> AIRFLOW-877 |GoogleCloudStorageDownloadOperator: template_ext c|#2083
> >> |debc69
> >> AIRFLOW-875 |Allow HttpSensor params to be templated           |#2080
> >> |62f503
> >> AIRFLOW-871 |multiple places use logging.warn() instead of warn|#2082
> >> |21d775
> >> AIRFLOW-866 |Add FTPSensor                                     |#2070
> >> |5f87f8
> >> AIRFLOW-863 |Example DAG start dates should be recent to avoid |#2068
> >> |bbfd43
> >> AIRFLOW-862 |Add DaskExecutor                                  |#2067
> >> |6e2210
> >> AIRFLOW-860 |Circular module dependency prevents loading of cus|-     |-
> >>
> >> AIRFLOW-854 |Add Open Knowledge International to Airflow users |#2061
> >> |51a311
> >> AIRFLOW-842 |scheduler.clean_dirty raises warning: SAWarning: T|#2072
> >> |485280
> >> AIRFLOW-840 |Python3 encoding issue in Kerberos                |#2158
> >> |639336
> >> AIRFLOW-836 |The paused and queryview endpoints are vulnerable |#2054
> >> |6aca2c
> >> AIRFLOW-831 |Fix broken unit tests                             |#2050
> >> |b86194
> >> AIRFLOW-830 |Plugin manager should log to debug, not info      |-     |-
> >>
> >> AIRFLOW-829 |Reduce verbosity of successful Travis unit tests  |-     |-
> >>
> >> AIRFLOW-826 |Add Zendesk Hook                                  |#2066
> >> |a09762
> >> AIRFLOW-823 |Make task instance details available via API      |#2045
> >> |3f546e
> >> AIRFLOW-822 |Close the connection before throwing exception in |#2038
> >> |4b6c38
> >> AIRFLOW-821 |Scheduler dagbag importing not Py3 compatible     |#2039
> >> |fbb59b
> >> AIRFLOW-809 |SqlAlchemy is_ ColumnOperator Causing Errors in MS|-     |-
> >>
> >> AIRFLOW-802 |Integration of spark-submit                       |-     |-
> >>
> >> AIRFLOW-781 |Allow DataFlowJavaOperator to accept jar file stor|#2037
> >> |259c86
> >> AIRFLOW-770 |HDFS hooks should support alternative ways of gett|#2056
> >> |261b65
> >> AIRFLOW-756 |Refactor ssh_hook and ssh_operator                |-     |-
> >>
> >> AIRFLOW-751 |SFTP file transfer functionality                  |#1999
> >> |fe0ede
> >> AIRFLOW-725 |Make merge tool use OS' keyring for password stora|#1966
> >> |8c1695
> >> AIRFLOW-706 |Configuration shell commands are not split properl|#2053
> >> |0bb6f2
> >> AIRFLOW-705 |airflow.configuration.run_command output does not |-
>  |-
> >>
> >> AIRFLOW-681 |homepage doc link should pointing to apache's repo|#2164
> >> |a8027a
> >> AIRFLOW-654 |SSL for AMQP w/ Celery(Executor)                  |#2333
> >> |868bfe
> >> AIRFLOW-645 |HttpHook ignores https                            |#2311
> >> |fd381a
> >> AIRFLOW-365 |Code view in subdag trigger exception             |#2043
> >> |cf102c
> >> AIRFLOW-300 |Add Google Pubsub hook and operator               |#2036
> >> |d231dc
> >> AIRFLOW-289 |Use datetime.utcnow() to keep airflow system indep|#2618
> >> |20c83e
> >> AIRFLOW-71  |docker_operator - pulling from private repositorie|#na
> >> |d4406c
> >>
> >> Cheers,
> >> Chris
> >>
>
>

Re: [VOTE] Airflow 1.9.0rc1

Posted by Arthur Wiedmer <ar...@gmail.com>.
Ash,

There seem to be some solutions, but there are pretty hacky and poorly
documented :

https://stackoverflow.com/questions/18026980/python-setuptools-how-can-i-list-a-private-repository-under-install-requires
https://github.com/pypa/pip/issues/2124

That said, we should be able to figure out a path :)

Maybe we can get the incubator to reconsider for this particular case.

Best,
Arthur

On Fri, Nov 10, 2017 at 1:25 AM, Ash Berlin-Taylor <
ash_airflowlist@firemirror.com> wrote:

> The other difference is that if you depend upon airflow in a module where
> you want to put it in the install_requires section of a setup.py (not an
> application which has a requirements.txt, say) you can't use a git tag. Or
> at least I couldn't get it working.
>
> It doesn't make a difference a lot of time, but it is occasionally useful.
>
> -ash
>
> > On 9 Nov 2017, at 23:08, Alek Storm <al...@gmail.com> wrote:
> >
> > It’s not a major difference, but installing from a git repo via pip
> > requires a completely different syntax, which complicates our tooling,
> e.g.:
> >
> > $ pip install 'apache-airflow[postgres,celery,rabbitmq]=={{version}}'
> >
> > $ pip install 'git+git://github.com/apache/
> incubator-airflow@{{version}}#egg=apache-airflow[postgres,celery,rabbitmq]
> '
> >
> > Alek
> > ​
> >
> > On Thu, Nov 9, 2017 at 3:53 PM, Arthur Wiedmer <arthur.wiedmer@gmail.com
> >
> > wrote:
> >
> >> I agree with Bolke that it would be better to provide dev releases in
> PyPI,
> >> but my understanding was that, while not an official release channel, it
> >> still has the apache branding and we should be careful nonetheless.
> >>
> >> I am still confused as to why installing from a git tag or the like is
> not
> >> OK for testing, provided our release artifact creation process is
> >> consistent.
> >>
> >> Best,
> >> Arthur
> >>
> >> On Thu, Nov 9, 2017 at 12:09 PM, Daniel Huang <dx...@gmail.com>
> wrote:
> >>
> >>> This is how pip handles RC/beta versions:
> >>>
> >>>
> >>>> Pre-release Versions
> >>>> Starting with v1.4, pip will only install stable versions as specified
> >> by
> >>>> PEP426 by default. If a version cannot be parsed as a compliant PEP426
> >>>> version then it is assumed to be a pre-release.
> >>>> If a Requirement specifier includes a pre-release or development
> >> version
> >>>> (e.g. >=0.0.dev0) then pip will allow pre-release and development
> >>> versions
> >>>> for that requirement. This does not include the != flag.
> >>>> The pip install command also supports a --pre flag that will enable
> >>>> installing pre-releases and development releases.
> >>>
> >>>
> >>> Source:
> >>> https://pip.pypa.io/en/stable/reference/pip_install/#pre-
> >> release-versions
> >>> <https://pip.pypa.io/en/stable/reference/pip_install/#
> >> pre-release-versions
> >>>>
> >>>
> >>> On Thu, Nov 9, 2017 at 11:54 AM, Bolke de Bruin <bd...@gmail.com>
> >> wrote:
> >>>
> >>>> I think we should put this up for discussion. PyPi is not an official
> >>>> apache channel, so in theory we could put anything on PyPI. I also
> >> think
> >>>> (didn’t confirm) pip doesn’t upgrade to RC/beta etc.
> >>>>
> >>>> Any thoughts?
> >>>>
> >>>> Bolke.
> >>>>
> >>>>> On 9 Nov 2017, at 15:53, Arthur Wiedmer <ar...@apache.org> wrote:
> >>>>>
> >>>>> Hi Alek,
> >>>>>
> >>>>> Technically, we cannot release a distribution on PyPI until we have
> >>> voted
> >>>>> on a release. And here usually a release artifact. It is a little
> >>>>> convoluted in the case of Python, but we are getting the hang of it.
> >>>>>
> >>>>> That said, installing from a git reference is a possibility too if
> >> you
> >>>> want
> >>>>> the fastest path to install.
> >>>>>
> >>>>> Best,
> >>>>> Arthur
> >>>>>
> >>>>> On Nov 9, 2017 06:34, "Alek Storm" <al...@gmail.com> wrote:
> >>>>>
> >>>>> I think this has been mentioned before, but it would be much easier
> >> for
> >>>> us
> >>>>> (my team) to test RCs if they were published to PyPI. Or is that
> >>> against
> >>>>> Apache guidelines?
> >>>>>
> >>>>> Alek
> >>>>>
> >>>>> On Thu, Nov 9, 2017 at 8:29 AM, Michael Crawford <
> >>>>> michael.crawford@modernizingmedicine.com> wrote:
> >>>>>
> >>>>>> Thanks.  Yes I understand it isn’t released yet.
> >>>>>>
> >>>>>>
> >>>>>>> On Nov 9, 2017, at 9:09 AM, Driesprong, Fokko <fokko@driesprong.frl
> >>>
> >>>>>> wrote:
> >>>>>>>
> >>>>>>> Hi Michael,
> >>>>>>>
> >>>>>>> You have to install it from the tar.gz:
> >>>>>>>
> >>>>>>> wget
> >>>>>>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.
> >>>>>> 9.0rc1/apache-airflow-1.9.0rc1+incubating-bin.tar.gz
> >>>>>>> pip install /tmp/apache-airflow.tar.gz
> >>>>>>>
> >>>>>>> The steps of updating, are in the UPDATING.md:
> >>>>>>> https://github.com/apache/incubator-airflow/blob/master/
> >> UPDATING.md
> >>>>>>>
> >>>>>>> Please note that 1.9 is not released yet, but you are welcome to
> >> try
> >>>> out
> >>>>>>> RC1.
> >>>>>>>
> >>>>>>> Cheers, Fokko
> >>>>>>
> >>>>>>
> >>>>
> >>>>
> >>>
> >>
>
>

Re: [VOTE] Airflow 1.9.0rc1

Posted by Ash Berlin-Taylor <as...@firemirror.com>.
The other difference is that if you depend upon airflow in a module where you want to put it in the install_requires section of a setup.py (not an application which has a requirements.txt, say) you can't use a git tag. Or at least I couldn't get it working.

It doesn't make a difference a lot of time, but it is occasionally useful.

-ash

> On 9 Nov 2017, at 23:08, Alek Storm <al...@gmail.com> wrote:
> 
> It’s not a major difference, but installing from a git repo via pip
> requires a completely different syntax, which complicates our tooling, e.g.:
> 
> $ pip install 'apache-airflow[postgres,celery,rabbitmq]=={{version}}'
> 
> $ pip install 'git+git://github.com/apache/incubator-airflow@{{version}}#egg=apache-airflow[postgres,celery,rabbitmq]'
> 
> Alek
> ​
> 
> On Thu, Nov 9, 2017 at 3:53 PM, Arthur Wiedmer <ar...@gmail.com>
> wrote:
> 
>> I agree with Bolke that it would be better to provide dev releases in PyPI,
>> but my understanding was that, while not an official release channel, it
>> still has the apache branding and we should be careful nonetheless.
>> 
>> I am still confused as to why installing from a git tag or the like is not
>> OK for testing, provided our release artifact creation process is
>> consistent.
>> 
>> Best,
>> Arthur
>> 
>> On Thu, Nov 9, 2017 at 12:09 PM, Daniel Huang <dx...@gmail.com> wrote:
>> 
>>> This is how pip handles RC/beta versions:
>>> 
>>> 
>>>> Pre-release Versions
>>>> Starting with v1.4, pip will only install stable versions as specified
>> by
>>>> PEP426 by default. If a version cannot be parsed as a compliant PEP426
>>>> version then it is assumed to be a pre-release.
>>>> If a Requirement specifier includes a pre-release or development
>> version
>>>> (e.g. >=0.0.dev0) then pip will allow pre-release and development
>>> versions
>>>> for that requirement. This does not include the != flag.
>>>> The pip install command also supports a --pre flag that will enable
>>>> installing pre-releases and development releases.
>>> 
>>> 
>>> Source:
>>> https://pip.pypa.io/en/stable/reference/pip_install/#pre-
>> release-versions
>>> <https://pip.pypa.io/en/stable/reference/pip_install/#
>> pre-release-versions
>>>> 
>>> 
>>> On Thu, Nov 9, 2017 at 11:54 AM, Bolke de Bruin <bd...@gmail.com>
>> wrote:
>>> 
>>>> I think we should put this up for discussion. PyPi is not an official
>>>> apache channel, so in theory we could put anything on PyPI. I also
>> think
>>>> (didn’t confirm) pip doesn’t upgrade to RC/beta etc.
>>>> 
>>>> Any thoughts?
>>>> 
>>>> Bolke.
>>>> 
>>>>> On 9 Nov 2017, at 15:53, Arthur Wiedmer <ar...@apache.org> wrote:
>>>>> 
>>>>> Hi Alek,
>>>>> 
>>>>> Technically, we cannot release a distribution on PyPI until we have
>>> voted
>>>>> on a release. And here usually a release artifact. It is a little
>>>>> convoluted in the case of Python, but we are getting the hang of it.
>>>>> 
>>>>> That said, installing from a git reference is a possibility too if
>> you
>>>> want
>>>>> the fastest path to install.
>>>>> 
>>>>> Best,
>>>>> Arthur
>>>>> 
>>>>> On Nov 9, 2017 06:34, "Alek Storm" <al...@gmail.com> wrote:
>>>>> 
>>>>> I think this has been mentioned before, but it would be much easier
>> for
>>>> us
>>>>> (my team) to test RCs if they were published to PyPI. Or is that
>>> against
>>>>> Apache guidelines?
>>>>> 
>>>>> Alek
>>>>> 
>>>>> On Thu, Nov 9, 2017 at 8:29 AM, Michael Crawford <
>>>>> michael.crawford@modernizingmedicine.com> wrote:
>>>>> 
>>>>>> Thanks.  Yes I understand it isn’t released yet.
>>>>>> 
>>>>>> 
>>>>>>> On Nov 9, 2017, at 9:09 AM, Driesprong, Fokko <fokko@driesprong.frl
>>> 
>>>>>> wrote:
>>>>>>> 
>>>>>>> Hi Michael,
>>>>>>> 
>>>>>>> You have to install it from the tar.gz:
>>>>>>> 
>>>>>>> wget
>>>>>>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.
>>>>>> 9.0rc1/apache-airflow-1.9.0rc1+incubating-bin.tar.gz
>>>>>>> pip install /tmp/apache-airflow.tar.gz
>>>>>>> 
>>>>>>> The steps of updating, are in the UPDATING.md:
>>>>>>> https://github.com/apache/incubator-airflow/blob/master/
>> UPDATING.md
>>>>>>> 
>>>>>>> Please note that 1.9 is not released yet, but you are welcome to
>> try
>>>> out
>>>>>>> RC1.
>>>>>>> 
>>>>>>> Cheers, Fokko
>>>>>> 
>>>>>> 
>>>> 
>>>> 
>>> 
>> 


Re: [VOTE] Airflow 1.9.0rc1

Posted by Alek Storm <al...@gmail.com>.
It’s not a major difference, but installing from a git repo via pip
requires a completely different syntax, which complicates our tooling, e.g.:

$ pip install 'apache-airflow[postgres,celery,rabbitmq]=={{version}}'

$ pip install 'git+git://github.com/apache/incubator-airflow@{{version}}#egg=apache-airflow[postgres,celery,rabbitmq]'

Alek
​

On Thu, Nov 9, 2017 at 3:53 PM, Arthur Wiedmer <ar...@gmail.com>
wrote:

> I agree with Bolke that it would be better to provide dev releases in PyPI,
> but my understanding was that, while not an official release channel, it
> still has the apache branding and we should be careful nonetheless.
>
> I am still confused as to why installing from a git tag or the like is not
> OK for testing, provided our release artifact creation process is
> consistent.
>
> Best,
> Arthur
>
> On Thu, Nov 9, 2017 at 12:09 PM, Daniel Huang <dx...@gmail.com> wrote:
>
> > This is how pip handles RC/beta versions:
> >
> >
> > > Pre-release Versions
> > > Starting with v1.4, pip will only install stable versions as specified
> by
> > > PEP426 by default. If a version cannot be parsed as a compliant PEP426
> > > version then it is assumed to be a pre-release.
> > > If a Requirement specifier includes a pre-release or development
> version
> > > (e.g. >=0.0.dev0) then pip will allow pre-release and development
> > versions
> > > for that requirement. This does not include the != flag.
> > > The pip install command also supports a --pre flag that will enable
> > > installing pre-releases and development releases.
> >
> >
> > Source:
> > https://pip.pypa.io/en/stable/reference/pip_install/#pre-
> release-versions
> > <https://pip.pypa.io/en/stable/reference/pip_install/#
> pre-release-versions
> > >
> >
> > On Thu, Nov 9, 2017 at 11:54 AM, Bolke de Bruin <bd...@gmail.com>
> wrote:
> >
> > > I think we should put this up for discussion. PyPi is not an official
> > > apache channel, so in theory we could put anything on PyPI. I also
> think
> > > (didn’t confirm) pip doesn’t upgrade to RC/beta etc.
> > >
> > > Any thoughts?
> > >
> > > Bolke.
> > >
> > > > On 9 Nov 2017, at 15:53, Arthur Wiedmer <ar...@apache.org> wrote:
> > > >
> > > > Hi Alek,
> > > >
> > > > Technically, we cannot release a distribution on PyPI until we have
> > voted
> > > > on a release. And here usually a release artifact. It is a little
> > > > convoluted in the case of Python, but we are getting the hang of it.
> > > >
> > > > That said, installing from a git reference is a possibility too if
> you
> > > want
> > > > the fastest path to install.
> > > >
> > > > Best,
> > > > Arthur
> > > >
> > > > On Nov 9, 2017 06:34, "Alek Storm" <al...@gmail.com> wrote:
> > > >
> > > > I think this has been mentioned before, but it would be much easier
> for
> > > us
> > > > (my team) to test RCs if they were published to PyPI. Or is that
> > against
> > > > Apache guidelines?
> > > >
> > > > Alek
> > > >
> > > > On Thu, Nov 9, 2017 at 8:29 AM, Michael Crawford <
> > > > michael.crawford@modernizingmedicine.com> wrote:
> > > >
> > > >> Thanks.  Yes I understand it isn’t released yet.
> > > >>
> > > >>
> > > >>> On Nov 9, 2017, at 9:09 AM, Driesprong, Fokko <fokko@driesprong.frl
> >
> > > >> wrote:
> > > >>>
> > > >>> Hi Michael,
> > > >>>
> > > >>> You have to install it from the tar.gz:
> > > >>>
> > > >>> wget
> > > >>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.
> > > >> 9.0rc1/apache-airflow-1.9.0rc1+incubating-bin.tar.gz
> > > >>> pip install /tmp/apache-airflow.tar.gz
> > > >>>
> > > >>> The steps of updating, are in the UPDATING.md:
> > > >>> https://github.com/apache/incubator-airflow/blob/master/
> UPDATING.md
> > > >>>
> > > >>> Please note that 1.9 is not released yet, but you are welcome to
> try
> > > out
> > > >>> RC1.
> > > >>>
> > > >>> Cheers, Fokko
> > > >>
> > > >>
> > >
> > >
> >
>

Re: [VOTE] Airflow 1.9.0rc1

Posted by Arthur Wiedmer <ar...@gmail.com>.
I agree with Bolke that it would be better to provide dev releases in PyPI,
but my understanding was that, while not an official release channel, it
still has the apache branding and we should be careful nonetheless.

I am still confused as to why installing from a git tag or the like is not
OK for testing, provided our release artifact creation process is
consistent.

Best,
Arthur

On Thu, Nov 9, 2017 at 12:09 PM, Daniel Huang <dx...@gmail.com> wrote:

> This is how pip handles RC/beta versions:
>
>
> > Pre-release Versions
> > Starting with v1.4, pip will only install stable versions as specified by
> > PEP426 by default. If a version cannot be parsed as a compliant PEP426
> > version then it is assumed to be a pre-release.
> > If a Requirement specifier includes a pre-release or development version
> > (e.g. >=0.0.dev0) then pip will allow pre-release and development
> versions
> > for that requirement. This does not include the != flag.
> > The pip install command also supports a --pre flag that will enable
> > installing pre-releases and development releases.
>
>
> Source:
> https://pip.pypa.io/en/stable/reference/pip_install/#pre-release-versions
> <https://pip.pypa.io/en/stable/reference/pip_install/#pre-release-versions
> >
>
> On Thu, Nov 9, 2017 at 11:54 AM, Bolke de Bruin <bd...@gmail.com> wrote:
>
> > I think we should put this up for discussion. PyPi is not an official
> > apache channel, so in theory we could put anything on PyPI. I also think
> > (didn’t confirm) pip doesn’t upgrade to RC/beta etc.
> >
> > Any thoughts?
> >
> > Bolke.
> >
> > > On 9 Nov 2017, at 15:53, Arthur Wiedmer <ar...@apache.org> wrote:
> > >
> > > Hi Alek,
> > >
> > > Technically, we cannot release a distribution on PyPI until we have
> voted
> > > on a release. And here usually a release artifact. It is a little
> > > convoluted in the case of Python, but we are getting the hang of it.
> > >
> > > That said, installing from a git reference is a possibility too if you
> > want
> > > the fastest path to install.
> > >
> > > Best,
> > > Arthur
> > >
> > > On Nov 9, 2017 06:34, "Alek Storm" <al...@gmail.com> wrote:
> > >
> > > I think this has been mentioned before, but it would be much easier for
> > us
> > > (my team) to test RCs if they were published to PyPI. Or is that
> against
> > > Apache guidelines?
> > >
> > > Alek
> > >
> > > On Thu, Nov 9, 2017 at 8:29 AM, Michael Crawford <
> > > michael.crawford@modernizingmedicine.com> wrote:
> > >
> > >> Thanks.  Yes I understand it isn’t released yet.
> > >>
> > >>
> > >>> On Nov 9, 2017, at 9:09 AM, Driesprong, Fokko <fo...@driesprong.frl>
> > >> wrote:
> > >>>
> > >>> Hi Michael,
> > >>>
> > >>> You have to install it from the tar.gz:
> > >>>
> > >>> wget
> > >>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.
> > >> 9.0rc1/apache-airflow-1.9.0rc1+incubating-bin.tar.gz
> > >>> pip install /tmp/apache-airflow.tar.gz
> > >>>
> > >>> The steps of updating, are in the UPDATING.md:
> > >>> https://github.com/apache/incubator-airflow/blob/master/UPDATING.md
> > >>>
> > >>> Please note that 1.9 is not released yet, but you are welcome to try
> > out
> > >>> RC1.
> > >>>
> > >>> Cheers, Fokko
> > >>
> > >>
> >
> >
>

Re: [VOTE] Airflow 1.9.0rc1

Posted by Daniel Huang <dx...@gmail.com>.
This is how pip handles RC/beta versions:


> Pre-release Versions
> Starting with v1.4, pip will only install stable versions as specified by
> PEP426 by default. If a version cannot be parsed as a compliant PEP426
> version then it is assumed to be a pre-release.
> If a Requirement specifier includes a pre-release or development version
> (e.g. >=0.0.dev0) then pip will allow pre-release and development versions
> for that requirement. This does not include the != flag.
> The pip install command also supports a --pre flag that will enable
> installing pre-releases and development releases.


Source:
https://pip.pypa.io/en/stable/reference/pip_install/#pre-release-versions
<https://pip.pypa.io/en/stable/reference/pip_install/#pre-release-versions>

On Thu, Nov 9, 2017 at 11:54 AM, Bolke de Bruin <bd...@gmail.com> wrote:

> I think we should put this up for discussion. PyPi is not an official
> apache channel, so in theory we could put anything on PyPI. I also think
> (didn’t confirm) pip doesn’t upgrade to RC/beta etc.
>
> Any thoughts?
>
> Bolke.
>
> > On 9 Nov 2017, at 15:53, Arthur Wiedmer <ar...@apache.org> wrote:
> >
> > Hi Alek,
> >
> > Technically, we cannot release a distribution on PyPI until we have voted
> > on a release. And here usually a release artifact. It is a little
> > convoluted in the case of Python, but we are getting the hang of it.
> >
> > That said, installing from a git reference is a possibility too if you
> want
> > the fastest path to install.
> >
> > Best,
> > Arthur
> >
> > On Nov 9, 2017 06:34, "Alek Storm" <al...@gmail.com> wrote:
> >
> > I think this has been mentioned before, but it would be much easier for
> us
> > (my team) to test RCs if they were published to PyPI. Or is that against
> > Apache guidelines?
> >
> > Alek
> >
> > On Thu, Nov 9, 2017 at 8:29 AM, Michael Crawford <
> > michael.crawford@modernizingmedicine.com> wrote:
> >
> >> Thanks.  Yes I understand it isn’t released yet.
> >>
> >>
> >>> On Nov 9, 2017, at 9:09 AM, Driesprong, Fokko <fo...@driesprong.frl>
> >> wrote:
> >>>
> >>> Hi Michael,
> >>>
> >>> You have to install it from the tar.gz:
> >>>
> >>> wget
> >>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.
> >> 9.0rc1/apache-airflow-1.9.0rc1+incubating-bin.tar.gz
> >>> pip install /tmp/apache-airflow.tar.gz
> >>>
> >>> The steps of updating, are in the UPDATING.md:
> >>> https://github.com/apache/incubator-airflow/blob/master/UPDATING.md
> >>>
> >>> Please note that 1.9 is not released yet, but you are welcome to try
> out
> >>> RC1.
> >>>
> >>> Cheers, Fokko
> >>
> >>
>
>

Re: [VOTE] Airflow 1.9.0rc1

Posted by Ash Berlin-Taylor <as...@firemirror.com>.
I'd be in favour of this for similar reasons to Alek.

I think the "mentioned before" is in reference to my post from October 20 http://mail-archives.apache.org/mod_mbox/incubator-airflow-dev/201710.mbox/%3CD008C556-C67C-42EB-88C0-CFE440C93656%40firemirror.com%3E <http://mail-archives.apache.org/mod_mbox/incubator-airflow-dev/201710.mbox/%3CD008C556-C67C-42EB-88C0-CFE440C93656@firemirror.com%3E>

To confirm: pip doesn't upgrade to RC/beta unless the user asks for it -- more detail is available in the linked post.

-ash

> On 9 Nov 2017, at 19:54, Bolke de Bruin <bd...@gmail.com> wrote:
> 
> I think we should put this up for discussion. PyPi is not an official apache channel, so in theory we could put anything on PyPI. I also think (didn’t confirm) pip doesn’t upgrade to RC/beta etc.
> 
> Any thoughts?
> 
> Bolke.
> 
>> On 9 Nov 2017, at 15:53, Arthur Wiedmer <ar...@apache.org> wrote:
>> 
>> Hi Alek,
>> 
>> Technically, we cannot release a distribution on PyPI until we have voted
>> on a release. And here usually a release artifact. It is a little
>> convoluted in the case of Python, but we are getting the hang of it.
>> 
>> That said, installing from a git reference is a possibility too if you want
>> the fastest path to install.
>> 
>> Best,
>> Arthur
>> 
>> On Nov 9, 2017 06:34, "Alek Storm" <al...@gmail.com> wrote:
>> 
>> I think this has been mentioned before, but it would be much easier for us
>> (my team) to test RCs if they were published to PyPI. Or is that against
>> Apache guidelines?
>> 
>> Alek
>> 
>> On Thu, Nov 9, 2017 at 8:29 AM, Michael Crawford <
>> michael.crawford@modernizingmedicine.com> wrote:
>> 
>>> Thanks.  Yes I understand it isn’t released yet.
>>> 
>>> 
>>>> On Nov 9, 2017, at 9:09 AM, Driesprong, Fokko <fo...@driesprong.frl>
>>> wrote:
>>>> 
>>>> Hi Michael,
>>>> 
>>>> You have to install it from the tar.gz:
>>>> 
>>>> wget
>>>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.
>>> 9.0rc1/apache-airflow-1.9.0rc1+incubating-bin.tar.gz
>>>> pip install /tmp/apache-airflow.tar.gz
>>>> 
>>>> The steps of updating, are in the UPDATING.md:
>>>> https://github.com/apache/incubator-airflow/blob/master/UPDATING.md
>>>> 
>>>> Please note that 1.9 is not released yet, but you are welcome to try out
>>>> RC1.
>>>> 
>>>> Cheers, Fokko
>>> 
>>> 
> 


Re: [VOTE] Airflow 1.9.0rc1

Posted by Bolke de Bruin <bd...@gmail.com>.
I think we should put this up for discussion. PyPi is not an official apache channel, so in theory we could put anything on PyPI. I also think (didn’t confirm) pip doesn’t upgrade to RC/beta etc.

Any thoughts?

Bolke.

> On 9 Nov 2017, at 15:53, Arthur Wiedmer <ar...@apache.org> wrote:
> 
> Hi Alek,
> 
> Technically, we cannot release a distribution on PyPI until we have voted
> on a release. And here usually a release artifact. It is a little
> convoluted in the case of Python, but we are getting the hang of it.
> 
> That said, installing from a git reference is a possibility too if you want
> the fastest path to install.
> 
> Best,
> Arthur
> 
> On Nov 9, 2017 06:34, "Alek Storm" <al...@gmail.com> wrote:
> 
> I think this has been mentioned before, but it would be much easier for us
> (my team) to test RCs if they were published to PyPI. Or is that against
> Apache guidelines?
> 
> Alek
> 
> On Thu, Nov 9, 2017 at 8:29 AM, Michael Crawford <
> michael.crawford@modernizingmedicine.com> wrote:
> 
>> Thanks.  Yes I understand it isn’t released yet.
>> 
>> 
>>> On Nov 9, 2017, at 9:09 AM, Driesprong, Fokko <fo...@driesprong.frl>
>> wrote:
>>> 
>>> Hi Michael,
>>> 
>>> You have to install it from the tar.gz:
>>> 
>>> wget
>>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.
>> 9.0rc1/apache-airflow-1.9.0rc1+incubating-bin.tar.gz
>>> pip install /tmp/apache-airflow.tar.gz
>>> 
>>> The steps of updating, are in the UPDATING.md:
>>> https://github.com/apache/incubator-airflow/blob/master/UPDATING.md
>>> 
>>> Please note that 1.9 is not released yet, but you are welcome to try out
>>> RC1.
>>> 
>>> Cheers, Fokko
>> 
>> 


Re: [VOTE] Airflow 1.9.0rc1

Posted by Arthur Wiedmer <ar...@apache.org>.
Hi Alek,

Technically, we cannot release a distribution on PyPI until we have voted
on a release. And here usually a release artifact. It is a little
convoluted in the case of Python, but we are getting the hang of it.

That said, installing from a git reference is a possibility too if you want
the fastest path to install.

Best,
Arthur

On Nov 9, 2017 06:34, "Alek Storm" <al...@gmail.com> wrote:

I think this has been mentioned before, but it would be much easier for us
(my team) to test RCs if they were published to PyPI. Or is that against
Apache guidelines?

Alek

On Thu, Nov 9, 2017 at 8:29 AM, Michael Crawford <
michael.crawford@modernizingmedicine.com> wrote:

> Thanks.  Yes I understand it isn’t released yet.
>
>
> > On Nov 9, 2017, at 9:09 AM, Driesprong, Fokko <fo...@driesprong.frl>
> wrote:
> >
> > Hi Michael,
> >
> > You have to install it from the tar.gz:
> >
> > wget
> > https://dist.apache.org/repos/dist/dev/incubator/airflow/1.
> 9.0rc1/apache-airflow-1.9.0rc1+incubating-bin.tar.gz
> > pip install /tmp/apache-airflow.tar.gz
> >
> > The steps of updating, are in the UPDATING.md:
> > https://github.com/apache/incubator-airflow/blob/master/UPDATING.md
> >
> > Please note that 1.9 is not released yet, but you are welcome to try out
> > RC1.
> >
> > Cheers, Fokko
>
>

Re: [VOTE] Airflow 1.9.0rc1

Posted by Alek Storm <al...@gmail.com>.
I think this has been mentioned before, but it would be much easier for us
(my team) to test RCs if they were published to PyPI. Or is that against
Apache guidelines?

Alek

On Thu, Nov 9, 2017 at 8:29 AM, Michael Crawford <
michael.crawford@modernizingmedicine.com> wrote:

> Thanks.  Yes I understand it isn’t released yet.
>
>
> > On Nov 9, 2017, at 9:09 AM, Driesprong, Fokko <fo...@driesprong.frl>
> wrote:
> >
> > Hi Michael,
> >
> > You have to install it from the tar.gz:
> >
> > wget
> > https://dist.apache.org/repos/dist/dev/incubator/airflow/1.
> 9.0rc1/apache-airflow-1.9.0rc1+incubating-bin.tar.gz
> > pip install /tmp/apache-airflow.tar.gz
> >
> > The steps of updating, are in the UPDATING.md:
> > https://github.com/apache/incubator-airflow/blob/master/UPDATING.md
> >
> > Please note that 1.9 is not released yet, but you are welcome to try out
> > RC1.
> >
> > Cheers, Fokko
>
>

Re: [VOTE] Airflow 1.9.0rc1

Posted by Michael Crawford <mi...@modernizingmedicine.com>.
Thanks.  Yes I understand it isn’t released yet.  


> On Nov 9, 2017, at 9:09 AM, Driesprong, Fokko <fo...@driesprong.frl> wrote:
> 
> Hi Michael,
> 
> You have to install it from the tar.gz:
> 
> wget
> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.9.0rc1/apache-airflow-1.9.0rc1+incubating-bin.tar.gz
> pip install /tmp/apache-airflow.tar.gz
> 
> The steps of updating, are in the UPDATING.md:
> https://github.com/apache/incubator-airflow/blob/master/UPDATING.md
> 
> Please note that 1.9 is not released yet, but you are welcome to try out
> RC1.
> 
> Cheers, Fokko


Re: [VOTE] Airflow 1.9.0rc1

Posted by "Driesprong, Fokko" <fo...@driesprong.frl>.
Hi Michael,

You have to install it from the tar.gz:

wget
https://dist.apache.org/repos/dist/dev/incubator/airflow/1.9.0rc1/apache-airflow-1.9.0rc1+incubating-bin.tar.gz
pip install /tmp/apache-airflow.tar.gz

The steps of updating, are in the UPDATING.md:
https://github.com/apache/incubator-airflow/blob/master/UPDATING.md

Please note that 1.9 is not released yet, but you are welcome to try out
RC1.

Cheers, Fokko

Re: [VOTE] Airflow 1.9.0rc1

Posted by Michael Crawford <mi...@modernizingmedicine.com>.
Curious how you guys are installing 1.9.0rc1 in your environments.   

Are you upgrading 1.8 environments or setting up a new test environment?

Do we have any official documentation yet as part of the 1.9 release that documents how to upgrade from 1.8 to 1.9?

> On Nov 9, 2017, at 7:44 AM, Ash Berlin-Taylor <as...@firemirror.com> wrote:
> 
> That final URL should have been https://issues.apache.org/jira/browse/AIRFLOW-1797 <https://issues.apache.org/jira/browse/AIRFLOW-1797>, which now has a PR for it https://github.com/apache/incubator-airflow/pull/2771 <https://github.com/apache/incubator-airflow/pull/2771>
> 
> There's going to be some more fixes coming around S3 logs.
> 
> One thing I have noticed is the switch to per-try logs (which is awesome! So much easier to view) means I can't read old logs anymore because it isn't split by try, it's just all in one file.
> 
> Is it worth making a change to the log task handlers to try loading under the old pattern if none are found with the new style? (The other option is that I just run a migration script to move the old logs into the new place. That sort of only helps me though.)
> 
> -ash
> 
>> On 9 Nov 2017, at 11:04, Ash Berlin-Taylor <as...@firemirror.com> wrote:
>> 
>> And one more - on Python3 we can't use S3Hook.load_string due to bytes vs string issue: https://issues.apache.org/jira/browse/AIRFLOW-1796 <https://issues.apache.org/jira/browse/AIRFLOW-1796>
>> 
>> I'll try and work on fixes for some/all of these today, including adding and expanding on the tests for S3Hook which it looks like was kind of lacking.
>> 
>> -ash
>> 
>>> On 9 Nov 2017, at 10:54, Ash Berlin-Taylor <as...@firemirror.com> wrote:
>>> 
>>> Thanks for picking this up. Your fix should stop the 500 error, but there's another problem (which is ultimately user misconfiguration about) https://issues.apache.org/jira/browse/AIRFLOW-1796 - the fix for that to update a doc somewhere, and probably validate this settings is correct at start time.
>>> 
>>> 
>>> I've found another issue related to arg names of S3Hook. In 1.8.2 it was `s3_conn_id` but the move to boto3/basing off AWSHook now expects `aws_conn_id`, and various places in Airflow code base (and a few places in our dags/operators code base) still pass it as s3_conn_id. I've created https://issues.apache.org/jira/browse/AIRFLOW-1795 for that issue.
>>> 
>>> -ash
>>> 
>>> 
>>>> On 8 Nov 2017, at 18:54, Daniel Huang <dx...@gmail.com> wrote:
>>>> 
>>>> Still testing this out.
>>>> 
>>>> Put up a small fix for Ash's second exception
>>>> https://github.com/apache/incubator-airflow/pull/2766
>>>> 
>>>> On Wed, Nov 8, 2017 at 10:48 AM, Bolke de Bruin <bd...@gmail.com> wrote:
>>>> 
>>>>> Hi Chris,
>>>>> 
>>>>> Actively testing here: we found an issue in the SSHOperator introduced in
>>>>> 1.9.0 (fix already merged for RC2, but blocking I as it stops us from
>>>>> running SSH properly), some minor fixes by Airbnb should also be in RC2.
>>>>> There is some logging “weirdness”, that might warrant a small patch here in
>>>>> there and could be squeezed into RC2, but I don’t consider them blocking.
>>>>> 
>>>>> So almost there, but we need an RC2 imho.
>>>>> 
>>>>> -1, binding.
>>>>> 
>>>>> Bolke
>>>>> 
>>>>>> On 8 Nov 2017, at 19:00, Ash Berlin-Taylor <ash_airflowlist@firemirror.
>>>>> com> wrote:
>>>>>> 
>>>>>> -1 (for now. Non binding. Is that how this process works?)
>>>>>> 
>>>>>> We've built a test env for this RC and are testing, but have run into an
>>>>> issue reading task logs. (See below)
>>>>>> 
>>>>>> We haven't gotten very far with this yet, we will dig more tomorrow
>>>>> (it's the end of the UK work day now). I suspect this might be how we've
>>>>> misconfigured our logging. We will see tomorrow.
>>>>>> 
>>>>>> -ash
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py",
>>>>> line 712, in log
>>>>>> logs = handler.read(ti)
>>>>>> AttributeError: 'NoneType' object has no attribute 'read'
>>>>>> 
>>>>>> During handling of the above exception, another exception occurred:
>>>>>> 
>>>>>> Traceback (most recent call last):
>>>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1988,
>>>>> in wsgi_app
>>>>>> response = self.full_dispatch_request()
>>>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1641,
>>>>> in full_dispatch_request
>>>>>> rv = self.handle_user_exception(e)
>>>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1544,
>>>>> in handle_user_exception
>>>>>> reraise(exc_type, exc_value, tb)
>>>>>> File "/usr/local/lib/python3.5/dist-packages/flask/_compat.py", line
>>>>> 33, in reraise
>>>>>> raise value
>>>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1639,
>>>>> in full_dispatch_request
>>>>>> rv = self.dispatch_request()
>>>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1625,
>>>>> in dispatch_request
>>>>>> return self.view_functions[rule.endpoint](**req.view_args)
>>>>>> File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py",
>>>>> line 69, in inner
>>>>>> return self._run_view(f, *args, **kwargs)
>>>>>> File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py",
>>>>> line 368, in _run_view
>>>>>> return fn(self, *args, **kwargs)
>>>>>> File "/usr/local/lib/python3.5/dist-packages/flask_login.py", line
>>>>> 758, in decorated_view
>>>>>> return func(*args, **kwargs)
>>>>>> File "/usr/local/lib/python3.5/dist-packages/airflow/www/utils.py",
>>>>> line 262, in wrapper
>>>>>> return f(*args, **kwargs)
>>>>>> File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py",
>>>>> line 715, in log
>>>>>> .format(task_log_reader, e.message)]
>>>>>> AttributeError: 'AttributeError' object has no attribute 'message'
>>>>>> 
>>>>>> 
>>>>>>> On 8 Nov 2017, at 17:46, Chris Riccomini <cr...@apache.org> wrote:
>>>>>>> 
>>>>>>> Anyone? :/
>>>>>>> 
>>>>>>> On Mon, Nov 6, 2017 at 1:22 PM, Chris Riccomini <cr...@apache.org>
>>>>>>> wrote:
>>>>>>> 
>>>>>>>> Hey all,
>>>>>>>> 
>>>>>>>> I have cut Airflow 1.9.0 RC1. This email is calling a vote on the
>>>>> release,
>>>>>>>> which will last fo 72 hours. Consider this my (binding) +1.
>>>>>>>> 
>>>>>>>> Airflow 1.9.0 RC1 is available at:
>>>>>>>> 
>>>>>>>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.9.0rc1/
>>>>>>>> 
>>>>>>>> apache-airflow-1.9.0rc1+incubating-source.tar.gz is a source release
>>>>> that
>>>>>>>> comes with INSTALL instructions.
>>>>>>>> apache-airflow-1.9.0rc1+incubating-bin.tar.gz is the binary Python
>>>>>>>> "sdist" release.
>>>>>>>> 
>>>>>>>> Public keys are available at:
>>>>>>>> 
>>>>>>>> https://dist.apache.org/repos/dist/release/incubator/airflow/
>>>>>>>> 
>>>>>>>> The release contains the following JIRAs:
>>>>>>>> 
>>>>>>>> ISSUE ID    |DESCRIPTION                                       |PR
>>>>>>>> |COMMIT
>>>>>>>> AIRFLOW-1779|Add keepalive packets to ssh hook                 |#2749
>>>>>>>> |d2f9d1
>>>>>>>> AIRFLOW-1776|stdout/stderr logging not captured                |#2745
>>>>>>>> |590d9f
>>>>>>>> AIRFLOW-1771|Change heartbeat text from boom to heartbeat      |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1767|Airflow Scheduler no longer schedules DAGs        |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1765|Default API auth backed should deny all.          |#2737
>>>>>>>> |6ecdac
>>>>>>>> AIRFLOW-1764|Web Interface should not use experimental api     |#2738
>>>>>>>> |6bed1d
>>>>>>>> AIRFLOW-1757|Contrib.SparkSubmitOperator should allow --package|#2725
>>>>>>>> |4e06ee
>>>>>>>> AIRFLOW-1745|BashOperator ignores SIGPIPE in subprocess        |#2714
>>>>>>>> |e021c9
>>>>>>>> AIRFLOW-1744|task.retries can be False                         |#2713
>>>>>>>> |6144c6
>>>>>>>> AIRFLOW-1743|Default config template should not contain ldap fi|#2712
>>>>>>>> |270684
>>>>>>>> AIRFLOW-1741|Task Duration shows two charts on first page load.|#2711
>>>>>>>> |974b49
>>>>>>>> AIRFLOW-1734|Sqoop Operator contains logic errors & needs optio|#2703
>>>>>>>> |f6810c
>>>>>>>> AIRFLOW-1731|Import custom config on PYTHONPATH                |#2721
>>>>>>>> |f07eb3
>>>>>>>> AIRFLOW-1726|Copy Expert command for Postgres Hook             |#2698
>>>>>>>> |8a4ad3
>>>>>>>> AIRFLOW-1719|Fix small typo - your vs you                      |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1712|Log SSHOperator output                            |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1711|Ldap Attributes not always a "list" part 2        |#2731
>>>>>>>> |40a936
>>>>>>>> AIRFLOW-1706|Scheduler is failed on startup with MS SQL Server |#2733
>>>>>>>> |9e209b
>>>>>>>> AIRFLOW-1698|Remove confusing SCHEDULER_RUNS env var from syste|#2677
>>>>>>>> |00dd06
>>>>>>>> AIRFLOW-1695|Redshift Hook using boto3 & AWS Hook              |#2717
>>>>>>>> |bfddae
>>>>>>>> AIRFLOW-1694|Hive Hooks: Python 3 does not have an `itertools.i|#2674
>>>>>>>> |c6e5ae
>>>>>>>> AIRFLOW-1692|Master cannot be checked out on windows           |#2673
>>>>>>>> |31805e
>>>>>>>> AIRFLOW-1691|Add better documentation for Google cloud storage |#2671
>>>>>>>> |ace2b1
>>>>>>>> AIRFLOW-1690|Error messages regarding gcs log commits are spars|#2670
>>>>>>>> |5fb5cd
>>>>>>>> AIRFLOW-1682|S3 task handler never writes to S3                |#2664
>>>>>>>> |0080f0
>>>>>>>> AIRFLOW-1678|Fix docstring errors for `set_upstream` and `set_d|-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1677|Fix typo in example_qubole_operator               |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1676|GCS task handler never writes to GCS              |#2659
>>>>>>>> |781fa4
>>>>>>>> AIRFLOW-1675|Fix API docstrings to be properly rendered        |#2667
>>>>>>>> |f12381
>>>>>>>> AIRFLOW-1671|Missing @apply_defaults annotation for gcs downloa|#2655
>>>>>>>> |97666b
>>>>>>>> AIRFLOW-1669|Fix Docker import in Master                       |#na
>>>>>>>> |f7f2a8
>>>>>>>> AIRFLOW-1668|Redhsift requires a keep alive of < 300s          |#2650
>>>>>>>> |f2bb77
>>>>>>>> AIRFLOW-1664|Make MySqlToGoogleCloudStorageOperator support bin|#2649
>>>>>>>> |95813d
>>>>>>>> AIRFLOW-1660|Change webpage width to full-width                |#2646
>>>>>>>> |8ee3d9
>>>>>>>> AIRFLOW-1659|Fix invalid attribute bug in FileTaskHandler      |#2645
>>>>>>>> |bee823
>>>>>>>> AIRFLOW-1658|Kill (possibly) still running Druid indexing job a|#2644
>>>>>>>> |cbf7ad
>>>>>>>> AIRFLOW-1657|Handle failure of Qubole Operator for s3distcp had|-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1654|Show tooltips for link icons in DAGs view         |#2642
>>>>>>>> |ada7b2
>>>>>>>> AIRFLOW-1647|Fix Spark-sql hook                                |#2637
>>>>>>>> |b1e5c6
>>>>>>>> AIRFLOW-1641|Task gets stuck in queued state                   |#2715
>>>>>>>> |735497
>>>>>>>> AIRFLOW-1640|Add Qubole default connection in connection table |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1639|ValueError does not have .message attribute       |#2629
>>>>>>>> |87df67
>>>>>>>> AIRFLOW-1637|readme not tracking master branch for travis      |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1636|aws and emr connection types get cleared          |#2626
>>>>>>>> |540e04
>>>>>>>> AIRFLOW-1635|Allow creating Google Cloud Platform connection wi|#2640
>>>>>>>> |6dec7a
>>>>>>>> AIRFLOW-1629|make extra a textarea in edit connections form    |#2623
>>>>>>>> |f5d46f
>>>>>>>> AIRFLOW-1628|Docstring of sqlsensor is incorrect               |#2621
>>>>>>>> |9ba73d
>>>>>>>> AIRFLOW-1627|SubDagOperator initialization should only query po|#2620
>>>>>>>> |516ace
>>>>>>>> AIRFLOW-1621|Add tests for logic added on server side dag list |#2614
>>>>>>>> |8de9fd
>>>>>>>> AIRFLOW-1614|Improve performance of DAG parsing when there are |#2610
>>>>>>>> |a95adb
>>>>>>>> AIRFLOW-1611|Customize logging in Airflow                      |#2631
>>>>>>>> |8b4a50
>>>>>>>> AIRFLOW-1609|Ignore all venvs in gitignore                     |#2608
>>>>>>>> |f1f9b4
>>>>>>>> AIRFLOW-1608|GCP Dataflow hook missing pending job state       |#2607
>>>>>>>> |653562
>>>>>>>> AIRFLOW-1606|DAG.sync_to_db is static, but takes a DAG as first|#2606
>>>>>>>> |6ac296
>>>>>>>> AIRFLOW-1605|Fix log source of local loggers                   |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1604|Rename the logger to log                          |#2604
>>>>>>>> |af4050
>>>>>>>> AIRFLOW-1602|Use LoggingMixin for the DAG class                |#2602
>>>>>>>> |956699
>>>>>>>> AIRFLOW-1601|Add configurable time between SIGTERM and SIGKILL |#2601
>>>>>>>> |48a95e
>>>>>>>> AIRFLOW-1600|Uncaught exceptions in get_fernet if cryptography |#2600
>>>>>>>> |ad963e
>>>>>>>> AIRFLOW-1597|Add GameWisp as Airflow user                      |#2599
>>>>>>>> |26b747
>>>>>>>> AIRFLOW-1594|Installing via pip copies test files into python l|#2597
>>>>>>>> |a6b23a
>>>>>>>> AIRFLOW-1593|Expose load_string in WasbHook                    |#2596
>>>>>>>> |7ece95
>>>>>>>> AIRFLOW-1591|Exception: 'TaskInstance' object has no attribute |#2578
>>>>>>>> |f4653e
>>>>>>>> AIRFLOW-1590|Small fix for dates util                          |#2652
>>>>>>>> |31946e
>>>>>>>> AIRFLOW-1587|fix `ImportError: cannot import name 'CeleryExecut|#2590
>>>>>>>> |34c73b
>>>>>>>> AIRFLOW-1586|MySQL to GCS to BigQuery fails for tables with dat|#2589
>>>>>>>> |e83012
>>>>>>>> AIRFLOW-1584|Remove the insecure /headers endpoints            |#2588
>>>>>>>> |17ac07
>>>>>>>> AIRFLOW-1582|Improve logging structure of Airflow              |#2592
>>>>>>>> |a7a518
>>>>>>>> AIRFLOW-1580|Error in string formatter when throwing an excepti|#2583
>>>>>>>> |ea9ab9
>>>>>>>> AIRFLOW-1579|Allow jagged rows in BQ Hook.                     |#2582
>>>>>>>> |5b978b
>>>>>>>> AIRFLOW-1577|Add token support to DatabricksHook               |#2579
>>>>>>>> |c2c515
>>>>>>>> AIRFLOW-1573|Remove `thrift < 0.10.0` requirement              |#2574
>>>>>>>> |aa95f2
>>>>>>>> AIRFLOW-1571|Add AWS Lambda Hook for invoking Lambda Function  |#2718
>>>>>>>> |017f18
>>>>>>>> AIRFLOW-1568|Add datastore import/export operator              |#2568
>>>>>>>> |86063b
>>>>>>>> AIRFLOW-1567|Clean up ML Engine operators                      |#2567
>>>>>>>> |af91e2
>>>>>>>> AIRFLOW-1564|Default logging filename contains a colon         |#2565
>>>>>>>> |4c674c
>>>>>>>> AIRFLOW-1560|Add AWS DynamoDB hook for inserting batch items   |#2587
>>>>>>>> |71400b
>>>>>>>> AIRFLOW-1556|BigQueryBaseCursor should support SQL parameters  |#2557
>>>>>>>> |9df0ac
>>>>>>>> AIRFLOW-1546| add Zymergen to org list in README               |#2512
>>>>>>>> |7cc346
>>>>>>>> AIRFLOW-1535|Add support for Dataproc serviceAccountScopes in D|#2546
>>>>>>>> |b1f902
>>>>>>>> AIRFLOW-1529|Support quoted newlines in Google BigQuery load jo|#2545
>>>>>>>> |4a4b02
>>>>>>>> AIRFLOW-1527|Refactor celery config to make use of template    |#2542
>>>>>>>> |f4437b
>>>>>>>> AIRFLOW-1522|Increase size of val column for variable table in |#2535
>>>>>>>> |8a2d24
>>>>>>>> AIRFLOW-1521|Template fields definition for bigquery_table_dele|#2534
>>>>>>>> |f1a7c0
>>>>>>>> AIRFLOW-1520|S3Hook uses boto2                                 |#2532
>>>>>>>> |386583
>>>>>>>> AIRFLOW-1519|Main DAG list page does not scale using client sid|#2531
>>>>>>>> |d7d7ce
>>>>>>>> AIRFLOW-1512|Add operator for running Python functions in a vir|#2446
>>>>>>>> |14e6d7
>>>>>>>> AIRFLOW-1507|Make src, dst and bucket parameters as templated i|#2516
>>>>>>>> |d295cf
>>>>>>>> AIRFLOW-1505|Document when Jinja substitution occurs           |#2523
>>>>>>>> |984a87
>>>>>>>> AIRFLOW-1504|Log Cluster Name on Dataproc Operator When Execute|#2517
>>>>>>>> |1cd6c4
>>>>>>>> AIRFLOW-1499s|Eliminate duplicate and unneeded code             |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1497|Hidden fields in connection form aren't reset when|#2507
>>>>>>>> |d8da8b
>>>>>>>> AIRFLOW-1493|Fix race condition with airflow run               |#2505
>>>>>>>> |b2e175
>>>>>>>> AIRFLOW-1492|Add metric for task success/failure               |#2504
>>>>>>>> |fa84d4
>>>>>>>> AIRFLOW-1489|Docs: Typo in BigQueryCheckOperator               |#2501
>>>>>>>> |111ce5
>>>>>>>> AIRFLOW-1483|Page size on model views is to large to render qui|#2497
>>>>>>>> |04bfba
>>>>>>>> AIRFLOW-1478|Chart -> Owner column should be sortable          |#2493
>>>>>>>> |651e60
>>>>>>>> AIRFLOW-1476|Add INSTALL file for source releases              |#2492
>>>>>>>> |da76ac
>>>>>>>> AIRFLOW-1474|Add dag_id regex for 'airflow clear' CLI command  |#2486
>>>>>>>> |18f849
>>>>>>>> AIRFLOW-1470s|BashSensor Implementation                         |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1459|integration rst doc is broken in github view      |#2481
>>>>>>>> |322ec9
>>>>>>>> AIRFLOW-1438|Scheduler batch queries should have a limit       |#2462
>>>>>>>> |3547cb
>>>>>>>> AIRFLOW-1437|BigQueryTableDeleteOperator should define deletion|#2459
>>>>>>>> |b87903
>>>>>>>> AIRFLOW-1432|NVD3 Charts do not have labeled axes and units cha|#2710
>>>>>>>> |70ffa4
>>>>>>>> AIRFLOW-1402|Cleanup SafeConfigParser DeprecationWarning       |#2435
>>>>>>>> |38c86b
>>>>>>>> AIRFLOW-1401|Standardize GCP project, region, and zone argument|#2439
>>>>>>>> |b6d363
>>>>>>>> AIRFLOW-1397|Airflow 1.8.1 - No data displays in Last Run Colum|-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1394|Add quote_character parameter to GoogleCloudStorag|#2428
>>>>>>>> |9fd0be
>>>>>>>> AIRFLOW-1389|BigQueryOperator should support `createDisposition|#2470
>>>>>>>> |6e2640
>>>>>>>> AIRFLOW-1384|Add ARGO/CaDC                                     |#2434
>>>>>>>> |715947
>>>>>>>> AIRFLOW-1368|Automatically remove the container when it exits  |#2653
>>>>>>>> |d42d23
>>>>>>>> AIRFLOW-1359|Provide GoogleCloudML operator for model evaluatio|#2407
>>>>>>>> |194d1d
>>>>>>>> AIRFLOW-1356|add `--celery_hostname` to `airflow worker`       |#2405
>>>>>>>> |b9d7d1
>>>>>>>> AIRFLOW-1352|Revert bad logging Handler                        |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1350|Add "query_uri" parameter for Google DataProc oper|#2402
>>>>>>>> |d32c72
>>>>>>>> AIRFLOW-1348|Paginated UI has broken toggles after first page  |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1345|Don't commit on each loop                         |#2397
>>>>>>>> |0dd002
>>>>>>>> AIRFLOW-1344|Builds failing on Python 3.5 with AttributeError  |#2394
>>>>>>>> |2a5883
>>>>>>>> AIRFLOW-1343|Add airflow default label to the dataproc operator|#2396
>>>>>>>> |e4b240
>>>>>>>> AIRFLOW-1338|gcp_dataflow_hook is incompatible with the recent |#2388
>>>>>>>> |cf2605
>>>>>>>> AIRFLOW-1337|Customize log format via config file              |#2392
>>>>>>>> |4841e3
>>>>>>>> AIRFLOW-1335|Use buffered logger                               |#2386
>>>>>>>> |0d23d3
>>>>>>>> AIRFLOW-1333|Enable copy function for Google Cloud Storage Hook|#2385
>>>>>>>> |e2c383
>>>>>>>> AIRFLOW-1331|Contrib.SparkSubmitOperator should allow --package|#2622
>>>>>>>> |fbca8f
>>>>>>>> AIRFLOW-1330|Connection.parse_from_uri doesn't work for google_|#2525
>>>>>>>> |6e5e9d
>>>>>>>> AIRFLOW-1324|Make the Druid operator/hook more general         |#2378
>>>>>>>> |de99aa
>>>>>>>> AIRFLOW-1323|Operators related to Dataproc should keep some par|#2636
>>>>>>>> |ed248d
>>>>>>>> AIRFLOW-1315|Add Qubole File and Partition Sensors             |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1309|Add optional hive_tblproperties in HiveToDruidTran|-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1301|Add New Relic to Airflow user list                |#2359
>>>>>>>> |355fc9
>>>>>>>> AIRFLOW-1299|Google Dataproc cluster creation operator should s|#2358
>>>>>>>> |c2b80e
>>>>>>>> AIRFLOW-1289|Don't restrict scheduler threads to CPU cores     |#2353
>>>>>>>> |8e23d2
>>>>>>>> AIRFLOW-1286|BaseTaskRunner - Exception TypeError: a bytes-like|#2363
>>>>>>>> |d8891d
>>>>>>>> AIRFLOW-1277|Forbid creation of a known event with empty fields|#na
>>>>>>>> |65184a
>>>>>>>> AIRFLOW-1276|Forbid event creation with end_data earlier than s|#na
>>>>>>>> |d5d02f
>>>>>>>> AIRFLOW-1275|Fix `airflow pool` command exception              |#2346
>>>>>>>> |9958aa
>>>>>>>> AIRFLOW-1273|Google Cloud ML Version and Model CRUD Operator   |#2379
>>>>>>>> |534a0e
>>>>>>>> AIRFLOW-1272|Google Cloud ML Batch Prediction Operator         |#2390
>>>>>>>> |e92d6b
>>>>>>>> AIRFLOW-1271|Google Cloud ML Training Operator                 |#2408
>>>>>>>> |0fc450
>>>>>>>> AIRFLOW-1256|Add United Airlines as Airflow user               |#2332
>>>>>>>> |d3484a
>>>>>>>> AIRFLOW-1251|Add eRevalue as an Airflow user                   |#2331
>>>>>>>> |8d5160
>>>>>>>> AIRFLOW-1248|Fix inconsistent configuration name for worker tim|#2328
>>>>>>>> |92314f
>>>>>>>> AIRFLOW-1247|CLI: ignore all dependencies argument ignored     |#2441
>>>>>>>> |e88ecf
>>>>>>>> AIRFLOW-1245|Fix random failure of test_trigger_dag_for_date un|#2325
>>>>>>>> |cef01b
>>>>>>>> AIRFLOW-1244|Forbid creation of a pool with empty name         |#2324
>>>>>>>> |df9a10
>>>>>>>> AIRFLOW-1242|BigQueryHook assumes that a valid project_id can't|#2335
>>>>>>>> |ffe616
>>>>>>>> AIRFLOW-1237|Fix IN-predicate sqlalchemy warning               |#2320
>>>>>>>> |a1f422
>>>>>>>> AIRFLOW-1234|Cover utils.operator_helpers with unit tests      |#2317
>>>>>>>> |d16537
>>>>>>>> AIRFLOW-1233|Cover utils.json with unit tests                  |#2316
>>>>>>>> |502410
>>>>>>>> AIRFLOW-1232|Remove deprecated readfp warning                  |#2315
>>>>>>>> |6ffaaf
>>>>>>>> AIRFLOW-1231|Use flask_wtf.CSRFProtect instead of flask_wtf.Csr|#2313
>>>>>>>> |cac49e
>>>>>>>> AIRFLOW-1221|Fix DatabricksSubmitRunOperator Templating        |#2308
>>>>>>>> |0fa104
>>>>>>>> AIRFLOW-1217|Enable logging in Sqoop hook                      |#2307
>>>>>>>> |4f459b
>>>>>>>> AIRFLOW-1213|Add hcatalog parameters to the sqoop operator/hook|#2305
>>>>>>>> |857850
>>>>>>>> AIRFLOW-1208|Speed-up cli tests                                |#2301
>>>>>>>> |21c142
>>>>>>>> AIRFLOW-1207|Enable utils.helpers unit tests                   |#2300
>>>>>>>> |8ac87b
>>>>>>>> AIRFLOW-1203|Tests failing after oauth upgrade                 |#2296
>>>>>>>> |3e9c66
>>>>>>>> AIRFLOW-1201|Update deprecated 'nose-parameterized' library to |#2298
>>>>>>>> |d2d3e4
>>>>>>>> AIRFLOW-1193|Add Checkr to Airflow user list                   |#2276
>>>>>>>> |707238
>>>>>>>> AIRFLOW-1189|Get pandas DataFrame using BigQueryHook fails     |#2287
>>>>>>>> |93666f
>>>>>>>> AIRFLOW-1188|Add max_bad_records param to GoogleCloudStorageToB|#2286
>>>>>>>> |443e6b
>>>>>>>> AIRFLOW-1187|Obsolete package names in documentation           |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1185|Incorrect url to PyPi                             |#2283
>>>>>>>> |829755
>>>>>>>> AIRFLOW-1181|Enable delete and list function for Google Cloud S|#2281
>>>>>>>> |24f73c
>>>>>>>> AIRFLOW-1179|Pandas 0.20 broke Google BigQuery hook            |#2279
>>>>>>>> |ac9ccb
>>>>>>>> AIRFLOW-1177|variable json deserialize does not work at set def|#2540
>>>>>>>> |65319a
>>>>>>>> AIRFLOW-1175|Add Pronto Tools to Airflow user list             |#2277
>>>>>>>> |86aafa
>>>>>>>> AIRFLOW-1173|Add Robinhood to list of Airflow users            |#2271
>>>>>>>> |379115
>>>>>>>> AIRFLOW-1165|airflow webservice crashes on ubuntu16 - python3  |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1160|Upadte SparkSubmitOperator parameters             |#2265
>>>>>>>> |2e3f07
>>>>>>>> AIRFLOW-1155|Add Tails.com to community                        |#2261
>>>>>>>> |2fa690
>>>>>>>> AIRFLOW-1149|Allow custom filters to be added to jinja2        |#2258
>>>>>>>> |48135a
>>>>>>>> AIRFLOW-1141|Remove DAG.crawl_for_tasks method                 |#2275
>>>>>>>> |a30fee
>>>>>>>> AIRFLOW-1140|DatabricksSubmitRunOperator should template the "j|#2255
>>>>>>>> |e6d316
>>>>>>>> AIRFLOW-1136|Invalid parameters are not captured for Sqoop oper|#2252
>>>>>>>> |2ef4db
>>>>>>>> AIRFLOW-1125|Clarify documentation regarding fernet_key        |#2251
>>>>>>>> |831f8d
>>>>>>>> AIRFLOW-1122|Node strokes are too thin for people with color vi|#2246
>>>>>>>> |a08761
>>>>>>>> AIRFLOW-1121|airflow webserver --pid no longer write out pid fi|-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1118|Add evo.company to Airflow users                  |#2243
>>>>>>>> |f16914
>>>>>>>> AIRFLOW-1112|Log which pool is full in scheduler when pool slot|#2242
>>>>>>>> |74c1ce
>>>>>>>> AIRFLOW-1107|Add support for ftps non-default port             |#2240
>>>>>>>> |4d0c2f
>>>>>>>> AIRFLOW-1106|Add Groupalia/Letsbonus                           |#2239
>>>>>>>> |945b42
>>>>>>>> AIRFLOW-1095|ldap_auth memberOf should come from configuration |#2232
>>>>>>>> |6b1c32
>>>>>>>> AIRFLOW-1094|Invalid unit tests under `contrib/`               |#2234
>>>>>>>> |219c50
>>>>>>>> AIRFLOW-1091|As a release manager I want to be able to compare |#2231
>>>>>>>> |bfae42
>>>>>>>> AIRFLOW-1090|Add HBO                                           |#2230
>>>>>>>> |177d34
>>>>>>>> AIRFLOW-1089|Add Spark application arguments to SparkSubmitOper|#2229
>>>>>>>> |e5b914
>>>>>>>> AIRFLOW-1081|Task duration page is slow                        |#2226
>>>>>>>> |0da512
>>>>>>>> AIRFLOW-1075|Cleanup security docs                             |#2222
>>>>>>>> |5a6f18
>>>>>>>> AIRFLOW-1065|Add functionality for Azure Blob Storage          |#2216
>>>>>>>> |f1bc5f
>>>>>>>> AIRFLOW-1059|Reset_state_for_orphaned_task should operate in ba|#2205
>>>>>>>> |e05d3b
>>>>>>>> AIRFLOW-1058|Improvements for SparkSubmitOperator              |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1051|Add a test for resetdb to CliTests                |#2198
>>>>>>>> |15aee0
>>>>>>>> AIRFLOW-1047|Airflow logs vulnerable to XSS                    |#2193
>>>>>>>> |fe9ebe
>>>>>>>> AIRFLOW-1045|Make log level configurable via airflow.cfg       |#2191
>>>>>>>> |e739a5
>>>>>>>> AIRFLOW-1043|Documentation issues for operators                |#2188
>>>>>>>> |b55f41
>>>>>>>> AIRFLOW-1041|DockerOperator replaces its xcom_push method with |#2274
>>>>>>>> |03704c
>>>>>>>> AIRFLOW-1040|Fix typos in comments/docstrings in models.py     |#2174
>>>>>>>> |d8c0f5
>>>>>>>> AIRFLOW-1036|Exponential backoff should use randomization      |#2262
>>>>>>>> |66168e
>>>>>>>> AIRFLOW-1035|Exponential backoff retry logic should use 2 as ba|#2196
>>>>>>>> |4ec932
>>>>>>>> AIRFLOW-1034|Make it possible to connect to S3 in sigv4 regions|#2181
>>>>>>>> |4c0905
>>>>>>>> AIRFLOW-1031|'scheduled__' may replace with DagRun.ID_PREFIX in|#2613
>>>>>>>> |aa3844
>>>>>>>> AIRFLOW-1030|HttpHook error when creating HttpSensor           |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-1028|Databricks Operator for Airflow                   |#2202
>>>>>>>> |53ca50
>>>>>>>> AIRFLOW-1024|Handle CeleryExecutor errors gracefully           |#2355
>>>>>>>> |7af20f
>>>>>>>> AIRFLOW-1018|Scheduler DAG processes can not log to stdout     |#2728
>>>>>>>> |ef775d
>>>>>>>> AIRFLOW-1016|Allow HTTP HEAD request method on HTTPSensor      |#2175
>>>>>>>> |4c41f6
>>>>>>>> AIRFLOW-1010|Add a convenience script for signing              |#2169
>>>>>>>> |a2b65a
>>>>>>>> AIRFLOW-1009|Remove SQLOperator from Concepts page             |#2168
>>>>>>>> |7d1144
>>>>>>>> AIRFLOW-1007|Jinja sandbox is vulnerable to RCE                |#2184
>>>>>>>> |daa281
>>>>>>>> AIRFLOW-1005|Speed up Airflow startup time                     |#na
>>>>>>>> |996dd3
>>>>>>>> AIRFLOW-999 |Support for Redis database                        |#2165
>>>>>>>> |8de850
>>>>>>>> AIRFLOW-997 |Change setup.cfg to point to Apache instead of Max|#na
>>>>>>>> |75cd46
>>>>>>>> AIRFLOW-995 |Update Github PR template                         |#2163
>>>>>>>> |b62485
>>>>>>>> AIRFLOW-994 |Add MiNODES to the AIRFLOW Active Users List      |#2159
>>>>>>>> |ca1623
>>>>>>>> AIRFLOW-991 |Mark_success while a task is running leads to fail|-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-990 |DockerOperator fails when logging unicode string  |#2155
>>>>>>>> |6bbf54
>>>>>>>> AIRFLOW-988 |SLA Miss Callbacks Are Repeated if Email is Not be|#2415
>>>>>>>> |6e74d4
>>>>>>>> AIRFLOW-985 |Extend the sqoop operator/hook with additional par|#2177
>>>>>>>> |82eb20
>>>>>>>> AIRFLOW-984 |Subdags unrecognized when subclassing SubDagOperat|#2152
>>>>>>>> |a8bd16
>>>>>>>> AIRFLOW-979 |Add GovTech GDS                                   |#2149
>>>>>>>> |b17bd3
>>>>>>>> AIRFLOW-976 |Mark success running task causes it to fail       |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-969 |Catch bad python_callable argument at DAG construc|#2142
>>>>>>>> |12901d
>>>>>>>> AIRFLOW-963 |Some code examples are not rendered in the airflow|#2139
>>>>>>>> |f69c1b
>>>>>>>> AIRFLOW-960 |Add support for .editorconfig                     |#na
>>>>>>>> |f5cacc
>>>>>>>> AIRFLOW-959 |.gitignore file is disorganized and incomplete    |#na
>>>>>>>> |3d3c14
>>>>>>>> AIRFLOW-958 |Improve tooltip readability                       |#2134
>>>>>>>> |b3c3eb
>>>>>>>> AIRFLOW-950 |Missing AWS integrations on documentation::integra|#2552
>>>>>>>> |01be02
>>>>>>>> AIRFLOW-947 |Make PrestoHook surface better messages when the P|#na
>>>>>>>> |6dd4b3
>>>>>>>> AIRFLOW-945 |Revert psycopg2 workaround when psycopg2 2.7.1 is |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-943 |Add Digital First Media to the Airflow users list |#2115
>>>>>>>> |2cfe28
>>>>>>>> AIRFLOW-942 |Add mytaxi to Airflow Users                       |#2111
>>>>>>>> |d579e6
>>>>>>>> AIRFLOW-935 |Impossible to use plugin executors                |#2120
>>>>>>>> |08a784
>>>>>>>> AIRFLOW-926 |jdbc connector is broken due to jaydebeapi api upd|#2651
>>>>>>>> |07ed29
>>>>>>>> AIRFLOW-917 |Incorrectly formatted failure status message      |#2109
>>>>>>>> |b8164c
>>>>>>>> AIRFLOW-916 |Fix ConfigParser deprecation warning              |#2108
>>>>>>>> |ef6dd1
>>>>>>>> AIRFLOW-911 |Add colouring and profiling info on tests         |#2106
>>>>>>>> |4f52db
>>>>>>>> AIRFLOW-903 |Add configuration setting for default DAG view.   |#2103
>>>>>>>> |cadfae
>>>>>>>> AIRFLOW-896 |BigQueryOperator fails to execute with certain inp|#2097
>>>>>>>> |2bceee
>>>>>>>> AIRFLOW-891 |Webserver Clock Should Include Day                |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-889 |Minor error in the docstrings for BaseOperator.   |#2084
>>>>>>>> |50702d
>>>>>>>> AIRFLOW-887 |Add compatibility with future v0.16               |#na
>>>>>>>> |50902d
>>>>>>>> AIRFLOW-886 |Pass Operator result to post_execute hook         |#na
>>>>>>>> |4da361
>>>>>>>> AIRFLOW-885 |Add Change.org to the list of Airflow users       |#2089
>>>>>>>> |a279be
>>>>>>>> AIRFLOW-882 |Code example in docs has unnecessary DAG>>Operator|#2088
>>>>>>>> |baa4cd
>>>>>>>> AIRFLOW-881 |Create SubDagOperator within DAG context manager w|#2087
>>>>>>>> |0ed608
>>>>>>>> AIRFLOW-880 |Fix remote log functionality inconsistencies for W|#2086
>>>>>>>> |974b75
>>>>>>>> AIRFLOW-877 |GoogleCloudStorageDownloadOperator: template_ext c|#2083
>>>>>>>> |debc69
>>>>>>>> AIRFLOW-875 |Allow HttpSensor params to be templated           |#2080
>>>>>>>> |62f503
>>>>>>>> AIRFLOW-871 |multiple places use logging.warn() instead of warn|#2082
>>>>>>>> |21d775
>>>>>>>> AIRFLOW-866 |Add FTPSensor                                     |#2070
>>>>>>>> |5f87f8
>>>>>>>> AIRFLOW-863 |Example DAG start dates should be recent to avoid |#2068
>>>>>>>> |bbfd43
>>>>>>>> AIRFLOW-862 |Add DaskExecutor                                  |#2067
>>>>>>>> |6e2210
>>>>>>>> AIRFLOW-860 |Circular module dependency prevents loading of cus|-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-854 |Add Open Knowledge International to Airflow users |#2061
>>>>>>>> |51a311
>>>>>>>> AIRFLOW-842 |scheduler.clean_dirty raises warning: SAWarning: T|#2072
>>>>>>>> |485280
>>>>>>>> AIRFLOW-840 |Python3 encoding issue in Kerberos                |#2158
>>>>>>>> |639336
>>>>>>>> AIRFLOW-836 |The paused and queryview endpoints are vulnerable |#2054
>>>>>>>> |6aca2c
>>>>>>>> AIRFLOW-831 |Fix broken unit tests                             |#2050
>>>>>>>> |b86194
>>>>>>>> AIRFLOW-830 |Plugin manager should log to debug, not info      |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-829 |Reduce verbosity of successful Travis unit tests  |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-826 |Add Zendesk Hook                                  |#2066
>>>>>>>> |a09762
>>>>>>>> AIRFLOW-823 |Make task instance details available via API      |#2045
>>>>>>>> |3f546e
>>>>>>>> AIRFLOW-822 |Close the connection before throwing exception in |#2038
>>>>>>>> |4b6c38
>>>>>>>> AIRFLOW-821 |Scheduler dagbag importing not Py3 compatible     |#2039
>>>>>>>> |fbb59b
>>>>>>>> AIRFLOW-809 |SqlAlchemy is_ ColumnOperator Causing Errors in MS|-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-802 |Integration of spark-submit                       |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-781 |Allow DataFlowJavaOperator to accept jar file stor|#2037
>>>>>>>> |259c86
>>>>>>>> AIRFLOW-770 |HDFS hooks should support alternative ways of gett|#2056
>>>>>>>> |261b65
>>>>>>>> AIRFLOW-756 |Refactor ssh_hook and ssh_operator                |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-751 |SFTP file transfer functionality                  |#1999
>>>>>>>> |fe0ede
>>>>>>>> AIRFLOW-725 |Make merge tool use OS' keyring for password stora|#1966
>>>>>>>> |8c1695
>>>>>>>> AIRFLOW-706 |Configuration shell commands are not split properl|#2053
>>>>>>>> |0bb6f2
>>>>>>>> AIRFLOW-705 |airflow.configuration.run_command output does not |-
>>>>> |-
>>>>>>>> 
>>>>>>>> AIRFLOW-681 |homepage doc link should pointing to apache's repo|#2164
>>>>>>>> |a8027a
>>>>>>>> AIRFLOW-654 |SSL for AMQP w/ Celery(Executor)                  |#2333
>>>>>>>> |868bfe
>>>>>>>> AIRFLOW-645 |HttpHook ignores https                            |#2311
>>>>>>>> |fd381a
>>>>>>>> AIRFLOW-365 |Code view in subdag trigger exception             |#2043
>>>>>>>> |cf102c
>>>>>>>> AIRFLOW-300 |Add Google Pubsub hook and operator               |#2036
>>>>>>>> |d231dc
>>>>>>>> AIRFLOW-289 |Use datetime.utcnow() to keep airflow system indep|#2618
>>>>>>>> |20c83e
>>>>>>>> AIRFLOW-71  |docker_operator - pulling from private repositorie|#na
>>>>>>>> |d4406c
>>>>>>>> 
>>>>>>>> Cheers,
>>>>>>>> Chris
>>>>>>>> 
>>>>>> 
>>>>> 
>>>>> 
>>> 
>> 
> 



Re: [VOTE] Airflow 1.9.0rc1

Posted by Ash Berlin-Taylor <as...@firemirror.com>.
That final URL should have been https://issues.apache.org/jira/browse/AIRFLOW-1797 <https://issues.apache.org/jira/browse/AIRFLOW-1797>, which now has a PR for it https://github.com/apache/incubator-airflow/pull/2771 <https://github.com/apache/incubator-airflow/pull/2771>

There's going to be some more fixes coming around S3 logs.

One thing I have noticed is the switch to per-try logs (which is awesome! So much easier to view) means I can't read old logs anymore because it isn't split by try, it's just all in one file.

Is it worth making a change to the log task handlers to try loading under the old pattern if none are found with the new style? (The other option is that I just run a migration script to move the old logs into the new place. That sort of only helps me though.)

-ash

> On 9 Nov 2017, at 11:04, Ash Berlin-Taylor <as...@firemirror.com> wrote:
> 
> And one more - on Python3 we can't use S3Hook.load_string due to bytes vs string issue: https://issues.apache.org/jira/browse/AIRFLOW-1796 <https://issues.apache.org/jira/browse/AIRFLOW-1796>
> 
> I'll try and work on fixes for some/all of these today, including adding and expanding on the tests for S3Hook which it looks like was kind of lacking.
> 
> -ash
> 
>> On 9 Nov 2017, at 10:54, Ash Berlin-Taylor <as...@firemirror.com> wrote:
>> 
>> Thanks for picking this up. Your fix should stop the 500 error, but there's another problem (which is ultimately user misconfiguration about) https://issues.apache.org/jira/browse/AIRFLOW-1796 - the fix for that to update a doc somewhere, and probably validate this settings is correct at start time.
>> 
>> 
>> I've found another issue related to arg names of S3Hook. In 1.8.2 it was `s3_conn_id` but the move to boto3/basing off AWSHook now expects `aws_conn_id`, and various places in Airflow code base (and a few places in our dags/operators code base) still pass it as s3_conn_id. I've created https://issues.apache.org/jira/browse/AIRFLOW-1795 for that issue.
>> 
>> -ash
>> 
>> 
>>> On 8 Nov 2017, at 18:54, Daniel Huang <dx...@gmail.com> wrote:
>>> 
>>> Still testing this out.
>>> 
>>> Put up a small fix for Ash's second exception
>>> https://github.com/apache/incubator-airflow/pull/2766
>>> 
>>> On Wed, Nov 8, 2017 at 10:48 AM, Bolke de Bruin <bd...@gmail.com> wrote:
>>> 
>>>> Hi Chris,
>>>> 
>>>> Actively testing here: we found an issue in the SSHOperator introduced in
>>>> 1.9.0 (fix already merged for RC2, but blocking I as it stops us from
>>>> running SSH properly), some minor fixes by Airbnb should also be in RC2.
>>>> There is some logging “weirdness”, that might warrant a small patch here in
>>>> there and could be squeezed into RC2, but I don’t consider them blocking.
>>>> 
>>>> So almost there, but we need an RC2 imho.
>>>> 
>>>> -1, binding.
>>>> 
>>>> Bolke
>>>> 
>>>>> On 8 Nov 2017, at 19:00, Ash Berlin-Taylor <ash_airflowlist@firemirror.
>>>> com> wrote:
>>>>> 
>>>>> -1 (for now. Non binding. Is that how this process works?)
>>>>> 
>>>>> We've built a test env for this RC and are testing, but have run into an
>>>> issue reading task logs. (See below)
>>>>> 
>>>>> We haven't gotten very far with this yet, we will dig more tomorrow
>>>> (it's the end of the UK work day now). I suspect this might be how we've
>>>> misconfigured our logging. We will see tomorrow.
>>>>> 
>>>>> -ash
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py",
>>>> line 712, in log
>>>>> logs = handler.read(ti)
>>>>> AttributeError: 'NoneType' object has no attribute 'read'
>>>>> 
>>>>> During handling of the above exception, another exception occurred:
>>>>> 
>>>>> Traceback (most recent call last):
>>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1988,
>>>> in wsgi_app
>>>>> response = self.full_dispatch_request()
>>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1641,
>>>> in full_dispatch_request
>>>>> rv = self.handle_user_exception(e)
>>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1544,
>>>> in handle_user_exception
>>>>> reraise(exc_type, exc_value, tb)
>>>>> File "/usr/local/lib/python3.5/dist-packages/flask/_compat.py", line
>>>> 33, in reraise
>>>>> raise value
>>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1639,
>>>> in full_dispatch_request
>>>>> rv = self.dispatch_request()
>>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1625,
>>>> in dispatch_request
>>>>> return self.view_functions[rule.endpoint](**req.view_args)
>>>>> File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py",
>>>> line 69, in inner
>>>>> return self._run_view(f, *args, **kwargs)
>>>>> File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py",
>>>> line 368, in _run_view
>>>>> return fn(self, *args, **kwargs)
>>>>> File "/usr/local/lib/python3.5/dist-packages/flask_login.py", line
>>>> 758, in decorated_view
>>>>> return func(*args, **kwargs)
>>>>> File "/usr/local/lib/python3.5/dist-packages/airflow/www/utils.py",
>>>> line 262, in wrapper
>>>>> return f(*args, **kwargs)
>>>>> File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py",
>>>> line 715, in log
>>>>> .format(task_log_reader, e.message)]
>>>>> AttributeError: 'AttributeError' object has no attribute 'message'
>>>>> 
>>>>> 
>>>>>> On 8 Nov 2017, at 17:46, Chris Riccomini <cr...@apache.org> wrote:
>>>>>> 
>>>>>> Anyone? :/
>>>>>> 
>>>>>> On Mon, Nov 6, 2017 at 1:22 PM, Chris Riccomini <cr...@apache.org>
>>>>>> wrote:
>>>>>> 
>>>>>>> Hey all,
>>>>>>> 
>>>>>>> I have cut Airflow 1.9.0 RC1. This email is calling a vote on the
>>>> release,
>>>>>>> which will last fo 72 hours. Consider this my (binding) +1.
>>>>>>> 
>>>>>>> Airflow 1.9.0 RC1 is available at:
>>>>>>> 
>>>>>>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.9.0rc1/
>>>>>>> 
>>>>>>> apache-airflow-1.9.0rc1+incubating-source.tar.gz is a source release
>>>> that
>>>>>>> comes with INSTALL instructions.
>>>>>>> apache-airflow-1.9.0rc1+incubating-bin.tar.gz is the binary Python
>>>>>>> "sdist" release.
>>>>>>> 
>>>>>>> Public keys are available at:
>>>>>>> 
>>>>>>> https://dist.apache.org/repos/dist/release/incubator/airflow/
>>>>>>> 
>>>>>>> The release contains the following JIRAs:
>>>>>>> 
>>>>>>> ISSUE ID    |DESCRIPTION                                       |PR
>>>>>>> |COMMIT
>>>>>>> AIRFLOW-1779|Add keepalive packets to ssh hook                 |#2749
>>>>>>> |d2f9d1
>>>>>>> AIRFLOW-1776|stdout/stderr logging not captured                |#2745
>>>>>>> |590d9f
>>>>>>> AIRFLOW-1771|Change heartbeat text from boom to heartbeat      |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1767|Airflow Scheduler no longer schedules DAGs        |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1765|Default API auth backed should deny all.          |#2737
>>>>>>> |6ecdac
>>>>>>> AIRFLOW-1764|Web Interface should not use experimental api     |#2738
>>>>>>> |6bed1d
>>>>>>> AIRFLOW-1757|Contrib.SparkSubmitOperator should allow --package|#2725
>>>>>>> |4e06ee
>>>>>>> AIRFLOW-1745|BashOperator ignores SIGPIPE in subprocess        |#2714
>>>>>>> |e021c9
>>>>>>> AIRFLOW-1744|task.retries can be False                         |#2713
>>>>>>> |6144c6
>>>>>>> AIRFLOW-1743|Default config template should not contain ldap fi|#2712
>>>>>>> |270684
>>>>>>> AIRFLOW-1741|Task Duration shows two charts on first page load.|#2711
>>>>>>> |974b49
>>>>>>> AIRFLOW-1734|Sqoop Operator contains logic errors & needs optio|#2703
>>>>>>> |f6810c
>>>>>>> AIRFLOW-1731|Import custom config on PYTHONPATH                |#2721
>>>>>>> |f07eb3
>>>>>>> AIRFLOW-1726|Copy Expert command for Postgres Hook             |#2698
>>>>>>> |8a4ad3
>>>>>>> AIRFLOW-1719|Fix small typo - your vs you                      |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1712|Log SSHOperator output                            |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1711|Ldap Attributes not always a "list" part 2        |#2731
>>>>>>> |40a936
>>>>>>> AIRFLOW-1706|Scheduler is failed on startup with MS SQL Server |#2733
>>>>>>> |9e209b
>>>>>>> AIRFLOW-1698|Remove confusing SCHEDULER_RUNS env var from syste|#2677
>>>>>>> |00dd06
>>>>>>> AIRFLOW-1695|Redshift Hook using boto3 & AWS Hook              |#2717
>>>>>>> |bfddae
>>>>>>> AIRFLOW-1694|Hive Hooks: Python 3 does not have an `itertools.i|#2674
>>>>>>> |c6e5ae
>>>>>>> AIRFLOW-1692|Master cannot be checked out on windows           |#2673
>>>>>>> |31805e
>>>>>>> AIRFLOW-1691|Add better documentation for Google cloud storage |#2671
>>>>>>> |ace2b1
>>>>>>> AIRFLOW-1690|Error messages regarding gcs log commits are spars|#2670
>>>>>>> |5fb5cd
>>>>>>> AIRFLOW-1682|S3 task handler never writes to S3                |#2664
>>>>>>> |0080f0
>>>>>>> AIRFLOW-1678|Fix docstring errors for `set_upstream` and `set_d|-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1677|Fix typo in example_qubole_operator               |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1676|GCS task handler never writes to GCS              |#2659
>>>>>>> |781fa4
>>>>>>> AIRFLOW-1675|Fix API docstrings to be properly rendered        |#2667
>>>>>>> |f12381
>>>>>>> AIRFLOW-1671|Missing @apply_defaults annotation for gcs downloa|#2655
>>>>>>> |97666b
>>>>>>> AIRFLOW-1669|Fix Docker import in Master                       |#na
>>>>>>> |f7f2a8
>>>>>>> AIRFLOW-1668|Redhsift requires a keep alive of < 300s          |#2650
>>>>>>> |f2bb77
>>>>>>> AIRFLOW-1664|Make MySqlToGoogleCloudStorageOperator support bin|#2649
>>>>>>> |95813d
>>>>>>> AIRFLOW-1660|Change webpage width to full-width                |#2646
>>>>>>> |8ee3d9
>>>>>>> AIRFLOW-1659|Fix invalid attribute bug in FileTaskHandler      |#2645
>>>>>>> |bee823
>>>>>>> AIRFLOW-1658|Kill (possibly) still running Druid indexing job a|#2644
>>>>>>> |cbf7ad
>>>>>>> AIRFLOW-1657|Handle failure of Qubole Operator for s3distcp had|-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1654|Show tooltips for link icons in DAGs view         |#2642
>>>>>>> |ada7b2
>>>>>>> AIRFLOW-1647|Fix Spark-sql hook                                |#2637
>>>>>>> |b1e5c6
>>>>>>> AIRFLOW-1641|Task gets stuck in queued state                   |#2715
>>>>>>> |735497
>>>>>>> AIRFLOW-1640|Add Qubole default connection in connection table |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1639|ValueError does not have .message attribute       |#2629
>>>>>>> |87df67
>>>>>>> AIRFLOW-1637|readme not tracking master branch for travis      |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1636|aws and emr connection types get cleared          |#2626
>>>>>>> |540e04
>>>>>>> AIRFLOW-1635|Allow creating Google Cloud Platform connection wi|#2640
>>>>>>> |6dec7a
>>>>>>> AIRFLOW-1629|make extra a textarea in edit connections form    |#2623
>>>>>>> |f5d46f
>>>>>>> AIRFLOW-1628|Docstring of sqlsensor is incorrect               |#2621
>>>>>>> |9ba73d
>>>>>>> AIRFLOW-1627|SubDagOperator initialization should only query po|#2620
>>>>>>> |516ace
>>>>>>> AIRFLOW-1621|Add tests for logic added on server side dag list |#2614
>>>>>>> |8de9fd
>>>>>>> AIRFLOW-1614|Improve performance of DAG parsing when there are |#2610
>>>>>>> |a95adb
>>>>>>> AIRFLOW-1611|Customize logging in Airflow                      |#2631
>>>>>>> |8b4a50
>>>>>>> AIRFLOW-1609|Ignore all venvs in gitignore                     |#2608
>>>>>>> |f1f9b4
>>>>>>> AIRFLOW-1608|GCP Dataflow hook missing pending job state       |#2607
>>>>>>> |653562
>>>>>>> AIRFLOW-1606|DAG.sync_to_db is static, but takes a DAG as first|#2606
>>>>>>> |6ac296
>>>>>>> AIRFLOW-1605|Fix log source of local loggers                   |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1604|Rename the logger to log                          |#2604
>>>>>>> |af4050
>>>>>>> AIRFLOW-1602|Use LoggingMixin for the DAG class                |#2602
>>>>>>> |956699
>>>>>>> AIRFLOW-1601|Add configurable time between SIGTERM and SIGKILL |#2601
>>>>>>> |48a95e
>>>>>>> AIRFLOW-1600|Uncaught exceptions in get_fernet if cryptography |#2600
>>>>>>> |ad963e
>>>>>>> AIRFLOW-1597|Add GameWisp as Airflow user                      |#2599
>>>>>>> |26b747
>>>>>>> AIRFLOW-1594|Installing via pip copies test files into python l|#2597
>>>>>>> |a6b23a
>>>>>>> AIRFLOW-1593|Expose load_string in WasbHook                    |#2596
>>>>>>> |7ece95
>>>>>>> AIRFLOW-1591|Exception: 'TaskInstance' object has no attribute |#2578
>>>>>>> |f4653e
>>>>>>> AIRFLOW-1590|Small fix for dates util                          |#2652
>>>>>>> |31946e
>>>>>>> AIRFLOW-1587|fix `ImportError: cannot import name 'CeleryExecut|#2590
>>>>>>> |34c73b
>>>>>>> AIRFLOW-1586|MySQL to GCS to BigQuery fails for tables with dat|#2589
>>>>>>> |e83012
>>>>>>> AIRFLOW-1584|Remove the insecure /headers endpoints            |#2588
>>>>>>> |17ac07
>>>>>>> AIRFLOW-1582|Improve logging structure of Airflow              |#2592
>>>>>>> |a7a518
>>>>>>> AIRFLOW-1580|Error in string formatter when throwing an excepti|#2583
>>>>>>> |ea9ab9
>>>>>>> AIRFLOW-1579|Allow jagged rows in BQ Hook.                     |#2582
>>>>>>> |5b978b
>>>>>>> AIRFLOW-1577|Add token support to DatabricksHook               |#2579
>>>>>>> |c2c515
>>>>>>> AIRFLOW-1573|Remove `thrift < 0.10.0` requirement              |#2574
>>>>>>> |aa95f2
>>>>>>> AIRFLOW-1571|Add AWS Lambda Hook for invoking Lambda Function  |#2718
>>>>>>> |017f18
>>>>>>> AIRFLOW-1568|Add datastore import/export operator              |#2568
>>>>>>> |86063b
>>>>>>> AIRFLOW-1567|Clean up ML Engine operators                      |#2567
>>>>>>> |af91e2
>>>>>>> AIRFLOW-1564|Default logging filename contains a colon         |#2565
>>>>>>> |4c674c
>>>>>>> AIRFLOW-1560|Add AWS DynamoDB hook for inserting batch items   |#2587
>>>>>>> |71400b
>>>>>>> AIRFLOW-1556|BigQueryBaseCursor should support SQL parameters  |#2557
>>>>>>> |9df0ac
>>>>>>> AIRFLOW-1546| add Zymergen to org list in README               |#2512
>>>>>>> |7cc346
>>>>>>> AIRFLOW-1535|Add support for Dataproc serviceAccountScopes in D|#2546
>>>>>>> |b1f902
>>>>>>> AIRFLOW-1529|Support quoted newlines in Google BigQuery load jo|#2545
>>>>>>> |4a4b02
>>>>>>> AIRFLOW-1527|Refactor celery config to make use of template    |#2542
>>>>>>> |f4437b
>>>>>>> AIRFLOW-1522|Increase size of val column for variable table in |#2535
>>>>>>> |8a2d24
>>>>>>> AIRFLOW-1521|Template fields definition for bigquery_table_dele|#2534
>>>>>>> |f1a7c0
>>>>>>> AIRFLOW-1520|S3Hook uses boto2                                 |#2532
>>>>>>> |386583
>>>>>>> AIRFLOW-1519|Main DAG list page does not scale using client sid|#2531
>>>>>>> |d7d7ce
>>>>>>> AIRFLOW-1512|Add operator for running Python functions in a vir|#2446
>>>>>>> |14e6d7
>>>>>>> AIRFLOW-1507|Make src, dst and bucket parameters as templated i|#2516
>>>>>>> |d295cf
>>>>>>> AIRFLOW-1505|Document when Jinja substitution occurs           |#2523
>>>>>>> |984a87
>>>>>>> AIRFLOW-1504|Log Cluster Name on Dataproc Operator When Execute|#2517
>>>>>>> |1cd6c4
>>>>>>> AIRFLOW-1499s|Eliminate duplicate and unneeded code             |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1497|Hidden fields in connection form aren't reset when|#2507
>>>>>>> |d8da8b
>>>>>>> AIRFLOW-1493|Fix race condition with airflow run               |#2505
>>>>>>> |b2e175
>>>>>>> AIRFLOW-1492|Add metric for task success/failure               |#2504
>>>>>>> |fa84d4
>>>>>>> AIRFLOW-1489|Docs: Typo in BigQueryCheckOperator               |#2501
>>>>>>> |111ce5
>>>>>>> AIRFLOW-1483|Page size on model views is to large to render qui|#2497
>>>>>>> |04bfba
>>>>>>> AIRFLOW-1478|Chart -> Owner column should be sortable          |#2493
>>>>>>> |651e60
>>>>>>> AIRFLOW-1476|Add INSTALL file for source releases              |#2492
>>>>>>> |da76ac
>>>>>>> AIRFLOW-1474|Add dag_id regex for 'airflow clear' CLI command  |#2486
>>>>>>> |18f849
>>>>>>> AIRFLOW-1470s|BashSensor Implementation                         |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1459|integration rst doc is broken in github view      |#2481
>>>>>>> |322ec9
>>>>>>> AIRFLOW-1438|Scheduler batch queries should have a limit       |#2462
>>>>>>> |3547cb
>>>>>>> AIRFLOW-1437|BigQueryTableDeleteOperator should define deletion|#2459
>>>>>>> |b87903
>>>>>>> AIRFLOW-1432|NVD3 Charts do not have labeled axes and units cha|#2710
>>>>>>> |70ffa4
>>>>>>> AIRFLOW-1402|Cleanup SafeConfigParser DeprecationWarning       |#2435
>>>>>>> |38c86b
>>>>>>> AIRFLOW-1401|Standardize GCP project, region, and zone argument|#2439
>>>>>>> |b6d363
>>>>>>> AIRFLOW-1397|Airflow 1.8.1 - No data displays in Last Run Colum|-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1394|Add quote_character parameter to GoogleCloudStorag|#2428
>>>>>>> |9fd0be
>>>>>>> AIRFLOW-1389|BigQueryOperator should support `createDisposition|#2470
>>>>>>> |6e2640
>>>>>>> AIRFLOW-1384|Add ARGO/CaDC                                     |#2434
>>>>>>> |715947
>>>>>>> AIRFLOW-1368|Automatically remove the container when it exits  |#2653
>>>>>>> |d42d23
>>>>>>> AIRFLOW-1359|Provide GoogleCloudML operator for model evaluatio|#2407
>>>>>>> |194d1d
>>>>>>> AIRFLOW-1356|add `--celery_hostname` to `airflow worker`       |#2405
>>>>>>> |b9d7d1
>>>>>>> AIRFLOW-1352|Revert bad logging Handler                        |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1350|Add "query_uri" parameter for Google DataProc oper|#2402
>>>>>>> |d32c72
>>>>>>> AIRFLOW-1348|Paginated UI has broken toggles after first page  |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1345|Don't commit on each loop                         |#2397
>>>>>>> |0dd002
>>>>>>> AIRFLOW-1344|Builds failing on Python 3.5 with AttributeError  |#2394
>>>>>>> |2a5883
>>>>>>> AIRFLOW-1343|Add airflow default label to the dataproc operator|#2396
>>>>>>> |e4b240
>>>>>>> AIRFLOW-1338|gcp_dataflow_hook is incompatible with the recent |#2388
>>>>>>> |cf2605
>>>>>>> AIRFLOW-1337|Customize log format via config file              |#2392
>>>>>>> |4841e3
>>>>>>> AIRFLOW-1335|Use buffered logger                               |#2386
>>>>>>> |0d23d3
>>>>>>> AIRFLOW-1333|Enable copy function for Google Cloud Storage Hook|#2385
>>>>>>> |e2c383
>>>>>>> AIRFLOW-1331|Contrib.SparkSubmitOperator should allow --package|#2622
>>>>>>> |fbca8f
>>>>>>> AIRFLOW-1330|Connection.parse_from_uri doesn't work for google_|#2525
>>>>>>> |6e5e9d
>>>>>>> AIRFLOW-1324|Make the Druid operator/hook more general         |#2378
>>>>>>> |de99aa
>>>>>>> AIRFLOW-1323|Operators related to Dataproc should keep some par|#2636
>>>>>>> |ed248d
>>>>>>> AIRFLOW-1315|Add Qubole File and Partition Sensors             |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1309|Add optional hive_tblproperties in HiveToDruidTran|-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1301|Add New Relic to Airflow user list                |#2359
>>>>>>> |355fc9
>>>>>>> AIRFLOW-1299|Google Dataproc cluster creation operator should s|#2358
>>>>>>> |c2b80e
>>>>>>> AIRFLOW-1289|Don't restrict scheduler threads to CPU cores     |#2353
>>>>>>> |8e23d2
>>>>>>> AIRFLOW-1286|BaseTaskRunner - Exception TypeError: a bytes-like|#2363
>>>>>>> |d8891d
>>>>>>> AIRFLOW-1277|Forbid creation of a known event with empty fields|#na
>>>>>>> |65184a
>>>>>>> AIRFLOW-1276|Forbid event creation with end_data earlier than s|#na
>>>>>>> |d5d02f
>>>>>>> AIRFLOW-1275|Fix `airflow pool` command exception              |#2346
>>>>>>> |9958aa
>>>>>>> AIRFLOW-1273|Google Cloud ML Version and Model CRUD Operator   |#2379
>>>>>>> |534a0e
>>>>>>> AIRFLOW-1272|Google Cloud ML Batch Prediction Operator         |#2390
>>>>>>> |e92d6b
>>>>>>> AIRFLOW-1271|Google Cloud ML Training Operator                 |#2408
>>>>>>> |0fc450
>>>>>>> AIRFLOW-1256|Add United Airlines as Airflow user               |#2332
>>>>>>> |d3484a
>>>>>>> AIRFLOW-1251|Add eRevalue as an Airflow user                   |#2331
>>>>>>> |8d5160
>>>>>>> AIRFLOW-1248|Fix inconsistent configuration name for worker tim|#2328
>>>>>>> |92314f
>>>>>>> AIRFLOW-1247|CLI: ignore all dependencies argument ignored     |#2441
>>>>>>> |e88ecf
>>>>>>> AIRFLOW-1245|Fix random failure of test_trigger_dag_for_date un|#2325
>>>>>>> |cef01b
>>>>>>> AIRFLOW-1244|Forbid creation of a pool with empty name         |#2324
>>>>>>> |df9a10
>>>>>>> AIRFLOW-1242|BigQueryHook assumes that a valid project_id can't|#2335
>>>>>>> |ffe616
>>>>>>> AIRFLOW-1237|Fix IN-predicate sqlalchemy warning               |#2320
>>>>>>> |a1f422
>>>>>>> AIRFLOW-1234|Cover utils.operator_helpers with unit tests      |#2317
>>>>>>> |d16537
>>>>>>> AIRFLOW-1233|Cover utils.json with unit tests                  |#2316
>>>>>>> |502410
>>>>>>> AIRFLOW-1232|Remove deprecated readfp warning                  |#2315
>>>>>>> |6ffaaf
>>>>>>> AIRFLOW-1231|Use flask_wtf.CSRFProtect instead of flask_wtf.Csr|#2313
>>>>>>> |cac49e
>>>>>>> AIRFLOW-1221|Fix DatabricksSubmitRunOperator Templating        |#2308
>>>>>>> |0fa104
>>>>>>> AIRFLOW-1217|Enable logging in Sqoop hook                      |#2307
>>>>>>> |4f459b
>>>>>>> AIRFLOW-1213|Add hcatalog parameters to the sqoop operator/hook|#2305
>>>>>>> |857850
>>>>>>> AIRFLOW-1208|Speed-up cli tests                                |#2301
>>>>>>> |21c142
>>>>>>> AIRFLOW-1207|Enable utils.helpers unit tests                   |#2300
>>>>>>> |8ac87b
>>>>>>> AIRFLOW-1203|Tests failing after oauth upgrade                 |#2296
>>>>>>> |3e9c66
>>>>>>> AIRFLOW-1201|Update deprecated 'nose-parameterized' library to |#2298
>>>>>>> |d2d3e4
>>>>>>> AIRFLOW-1193|Add Checkr to Airflow user list                   |#2276
>>>>>>> |707238
>>>>>>> AIRFLOW-1189|Get pandas DataFrame using BigQueryHook fails     |#2287
>>>>>>> |93666f
>>>>>>> AIRFLOW-1188|Add max_bad_records param to GoogleCloudStorageToB|#2286
>>>>>>> |443e6b
>>>>>>> AIRFLOW-1187|Obsolete package names in documentation           |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1185|Incorrect url to PyPi                             |#2283
>>>>>>> |829755
>>>>>>> AIRFLOW-1181|Enable delete and list function for Google Cloud S|#2281
>>>>>>> |24f73c
>>>>>>> AIRFLOW-1179|Pandas 0.20 broke Google BigQuery hook            |#2279
>>>>>>> |ac9ccb
>>>>>>> AIRFLOW-1177|variable json deserialize does not work at set def|#2540
>>>>>>> |65319a
>>>>>>> AIRFLOW-1175|Add Pronto Tools to Airflow user list             |#2277
>>>>>>> |86aafa
>>>>>>> AIRFLOW-1173|Add Robinhood to list of Airflow users            |#2271
>>>>>>> |379115
>>>>>>> AIRFLOW-1165|airflow webservice crashes on ubuntu16 - python3  |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1160|Upadte SparkSubmitOperator parameters             |#2265
>>>>>>> |2e3f07
>>>>>>> AIRFLOW-1155|Add Tails.com to community                        |#2261
>>>>>>> |2fa690
>>>>>>> AIRFLOW-1149|Allow custom filters to be added to jinja2        |#2258
>>>>>>> |48135a
>>>>>>> AIRFLOW-1141|Remove DAG.crawl_for_tasks method                 |#2275
>>>>>>> |a30fee
>>>>>>> AIRFLOW-1140|DatabricksSubmitRunOperator should template the "j|#2255
>>>>>>> |e6d316
>>>>>>> AIRFLOW-1136|Invalid parameters are not captured for Sqoop oper|#2252
>>>>>>> |2ef4db
>>>>>>> AIRFLOW-1125|Clarify documentation regarding fernet_key        |#2251
>>>>>>> |831f8d
>>>>>>> AIRFLOW-1122|Node strokes are too thin for people with color vi|#2246
>>>>>>> |a08761
>>>>>>> AIRFLOW-1121|airflow webserver --pid no longer write out pid fi|-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1118|Add evo.company to Airflow users                  |#2243
>>>>>>> |f16914
>>>>>>> AIRFLOW-1112|Log which pool is full in scheduler when pool slot|#2242
>>>>>>> |74c1ce
>>>>>>> AIRFLOW-1107|Add support for ftps non-default port             |#2240
>>>>>>> |4d0c2f
>>>>>>> AIRFLOW-1106|Add Groupalia/Letsbonus                           |#2239
>>>>>>> |945b42
>>>>>>> AIRFLOW-1095|ldap_auth memberOf should come from configuration |#2232
>>>>>>> |6b1c32
>>>>>>> AIRFLOW-1094|Invalid unit tests under `contrib/`               |#2234
>>>>>>> |219c50
>>>>>>> AIRFLOW-1091|As a release manager I want to be able to compare |#2231
>>>>>>> |bfae42
>>>>>>> AIRFLOW-1090|Add HBO                                           |#2230
>>>>>>> |177d34
>>>>>>> AIRFLOW-1089|Add Spark application arguments to SparkSubmitOper|#2229
>>>>>>> |e5b914
>>>>>>> AIRFLOW-1081|Task duration page is slow                        |#2226
>>>>>>> |0da512
>>>>>>> AIRFLOW-1075|Cleanup security docs                             |#2222
>>>>>>> |5a6f18
>>>>>>> AIRFLOW-1065|Add functionality for Azure Blob Storage          |#2216
>>>>>>> |f1bc5f
>>>>>>> AIRFLOW-1059|Reset_state_for_orphaned_task should operate in ba|#2205
>>>>>>> |e05d3b
>>>>>>> AIRFLOW-1058|Improvements for SparkSubmitOperator              |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1051|Add a test for resetdb to CliTests                |#2198
>>>>>>> |15aee0
>>>>>>> AIRFLOW-1047|Airflow logs vulnerable to XSS                    |#2193
>>>>>>> |fe9ebe
>>>>>>> AIRFLOW-1045|Make log level configurable via airflow.cfg       |#2191
>>>>>>> |e739a5
>>>>>>> AIRFLOW-1043|Documentation issues for operators                |#2188
>>>>>>> |b55f41
>>>>>>> AIRFLOW-1041|DockerOperator replaces its xcom_push method with |#2274
>>>>>>> |03704c
>>>>>>> AIRFLOW-1040|Fix typos in comments/docstrings in models.py     |#2174
>>>>>>> |d8c0f5
>>>>>>> AIRFLOW-1036|Exponential backoff should use randomization      |#2262
>>>>>>> |66168e
>>>>>>> AIRFLOW-1035|Exponential backoff retry logic should use 2 as ba|#2196
>>>>>>> |4ec932
>>>>>>> AIRFLOW-1034|Make it possible to connect to S3 in sigv4 regions|#2181
>>>>>>> |4c0905
>>>>>>> AIRFLOW-1031|'scheduled__' may replace with DagRun.ID_PREFIX in|#2613
>>>>>>> |aa3844
>>>>>>> AIRFLOW-1030|HttpHook error when creating HttpSensor           |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-1028|Databricks Operator for Airflow                   |#2202
>>>>>>> |53ca50
>>>>>>> AIRFLOW-1024|Handle CeleryExecutor errors gracefully           |#2355
>>>>>>> |7af20f
>>>>>>> AIRFLOW-1018|Scheduler DAG processes can not log to stdout     |#2728
>>>>>>> |ef775d
>>>>>>> AIRFLOW-1016|Allow HTTP HEAD request method on HTTPSensor      |#2175
>>>>>>> |4c41f6
>>>>>>> AIRFLOW-1010|Add a convenience script for signing              |#2169
>>>>>>> |a2b65a
>>>>>>> AIRFLOW-1009|Remove SQLOperator from Concepts page             |#2168
>>>>>>> |7d1144
>>>>>>> AIRFLOW-1007|Jinja sandbox is vulnerable to RCE                |#2184
>>>>>>> |daa281
>>>>>>> AIRFLOW-1005|Speed up Airflow startup time                     |#na
>>>>>>> |996dd3
>>>>>>> AIRFLOW-999 |Support for Redis database                        |#2165
>>>>>>> |8de850
>>>>>>> AIRFLOW-997 |Change setup.cfg to point to Apache instead of Max|#na
>>>>>>> |75cd46
>>>>>>> AIRFLOW-995 |Update Github PR template                         |#2163
>>>>>>> |b62485
>>>>>>> AIRFLOW-994 |Add MiNODES to the AIRFLOW Active Users List      |#2159
>>>>>>> |ca1623
>>>>>>> AIRFLOW-991 |Mark_success while a task is running leads to fail|-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-990 |DockerOperator fails when logging unicode string  |#2155
>>>>>>> |6bbf54
>>>>>>> AIRFLOW-988 |SLA Miss Callbacks Are Repeated if Email is Not be|#2415
>>>>>>> |6e74d4
>>>>>>> AIRFLOW-985 |Extend the sqoop operator/hook with additional par|#2177
>>>>>>> |82eb20
>>>>>>> AIRFLOW-984 |Subdags unrecognized when subclassing SubDagOperat|#2152
>>>>>>> |a8bd16
>>>>>>> AIRFLOW-979 |Add GovTech GDS                                   |#2149
>>>>>>> |b17bd3
>>>>>>> AIRFLOW-976 |Mark success running task causes it to fail       |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-969 |Catch bad python_callable argument at DAG construc|#2142
>>>>>>> |12901d
>>>>>>> AIRFLOW-963 |Some code examples are not rendered in the airflow|#2139
>>>>>>> |f69c1b
>>>>>>> AIRFLOW-960 |Add support for .editorconfig                     |#na
>>>>>>> |f5cacc
>>>>>>> AIRFLOW-959 |.gitignore file is disorganized and incomplete    |#na
>>>>>>> |3d3c14
>>>>>>> AIRFLOW-958 |Improve tooltip readability                       |#2134
>>>>>>> |b3c3eb
>>>>>>> AIRFLOW-950 |Missing AWS integrations on documentation::integra|#2552
>>>>>>> |01be02
>>>>>>> AIRFLOW-947 |Make PrestoHook surface better messages when the P|#na
>>>>>>> |6dd4b3
>>>>>>> AIRFLOW-945 |Revert psycopg2 workaround when psycopg2 2.7.1 is |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-943 |Add Digital First Media to the Airflow users list |#2115
>>>>>>> |2cfe28
>>>>>>> AIRFLOW-942 |Add mytaxi to Airflow Users                       |#2111
>>>>>>> |d579e6
>>>>>>> AIRFLOW-935 |Impossible to use plugin executors                |#2120
>>>>>>> |08a784
>>>>>>> AIRFLOW-926 |jdbc connector is broken due to jaydebeapi api upd|#2651
>>>>>>> |07ed29
>>>>>>> AIRFLOW-917 |Incorrectly formatted failure status message      |#2109
>>>>>>> |b8164c
>>>>>>> AIRFLOW-916 |Fix ConfigParser deprecation warning              |#2108
>>>>>>> |ef6dd1
>>>>>>> AIRFLOW-911 |Add colouring and profiling info on tests         |#2106
>>>>>>> |4f52db
>>>>>>> AIRFLOW-903 |Add configuration setting for default DAG view.   |#2103
>>>>>>> |cadfae
>>>>>>> AIRFLOW-896 |BigQueryOperator fails to execute with certain inp|#2097
>>>>>>> |2bceee
>>>>>>> AIRFLOW-891 |Webserver Clock Should Include Day                |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-889 |Minor error in the docstrings for BaseOperator.   |#2084
>>>>>>> |50702d
>>>>>>> AIRFLOW-887 |Add compatibility with future v0.16               |#na
>>>>>>> |50902d
>>>>>>> AIRFLOW-886 |Pass Operator result to post_execute hook         |#na
>>>>>>> |4da361
>>>>>>> AIRFLOW-885 |Add Change.org to the list of Airflow users       |#2089
>>>>>>> |a279be
>>>>>>> AIRFLOW-882 |Code example in docs has unnecessary DAG>>Operator|#2088
>>>>>>> |baa4cd
>>>>>>> AIRFLOW-881 |Create SubDagOperator within DAG context manager w|#2087
>>>>>>> |0ed608
>>>>>>> AIRFLOW-880 |Fix remote log functionality inconsistencies for W|#2086
>>>>>>> |974b75
>>>>>>> AIRFLOW-877 |GoogleCloudStorageDownloadOperator: template_ext c|#2083
>>>>>>> |debc69
>>>>>>> AIRFLOW-875 |Allow HttpSensor params to be templated           |#2080
>>>>>>> |62f503
>>>>>>> AIRFLOW-871 |multiple places use logging.warn() instead of warn|#2082
>>>>>>> |21d775
>>>>>>> AIRFLOW-866 |Add FTPSensor                                     |#2070
>>>>>>> |5f87f8
>>>>>>> AIRFLOW-863 |Example DAG start dates should be recent to avoid |#2068
>>>>>>> |bbfd43
>>>>>>> AIRFLOW-862 |Add DaskExecutor                                  |#2067
>>>>>>> |6e2210
>>>>>>> AIRFLOW-860 |Circular module dependency prevents loading of cus|-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-854 |Add Open Knowledge International to Airflow users |#2061
>>>>>>> |51a311
>>>>>>> AIRFLOW-842 |scheduler.clean_dirty raises warning: SAWarning: T|#2072
>>>>>>> |485280
>>>>>>> AIRFLOW-840 |Python3 encoding issue in Kerberos                |#2158
>>>>>>> |639336
>>>>>>> AIRFLOW-836 |The paused and queryview endpoints are vulnerable |#2054
>>>>>>> |6aca2c
>>>>>>> AIRFLOW-831 |Fix broken unit tests                             |#2050
>>>>>>> |b86194
>>>>>>> AIRFLOW-830 |Plugin manager should log to debug, not info      |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-829 |Reduce verbosity of successful Travis unit tests  |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-826 |Add Zendesk Hook                                  |#2066
>>>>>>> |a09762
>>>>>>> AIRFLOW-823 |Make task instance details available via API      |#2045
>>>>>>> |3f546e
>>>>>>> AIRFLOW-822 |Close the connection before throwing exception in |#2038
>>>>>>> |4b6c38
>>>>>>> AIRFLOW-821 |Scheduler dagbag importing not Py3 compatible     |#2039
>>>>>>> |fbb59b
>>>>>>> AIRFLOW-809 |SqlAlchemy is_ ColumnOperator Causing Errors in MS|-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-802 |Integration of spark-submit                       |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-781 |Allow DataFlowJavaOperator to accept jar file stor|#2037
>>>>>>> |259c86
>>>>>>> AIRFLOW-770 |HDFS hooks should support alternative ways of gett|#2056
>>>>>>> |261b65
>>>>>>> AIRFLOW-756 |Refactor ssh_hook and ssh_operator                |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-751 |SFTP file transfer functionality                  |#1999
>>>>>>> |fe0ede
>>>>>>> AIRFLOW-725 |Make merge tool use OS' keyring for password stora|#1966
>>>>>>> |8c1695
>>>>>>> AIRFLOW-706 |Configuration shell commands are not split properl|#2053
>>>>>>> |0bb6f2
>>>>>>> AIRFLOW-705 |airflow.configuration.run_command output does not |-
>>>> |-
>>>>>>> 
>>>>>>> AIRFLOW-681 |homepage doc link should pointing to apache's repo|#2164
>>>>>>> |a8027a
>>>>>>> AIRFLOW-654 |SSL for AMQP w/ Celery(Executor)                  |#2333
>>>>>>> |868bfe
>>>>>>> AIRFLOW-645 |HttpHook ignores https                            |#2311
>>>>>>> |fd381a
>>>>>>> AIRFLOW-365 |Code view in subdag trigger exception             |#2043
>>>>>>> |cf102c
>>>>>>> AIRFLOW-300 |Add Google Pubsub hook and operator               |#2036
>>>>>>> |d231dc
>>>>>>> AIRFLOW-289 |Use datetime.utcnow() to keep airflow system indep|#2618
>>>>>>> |20c83e
>>>>>>> AIRFLOW-71  |docker_operator - pulling from private repositorie|#na
>>>>>>> |d4406c
>>>>>>> 
>>>>>>> Cheers,
>>>>>>> Chris
>>>>>>> 
>>>>> 
>>>> 
>>>> 
>> 
> 


Re: [VOTE] Airflow 1.9.0rc1

Posted by Ash Berlin-Taylor <as...@firemirror.com>.
And one more - on Python3 we can't use S3Hook.load_string due to bytes vs string issue: https://issues.apache.org/jira/browse/AIRFLOW-1796 <https://issues.apache.org/jira/browse/AIRFLOW-1796>

I'll try and work on fixes for some/all of these today, including adding and expanding on the tests for S3Hook which it looks like was kind of lacking.

-ash

> On 9 Nov 2017, at 10:54, Ash Berlin-Taylor <as...@firemirror.com> wrote:
> 
> Thanks for picking this up. Your fix should stop the 500 error, but there's another problem (which is ultimately user misconfiguration about) https://issues.apache.org/jira/browse/AIRFLOW-1796 - the fix for that to update a doc somewhere, and probably validate this settings is correct at start time.
> 
> 
> I've found another issue related to arg names of S3Hook. In 1.8.2 it was `s3_conn_id` but the move to boto3/basing off AWSHook now expects `aws_conn_id`, and various places in Airflow code base (and a few places in our dags/operators code base) still pass it as s3_conn_id. I've created https://issues.apache.org/jira/browse/AIRFLOW-1795 for that issue.
> 
> -ash
> 
> 
>> On 8 Nov 2017, at 18:54, Daniel Huang <dx...@gmail.com> wrote:
>> 
>> Still testing this out.
>> 
>> Put up a small fix for Ash's second exception
>> https://github.com/apache/incubator-airflow/pull/2766
>> 
>> On Wed, Nov 8, 2017 at 10:48 AM, Bolke de Bruin <bd...@gmail.com> wrote:
>> 
>>> Hi Chris,
>>> 
>>> Actively testing here: we found an issue in the SSHOperator introduced in
>>> 1.9.0 (fix already merged for RC2, but blocking I as it stops us from
>>> running SSH properly), some minor fixes by Airbnb should also be in RC2.
>>> There is some logging “weirdness”, that might warrant a small patch here in
>>> there and could be squeezed into RC2, but I don’t consider them blocking.
>>> 
>>> So almost there, but we need an RC2 imho.
>>> 
>>> -1, binding.
>>> 
>>> Bolke
>>> 
>>>> On 8 Nov 2017, at 19:00, Ash Berlin-Taylor <ash_airflowlist@firemirror.
>>> com> wrote:
>>>> 
>>>> -1 (for now. Non binding. Is that how this process works?)
>>>> 
>>>> We've built a test env for this RC and are testing, but have run into an
>>> issue reading task logs. (See below)
>>>> 
>>>> We haven't gotten very far with this yet, we will dig more tomorrow
>>> (it's the end of the UK work day now). I suspect this might be how we've
>>> misconfigured our logging. We will see tomorrow.
>>>> 
>>>> -ash
>>>> 
>>>> 
>>>> 
>>>> 
>>>> File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py",
>>> line 712, in log
>>>>  logs = handler.read(ti)
>>>> AttributeError: 'NoneType' object has no attribute 'read'
>>>> 
>>>> During handling of the above exception, another exception occurred:
>>>> 
>>>> Traceback (most recent call last):
>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1988,
>>> in wsgi_app
>>>>  response = self.full_dispatch_request()
>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1641,
>>> in full_dispatch_request
>>>>  rv = self.handle_user_exception(e)
>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1544,
>>> in handle_user_exception
>>>>  reraise(exc_type, exc_value, tb)
>>>> File "/usr/local/lib/python3.5/dist-packages/flask/_compat.py", line
>>> 33, in reraise
>>>>  raise value
>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1639,
>>> in full_dispatch_request
>>>>  rv = self.dispatch_request()
>>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1625,
>>> in dispatch_request
>>>>  return self.view_functions[rule.endpoint](**req.view_args)
>>>> File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py",
>>> line 69, in inner
>>>>  return self._run_view(f, *args, **kwargs)
>>>> File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py",
>>> line 368, in _run_view
>>>>  return fn(self, *args, **kwargs)
>>>> File "/usr/local/lib/python3.5/dist-packages/flask_login.py", line
>>> 758, in decorated_view
>>>>  return func(*args, **kwargs)
>>>> File "/usr/local/lib/python3.5/dist-packages/airflow/www/utils.py",
>>> line 262, in wrapper
>>>>  return f(*args, **kwargs)
>>>> File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py",
>>> line 715, in log
>>>>  .format(task_log_reader, e.message)]
>>>> AttributeError: 'AttributeError' object has no attribute 'message'
>>>> 
>>>> 
>>>>> On 8 Nov 2017, at 17:46, Chris Riccomini <cr...@apache.org> wrote:
>>>>> 
>>>>> Anyone? :/
>>>>> 
>>>>> On Mon, Nov 6, 2017 at 1:22 PM, Chris Riccomini <cr...@apache.org>
>>>>> wrote:
>>>>> 
>>>>>> Hey all,
>>>>>> 
>>>>>> I have cut Airflow 1.9.0 RC1. This email is calling a vote on the
>>> release,
>>>>>> which will last fo 72 hours. Consider this my (binding) +1.
>>>>>> 
>>>>>> Airflow 1.9.0 RC1 is available at:
>>>>>> 
>>>>>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.9.0rc1/
>>>>>> 
>>>>>> apache-airflow-1.9.0rc1+incubating-source.tar.gz is a source release
>>> that
>>>>>> comes with INSTALL instructions.
>>>>>> apache-airflow-1.9.0rc1+incubating-bin.tar.gz is the binary Python
>>>>>> "sdist" release.
>>>>>> 
>>>>>> Public keys are available at:
>>>>>> 
>>>>>> https://dist.apache.org/repos/dist/release/incubator/airflow/
>>>>>> 
>>>>>> The release contains the following JIRAs:
>>>>>> 
>>>>>> ISSUE ID    |DESCRIPTION                                       |PR
>>>>>> |COMMIT
>>>>>> AIRFLOW-1779|Add keepalive packets to ssh hook                 |#2749
>>>>>> |d2f9d1
>>>>>> AIRFLOW-1776|stdout/stderr logging not captured                |#2745
>>>>>> |590d9f
>>>>>> AIRFLOW-1771|Change heartbeat text from boom to heartbeat      |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1767|Airflow Scheduler no longer schedules DAGs        |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1765|Default API auth backed should deny all.          |#2737
>>>>>> |6ecdac
>>>>>> AIRFLOW-1764|Web Interface should not use experimental api     |#2738
>>>>>> |6bed1d
>>>>>> AIRFLOW-1757|Contrib.SparkSubmitOperator should allow --package|#2725
>>>>>> |4e06ee
>>>>>> AIRFLOW-1745|BashOperator ignores SIGPIPE in subprocess        |#2714
>>>>>> |e021c9
>>>>>> AIRFLOW-1744|task.retries can be False                         |#2713
>>>>>> |6144c6
>>>>>> AIRFLOW-1743|Default config template should not contain ldap fi|#2712
>>>>>> |270684
>>>>>> AIRFLOW-1741|Task Duration shows two charts on first page load.|#2711
>>>>>> |974b49
>>>>>> AIRFLOW-1734|Sqoop Operator contains logic errors & needs optio|#2703
>>>>>> |f6810c
>>>>>> AIRFLOW-1731|Import custom config on PYTHONPATH                |#2721
>>>>>> |f07eb3
>>>>>> AIRFLOW-1726|Copy Expert command for Postgres Hook             |#2698
>>>>>> |8a4ad3
>>>>>> AIRFLOW-1719|Fix small typo - your vs you                      |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1712|Log SSHOperator output                            |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1711|Ldap Attributes not always a "list" part 2        |#2731
>>>>>> |40a936
>>>>>> AIRFLOW-1706|Scheduler is failed on startup with MS SQL Server |#2733
>>>>>> |9e209b
>>>>>> AIRFLOW-1698|Remove confusing SCHEDULER_RUNS env var from syste|#2677
>>>>>> |00dd06
>>>>>> AIRFLOW-1695|Redshift Hook using boto3 & AWS Hook              |#2717
>>>>>> |bfddae
>>>>>> AIRFLOW-1694|Hive Hooks: Python 3 does not have an `itertools.i|#2674
>>>>>> |c6e5ae
>>>>>> AIRFLOW-1692|Master cannot be checked out on windows           |#2673
>>>>>> |31805e
>>>>>> AIRFLOW-1691|Add better documentation for Google cloud storage |#2671
>>>>>> |ace2b1
>>>>>> AIRFLOW-1690|Error messages regarding gcs log commits are spars|#2670
>>>>>> |5fb5cd
>>>>>> AIRFLOW-1682|S3 task handler never writes to S3                |#2664
>>>>>> |0080f0
>>>>>> AIRFLOW-1678|Fix docstring errors for `set_upstream` and `set_d|-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1677|Fix typo in example_qubole_operator               |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1676|GCS task handler never writes to GCS              |#2659
>>>>>> |781fa4
>>>>>> AIRFLOW-1675|Fix API docstrings to be properly rendered        |#2667
>>>>>> |f12381
>>>>>> AIRFLOW-1671|Missing @apply_defaults annotation for gcs downloa|#2655
>>>>>> |97666b
>>>>>> AIRFLOW-1669|Fix Docker import in Master                       |#na
>>>>>> |f7f2a8
>>>>>> AIRFLOW-1668|Redhsift requires a keep alive of < 300s          |#2650
>>>>>> |f2bb77
>>>>>> AIRFLOW-1664|Make MySqlToGoogleCloudStorageOperator support bin|#2649
>>>>>> |95813d
>>>>>> AIRFLOW-1660|Change webpage width to full-width                |#2646
>>>>>> |8ee3d9
>>>>>> AIRFLOW-1659|Fix invalid attribute bug in FileTaskHandler      |#2645
>>>>>> |bee823
>>>>>> AIRFLOW-1658|Kill (possibly) still running Druid indexing job a|#2644
>>>>>> |cbf7ad
>>>>>> AIRFLOW-1657|Handle failure of Qubole Operator for s3distcp had|-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1654|Show tooltips for link icons in DAGs view         |#2642
>>>>>> |ada7b2
>>>>>> AIRFLOW-1647|Fix Spark-sql hook                                |#2637
>>>>>> |b1e5c6
>>>>>> AIRFLOW-1641|Task gets stuck in queued state                   |#2715
>>>>>> |735497
>>>>>> AIRFLOW-1640|Add Qubole default connection in connection table |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1639|ValueError does not have .message attribute       |#2629
>>>>>> |87df67
>>>>>> AIRFLOW-1637|readme not tracking master branch for travis      |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1636|aws and emr connection types get cleared          |#2626
>>>>>> |540e04
>>>>>> AIRFLOW-1635|Allow creating Google Cloud Platform connection wi|#2640
>>>>>> |6dec7a
>>>>>> AIRFLOW-1629|make extra a textarea in edit connections form    |#2623
>>>>>> |f5d46f
>>>>>> AIRFLOW-1628|Docstring of sqlsensor is incorrect               |#2621
>>>>>> |9ba73d
>>>>>> AIRFLOW-1627|SubDagOperator initialization should only query po|#2620
>>>>>> |516ace
>>>>>> AIRFLOW-1621|Add tests for logic added on server side dag list |#2614
>>>>>> |8de9fd
>>>>>> AIRFLOW-1614|Improve performance of DAG parsing when there are |#2610
>>>>>> |a95adb
>>>>>> AIRFLOW-1611|Customize logging in Airflow                      |#2631
>>>>>> |8b4a50
>>>>>> AIRFLOW-1609|Ignore all venvs in gitignore                     |#2608
>>>>>> |f1f9b4
>>>>>> AIRFLOW-1608|GCP Dataflow hook missing pending job state       |#2607
>>>>>> |653562
>>>>>> AIRFLOW-1606|DAG.sync_to_db is static, but takes a DAG as first|#2606
>>>>>> |6ac296
>>>>>> AIRFLOW-1605|Fix log source of local loggers                   |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1604|Rename the logger to log                          |#2604
>>>>>> |af4050
>>>>>> AIRFLOW-1602|Use LoggingMixin for the DAG class                |#2602
>>>>>> |956699
>>>>>> AIRFLOW-1601|Add configurable time between SIGTERM and SIGKILL |#2601
>>>>>> |48a95e
>>>>>> AIRFLOW-1600|Uncaught exceptions in get_fernet if cryptography |#2600
>>>>>> |ad963e
>>>>>> AIRFLOW-1597|Add GameWisp as Airflow user                      |#2599
>>>>>> |26b747
>>>>>> AIRFLOW-1594|Installing via pip copies test files into python l|#2597
>>>>>> |a6b23a
>>>>>> AIRFLOW-1593|Expose load_string in WasbHook                    |#2596
>>>>>> |7ece95
>>>>>> AIRFLOW-1591|Exception: 'TaskInstance' object has no attribute |#2578
>>>>>> |f4653e
>>>>>> AIRFLOW-1590|Small fix for dates util                          |#2652
>>>>>> |31946e
>>>>>> AIRFLOW-1587|fix `ImportError: cannot import name 'CeleryExecut|#2590
>>>>>> |34c73b
>>>>>> AIRFLOW-1586|MySQL to GCS to BigQuery fails for tables with dat|#2589
>>>>>> |e83012
>>>>>> AIRFLOW-1584|Remove the insecure /headers endpoints            |#2588
>>>>>> |17ac07
>>>>>> AIRFLOW-1582|Improve logging structure of Airflow              |#2592
>>>>>> |a7a518
>>>>>> AIRFLOW-1580|Error in string formatter when throwing an excepti|#2583
>>>>>> |ea9ab9
>>>>>> AIRFLOW-1579|Allow jagged rows in BQ Hook.                     |#2582
>>>>>> |5b978b
>>>>>> AIRFLOW-1577|Add token support to DatabricksHook               |#2579
>>>>>> |c2c515
>>>>>> AIRFLOW-1573|Remove `thrift < 0.10.0` requirement              |#2574
>>>>>> |aa95f2
>>>>>> AIRFLOW-1571|Add AWS Lambda Hook for invoking Lambda Function  |#2718
>>>>>> |017f18
>>>>>> AIRFLOW-1568|Add datastore import/export operator              |#2568
>>>>>> |86063b
>>>>>> AIRFLOW-1567|Clean up ML Engine operators                      |#2567
>>>>>> |af91e2
>>>>>> AIRFLOW-1564|Default logging filename contains a colon         |#2565
>>>>>> |4c674c
>>>>>> AIRFLOW-1560|Add AWS DynamoDB hook for inserting batch items   |#2587
>>>>>> |71400b
>>>>>> AIRFLOW-1556|BigQueryBaseCursor should support SQL parameters  |#2557
>>>>>> |9df0ac
>>>>>> AIRFLOW-1546| add Zymergen to org list in README               |#2512
>>>>>> |7cc346
>>>>>> AIRFLOW-1535|Add support for Dataproc serviceAccountScopes in D|#2546
>>>>>> |b1f902
>>>>>> AIRFLOW-1529|Support quoted newlines in Google BigQuery load jo|#2545
>>>>>> |4a4b02
>>>>>> AIRFLOW-1527|Refactor celery config to make use of template    |#2542
>>>>>> |f4437b
>>>>>> AIRFLOW-1522|Increase size of val column for variable table in |#2535
>>>>>> |8a2d24
>>>>>> AIRFLOW-1521|Template fields definition for bigquery_table_dele|#2534
>>>>>> |f1a7c0
>>>>>> AIRFLOW-1520|S3Hook uses boto2                                 |#2532
>>>>>> |386583
>>>>>> AIRFLOW-1519|Main DAG list page does not scale using client sid|#2531
>>>>>> |d7d7ce
>>>>>> AIRFLOW-1512|Add operator for running Python functions in a vir|#2446
>>>>>> |14e6d7
>>>>>> AIRFLOW-1507|Make src, dst and bucket parameters as templated i|#2516
>>>>>> |d295cf
>>>>>> AIRFLOW-1505|Document when Jinja substitution occurs           |#2523
>>>>>> |984a87
>>>>>> AIRFLOW-1504|Log Cluster Name on Dataproc Operator When Execute|#2517
>>>>>> |1cd6c4
>>>>>> AIRFLOW-1499s|Eliminate duplicate and unneeded code             |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1497|Hidden fields in connection form aren't reset when|#2507
>>>>>> |d8da8b
>>>>>> AIRFLOW-1493|Fix race condition with airflow run               |#2505
>>>>>> |b2e175
>>>>>> AIRFLOW-1492|Add metric for task success/failure               |#2504
>>>>>> |fa84d4
>>>>>> AIRFLOW-1489|Docs: Typo in BigQueryCheckOperator               |#2501
>>>>>> |111ce5
>>>>>> AIRFLOW-1483|Page size on model views is to large to render qui|#2497
>>>>>> |04bfba
>>>>>> AIRFLOW-1478|Chart -> Owner column should be sortable          |#2493
>>>>>> |651e60
>>>>>> AIRFLOW-1476|Add INSTALL file for source releases              |#2492
>>>>>> |da76ac
>>>>>> AIRFLOW-1474|Add dag_id regex for 'airflow clear' CLI command  |#2486
>>>>>> |18f849
>>>>>> AIRFLOW-1470s|BashSensor Implementation                         |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1459|integration rst doc is broken in github view      |#2481
>>>>>> |322ec9
>>>>>> AIRFLOW-1438|Scheduler batch queries should have a limit       |#2462
>>>>>> |3547cb
>>>>>> AIRFLOW-1437|BigQueryTableDeleteOperator should define deletion|#2459
>>>>>> |b87903
>>>>>> AIRFLOW-1432|NVD3 Charts do not have labeled axes and units cha|#2710
>>>>>> |70ffa4
>>>>>> AIRFLOW-1402|Cleanup SafeConfigParser DeprecationWarning       |#2435
>>>>>> |38c86b
>>>>>> AIRFLOW-1401|Standardize GCP project, region, and zone argument|#2439
>>>>>> |b6d363
>>>>>> AIRFLOW-1397|Airflow 1.8.1 - No data displays in Last Run Colum|-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1394|Add quote_character parameter to GoogleCloudStorag|#2428
>>>>>> |9fd0be
>>>>>> AIRFLOW-1389|BigQueryOperator should support `createDisposition|#2470
>>>>>> |6e2640
>>>>>> AIRFLOW-1384|Add ARGO/CaDC                                     |#2434
>>>>>> |715947
>>>>>> AIRFLOW-1368|Automatically remove the container when it exits  |#2653
>>>>>> |d42d23
>>>>>> AIRFLOW-1359|Provide GoogleCloudML operator for model evaluatio|#2407
>>>>>> |194d1d
>>>>>> AIRFLOW-1356|add `--celery_hostname` to `airflow worker`       |#2405
>>>>>> |b9d7d1
>>>>>> AIRFLOW-1352|Revert bad logging Handler                        |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1350|Add "query_uri" parameter for Google DataProc oper|#2402
>>>>>> |d32c72
>>>>>> AIRFLOW-1348|Paginated UI has broken toggles after first page  |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1345|Don't commit on each loop                         |#2397
>>>>>> |0dd002
>>>>>> AIRFLOW-1344|Builds failing on Python 3.5 with AttributeError  |#2394
>>>>>> |2a5883
>>>>>> AIRFLOW-1343|Add airflow default label to the dataproc operator|#2396
>>>>>> |e4b240
>>>>>> AIRFLOW-1338|gcp_dataflow_hook is incompatible with the recent |#2388
>>>>>> |cf2605
>>>>>> AIRFLOW-1337|Customize log format via config file              |#2392
>>>>>> |4841e3
>>>>>> AIRFLOW-1335|Use buffered logger                               |#2386
>>>>>> |0d23d3
>>>>>> AIRFLOW-1333|Enable copy function for Google Cloud Storage Hook|#2385
>>>>>> |e2c383
>>>>>> AIRFLOW-1331|Contrib.SparkSubmitOperator should allow --package|#2622
>>>>>> |fbca8f
>>>>>> AIRFLOW-1330|Connection.parse_from_uri doesn't work for google_|#2525
>>>>>> |6e5e9d
>>>>>> AIRFLOW-1324|Make the Druid operator/hook more general         |#2378
>>>>>> |de99aa
>>>>>> AIRFLOW-1323|Operators related to Dataproc should keep some par|#2636
>>>>>> |ed248d
>>>>>> AIRFLOW-1315|Add Qubole File and Partition Sensors             |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1309|Add optional hive_tblproperties in HiveToDruidTran|-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1301|Add New Relic to Airflow user list                |#2359
>>>>>> |355fc9
>>>>>> AIRFLOW-1299|Google Dataproc cluster creation operator should s|#2358
>>>>>> |c2b80e
>>>>>> AIRFLOW-1289|Don't restrict scheduler threads to CPU cores     |#2353
>>>>>> |8e23d2
>>>>>> AIRFLOW-1286|BaseTaskRunner - Exception TypeError: a bytes-like|#2363
>>>>>> |d8891d
>>>>>> AIRFLOW-1277|Forbid creation of a known event with empty fields|#na
>>>>>> |65184a
>>>>>> AIRFLOW-1276|Forbid event creation with end_data earlier than s|#na
>>>>>> |d5d02f
>>>>>> AIRFLOW-1275|Fix `airflow pool` command exception              |#2346
>>>>>> |9958aa
>>>>>> AIRFLOW-1273|Google Cloud ML Version and Model CRUD Operator   |#2379
>>>>>> |534a0e
>>>>>> AIRFLOW-1272|Google Cloud ML Batch Prediction Operator         |#2390
>>>>>> |e92d6b
>>>>>> AIRFLOW-1271|Google Cloud ML Training Operator                 |#2408
>>>>>> |0fc450
>>>>>> AIRFLOW-1256|Add United Airlines as Airflow user               |#2332
>>>>>> |d3484a
>>>>>> AIRFLOW-1251|Add eRevalue as an Airflow user                   |#2331
>>>>>> |8d5160
>>>>>> AIRFLOW-1248|Fix inconsistent configuration name for worker tim|#2328
>>>>>> |92314f
>>>>>> AIRFLOW-1247|CLI: ignore all dependencies argument ignored     |#2441
>>>>>> |e88ecf
>>>>>> AIRFLOW-1245|Fix random failure of test_trigger_dag_for_date un|#2325
>>>>>> |cef01b
>>>>>> AIRFLOW-1244|Forbid creation of a pool with empty name         |#2324
>>>>>> |df9a10
>>>>>> AIRFLOW-1242|BigQueryHook assumes that a valid project_id can't|#2335
>>>>>> |ffe616
>>>>>> AIRFLOW-1237|Fix IN-predicate sqlalchemy warning               |#2320
>>>>>> |a1f422
>>>>>> AIRFLOW-1234|Cover utils.operator_helpers with unit tests      |#2317
>>>>>> |d16537
>>>>>> AIRFLOW-1233|Cover utils.json with unit tests                  |#2316
>>>>>> |502410
>>>>>> AIRFLOW-1232|Remove deprecated readfp warning                  |#2315
>>>>>> |6ffaaf
>>>>>> AIRFLOW-1231|Use flask_wtf.CSRFProtect instead of flask_wtf.Csr|#2313
>>>>>> |cac49e
>>>>>> AIRFLOW-1221|Fix DatabricksSubmitRunOperator Templating        |#2308
>>>>>> |0fa104
>>>>>> AIRFLOW-1217|Enable logging in Sqoop hook                      |#2307
>>>>>> |4f459b
>>>>>> AIRFLOW-1213|Add hcatalog parameters to the sqoop operator/hook|#2305
>>>>>> |857850
>>>>>> AIRFLOW-1208|Speed-up cli tests                                |#2301
>>>>>> |21c142
>>>>>> AIRFLOW-1207|Enable utils.helpers unit tests                   |#2300
>>>>>> |8ac87b
>>>>>> AIRFLOW-1203|Tests failing after oauth upgrade                 |#2296
>>>>>> |3e9c66
>>>>>> AIRFLOW-1201|Update deprecated 'nose-parameterized' library to |#2298
>>>>>> |d2d3e4
>>>>>> AIRFLOW-1193|Add Checkr to Airflow user list                   |#2276
>>>>>> |707238
>>>>>> AIRFLOW-1189|Get pandas DataFrame using BigQueryHook fails     |#2287
>>>>>> |93666f
>>>>>> AIRFLOW-1188|Add max_bad_records param to GoogleCloudStorageToB|#2286
>>>>>> |443e6b
>>>>>> AIRFLOW-1187|Obsolete package names in documentation           |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1185|Incorrect url to PyPi                             |#2283
>>>>>> |829755
>>>>>> AIRFLOW-1181|Enable delete and list function for Google Cloud S|#2281
>>>>>> |24f73c
>>>>>> AIRFLOW-1179|Pandas 0.20 broke Google BigQuery hook            |#2279
>>>>>> |ac9ccb
>>>>>> AIRFLOW-1177|variable json deserialize does not work at set def|#2540
>>>>>> |65319a
>>>>>> AIRFLOW-1175|Add Pronto Tools to Airflow user list             |#2277
>>>>>> |86aafa
>>>>>> AIRFLOW-1173|Add Robinhood to list of Airflow users            |#2271
>>>>>> |379115
>>>>>> AIRFLOW-1165|airflow webservice crashes on ubuntu16 - python3  |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1160|Upadte SparkSubmitOperator parameters             |#2265
>>>>>> |2e3f07
>>>>>> AIRFLOW-1155|Add Tails.com to community                        |#2261
>>>>>> |2fa690
>>>>>> AIRFLOW-1149|Allow custom filters to be added to jinja2        |#2258
>>>>>> |48135a
>>>>>> AIRFLOW-1141|Remove DAG.crawl_for_tasks method                 |#2275
>>>>>> |a30fee
>>>>>> AIRFLOW-1140|DatabricksSubmitRunOperator should template the "j|#2255
>>>>>> |e6d316
>>>>>> AIRFLOW-1136|Invalid parameters are not captured for Sqoop oper|#2252
>>>>>> |2ef4db
>>>>>> AIRFLOW-1125|Clarify documentation regarding fernet_key        |#2251
>>>>>> |831f8d
>>>>>> AIRFLOW-1122|Node strokes are too thin for people with color vi|#2246
>>>>>> |a08761
>>>>>> AIRFLOW-1121|airflow webserver --pid no longer write out pid fi|-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1118|Add evo.company to Airflow users                  |#2243
>>>>>> |f16914
>>>>>> AIRFLOW-1112|Log which pool is full in scheduler when pool slot|#2242
>>>>>> |74c1ce
>>>>>> AIRFLOW-1107|Add support for ftps non-default port             |#2240
>>>>>> |4d0c2f
>>>>>> AIRFLOW-1106|Add Groupalia/Letsbonus                           |#2239
>>>>>> |945b42
>>>>>> AIRFLOW-1095|ldap_auth memberOf should come from configuration |#2232
>>>>>> |6b1c32
>>>>>> AIRFLOW-1094|Invalid unit tests under `contrib/`               |#2234
>>>>>> |219c50
>>>>>> AIRFLOW-1091|As a release manager I want to be able to compare |#2231
>>>>>> |bfae42
>>>>>> AIRFLOW-1090|Add HBO                                           |#2230
>>>>>> |177d34
>>>>>> AIRFLOW-1089|Add Spark application arguments to SparkSubmitOper|#2229
>>>>>> |e5b914
>>>>>> AIRFLOW-1081|Task duration page is slow                        |#2226
>>>>>> |0da512
>>>>>> AIRFLOW-1075|Cleanup security docs                             |#2222
>>>>>> |5a6f18
>>>>>> AIRFLOW-1065|Add functionality for Azure Blob Storage          |#2216
>>>>>> |f1bc5f
>>>>>> AIRFLOW-1059|Reset_state_for_orphaned_task should operate in ba|#2205
>>>>>> |e05d3b
>>>>>> AIRFLOW-1058|Improvements for SparkSubmitOperator              |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1051|Add a test for resetdb to CliTests                |#2198
>>>>>> |15aee0
>>>>>> AIRFLOW-1047|Airflow logs vulnerable to XSS                    |#2193
>>>>>> |fe9ebe
>>>>>> AIRFLOW-1045|Make log level configurable via airflow.cfg       |#2191
>>>>>> |e739a5
>>>>>> AIRFLOW-1043|Documentation issues for operators                |#2188
>>>>>> |b55f41
>>>>>> AIRFLOW-1041|DockerOperator replaces its xcom_push method with |#2274
>>>>>> |03704c
>>>>>> AIRFLOW-1040|Fix typos in comments/docstrings in models.py     |#2174
>>>>>> |d8c0f5
>>>>>> AIRFLOW-1036|Exponential backoff should use randomization      |#2262
>>>>>> |66168e
>>>>>> AIRFLOW-1035|Exponential backoff retry logic should use 2 as ba|#2196
>>>>>> |4ec932
>>>>>> AIRFLOW-1034|Make it possible to connect to S3 in sigv4 regions|#2181
>>>>>> |4c0905
>>>>>> AIRFLOW-1031|'scheduled__' may replace with DagRun.ID_PREFIX in|#2613
>>>>>> |aa3844
>>>>>> AIRFLOW-1030|HttpHook error when creating HttpSensor           |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-1028|Databricks Operator for Airflow                   |#2202
>>>>>> |53ca50
>>>>>> AIRFLOW-1024|Handle CeleryExecutor errors gracefully           |#2355
>>>>>> |7af20f
>>>>>> AIRFLOW-1018|Scheduler DAG processes can not log to stdout     |#2728
>>>>>> |ef775d
>>>>>> AIRFLOW-1016|Allow HTTP HEAD request method on HTTPSensor      |#2175
>>>>>> |4c41f6
>>>>>> AIRFLOW-1010|Add a convenience script for signing              |#2169
>>>>>> |a2b65a
>>>>>> AIRFLOW-1009|Remove SQLOperator from Concepts page             |#2168
>>>>>> |7d1144
>>>>>> AIRFLOW-1007|Jinja sandbox is vulnerable to RCE                |#2184
>>>>>> |daa281
>>>>>> AIRFLOW-1005|Speed up Airflow startup time                     |#na
>>>>>> |996dd3
>>>>>> AIRFLOW-999 |Support for Redis database                        |#2165
>>>>>> |8de850
>>>>>> AIRFLOW-997 |Change setup.cfg to point to Apache instead of Max|#na
>>>>>> |75cd46
>>>>>> AIRFLOW-995 |Update Github PR template                         |#2163
>>>>>> |b62485
>>>>>> AIRFLOW-994 |Add MiNODES to the AIRFLOW Active Users List      |#2159
>>>>>> |ca1623
>>>>>> AIRFLOW-991 |Mark_success while a task is running leads to fail|-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-990 |DockerOperator fails when logging unicode string  |#2155
>>>>>> |6bbf54
>>>>>> AIRFLOW-988 |SLA Miss Callbacks Are Repeated if Email is Not be|#2415
>>>>>> |6e74d4
>>>>>> AIRFLOW-985 |Extend the sqoop operator/hook with additional par|#2177
>>>>>> |82eb20
>>>>>> AIRFLOW-984 |Subdags unrecognized when subclassing SubDagOperat|#2152
>>>>>> |a8bd16
>>>>>> AIRFLOW-979 |Add GovTech GDS                                   |#2149
>>>>>> |b17bd3
>>>>>> AIRFLOW-976 |Mark success running task causes it to fail       |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-969 |Catch bad python_callable argument at DAG construc|#2142
>>>>>> |12901d
>>>>>> AIRFLOW-963 |Some code examples are not rendered in the airflow|#2139
>>>>>> |f69c1b
>>>>>> AIRFLOW-960 |Add support for .editorconfig                     |#na
>>>>>> |f5cacc
>>>>>> AIRFLOW-959 |.gitignore file is disorganized and incomplete    |#na
>>>>>> |3d3c14
>>>>>> AIRFLOW-958 |Improve tooltip readability                       |#2134
>>>>>> |b3c3eb
>>>>>> AIRFLOW-950 |Missing AWS integrations on documentation::integra|#2552
>>>>>> |01be02
>>>>>> AIRFLOW-947 |Make PrestoHook surface better messages when the P|#na
>>>>>> |6dd4b3
>>>>>> AIRFLOW-945 |Revert psycopg2 workaround when psycopg2 2.7.1 is |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-943 |Add Digital First Media to the Airflow users list |#2115
>>>>>> |2cfe28
>>>>>> AIRFLOW-942 |Add mytaxi to Airflow Users                       |#2111
>>>>>> |d579e6
>>>>>> AIRFLOW-935 |Impossible to use plugin executors                |#2120
>>>>>> |08a784
>>>>>> AIRFLOW-926 |jdbc connector is broken due to jaydebeapi api upd|#2651
>>>>>> |07ed29
>>>>>> AIRFLOW-917 |Incorrectly formatted failure status message      |#2109
>>>>>> |b8164c
>>>>>> AIRFLOW-916 |Fix ConfigParser deprecation warning              |#2108
>>>>>> |ef6dd1
>>>>>> AIRFLOW-911 |Add colouring and profiling info on tests         |#2106
>>>>>> |4f52db
>>>>>> AIRFLOW-903 |Add configuration setting for default DAG view.   |#2103
>>>>>> |cadfae
>>>>>> AIRFLOW-896 |BigQueryOperator fails to execute with certain inp|#2097
>>>>>> |2bceee
>>>>>> AIRFLOW-891 |Webserver Clock Should Include Day                |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-889 |Minor error in the docstrings for BaseOperator.   |#2084
>>>>>> |50702d
>>>>>> AIRFLOW-887 |Add compatibility with future v0.16               |#na
>>>>>> |50902d
>>>>>> AIRFLOW-886 |Pass Operator result to post_execute hook         |#na
>>>>>> |4da361
>>>>>> AIRFLOW-885 |Add Change.org to the list of Airflow users       |#2089
>>>>>> |a279be
>>>>>> AIRFLOW-882 |Code example in docs has unnecessary DAG>>Operator|#2088
>>>>>> |baa4cd
>>>>>> AIRFLOW-881 |Create SubDagOperator within DAG context manager w|#2087
>>>>>> |0ed608
>>>>>> AIRFLOW-880 |Fix remote log functionality inconsistencies for W|#2086
>>>>>> |974b75
>>>>>> AIRFLOW-877 |GoogleCloudStorageDownloadOperator: template_ext c|#2083
>>>>>> |debc69
>>>>>> AIRFLOW-875 |Allow HttpSensor params to be templated           |#2080
>>>>>> |62f503
>>>>>> AIRFLOW-871 |multiple places use logging.warn() instead of warn|#2082
>>>>>> |21d775
>>>>>> AIRFLOW-866 |Add FTPSensor                                     |#2070
>>>>>> |5f87f8
>>>>>> AIRFLOW-863 |Example DAG start dates should be recent to avoid |#2068
>>>>>> |bbfd43
>>>>>> AIRFLOW-862 |Add DaskExecutor                                  |#2067
>>>>>> |6e2210
>>>>>> AIRFLOW-860 |Circular module dependency prevents loading of cus|-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-854 |Add Open Knowledge International to Airflow users |#2061
>>>>>> |51a311
>>>>>> AIRFLOW-842 |scheduler.clean_dirty raises warning: SAWarning: T|#2072
>>>>>> |485280
>>>>>> AIRFLOW-840 |Python3 encoding issue in Kerberos                |#2158
>>>>>> |639336
>>>>>> AIRFLOW-836 |The paused and queryview endpoints are vulnerable |#2054
>>>>>> |6aca2c
>>>>>> AIRFLOW-831 |Fix broken unit tests                             |#2050
>>>>>> |b86194
>>>>>> AIRFLOW-830 |Plugin manager should log to debug, not info      |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-829 |Reduce verbosity of successful Travis unit tests  |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-826 |Add Zendesk Hook                                  |#2066
>>>>>> |a09762
>>>>>> AIRFLOW-823 |Make task instance details available via API      |#2045
>>>>>> |3f546e
>>>>>> AIRFLOW-822 |Close the connection before throwing exception in |#2038
>>>>>> |4b6c38
>>>>>> AIRFLOW-821 |Scheduler dagbag importing not Py3 compatible     |#2039
>>>>>> |fbb59b
>>>>>> AIRFLOW-809 |SqlAlchemy is_ ColumnOperator Causing Errors in MS|-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-802 |Integration of spark-submit                       |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-781 |Allow DataFlowJavaOperator to accept jar file stor|#2037
>>>>>> |259c86
>>>>>> AIRFLOW-770 |HDFS hooks should support alternative ways of gett|#2056
>>>>>> |261b65
>>>>>> AIRFLOW-756 |Refactor ssh_hook and ssh_operator                |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-751 |SFTP file transfer functionality                  |#1999
>>>>>> |fe0ede
>>>>>> AIRFLOW-725 |Make merge tool use OS' keyring for password stora|#1966
>>>>>> |8c1695
>>>>>> AIRFLOW-706 |Configuration shell commands are not split properl|#2053
>>>>>> |0bb6f2
>>>>>> AIRFLOW-705 |airflow.configuration.run_command output does not |-
>>> |-
>>>>>> 
>>>>>> AIRFLOW-681 |homepage doc link should pointing to apache's repo|#2164
>>>>>> |a8027a
>>>>>> AIRFLOW-654 |SSL for AMQP w/ Celery(Executor)                  |#2333
>>>>>> |868bfe
>>>>>> AIRFLOW-645 |HttpHook ignores https                            |#2311
>>>>>> |fd381a
>>>>>> AIRFLOW-365 |Code view in subdag trigger exception             |#2043
>>>>>> |cf102c
>>>>>> AIRFLOW-300 |Add Google Pubsub hook and operator               |#2036
>>>>>> |d231dc
>>>>>> AIRFLOW-289 |Use datetime.utcnow() to keep airflow system indep|#2618
>>>>>> |20c83e
>>>>>> AIRFLOW-71  |docker_operator - pulling from private repositorie|#na
>>>>>> |d4406c
>>>>>> 
>>>>>> Cheers,
>>>>>> Chris
>>>>>> 
>>>> 
>>> 
>>> 
> 


Re: [VOTE] Airflow 1.9.0rc1

Posted by Ash Berlin-Taylor <as...@firemirror.com>.
Thanks for picking this up. Your fix should stop the 500 error, but there's another problem (which is ultimately user misconfiguration about) https://issues.apache.org/jira/browse/AIRFLOW-1796 - the fix for that to update a doc somewhere, and probably validate this settings is correct at start time.


I've found another issue related to arg names of S3Hook. In 1.8.2 it was `s3_conn_id` but the move to boto3/basing off AWSHook now expects `aws_conn_id`, and various places in Airflow code base (and a few places in our dags/operators code base) still pass it as s3_conn_id. I've created https://issues.apache.org/jira/browse/AIRFLOW-1795 for that issue.

-ash


> On 8 Nov 2017, at 18:54, Daniel Huang <dx...@gmail.com> wrote:
> 
> Still testing this out.
> 
> Put up a small fix for Ash's second exception
> https://github.com/apache/incubator-airflow/pull/2766
> 
> On Wed, Nov 8, 2017 at 10:48 AM, Bolke de Bruin <bd...@gmail.com> wrote:
> 
>> Hi Chris,
>> 
>> Actively testing here: we found an issue in the SSHOperator introduced in
>> 1.9.0 (fix already merged for RC2, but blocking I as it stops us from
>> running SSH properly), some minor fixes by Airbnb should also be in RC2.
>> There is some logging “weirdness”, that might warrant a small patch here in
>> there and could be squeezed into RC2, but I don’t consider them blocking.
>> 
>> So almost there, but we need an RC2 imho.
>> 
>> -1, binding.
>> 
>> Bolke
>> 
>>> On 8 Nov 2017, at 19:00, Ash Berlin-Taylor <ash_airflowlist@firemirror.
>> com> wrote:
>>> 
>>> -1 (for now. Non binding. Is that how this process works?)
>>> 
>>> We've built a test env for this RC and are testing, but have run into an
>> issue reading task logs. (See below)
>>> 
>>> We haven't gotten very far with this yet, we will dig more tomorrow
>> (it's the end of the UK work day now). I suspect this might be how we've
>> misconfigured our logging. We will see tomorrow.
>>> 
>>> -ash
>>> 
>>> 
>>> 
>>> 
>>> File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py",
>> line 712, in log
>>>   logs = handler.read(ti)
>>> AttributeError: 'NoneType' object has no attribute 'read'
>>> 
>>> During handling of the above exception, another exception occurred:
>>> 
>>> Traceback (most recent call last):
>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1988,
>> in wsgi_app
>>>   response = self.full_dispatch_request()
>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1641,
>> in full_dispatch_request
>>>   rv = self.handle_user_exception(e)
>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1544,
>> in handle_user_exception
>>>   reraise(exc_type, exc_value, tb)
>>> File "/usr/local/lib/python3.5/dist-packages/flask/_compat.py", line
>> 33, in reraise
>>>   raise value
>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1639,
>> in full_dispatch_request
>>>   rv = self.dispatch_request()
>>> File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1625,
>> in dispatch_request
>>>   return self.view_functions[rule.endpoint](**req.view_args)
>>> File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py",
>> line 69, in inner
>>>   return self._run_view(f, *args, **kwargs)
>>> File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py",
>> line 368, in _run_view
>>>   return fn(self, *args, **kwargs)
>>> File "/usr/local/lib/python3.5/dist-packages/flask_login.py", line
>> 758, in decorated_view
>>>   return func(*args, **kwargs)
>>> File "/usr/local/lib/python3.5/dist-packages/airflow/www/utils.py",
>> line 262, in wrapper
>>>   return f(*args, **kwargs)
>>> File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py",
>> line 715, in log
>>>   .format(task_log_reader, e.message)]
>>> AttributeError: 'AttributeError' object has no attribute 'message'
>>> 
>>> 
>>>> On 8 Nov 2017, at 17:46, Chris Riccomini <cr...@apache.org> wrote:
>>>> 
>>>> Anyone? :/
>>>> 
>>>> On Mon, Nov 6, 2017 at 1:22 PM, Chris Riccomini <cr...@apache.org>
>>>> wrote:
>>>> 
>>>>> Hey all,
>>>>> 
>>>>> I have cut Airflow 1.9.0 RC1. This email is calling a vote on the
>> release,
>>>>> which will last fo 72 hours. Consider this my (binding) +1.
>>>>> 
>>>>> Airflow 1.9.0 RC1 is available at:
>>>>> 
>>>>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.9.0rc1/
>>>>> 
>>>>> apache-airflow-1.9.0rc1+incubating-source.tar.gz is a source release
>> that
>>>>> comes with INSTALL instructions.
>>>>> apache-airflow-1.9.0rc1+incubating-bin.tar.gz is the binary Python
>>>>> "sdist" release.
>>>>> 
>>>>> Public keys are available at:
>>>>> 
>>>>> https://dist.apache.org/repos/dist/release/incubator/airflow/
>>>>> 
>>>>> The release contains the following JIRAs:
>>>>> 
>>>>> ISSUE ID    |DESCRIPTION                                       |PR
>>>>> |COMMIT
>>>>> AIRFLOW-1779|Add keepalive packets to ssh hook                 |#2749
>>>>> |d2f9d1
>>>>> AIRFLOW-1776|stdout/stderr logging not captured                |#2745
>>>>> |590d9f
>>>>> AIRFLOW-1771|Change heartbeat text from boom to heartbeat      |-
>> |-
>>>>> 
>>>>> AIRFLOW-1767|Airflow Scheduler no longer schedules DAGs        |-
>> |-
>>>>> 
>>>>> AIRFLOW-1765|Default API auth backed should deny all.          |#2737
>>>>> |6ecdac
>>>>> AIRFLOW-1764|Web Interface should not use experimental api     |#2738
>>>>> |6bed1d
>>>>> AIRFLOW-1757|Contrib.SparkSubmitOperator should allow --package|#2725
>>>>> |4e06ee
>>>>> AIRFLOW-1745|BashOperator ignores SIGPIPE in subprocess        |#2714
>>>>> |e021c9
>>>>> AIRFLOW-1744|task.retries can be False                         |#2713
>>>>> |6144c6
>>>>> AIRFLOW-1743|Default config template should not contain ldap fi|#2712
>>>>> |270684
>>>>> AIRFLOW-1741|Task Duration shows two charts on first page load.|#2711
>>>>> |974b49
>>>>> AIRFLOW-1734|Sqoop Operator contains logic errors & needs optio|#2703
>>>>> |f6810c
>>>>> AIRFLOW-1731|Import custom config on PYTHONPATH                |#2721
>>>>> |f07eb3
>>>>> AIRFLOW-1726|Copy Expert command for Postgres Hook             |#2698
>>>>> |8a4ad3
>>>>> AIRFLOW-1719|Fix small typo - your vs you                      |-
>> |-
>>>>> 
>>>>> AIRFLOW-1712|Log SSHOperator output                            |-
>> |-
>>>>> 
>>>>> AIRFLOW-1711|Ldap Attributes not always a "list" part 2        |#2731
>>>>> |40a936
>>>>> AIRFLOW-1706|Scheduler is failed on startup with MS SQL Server |#2733
>>>>> |9e209b
>>>>> AIRFLOW-1698|Remove confusing SCHEDULER_RUNS env var from syste|#2677
>>>>> |00dd06
>>>>> AIRFLOW-1695|Redshift Hook using boto3 & AWS Hook              |#2717
>>>>> |bfddae
>>>>> AIRFLOW-1694|Hive Hooks: Python 3 does not have an `itertools.i|#2674
>>>>> |c6e5ae
>>>>> AIRFLOW-1692|Master cannot be checked out on windows           |#2673
>>>>> |31805e
>>>>> AIRFLOW-1691|Add better documentation for Google cloud storage |#2671
>>>>> |ace2b1
>>>>> AIRFLOW-1690|Error messages regarding gcs log commits are spars|#2670
>>>>> |5fb5cd
>>>>> AIRFLOW-1682|S3 task handler never writes to S3                |#2664
>>>>> |0080f0
>>>>> AIRFLOW-1678|Fix docstring errors for `set_upstream` and `set_d|-
>> |-
>>>>> 
>>>>> AIRFLOW-1677|Fix typo in example_qubole_operator               |-
>> |-
>>>>> 
>>>>> AIRFLOW-1676|GCS task handler never writes to GCS              |#2659
>>>>> |781fa4
>>>>> AIRFLOW-1675|Fix API docstrings to be properly rendered        |#2667
>>>>> |f12381
>>>>> AIRFLOW-1671|Missing @apply_defaults annotation for gcs downloa|#2655
>>>>> |97666b
>>>>> AIRFLOW-1669|Fix Docker import in Master                       |#na
>>>>> |f7f2a8
>>>>> AIRFLOW-1668|Redhsift requires a keep alive of < 300s          |#2650
>>>>> |f2bb77
>>>>> AIRFLOW-1664|Make MySqlToGoogleCloudStorageOperator support bin|#2649
>>>>> |95813d
>>>>> AIRFLOW-1660|Change webpage width to full-width                |#2646
>>>>> |8ee3d9
>>>>> AIRFLOW-1659|Fix invalid attribute bug in FileTaskHandler      |#2645
>>>>> |bee823
>>>>> AIRFLOW-1658|Kill (possibly) still running Druid indexing job a|#2644
>>>>> |cbf7ad
>>>>> AIRFLOW-1657|Handle failure of Qubole Operator for s3distcp had|-
>> |-
>>>>> 
>>>>> AIRFLOW-1654|Show tooltips for link icons in DAGs view         |#2642
>>>>> |ada7b2
>>>>> AIRFLOW-1647|Fix Spark-sql hook                                |#2637
>>>>> |b1e5c6
>>>>> AIRFLOW-1641|Task gets stuck in queued state                   |#2715
>>>>> |735497
>>>>> AIRFLOW-1640|Add Qubole default connection in connection table |-
>> |-
>>>>> 
>>>>> AIRFLOW-1639|ValueError does not have .message attribute       |#2629
>>>>> |87df67
>>>>> AIRFLOW-1637|readme not tracking master branch for travis      |-
>> |-
>>>>> 
>>>>> AIRFLOW-1636|aws and emr connection types get cleared          |#2626
>>>>> |540e04
>>>>> AIRFLOW-1635|Allow creating Google Cloud Platform connection wi|#2640
>>>>> |6dec7a
>>>>> AIRFLOW-1629|make extra a textarea in edit connections form    |#2623
>>>>> |f5d46f
>>>>> AIRFLOW-1628|Docstring of sqlsensor is incorrect               |#2621
>>>>> |9ba73d
>>>>> AIRFLOW-1627|SubDagOperator initialization should only query po|#2620
>>>>> |516ace
>>>>> AIRFLOW-1621|Add tests for logic added on server side dag list |#2614
>>>>> |8de9fd
>>>>> AIRFLOW-1614|Improve performance of DAG parsing when there are |#2610
>>>>> |a95adb
>>>>> AIRFLOW-1611|Customize logging in Airflow                      |#2631
>>>>> |8b4a50
>>>>> AIRFLOW-1609|Ignore all venvs in gitignore                     |#2608
>>>>> |f1f9b4
>>>>> AIRFLOW-1608|GCP Dataflow hook missing pending job state       |#2607
>>>>> |653562
>>>>> AIRFLOW-1606|DAG.sync_to_db is static, but takes a DAG as first|#2606
>>>>> |6ac296
>>>>> AIRFLOW-1605|Fix log source of local loggers                   |-
>> |-
>>>>> 
>>>>> AIRFLOW-1604|Rename the logger to log                          |#2604
>>>>> |af4050
>>>>> AIRFLOW-1602|Use LoggingMixin for the DAG class                |#2602
>>>>> |956699
>>>>> AIRFLOW-1601|Add configurable time between SIGTERM and SIGKILL |#2601
>>>>> |48a95e
>>>>> AIRFLOW-1600|Uncaught exceptions in get_fernet if cryptography |#2600
>>>>> |ad963e
>>>>> AIRFLOW-1597|Add GameWisp as Airflow user                      |#2599
>>>>> |26b747
>>>>> AIRFLOW-1594|Installing via pip copies test files into python l|#2597
>>>>> |a6b23a
>>>>> AIRFLOW-1593|Expose load_string in WasbHook                    |#2596
>>>>> |7ece95
>>>>> AIRFLOW-1591|Exception: 'TaskInstance' object has no attribute |#2578
>>>>> |f4653e
>>>>> AIRFLOW-1590|Small fix for dates util                          |#2652
>>>>> |31946e
>>>>> AIRFLOW-1587|fix `ImportError: cannot import name 'CeleryExecut|#2590
>>>>> |34c73b
>>>>> AIRFLOW-1586|MySQL to GCS to BigQuery fails for tables with dat|#2589
>>>>> |e83012
>>>>> AIRFLOW-1584|Remove the insecure /headers endpoints            |#2588
>>>>> |17ac07
>>>>> AIRFLOW-1582|Improve logging structure of Airflow              |#2592
>>>>> |a7a518
>>>>> AIRFLOW-1580|Error in string formatter when throwing an excepti|#2583
>>>>> |ea9ab9
>>>>> AIRFLOW-1579|Allow jagged rows in BQ Hook.                     |#2582
>>>>> |5b978b
>>>>> AIRFLOW-1577|Add token support to DatabricksHook               |#2579
>>>>> |c2c515
>>>>> AIRFLOW-1573|Remove `thrift < 0.10.0` requirement              |#2574
>>>>> |aa95f2
>>>>> AIRFLOW-1571|Add AWS Lambda Hook for invoking Lambda Function  |#2718
>>>>> |017f18
>>>>> AIRFLOW-1568|Add datastore import/export operator              |#2568
>>>>> |86063b
>>>>> AIRFLOW-1567|Clean up ML Engine operators                      |#2567
>>>>> |af91e2
>>>>> AIRFLOW-1564|Default logging filename contains a colon         |#2565
>>>>> |4c674c
>>>>> AIRFLOW-1560|Add AWS DynamoDB hook for inserting batch items   |#2587
>>>>> |71400b
>>>>> AIRFLOW-1556|BigQueryBaseCursor should support SQL parameters  |#2557
>>>>> |9df0ac
>>>>> AIRFLOW-1546| add Zymergen to org list in README               |#2512
>>>>> |7cc346
>>>>> AIRFLOW-1535|Add support for Dataproc serviceAccountScopes in D|#2546
>>>>> |b1f902
>>>>> AIRFLOW-1529|Support quoted newlines in Google BigQuery load jo|#2545
>>>>> |4a4b02
>>>>> AIRFLOW-1527|Refactor celery config to make use of template    |#2542
>>>>> |f4437b
>>>>> AIRFLOW-1522|Increase size of val column for variable table in |#2535
>>>>> |8a2d24
>>>>> AIRFLOW-1521|Template fields definition for bigquery_table_dele|#2534
>>>>> |f1a7c0
>>>>> AIRFLOW-1520|S3Hook uses boto2                                 |#2532
>>>>> |386583
>>>>> AIRFLOW-1519|Main DAG list page does not scale using client sid|#2531
>>>>> |d7d7ce
>>>>> AIRFLOW-1512|Add operator for running Python functions in a vir|#2446
>>>>> |14e6d7
>>>>> AIRFLOW-1507|Make src, dst and bucket parameters as templated i|#2516
>>>>> |d295cf
>>>>> AIRFLOW-1505|Document when Jinja substitution occurs           |#2523
>>>>> |984a87
>>>>> AIRFLOW-1504|Log Cluster Name on Dataproc Operator When Execute|#2517
>>>>> |1cd6c4
>>>>> AIRFLOW-1499s|Eliminate duplicate and unneeded code             |-
>> |-
>>>>> 
>>>>> AIRFLOW-1497|Hidden fields in connection form aren't reset when|#2507
>>>>> |d8da8b
>>>>> AIRFLOW-1493|Fix race condition with airflow run               |#2505
>>>>> |b2e175
>>>>> AIRFLOW-1492|Add metric for task success/failure               |#2504
>>>>> |fa84d4
>>>>> AIRFLOW-1489|Docs: Typo in BigQueryCheckOperator               |#2501
>>>>> |111ce5
>>>>> AIRFLOW-1483|Page size on model views is to large to render qui|#2497
>>>>> |04bfba
>>>>> AIRFLOW-1478|Chart -> Owner column should be sortable          |#2493
>>>>> |651e60
>>>>> AIRFLOW-1476|Add INSTALL file for source releases              |#2492
>>>>> |da76ac
>>>>> AIRFLOW-1474|Add dag_id regex for 'airflow clear' CLI command  |#2486
>>>>> |18f849
>>>>> AIRFLOW-1470s|BashSensor Implementation                         |-
>> |-
>>>>> 
>>>>> AIRFLOW-1459|integration rst doc is broken in github view      |#2481
>>>>> |322ec9
>>>>> AIRFLOW-1438|Scheduler batch queries should have a limit       |#2462
>>>>> |3547cb
>>>>> AIRFLOW-1437|BigQueryTableDeleteOperator should define deletion|#2459
>>>>> |b87903
>>>>> AIRFLOW-1432|NVD3 Charts do not have labeled axes and units cha|#2710
>>>>> |70ffa4
>>>>> AIRFLOW-1402|Cleanup SafeConfigParser DeprecationWarning       |#2435
>>>>> |38c86b
>>>>> AIRFLOW-1401|Standardize GCP project, region, and zone argument|#2439
>>>>> |b6d363
>>>>> AIRFLOW-1397|Airflow 1.8.1 - No data displays in Last Run Colum|-
>> |-
>>>>> 
>>>>> AIRFLOW-1394|Add quote_character parameter to GoogleCloudStorag|#2428
>>>>> |9fd0be
>>>>> AIRFLOW-1389|BigQueryOperator should support `createDisposition|#2470
>>>>> |6e2640
>>>>> AIRFLOW-1384|Add ARGO/CaDC                                     |#2434
>>>>> |715947
>>>>> AIRFLOW-1368|Automatically remove the container when it exits  |#2653
>>>>> |d42d23
>>>>> AIRFLOW-1359|Provide GoogleCloudML operator for model evaluatio|#2407
>>>>> |194d1d
>>>>> AIRFLOW-1356|add `--celery_hostname` to `airflow worker`       |#2405
>>>>> |b9d7d1
>>>>> AIRFLOW-1352|Revert bad logging Handler                        |-
>> |-
>>>>> 
>>>>> AIRFLOW-1350|Add "query_uri" parameter for Google DataProc oper|#2402
>>>>> |d32c72
>>>>> AIRFLOW-1348|Paginated UI has broken toggles after first page  |-
>> |-
>>>>> 
>>>>> AIRFLOW-1345|Don't commit on each loop                         |#2397
>>>>> |0dd002
>>>>> AIRFLOW-1344|Builds failing on Python 3.5 with AttributeError  |#2394
>>>>> |2a5883
>>>>> AIRFLOW-1343|Add airflow default label to the dataproc operator|#2396
>>>>> |e4b240
>>>>> AIRFLOW-1338|gcp_dataflow_hook is incompatible with the recent |#2388
>>>>> |cf2605
>>>>> AIRFLOW-1337|Customize log format via config file              |#2392
>>>>> |4841e3
>>>>> AIRFLOW-1335|Use buffered logger                               |#2386
>>>>> |0d23d3
>>>>> AIRFLOW-1333|Enable copy function for Google Cloud Storage Hook|#2385
>>>>> |e2c383
>>>>> AIRFLOW-1331|Contrib.SparkSubmitOperator should allow --package|#2622
>>>>> |fbca8f
>>>>> AIRFLOW-1330|Connection.parse_from_uri doesn't work for google_|#2525
>>>>> |6e5e9d
>>>>> AIRFLOW-1324|Make the Druid operator/hook more general         |#2378
>>>>> |de99aa
>>>>> AIRFLOW-1323|Operators related to Dataproc should keep some par|#2636
>>>>> |ed248d
>>>>> AIRFLOW-1315|Add Qubole File and Partition Sensors             |-
>> |-
>>>>> 
>>>>> AIRFLOW-1309|Add optional hive_tblproperties in HiveToDruidTran|-
>> |-
>>>>> 
>>>>> AIRFLOW-1301|Add New Relic to Airflow user list                |#2359
>>>>> |355fc9
>>>>> AIRFLOW-1299|Google Dataproc cluster creation operator should s|#2358
>>>>> |c2b80e
>>>>> AIRFLOW-1289|Don't restrict scheduler threads to CPU cores     |#2353
>>>>> |8e23d2
>>>>> AIRFLOW-1286|BaseTaskRunner - Exception TypeError: a bytes-like|#2363
>>>>> |d8891d
>>>>> AIRFLOW-1277|Forbid creation of a known event with empty fields|#na
>>>>> |65184a
>>>>> AIRFLOW-1276|Forbid event creation with end_data earlier than s|#na
>>>>> |d5d02f
>>>>> AIRFLOW-1275|Fix `airflow pool` command exception              |#2346
>>>>> |9958aa
>>>>> AIRFLOW-1273|Google Cloud ML Version and Model CRUD Operator   |#2379
>>>>> |534a0e
>>>>> AIRFLOW-1272|Google Cloud ML Batch Prediction Operator         |#2390
>>>>> |e92d6b
>>>>> AIRFLOW-1271|Google Cloud ML Training Operator                 |#2408
>>>>> |0fc450
>>>>> AIRFLOW-1256|Add United Airlines as Airflow user               |#2332
>>>>> |d3484a
>>>>> AIRFLOW-1251|Add eRevalue as an Airflow user                   |#2331
>>>>> |8d5160
>>>>> AIRFLOW-1248|Fix inconsistent configuration name for worker tim|#2328
>>>>> |92314f
>>>>> AIRFLOW-1247|CLI: ignore all dependencies argument ignored     |#2441
>>>>> |e88ecf
>>>>> AIRFLOW-1245|Fix random failure of test_trigger_dag_for_date un|#2325
>>>>> |cef01b
>>>>> AIRFLOW-1244|Forbid creation of a pool with empty name         |#2324
>>>>> |df9a10
>>>>> AIRFLOW-1242|BigQueryHook assumes that a valid project_id can't|#2335
>>>>> |ffe616
>>>>> AIRFLOW-1237|Fix IN-predicate sqlalchemy warning               |#2320
>>>>> |a1f422
>>>>> AIRFLOW-1234|Cover utils.operator_helpers with unit tests      |#2317
>>>>> |d16537
>>>>> AIRFLOW-1233|Cover utils.json with unit tests                  |#2316
>>>>> |502410
>>>>> AIRFLOW-1232|Remove deprecated readfp warning                  |#2315
>>>>> |6ffaaf
>>>>> AIRFLOW-1231|Use flask_wtf.CSRFProtect instead of flask_wtf.Csr|#2313
>>>>> |cac49e
>>>>> AIRFLOW-1221|Fix DatabricksSubmitRunOperator Templating        |#2308
>>>>> |0fa104
>>>>> AIRFLOW-1217|Enable logging in Sqoop hook                      |#2307
>>>>> |4f459b
>>>>> AIRFLOW-1213|Add hcatalog parameters to the sqoop operator/hook|#2305
>>>>> |857850
>>>>> AIRFLOW-1208|Speed-up cli tests                                |#2301
>>>>> |21c142
>>>>> AIRFLOW-1207|Enable utils.helpers unit tests                   |#2300
>>>>> |8ac87b
>>>>> AIRFLOW-1203|Tests failing after oauth upgrade                 |#2296
>>>>> |3e9c66
>>>>> AIRFLOW-1201|Update deprecated 'nose-parameterized' library to |#2298
>>>>> |d2d3e4
>>>>> AIRFLOW-1193|Add Checkr to Airflow user list                   |#2276
>>>>> |707238
>>>>> AIRFLOW-1189|Get pandas DataFrame using BigQueryHook fails     |#2287
>>>>> |93666f
>>>>> AIRFLOW-1188|Add max_bad_records param to GoogleCloudStorageToB|#2286
>>>>> |443e6b
>>>>> AIRFLOW-1187|Obsolete package names in documentation           |-
>> |-
>>>>> 
>>>>> AIRFLOW-1185|Incorrect url to PyPi                             |#2283
>>>>> |829755
>>>>> AIRFLOW-1181|Enable delete and list function for Google Cloud S|#2281
>>>>> |24f73c
>>>>> AIRFLOW-1179|Pandas 0.20 broke Google BigQuery hook            |#2279
>>>>> |ac9ccb
>>>>> AIRFLOW-1177|variable json deserialize does not work at set def|#2540
>>>>> |65319a
>>>>> AIRFLOW-1175|Add Pronto Tools to Airflow user list             |#2277
>>>>> |86aafa
>>>>> AIRFLOW-1173|Add Robinhood to list of Airflow users            |#2271
>>>>> |379115
>>>>> AIRFLOW-1165|airflow webservice crashes on ubuntu16 - python3  |-
>> |-
>>>>> 
>>>>> AIRFLOW-1160|Upadte SparkSubmitOperator parameters             |#2265
>>>>> |2e3f07
>>>>> AIRFLOW-1155|Add Tails.com to community                        |#2261
>>>>> |2fa690
>>>>> AIRFLOW-1149|Allow custom filters to be added to jinja2        |#2258
>>>>> |48135a
>>>>> AIRFLOW-1141|Remove DAG.crawl_for_tasks method                 |#2275
>>>>> |a30fee
>>>>> AIRFLOW-1140|DatabricksSubmitRunOperator should template the "j|#2255
>>>>> |e6d316
>>>>> AIRFLOW-1136|Invalid parameters are not captured for Sqoop oper|#2252
>>>>> |2ef4db
>>>>> AIRFLOW-1125|Clarify documentation regarding fernet_key        |#2251
>>>>> |831f8d
>>>>> AIRFLOW-1122|Node strokes are too thin for people with color vi|#2246
>>>>> |a08761
>>>>> AIRFLOW-1121|airflow webserver --pid no longer write out pid fi|-
>> |-
>>>>> 
>>>>> AIRFLOW-1118|Add evo.company to Airflow users                  |#2243
>>>>> |f16914
>>>>> AIRFLOW-1112|Log which pool is full in scheduler when pool slot|#2242
>>>>> |74c1ce
>>>>> AIRFLOW-1107|Add support for ftps non-default port             |#2240
>>>>> |4d0c2f
>>>>> AIRFLOW-1106|Add Groupalia/Letsbonus                           |#2239
>>>>> |945b42
>>>>> AIRFLOW-1095|ldap_auth memberOf should come from configuration |#2232
>>>>> |6b1c32
>>>>> AIRFLOW-1094|Invalid unit tests under `contrib/`               |#2234
>>>>> |219c50
>>>>> AIRFLOW-1091|As a release manager I want to be able to compare |#2231
>>>>> |bfae42
>>>>> AIRFLOW-1090|Add HBO                                           |#2230
>>>>> |177d34
>>>>> AIRFLOW-1089|Add Spark application arguments to SparkSubmitOper|#2229
>>>>> |e5b914
>>>>> AIRFLOW-1081|Task duration page is slow                        |#2226
>>>>> |0da512
>>>>> AIRFLOW-1075|Cleanup security docs                             |#2222
>>>>> |5a6f18
>>>>> AIRFLOW-1065|Add functionality for Azure Blob Storage          |#2216
>>>>> |f1bc5f
>>>>> AIRFLOW-1059|Reset_state_for_orphaned_task should operate in ba|#2205
>>>>> |e05d3b
>>>>> AIRFLOW-1058|Improvements for SparkSubmitOperator              |-
>> |-
>>>>> 
>>>>> AIRFLOW-1051|Add a test for resetdb to CliTests                |#2198
>>>>> |15aee0
>>>>> AIRFLOW-1047|Airflow logs vulnerable to XSS                    |#2193
>>>>> |fe9ebe
>>>>> AIRFLOW-1045|Make log level configurable via airflow.cfg       |#2191
>>>>> |e739a5
>>>>> AIRFLOW-1043|Documentation issues for operators                |#2188
>>>>> |b55f41
>>>>> AIRFLOW-1041|DockerOperator replaces its xcom_push method with |#2274
>>>>> |03704c
>>>>> AIRFLOW-1040|Fix typos in comments/docstrings in models.py     |#2174
>>>>> |d8c0f5
>>>>> AIRFLOW-1036|Exponential backoff should use randomization      |#2262
>>>>> |66168e
>>>>> AIRFLOW-1035|Exponential backoff retry logic should use 2 as ba|#2196
>>>>> |4ec932
>>>>> AIRFLOW-1034|Make it possible to connect to S3 in sigv4 regions|#2181
>>>>> |4c0905
>>>>> AIRFLOW-1031|'scheduled__' may replace with DagRun.ID_PREFIX in|#2613
>>>>> |aa3844
>>>>> AIRFLOW-1030|HttpHook error when creating HttpSensor           |-
>> |-
>>>>> 
>>>>> AIRFLOW-1028|Databricks Operator for Airflow                   |#2202
>>>>> |53ca50
>>>>> AIRFLOW-1024|Handle CeleryExecutor errors gracefully           |#2355
>>>>> |7af20f
>>>>> AIRFLOW-1018|Scheduler DAG processes can not log to stdout     |#2728
>>>>> |ef775d
>>>>> AIRFLOW-1016|Allow HTTP HEAD request method on HTTPSensor      |#2175
>>>>> |4c41f6
>>>>> AIRFLOW-1010|Add a convenience script for signing              |#2169
>>>>> |a2b65a
>>>>> AIRFLOW-1009|Remove SQLOperator from Concepts page             |#2168
>>>>> |7d1144
>>>>> AIRFLOW-1007|Jinja sandbox is vulnerable to RCE                |#2184
>>>>> |daa281
>>>>> AIRFLOW-1005|Speed up Airflow startup time                     |#na
>>>>> |996dd3
>>>>> AIRFLOW-999 |Support for Redis database                        |#2165
>>>>> |8de850
>>>>> AIRFLOW-997 |Change setup.cfg to point to Apache instead of Max|#na
>>>>> |75cd46
>>>>> AIRFLOW-995 |Update Github PR template                         |#2163
>>>>> |b62485
>>>>> AIRFLOW-994 |Add MiNODES to the AIRFLOW Active Users List      |#2159
>>>>> |ca1623
>>>>> AIRFLOW-991 |Mark_success while a task is running leads to fail|-
>> |-
>>>>> 
>>>>> AIRFLOW-990 |DockerOperator fails when logging unicode string  |#2155
>>>>> |6bbf54
>>>>> AIRFLOW-988 |SLA Miss Callbacks Are Repeated if Email is Not be|#2415
>>>>> |6e74d4
>>>>> AIRFLOW-985 |Extend the sqoop operator/hook with additional par|#2177
>>>>> |82eb20
>>>>> AIRFLOW-984 |Subdags unrecognized when subclassing SubDagOperat|#2152
>>>>> |a8bd16
>>>>> AIRFLOW-979 |Add GovTech GDS                                   |#2149
>>>>> |b17bd3
>>>>> AIRFLOW-976 |Mark success running task causes it to fail       |-
>> |-
>>>>> 
>>>>> AIRFLOW-969 |Catch bad python_callable argument at DAG construc|#2142
>>>>> |12901d
>>>>> AIRFLOW-963 |Some code examples are not rendered in the airflow|#2139
>>>>> |f69c1b
>>>>> AIRFLOW-960 |Add support for .editorconfig                     |#na
>>>>> |f5cacc
>>>>> AIRFLOW-959 |.gitignore file is disorganized and incomplete    |#na
>>>>> |3d3c14
>>>>> AIRFLOW-958 |Improve tooltip readability                       |#2134
>>>>> |b3c3eb
>>>>> AIRFLOW-950 |Missing AWS integrations on documentation::integra|#2552
>>>>> |01be02
>>>>> AIRFLOW-947 |Make PrestoHook surface better messages when the P|#na
>>>>> |6dd4b3
>>>>> AIRFLOW-945 |Revert psycopg2 workaround when psycopg2 2.7.1 is |-
>> |-
>>>>> 
>>>>> AIRFLOW-943 |Add Digital First Media to the Airflow users list |#2115
>>>>> |2cfe28
>>>>> AIRFLOW-942 |Add mytaxi to Airflow Users                       |#2111
>>>>> |d579e6
>>>>> AIRFLOW-935 |Impossible to use plugin executors                |#2120
>>>>> |08a784
>>>>> AIRFLOW-926 |jdbc connector is broken due to jaydebeapi api upd|#2651
>>>>> |07ed29
>>>>> AIRFLOW-917 |Incorrectly formatted failure status message      |#2109
>>>>> |b8164c
>>>>> AIRFLOW-916 |Fix ConfigParser deprecation warning              |#2108
>>>>> |ef6dd1
>>>>> AIRFLOW-911 |Add colouring and profiling info on tests         |#2106
>>>>> |4f52db
>>>>> AIRFLOW-903 |Add configuration setting for default DAG view.   |#2103
>>>>> |cadfae
>>>>> AIRFLOW-896 |BigQueryOperator fails to execute with certain inp|#2097
>>>>> |2bceee
>>>>> AIRFLOW-891 |Webserver Clock Should Include Day                |-
>> |-
>>>>> 
>>>>> AIRFLOW-889 |Minor error in the docstrings for BaseOperator.   |#2084
>>>>> |50702d
>>>>> AIRFLOW-887 |Add compatibility with future v0.16               |#na
>>>>> |50902d
>>>>> AIRFLOW-886 |Pass Operator result to post_execute hook         |#na
>>>>> |4da361
>>>>> AIRFLOW-885 |Add Change.org to the list of Airflow users       |#2089
>>>>> |a279be
>>>>> AIRFLOW-882 |Code example in docs has unnecessary DAG>>Operator|#2088
>>>>> |baa4cd
>>>>> AIRFLOW-881 |Create SubDagOperator within DAG context manager w|#2087
>>>>> |0ed608
>>>>> AIRFLOW-880 |Fix remote log functionality inconsistencies for W|#2086
>>>>> |974b75
>>>>> AIRFLOW-877 |GoogleCloudStorageDownloadOperator: template_ext c|#2083
>>>>> |debc69
>>>>> AIRFLOW-875 |Allow HttpSensor params to be templated           |#2080
>>>>> |62f503
>>>>> AIRFLOW-871 |multiple places use logging.warn() instead of warn|#2082
>>>>> |21d775
>>>>> AIRFLOW-866 |Add FTPSensor                                     |#2070
>>>>> |5f87f8
>>>>> AIRFLOW-863 |Example DAG start dates should be recent to avoid |#2068
>>>>> |bbfd43
>>>>> AIRFLOW-862 |Add DaskExecutor                                  |#2067
>>>>> |6e2210
>>>>> AIRFLOW-860 |Circular module dependency prevents loading of cus|-
>> |-
>>>>> 
>>>>> AIRFLOW-854 |Add Open Knowledge International to Airflow users |#2061
>>>>> |51a311
>>>>> AIRFLOW-842 |scheduler.clean_dirty raises warning: SAWarning: T|#2072
>>>>> |485280
>>>>> AIRFLOW-840 |Python3 encoding issue in Kerberos                |#2158
>>>>> |639336
>>>>> AIRFLOW-836 |The paused and queryview endpoints are vulnerable |#2054
>>>>> |6aca2c
>>>>> AIRFLOW-831 |Fix broken unit tests                             |#2050
>>>>> |b86194
>>>>> AIRFLOW-830 |Plugin manager should log to debug, not info      |-
>> |-
>>>>> 
>>>>> AIRFLOW-829 |Reduce verbosity of successful Travis unit tests  |-
>> |-
>>>>> 
>>>>> AIRFLOW-826 |Add Zendesk Hook                                  |#2066
>>>>> |a09762
>>>>> AIRFLOW-823 |Make task instance details available via API      |#2045
>>>>> |3f546e
>>>>> AIRFLOW-822 |Close the connection before throwing exception in |#2038
>>>>> |4b6c38
>>>>> AIRFLOW-821 |Scheduler dagbag importing not Py3 compatible     |#2039
>>>>> |fbb59b
>>>>> AIRFLOW-809 |SqlAlchemy is_ ColumnOperator Causing Errors in MS|-
>> |-
>>>>> 
>>>>> AIRFLOW-802 |Integration of spark-submit                       |-
>> |-
>>>>> 
>>>>> AIRFLOW-781 |Allow DataFlowJavaOperator to accept jar file stor|#2037
>>>>> |259c86
>>>>> AIRFLOW-770 |HDFS hooks should support alternative ways of gett|#2056
>>>>> |261b65
>>>>> AIRFLOW-756 |Refactor ssh_hook and ssh_operator                |-
>> |-
>>>>> 
>>>>> AIRFLOW-751 |SFTP file transfer functionality                  |#1999
>>>>> |fe0ede
>>>>> AIRFLOW-725 |Make merge tool use OS' keyring for password stora|#1966
>>>>> |8c1695
>>>>> AIRFLOW-706 |Configuration shell commands are not split properl|#2053
>>>>> |0bb6f2
>>>>> AIRFLOW-705 |airflow.configuration.run_command output does not |-
>> |-
>>>>> 
>>>>> AIRFLOW-681 |homepage doc link should pointing to apache's repo|#2164
>>>>> |a8027a
>>>>> AIRFLOW-654 |SSL for AMQP w/ Celery(Executor)                  |#2333
>>>>> |868bfe
>>>>> AIRFLOW-645 |HttpHook ignores https                            |#2311
>>>>> |fd381a
>>>>> AIRFLOW-365 |Code view in subdag trigger exception             |#2043
>>>>> |cf102c
>>>>> AIRFLOW-300 |Add Google Pubsub hook and operator               |#2036
>>>>> |d231dc
>>>>> AIRFLOW-289 |Use datetime.utcnow() to keep airflow system indep|#2618
>>>>> |20c83e
>>>>> AIRFLOW-71  |docker_operator - pulling from private repositorie|#na
>>>>> |d4406c
>>>>> 
>>>>> Cheers,
>>>>> Chris
>>>>> 
>>> 
>> 
>> 


Re: [VOTE] Airflow 1.9.0rc1

Posted by Daniel Huang <dx...@gmail.com>.
Still testing this out.

Put up a small fix for Ash's second exception
https://github.com/apache/incubator-airflow/pull/2766

On Wed, Nov 8, 2017 at 10:48 AM, Bolke de Bruin <bd...@gmail.com> wrote:

> Hi Chris,
>
> Actively testing here: we found an issue in the SSHOperator introduced in
> 1.9.0 (fix already merged for RC2, but blocking I as it stops us from
> running SSH properly), some minor fixes by Airbnb should also be in RC2.
> There is some logging “weirdness”, that might warrant a small patch here in
> there and could be squeezed into RC2, but I don’t consider them blocking.
>
> So almost there, but we need an RC2 imho.
>
> -1, binding.
>
> Bolke
>
> > On 8 Nov 2017, at 19:00, Ash Berlin-Taylor <ash_airflowlist@firemirror.
> com> wrote:
> >
> > -1 (for now. Non binding. Is that how this process works?)
> >
> > We've built a test env for this RC and are testing, but have run into an
> issue reading task logs. (See below)
> >
> > We haven't gotten very far with this yet, we will dig more tomorrow
> (it's the end of the UK work day now). I suspect this might be how we've
> misconfigured our logging. We will see tomorrow.
> >
> > -ash
> >
> >
> >
> >
> > File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py",
> line 712, in log
> >    logs = handler.read(ti)
> > AttributeError: 'NoneType' object has no attribute 'read'
> >
> > During handling of the above exception, another exception occurred:
> >
> > Traceback (most recent call last):
> >  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1988,
> in wsgi_app
> >    response = self.full_dispatch_request()
> >  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1641,
> in full_dispatch_request
> >    rv = self.handle_user_exception(e)
> >  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1544,
> in handle_user_exception
> >    reraise(exc_type, exc_value, tb)
> >  File "/usr/local/lib/python3.5/dist-packages/flask/_compat.py", line
> 33, in reraise
> >    raise value
> >  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1639,
> in full_dispatch_request
> >    rv = self.dispatch_request()
> >  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1625,
> in dispatch_request
> >    return self.view_functions[rule.endpoint](**req.view_args)
> >  File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py",
> line 69, in inner
> >    return self._run_view(f, *args, **kwargs)
> >  File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py",
> line 368, in _run_view
> >    return fn(self, *args, **kwargs)
> >  File "/usr/local/lib/python3.5/dist-packages/flask_login.py", line
> 758, in decorated_view
> >    return func(*args, **kwargs)
> >  File "/usr/local/lib/python3.5/dist-packages/airflow/www/utils.py",
> line 262, in wrapper
> >    return f(*args, **kwargs)
> >  File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py",
> line 715, in log
> >    .format(task_log_reader, e.message)]
> > AttributeError: 'AttributeError' object has no attribute 'message'
> >
> >
> >> On 8 Nov 2017, at 17:46, Chris Riccomini <cr...@apache.org> wrote:
> >>
> >> Anyone? :/
> >>
> >> On Mon, Nov 6, 2017 at 1:22 PM, Chris Riccomini <cr...@apache.org>
> >> wrote:
> >>
> >>> Hey all,
> >>>
> >>> I have cut Airflow 1.9.0 RC1. This email is calling a vote on the
> release,
> >>> which will last fo 72 hours. Consider this my (binding) +1.
> >>>
> >>> Airflow 1.9.0 RC1 is available at:
> >>>
> >>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.9.0rc1/
> >>>
> >>> apache-airflow-1.9.0rc1+incubating-source.tar.gz is a source release
> that
> >>> comes with INSTALL instructions.
> >>> apache-airflow-1.9.0rc1+incubating-bin.tar.gz is the binary Python
> >>> "sdist" release.
> >>>
> >>> Public keys are available at:
> >>>
> >>> https://dist.apache.org/repos/dist/release/incubator/airflow/
> >>>
> >>> The release contains the following JIRAs:
> >>>
> >>> ISSUE ID    |DESCRIPTION                                       |PR
> >>> |COMMIT
> >>> AIRFLOW-1779|Add keepalive packets to ssh hook                 |#2749
> >>> |d2f9d1
> >>> AIRFLOW-1776|stdout/stderr logging not captured                |#2745
> >>> |590d9f
> >>> AIRFLOW-1771|Change heartbeat text from boom to heartbeat      |-
>  |-
> >>>
> >>> AIRFLOW-1767|Airflow Scheduler no longer schedules DAGs        |-
>  |-
> >>>
> >>> AIRFLOW-1765|Default API auth backed should deny all.          |#2737
> >>> |6ecdac
> >>> AIRFLOW-1764|Web Interface should not use experimental api     |#2738
> >>> |6bed1d
> >>> AIRFLOW-1757|Contrib.SparkSubmitOperator should allow --package|#2725
> >>> |4e06ee
> >>> AIRFLOW-1745|BashOperator ignores SIGPIPE in subprocess        |#2714
> >>> |e021c9
> >>> AIRFLOW-1744|task.retries can be False                         |#2713
> >>> |6144c6
> >>> AIRFLOW-1743|Default config template should not contain ldap fi|#2712
> >>> |270684
> >>> AIRFLOW-1741|Task Duration shows two charts on first page load.|#2711
> >>> |974b49
> >>> AIRFLOW-1734|Sqoop Operator contains logic errors & needs optio|#2703
> >>> |f6810c
> >>> AIRFLOW-1731|Import custom config on PYTHONPATH                |#2721
> >>> |f07eb3
> >>> AIRFLOW-1726|Copy Expert command for Postgres Hook             |#2698
> >>> |8a4ad3
> >>> AIRFLOW-1719|Fix small typo - your vs you                      |-
>  |-
> >>>
> >>> AIRFLOW-1712|Log SSHOperator output                            |-
>  |-
> >>>
> >>> AIRFLOW-1711|Ldap Attributes not always a "list" part 2        |#2731
> >>> |40a936
> >>> AIRFLOW-1706|Scheduler is failed on startup with MS SQL Server |#2733
> >>> |9e209b
> >>> AIRFLOW-1698|Remove confusing SCHEDULER_RUNS env var from syste|#2677
> >>> |00dd06
> >>> AIRFLOW-1695|Redshift Hook using boto3 & AWS Hook              |#2717
> >>> |bfddae
> >>> AIRFLOW-1694|Hive Hooks: Python 3 does not have an `itertools.i|#2674
> >>> |c6e5ae
> >>> AIRFLOW-1692|Master cannot be checked out on windows           |#2673
> >>> |31805e
> >>> AIRFLOW-1691|Add better documentation for Google cloud storage |#2671
> >>> |ace2b1
> >>> AIRFLOW-1690|Error messages regarding gcs log commits are spars|#2670
> >>> |5fb5cd
> >>> AIRFLOW-1682|S3 task handler never writes to S3                |#2664
> >>> |0080f0
> >>> AIRFLOW-1678|Fix docstring errors for `set_upstream` and `set_d|-
>  |-
> >>>
> >>> AIRFLOW-1677|Fix typo in example_qubole_operator               |-
>  |-
> >>>
> >>> AIRFLOW-1676|GCS task handler never writes to GCS              |#2659
> >>> |781fa4
> >>> AIRFLOW-1675|Fix API docstrings to be properly rendered        |#2667
> >>> |f12381
> >>> AIRFLOW-1671|Missing @apply_defaults annotation for gcs downloa|#2655
> >>> |97666b
> >>> AIRFLOW-1669|Fix Docker import in Master                       |#na
> >>> |f7f2a8
> >>> AIRFLOW-1668|Redhsift requires a keep alive of < 300s          |#2650
> >>> |f2bb77
> >>> AIRFLOW-1664|Make MySqlToGoogleCloudStorageOperator support bin|#2649
> >>> |95813d
> >>> AIRFLOW-1660|Change webpage width to full-width                |#2646
> >>> |8ee3d9
> >>> AIRFLOW-1659|Fix invalid attribute bug in FileTaskHandler      |#2645
> >>> |bee823
> >>> AIRFLOW-1658|Kill (possibly) still running Druid indexing job a|#2644
> >>> |cbf7ad
> >>> AIRFLOW-1657|Handle failure of Qubole Operator for s3distcp had|-
>  |-
> >>>
> >>> AIRFLOW-1654|Show tooltips for link icons in DAGs view         |#2642
> >>> |ada7b2
> >>> AIRFLOW-1647|Fix Spark-sql hook                                |#2637
> >>> |b1e5c6
> >>> AIRFLOW-1641|Task gets stuck in queued state                   |#2715
> >>> |735497
> >>> AIRFLOW-1640|Add Qubole default connection in connection table |-
>  |-
> >>>
> >>> AIRFLOW-1639|ValueError does not have .message attribute       |#2629
> >>> |87df67
> >>> AIRFLOW-1637|readme not tracking master branch for travis      |-
>  |-
> >>>
> >>> AIRFLOW-1636|aws and emr connection types get cleared          |#2626
> >>> |540e04
> >>> AIRFLOW-1635|Allow creating Google Cloud Platform connection wi|#2640
> >>> |6dec7a
> >>> AIRFLOW-1629|make extra a textarea in edit connections form    |#2623
> >>> |f5d46f
> >>> AIRFLOW-1628|Docstring of sqlsensor is incorrect               |#2621
> >>> |9ba73d
> >>> AIRFLOW-1627|SubDagOperator initialization should only query po|#2620
> >>> |516ace
> >>> AIRFLOW-1621|Add tests for logic added on server side dag list |#2614
> >>> |8de9fd
> >>> AIRFLOW-1614|Improve performance of DAG parsing when there are |#2610
> >>> |a95adb
> >>> AIRFLOW-1611|Customize logging in Airflow                      |#2631
> >>> |8b4a50
> >>> AIRFLOW-1609|Ignore all venvs in gitignore                     |#2608
> >>> |f1f9b4
> >>> AIRFLOW-1608|GCP Dataflow hook missing pending job state       |#2607
> >>> |653562
> >>> AIRFLOW-1606|DAG.sync_to_db is static, but takes a DAG as first|#2606
> >>> |6ac296
> >>> AIRFLOW-1605|Fix log source of local loggers                   |-
>  |-
> >>>
> >>> AIRFLOW-1604|Rename the logger to log                          |#2604
> >>> |af4050
> >>> AIRFLOW-1602|Use LoggingMixin for the DAG class                |#2602
> >>> |956699
> >>> AIRFLOW-1601|Add configurable time between SIGTERM and SIGKILL |#2601
> >>> |48a95e
> >>> AIRFLOW-1600|Uncaught exceptions in get_fernet if cryptography |#2600
> >>> |ad963e
> >>> AIRFLOW-1597|Add GameWisp as Airflow user                      |#2599
> >>> |26b747
> >>> AIRFLOW-1594|Installing via pip copies test files into python l|#2597
> >>> |a6b23a
> >>> AIRFLOW-1593|Expose load_string in WasbHook                    |#2596
> >>> |7ece95
> >>> AIRFLOW-1591|Exception: 'TaskInstance' object has no attribute |#2578
> >>> |f4653e
> >>> AIRFLOW-1590|Small fix for dates util                          |#2652
> >>> |31946e
> >>> AIRFLOW-1587|fix `ImportError: cannot import name 'CeleryExecut|#2590
> >>> |34c73b
> >>> AIRFLOW-1586|MySQL to GCS to BigQuery fails for tables with dat|#2589
> >>> |e83012
> >>> AIRFLOW-1584|Remove the insecure /headers endpoints            |#2588
> >>> |17ac07
> >>> AIRFLOW-1582|Improve logging structure of Airflow              |#2592
> >>> |a7a518
> >>> AIRFLOW-1580|Error in string formatter when throwing an excepti|#2583
> >>> |ea9ab9
> >>> AIRFLOW-1579|Allow jagged rows in BQ Hook.                     |#2582
> >>> |5b978b
> >>> AIRFLOW-1577|Add token support to DatabricksHook               |#2579
> >>> |c2c515
> >>> AIRFLOW-1573|Remove `thrift < 0.10.0` requirement              |#2574
> >>> |aa95f2
> >>> AIRFLOW-1571|Add AWS Lambda Hook for invoking Lambda Function  |#2718
> >>> |017f18
> >>> AIRFLOW-1568|Add datastore import/export operator              |#2568
> >>> |86063b
> >>> AIRFLOW-1567|Clean up ML Engine operators                      |#2567
> >>> |af91e2
> >>> AIRFLOW-1564|Default logging filename contains a colon         |#2565
> >>> |4c674c
> >>> AIRFLOW-1560|Add AWS DynamoDB hook for inserting batch items   |#2587
> >>> |71400b
> >>> AIRFLOW-1556|BigQueryBaseCursor should support SQL parameters  |#2557
> >>> |9df0ac
> >>> AIRFLOW-1546| add Zymergen to org list in README               |#2512
> >>> |7cc346
> >>> AIRFLOW-1535|Add support for Dataproc serviceAccountScopes in D|#2546
> >>> |b1f902
> >>> AIRFLOW-1529|Support quoted newlines in Google BigQuery load jo|#2545
> >>> |4a4b02
> >>> AIRFLOW-1527|Refactor celery config to make use of template    |#2542
> >>> |f4437b
> >>> AIRFLOW-1522|Increase size of val column for variable table in |#2535
> >>> |8a2d24
> >>> AIRFLOW-1521|Template fields definition for bigquery_table_dele|#2534
> >>> |f1a7c0
> >>> AIRFLOW-1520|S3Hook uses boto2                                 |#2532
> >>> |386583
> >>> AIRFLOW-1519|Main DAG list page does not scale using client sid|#2531
> >>> |d7d7ce
> >>> AIRFLOW-1512|Add operator for running Python functions in a vir|#2446
> >>> |14e6d7
> >>> AIRFLOW-1507|Make src, dst and bucket parameters as templated i|#2516
> >>> |d295cf
> >>> AIRFLOW-1505|Document when Jinja substitution occurs           |#2523
> >>> |984a87
> >>> AIRFLOW-1504|Log Cluster Name on Dataproc Operator When Execute|#2517
> >>> |1cd6c4
> >>> AIRFLOW-1499s|Eliminate duplicate and unneeded code             |-
>  |-
> >>>
> >>> AIRFLOW-1497|Hidden fields in connection form aren't reset when|#2507
> >>> |d8da8b
> >>> AIRFLOW-1493|Fix race condition with airflow run               |#2505
> >>> |b2e175
> >>> AIRFLOW-1492|Add metric for task success/failure               |#2504
> >>> |fa84d4
> >>> AIRFLOW-1489|Docs: Typo in BigQueryCheckOperator               |#2501
> >>> |111ce5
> >>> AIRFLOW-1483|Page size on model views is to large to render qui|#2497
> >>> |04bfba
> >>> AIRFLOW-1478|Chart -> Owner column should be sortable          |#2493
> >>> |651e60
> >>> AIRFLOW-1476|Add INSTALL file for source releases              |#2492
> >>> |da76ac
> >>> AIRFLOW-1474|Add dag_id regex for 'airflow clear' CLI command  |#2486
> >>> |18f849
> >>> AIRFLOW-1470s|BashSensor Implementation                         |-
>  |-
> >>>
> >>> AIRFLOW-1459|integration rst doc is broken in github view      |#2481
> >>> |322ec9
> >>> AIRFLOW-1438|Scheduler batch queries should have a limit       |#2462
> >>> |3547cb
> >>> AIRFLOW-1437|BigQueryTableDeleteOperator should define deletion|#2459
> >>> |b87903
> >>> AIRFLOW-1432|NVD3 Charts do not have labeled axes and units cha|#2710
> >>> |70ffa4
> >>> AIRFLOW-1402|Cleanup SafeConfigParser DeprecationWarning       |#2435
> >>> |38c86b
> >>> AIRFLOW-1401|Standardize GCP project, region, and zone argument|#2439
> >>> |b6d363
> >>> AIRFLOW-1397|Airflow 1.8.1 - No data displays in Last Run Colum|-
>  |-
> >>>
> >>> AIRFLOW-1394|Add quote_character parameter to GoogleCloudStorag|#2428
> >>> |9fd0be
> >>> AIRFLOW-1389|BigQueryOperator should support `createDisposition|#2470
> >>> |6e2640
> >>> AIRFLOW-1384|Add ARGO/CaDC                                     |#2434
> >>> |715947
> >>> AIRFLOW-1368|Automatically remove the container when it exits  |#2653
> >>> |d42d23
> >>> AIRFLOW-1359|Provide GoogleCloudML operator for model evaluatio|#2407
> >>> |194d1d
> >>> AIRFLOW-1356|add `--celery_hostname` to `airflow worker`       |#2405
> >>> |b9d7d1
> >>> AIRFLOW-1352|Revert bad logging Handler                        |-
>  |-
> >>>
> >>> AIRFLOW-1350|Add "query_uri" parameter for Google DataProc oper|#2402
> >>> |d32c72
> >>> AIRFLOW-1348|Paginated UI has broken toggles after first page  |-
>  |-
> >>>
> >>> AIRFLOW-1345|Don't commit on each loop                         |#2397
> >>> |0dd002
> >>> AIRFLOW-1344|Builds failing on Python 3.5 with AttributeError  |#2394
> >>> |2a5883
> >>> AIRFLOW-1343|Add airflow default label to the dataproc operator|#2396
> >>> |e4b240
> >>> AIRFLOW-1338|gcp_dataflow_hook is incompatible with the recent |#2388
> >>> |cf2605
> >>> AIRFLOW-1337|Customize log format via config file              |#2392
> >>> |4841e3
> >>> AIRFLOW-1335|Use buffered logger                               |#2386
> >>> |0d23d3
> >>> AIRFLOW-1333|Enable copy function for Google Cloud Storage Hook|#2385
> >>> |e2c383
> >>> AIRFLOW-1331|Contrib.SparkSubmitOperator should allow --package|#2622
> >>> |fbca8f
> >>> AIRFLOW-1330|Connection.parse_from_uri doesn't work for google_|#2525
> >>> |6e5e9d
> >>> AIRFLOW-1324|Make the Druid operator/hook more general         |#2378
> >>> |de99aa
> >>> AIRFLOW-1323|Operators related to Dataproc should keep some par|#2636
> >>> |ed248d
> >>> AIRFLOW-1315|Add Qubole File and Partition Sensors             |-
>  |-
> >>>
> >>> AIRFLOW-1309|Add optional hive_tblproperties in HiveToDruidTran|-
>  |-
> >>>
> >>> AIRFLOW-1301|Add New Relic to Airflow user list                |#2359
> >>> |355fc9
> >>> AIRFLOW-1299|Google Dataproc cluster creation operator should s|#2358
> >>> |c2b80e
> >>> AIRFLOW-1289|Don't restrict scheduler threads to CPU cores     |#2353
> >>> |8e23d2
> >>> AIRFLOW-1286|BaseTaskRunner - Exception TypeError: a bytes-like|#2363
> >>> |d8891d
> >>> AIRFLOW-1277|Forbid creation of a known event with empty fields|#na
> >>> |65184a
> >>> AIRFLOW-1276|Forbid event creation with end_data earlier than s|#na
> >>> |d5d02f
> >>> AIRFLOW-1275|Fix `airflow pool` command exception              |#2346
> >>> |9958aa
> >>> AIRFLOW-1273|Google Cloud ML Version and Model CRUD Operator   |#2379
> >>> |534a0e
> >>> AIRFLOW-1272|Google Cloud ML Batch Prediction Operator         |#2390
> >>> |e92d6b
> >>> AIRFLOW-1271|Google Cloud ML Training Operator                 |#2408
> >>> |0fc450
> >>> AIRFLOW-1256|Add United Airlines as Airflow user               |#2332
> >>> |d3484a
> >>> AIRFLOW-1251|Add eRevalue as an Airflow user                   |#2331
> >>> |8d5160
> >>> AIRFLOW-1248|Fix inconsistent configuration name for worker tim|#2328
> >>> |92314f
> >>> AIRFLOW-1247|CLI: ignore all dependencies argument ignored     |#2441
> >>> |e88ecf
> >>> AIRFLOW-1245|Fix random failure of test_trigger_dag_for_date un|#2325
> >>> |cef01b
> >>> AIRFLOW-1244|Forbid creation of a pool with empty name         |#2324
> >>> |df9a10
> >>> AIRFLOW-1242|BigQueryHook assumes that a valid project_id can't|#2335
> >>> |ffe616
> >>> AIRFLOW-1237|Fix IN-predicate sqlalchemy warning               |#2320
> >>> |a1f422
> >>> AIRFLOW-1234|Cover utils.operator_helpers with unit tests      |#2317
> >>> |d16537
> >>> AIRFLOW-1233|Cover utils.json with unit tests                  |#2316
> >>> |502410
> >>> AIRFLOW-1232|Remove deprecated readfp warning                  |#2315
> >>> |6ffaaf
> >>> AIRFLOW-1231|Use flask_wtf.CSRFProtect instead of flask_wtf.Csr|#2313
> >>> |cac49e
> >>> AIRFLOW-1221|Fix DatabricksSubmitRunOperator Templating        |#2308
> >>> |0fa104
> >>> AIRFLOW-1217|Enable logging in Sqoop hook                      |#2307
> >>> |4f459b
> >>> AIRFLOW-1213|Add hcatalog parameters to the sqoop operator/hook|#2305
> >>> |857850
> >>> AIRFLOW-1208|Speed-up cli tests                                |#2301
> >>> |21c142
> >>> AIRFLOW-1207|Enable utils.helpers unit tests                   |#2300
> >>> |8ac87b
> >>> AIRFLOW-1203|Tests failing after oauth upgrade                 |#2296
> >>> |3e9c66
> >>> AIRFLOW-1201|Update deprecated 'nose-parameterized' library to |#2298
> >>> |d2d3e4
> >>> AIRFLOW-1193|Add Checkr to Airflow user list                   |#2276
> >>> |707238
> >>> AIRFLOW-1189|Get pandas DataFrame using BigQueryHook fails     |#2287
> >>> |93666f
> >>> AIRFLOW-1188|Add max_bad_records param to GoogleCloudStorageToB|#2286
> >>> |443e6b
> >>> AIRFLOW-1187|Obsolete package names in documentation           |-
>  |-
> >>>
> >>> AIRFLOW-1185|Incorrect url to PyPi                             |#2283
> >>> |829755
> >>> AIRFLOW-1181|Enable delete and list function for Google Cloud S|#2281
> >>> |24f73c
> >>> AIRFLOW-1179|Pandas 0.20 broke Google BigQuery hook            |#2279
> >>> |ac9ccb
> >>> AIRFLOW-1177|variable json deserialize does not work at set def|#2540
> >>> |65319a
> >>> AIRFLOW-1175|Add Pronto Tools to Airflow user list             |#2277
> >>> |86aafa
> >>> AIRFLOW-1173|Add Robinhood to list of Airflow users            |#2271
> >>> |379115
> >>> AIRFLOW-1165|airflow webservice crashes on ubuntu16 - python3  |-
>  |-
> >>>
> >>> AIRFLOW-1160|Upadte SparkSubmitOperator parameters             |#2265
> >>> |2e3f07
> >>> AIRFLOW-1155|Add Tails.com to community                        |#2261
> >>> |2fa690
> >>> AIRFLOW-1149|Allow custom filters to be added to jinja2        |#2258
> >>> |48135a
> >>> AIRFLOW-1141|Remove DAG.crawl_for_tasks method                 |#2275
> >>> |a30fee
> >>> AIRFLOW-1140|DatabricksSubmitRunOperator should template the "j|#2255
> >>> |e6d316
> >>> AIRFLOW-1136|Invalid parameters are not captured for Sqoop oper|#2252
> >>> |2ef4db
> >>> AIRFLOW-1125|Clarify documentation regarding fernet_key        |#2251
> >>> |831f8d
> >>> AIRFLOW-1122|Node strokes are too thin for people with color vi|#2246
> >>> |a08761
> >>> AIRFLOW-1121|airflow webserver --pid no longer write out pid fi|-
>  |-
> >>>
> >>> AIRFLOW-1118|Add evo.company to Airflow users                  |#2243
> >>> |f16914
> >>> AIRFLOW-1112|Log which pool is full in scheduler when pool slot|#2242
> >>> |74c1ce
> >>> AIRFLOW-1107|Add support for ftps non-default port             |#2240
> >>> |4d0c2f
> >>> AIRFLOW-1106|Add Groupalia/Letsbonus                           |#2239
> >>> |945b42
> >>> AIRFLOW-1095|ldap_auth memberOf should come from configuration |#2232
> >>> |6b1c32
> >>> AIRFLOW-1094|Invalid unit tests under `contrib/`               |#2234
> >>> |219c50
> >>> AIRFLOW-1091|As a release manager I want to be able to compare |#2231
> >>> |bfae42
> >>> AIRFLOW-1090|Add HBO                                           |#2230
> >>> |177d34
> >>> AIRFLOW-1089|Add Spark application arguments to SparkSubmitOper|#2229
> >>> |e5b914
> >>> AIRFLOW-1081|Task duration page is slow                        |#2226
> >>> |0da512
> >>> AIRFLOW-1075|Cleanup security docs                             |#2222
> >>> |5a6f18
> >>> AIRFLOW-1065|Add functionality for Azure Blob Storage          |#2216
> >>> |f1bc5f
> >>> AIRFLOW-1059|Reset_state_for_orphaned_task should operate in ba|#2205
> >>> |e05d3b
> >>> AIRFLOW-1058|Improvements for SparkSubmitOperator              |-
>  |-
> >>>
> >>> AIRFLOW-1051|Add a test for resetdb to CliTests                |#2198
> >>> |15aee0
> >>> AIRFLOW-1047|Airflow logs vulnerable to XSS                    |#2193
> >>> |fe9ebe
> >>> AIRFLOW-1045|Make log level configurable via airflow.cfg       |#2191
> >>> |e739a5
> >>> AIRFLOW-1043|Documentation issues for operators                |#2188
> >>> |b55f41
> >>> AIRFLOW-1041|DockerOperator replaces its xcom_push method with |#2274
> >>> |03704c
> >>> AIRFLOW-1040|Fix typos in comments/docstrings in models.py     |#2174
> >>> |d8c0f5
> >>> AIRFLOW-1036|Exponential backoff should use randomization      |#2262
> >>> |66168e
> >>> AIRFLOW-1035|Exponential backoff retry logic should use 2 as ba|#2196
> >>> |4ec932
> >>> AIRFLOW-1034|Make it possible to connect to S3 in sigv4 regions|#2181
> >>> |4c0905
> >>> AIRFLOW-1031|'scheduled__' may replace with DagRun.ID_PREFIX in|#2613
> >>> |aa3844
> >>> AIRFLOW-1030|HttpHook error when creating HttpSensor           |-
>  |-
> >>>
> >>> AIRFLOW-1028|Databricks Operator for Airflow                   |#2202
> >>> |53ca50
> >>> AIRFLOW-1024|Handle CeleryExecutor errors gracefully           |#2355
> >>> |7af20f
> >>> AIRFLOW-1018|Scheduler DAG processes can not log to stdout     |#2728
> >>> |ef775d
> >>> AIRFLOW-1016|Allow HTTP HEAD request method on HTTPSensor      |#2175
> >>> |4c41f6
> >>> AIRFLOW-1010|Add a convenience script for signing              |#2169
> >>> |a2b65a
> >>> AIRFLOW-1009|Remove SQLOperator from Concepts page             |#2168
> >>> |7d1144
> >>> AIRFLOW-1007|Jinja sandbox is vulnerable to RCE                |#2184
> >>> |daa281
> >>> AIRFLOW-1005|Speed up Airflow startup time                     |#na
> >>> |996dd3
> >>> AIRFLOW-999 |Support for Redis database                        |#2165
> >>> |8de850
> >>> AIRFLOW-997 |Change setup.cfg to point to Apache instead of Max|#na
> >>> |75cd46
> >>> AIRFLOW-995 |Update Github PR template                         |#2163
> >>> |b62485
> >>> AIRFLOW-994 |Add MiNODES to the AIRFLOW Active Users List      |#2159
> >>> |ca1623
> >>> AIRFLOW-991 |Mark_success while a task is running leads to fail|-
>  |-
> >>>
> >>> AIRFLOW-990 |DockerOperator fails when logging unicode string  |#2155
> >>> |6bbf54
> >>> AIRFLOW-988 |SLA Miss Callbacks Are Repeated if Email is Not be|#2415
> >>> |6e74d4
> >>> AIRFLOW-985 |Extend the sqoop operator/hook with additional par|#2177
> >>> |82eb20
> >>> AIRFLOW-984 |Subdags unrecognized when subclassing SubDagOperat|#2152
> >>> |a8bd16
> >>> AIRFLOW-979 |Add GovTech GDS                                   |#2149
> >>> |b17bd3
> >>> AIRFLOW-976 |Mark success running task causes it to fail       |-
>  |-
> >>>
> >>> AIRFLOW-969 |Catch bad python_callable argument at DAG construc|#2142
> >>> |12901d
> >>> AIRFLOW-963 |Some code examples are not rendered in the airflow|#2139
> >>> |f69c1b
> >>> AIRFLOW-960 |Add support for .editorconfig                     |#na
> >>> |f5cacc
> >>> AIRFLOW-959 |.gitignore file is disorganized and incomplete    |#na
> >>> |3d3c14
> >>> AIRFLOW-958 |Improve tooltip readability                       |#2134
> >>> |b3c3eb
> >>> AIRFLOW-950 |Missing AWS integrations on documentation::integra|#2552
> >>> |01be02
> >>> AIRFLOW-947 |Make PrestoHook surface better messages when the P|#na
> >>> |6dd4b3
> >>> AIRFLOW-945 |Revert psycopg2 workaround when psycopg2 2.7.1 is |-
>  |-
> >>>
> >>> AIRFLOW-943 |Add Digital First Media to the Airflow users list |#2115
> >>> |2cfe28
> >>> AIRFLOW-942 |Add mytaxi to Airflow Users                       |#2111
> >>> |d579e6
> >>> AIRFLOW-935 |Impossible to use plugin executors                |#2120
> >>> |08a784
> >>> AIRFLOW-926 |jdbc connector is broken due to jaydebeapi api upd|#2651
> >>> |07ed29
> >>> AIRFLOW-917 |Incorrectly formatted failure status message      |#2109
> >>> |b8164c
> >>> AIRFLOW-916 |Fix ConfigParser deprecation warning              |#2108
> >>> |ef6dd1
> >>> AIRFLOW-911 |Add colouring and profiling info on tests         |#2106
> >>> |4f52db
> >>> AIRFLOW-903 |Add configuration setting for default DAG view.   |#2103
> >>> |cadfae
> >>> AIRFLOW-896 |BigQueryOperator fails to execute with certain inp|#2097
> >>> |2bceee
> >>> AIRFLOW-891 |Webserver Clock Should Include Day                |-
>  |-
> >>>
> >>> AIRFLOW-889 |Minor error in the docstrings for BaseOperator.   |#2084
> >>> |50702d
> >>> AIRFLOW-887 |Add compatibility with future v0.16               |#na
> >>> |50902d
> >>> AIRFLOW-886 |Pass Operator result to post_execute hook         |#na
> >>> |4da361
> >>> AIRFLOW-885 |Add Change.org to the list of Airflow users       |#2089
> >>> |a279be
> >>> AIRFLOW-882 |Code example in docs has unnecessary DAG>>Operator|#2088
> >>> |baa4cd
> >>> AIRFLOW-881 |Create SubDagOperator within DAG context manager w|#2087
> >>> |0ed608
> >>> AIRFLOW-880 |Fix remote log functionality inconsistencies for W|#2086
> >>> |974b75
> >>> AIRFLOW-877 |GoogleCloudStorageDownloadOperator: template_ext c|#2083
> >>> |debc69
> >>> AIRFLOW-875 |Allow HttpSensor params to be templated           |#2080
> >>> |62f503
> >>> AIRFLOW-871 |multiple places use logging.warn() instead of warn|#2082
> >>> |21d775
> >>> AIRFLOW-866 |Add FTPSensor                                     |#2070
> >>> |5f87f8
> >>> AIRFLOW-863 |Example DAG start dates should be recent to avoid |#2068
> >>> |bbfd43
> >>> AIRFLOW-862 |Add DaskExecutor                                  |#2067
> >>> |6e2210
> >>> AIRFLOW-860 |Circular module dependency prevents loading of cus|-
>  |-
> >>>
> >>> AIRFLOW-854 |Add Open Knowledge International to Airflow users |#2061
> >>> |51a311
> >>> AIRFLOW-842 |scheduler.clean_dirty raises warning: SAWarning: T|#2072
> >>> |485280
> >>> AIRFLOW-840 |Python3 encoding issue in Kerberos                |#2158
> >>> |639336
> >>> AIRFLOW-836 |The paused and queryview endpoints are vulnerable |#2054
> >>> |6aca2c
> >>> AIRFLOW-831 |Fix broken unit tests                             |#2050
> >>> |b86194
> >>> AIRFLOW-830 |Plugin manager should log to debug, not info      |-
>  |-
> >>>
> >>> AIRFLOW-829 |Reduce verbosity of successful Travis unit tests  |-
>  |-
> >>>
> >>> AIRFLOW-826 |Add Zendesk Hook                                  |#2066
> >>> |a09762
> >>> AIRFLOW-823 |Make task instance details available via API      |#2045
> >>> |3f546e
> >>> AIRFLOW-822 |Close the connection before throwing exception in |#2038
> >>> |4b6c38
> >>> AIRFLOW-821 |Scheduler dagbag importing not Py3 compatible     |#2039
> >>> |fbb59b
> >>> AIRFLOW-809 |SqlAlchemy is_ ColumnOperator Causing Errors in MS|-
>  |-
> >>>
> >>> AIRFLOW-802 |Integration of spark-submit                       |-
>  |-
> >>>
> >>> AIRFLOW-781 |Allow DataFlowJavaOperator to accept jar file stor|#2037
> >>> |259c86
> >>> AIRFLOW-770 |HDFS hooks should support alternative ways of gett|#2056
> >>> |261b65
> >>> AIRFLOW-756 |Refactor ssh_hook and ssh_operator                |-
>  |-
> >>>
> >>> AIRFLOW-751 |SFTP file transfer functionality                  |#1999
> >>> |fe0ede
> >>> AIRFLOW-725 |Make merge tool use OS' keyring for password stora|#1966
> >>> |8c1695
> >>> AIRFLOW-706 |Configuration shell commands are not split properl|#2053
> >>> |0bb6f2
> >>> AIRFLOW-705 |airflow.configuration.run_command output does not |-
>  |-
> >>>
> >>> AIRFLOW-681 |homepage doc link should pointing to apache's repo|#2164
> >>> |a8027a
> >>> AIRFLOW-654 |SSL for AMQP w/ Celery(Executor)                  |#2333
> >>> |868bfe
> >>> AIRFLOW-645 |HttpHook ignores https                            |#2311
> >>> |fd381a
> >>> AIRFLOW-365 |Code view in subdag trigger exception             |#2043
> >>> |cf102c
> >>> AIRFLOW-300 |Add Google Pubsub hook and operator               |#2036
> >>> |d231dc
> >>> AIRFLOW-289 |Use datetime.utcnow() to keep airflow system indep|#2618
> >>> |20c83e
> >>> AIRFLOW-71  |docker_operator - pulling from private repositorie|#na
> >>> |d4406c
> >>>
> >>> Cheers,
> >>> Chris
> >>>
> >
>
>

Re: [VOTE] Airflow 1.9.0rc1

Posted by Bolke de Bruin <bd...@gmail.com>.
Hi Chris,

Actively testing here: we found an issue in the SSHOperator introduced in 1.9.0 (fix already merged for RC2, but blocking I as it stops us from running SSH properly), some minor fixes by Airbnb should also be in RC2. There is some logging “weirdness”, that might warrant a small patch here in there and could be squeezed into RC2, but I don’t consider them blocking.

So almost there, but we need an RC2 imho.

-1, binding.

Bolke

> On 8 Nov 2017, at 19:00, Ash Berlin-Taylor <as...@firemirror.com> wrote:
> 
> -1 (for now. Non binding. Is that how this process works?)
> 
> We've built a test env for this RC and are testing, but have run into an issue reading task logs. (See below)
> 
> We haven't gotten very far with this yet, we will dig more tomorrow (it's the end of the UK work day now). I suspect this might be how we've misconfigured our logging. We will see tomorrow.
> 
> -ash
> 
> 
> 
> 
> File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py", line 712, in log
>    logs = handler.read(ti)
> AttributeError: 'NoneType' object has no attribute 'read'
> 
> During handling of the above exception, another exception occurred:
> 
> Traceback (most recent call last):
>  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1988, in wsgi_app
>    response = self.full_dispatch_request()
>  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1641, in full_dispatch_request
>    rv = self.handle_user_exception(e)
>  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1544, in handle_user_exception
>    reraise(exc_type, exc_value, tb)
>  File "/usr/local/lib/python3.5/dist-packages/flask/_compat.py", line 33, in reraise
>    raise value
>  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1639, in full_dispatch_request
>    rv = self.dispatch_request()
>  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1625, in dispatch_request
>    return self.view_functions[rule.endpoint](**req.view_args)
>  File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py", line 69, in inner
>    return self._run_view(f, *args, **kwargs)
>  File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py", line 368, in _run_view
>    return fn(self, *args, **kwargs)
>  File "/usr/local/lib/python3.5/dist-packages/flask_login.py", line 758, in decorated_view
>    return func(*args, **kwargs)
>  File "/usr/local/lib/python3.5/dist-packages/airflow/www/utils.py", line 262, in wrapper
>    return f(*args, **kwargs)
>  File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py", line 715, in log
>    .format(task_log_reader, e.message)]
> AttributeError: 'AttributeError' object has no attribute 'message'
> 
> 
>> On 8 Nov 2017, at 17:46, Chris Riccomini <cr...@apache.org> wrote:
>> 
>> Anyone? :/
>> 
>> On Mon, Nov 6, 2017 at 1:22 PM, Chris Riccomini <cr...@apache.org>
>> wrote:
>> 
>>> Hey all,
>>> 
>>> I have cut Airflow 1.9.0 RC1. This email is calling a vote on the release,
>>> which will last fo 72 hours. Consider this my (binding) +1.
>>> 
>>> Airflow 1.9.0 RC1 is available at:
>>> 
>>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.9.0rc1/
>>> 
>>> apache-airflow-1.9.0rc1+incubating-source.tar.gz is a source release that
>>> comes with INSTALL instructions.
>>> apache-airflow-1.9.0rc1+incubating-bin.tar.gz is the binary Python
>>> "sdist" release.
>>> 
>>> Public keys are available at:
>>> 
>>> https://dist.apache.org/repos/dist/release/incubator/airflow/
>>> 
>>> The release contains the following JIRAs:
>>> 
>>> ISSUE ID    |DESCRIPTION                                       |PR
>>> |COMMIT
>>> AIRFLOW-1779|Add keepalive packets to ssh hook                 |#2749
>>> |d2f9d1
>>> AIRFLOW-1776|stdout/stderr logging not captured                |#2745
>>> |590d9f
>>> AIRFLOW-1771|Change heartbeat text from boom to heartbeat      |-     |-
>>> 
>>> AIRFLOW-1767|Airflow Scheduler no longer schedules DAGs        |-     |-
>>> 
>>> AIRFLOW-1765|Default API auth backed should deny all.          |#2737
>>> |6ecdac
>>> AIRFLOW-1764|Web Interface should not use experimental api     |#2738
>>> |6bed1d
>>> AIRFLOW-1757|Contrib.SparkSubmitOperator should allow --package|#2725
>>> |4e06ee
>>> AIRFLOW-1745|BashOperator ignores SIGPIPE in subprocess        |#2714
>>> |e021c9
>>> AIRFLOW-1744|task.retries can be False                         |#2713
>>> |6144c6
>>> AIRFLOW-1743|Default config template should not contain ldap fi|#2712
>>> |270684
>>> AIRFLOW-1741|Task Duration shows two charts on first page load.|#2711
>>> |974b49
>>> AIRFLOW-1734|Sqoop Operator contains logic errors & needs optio|#2703
>>> |f6810c
>>> AIRFLOW-1731|Import custom config on PYTHONPATH                |#2721
>>> |f07eb3
>>> AIRFLOW-1726|Copy Expert command for Postgres Hook             |#2698
>>> |8a4ad3
>>> AIRFLOW-1719|Fix small typo - your vs you                      |-     |-
>>> 
>>> AIRFLOW-1712|Log SSHOperator output                            |-     |-
>>> 
>>> AIRFLOW-1711|Ldap Attributes not always a "list" part 2        |#2731
>>> |40a936
>>> AIRFLOW-1706|Scheduler is failed on startup with MS SQL Server |#2733
>>> |9e209b
>>> AIRFLOW-1698|Remove confusing SCHEDULER_RUNS env var from syste|#2677
>>> |00dd06
>>> AIRFLOW-1695|Redshift Hook using boto3 & AWS Hook              |#2717
>>> |bfddae
>>> AIRFLOW-1694|Hive Hooks: Python 3 does not have an `itertools.i|#2674
>>> |c6e5ae
>>> AIRFLOW-1692|Master cannot be checked out on windows           |#2673
>>> |31805e
>>> AIRFLOW-1691|Add better documentation for Google cloud storage |#2671
>>> |ace2b1
>>> AIRFLOW-1690|Error messages regarding gcs log commits are spars|#2670
>>> |5fb5cd
>>> AIRFLOW-1682|S3 task handler never writes to S3                |#2664
>>> |0080f0
>>> AIRFLOW-1678|Fix docstring errors for `set_upstream` and `set_d|-     |-
>>> 
>>> AIRFLOW-1677|Fix typo in example_qubole_operator               |-     |-
>>> 
>>> AIRFLOW-1676|GCS task handler never writes to GCS              |#2659
>>> |781fa4
>>> AIRFLOW-1675|Fix API docstrings to be properly rendered        |#2667
>>> |f12381
>>> AIRFLOW-1671|Missing @apply_defaults annotation for gcs downloa|#2655
>>> |97666b
>>> AIRFLOW-1669|Fix Docker import in Master                       |#na
>>> |f7f2a8
>>> AIRFLOW-1668|Redhsift requires a keep alive of < 300s          |#2650
>>> |f2bb77
>>> AIRFLOW-1664|Make MySqlToGoogleCloudStorageOperator support bin|#2649
>>> |95813d
>>> AIRFLOW-1660|Change webpage width to full-width                |#2646
>>> |8ee3d9
>>> AIRFLOW-1659|Fix invalid attribute bug in FileTaskHandler      |#2645
>>> |bee823
>>> AIRFLOW-1658|Kill (possibly) still running Druid indexing job a|#2644
>>> |cbf7ad
>>> AIRFLOW-1657|Handle failure of Qubole Operator for s3distcp had|-     |-
>>> 
>>> AIRFLOW-1654|Show tooltips for link icons in DAGs view         |#2642
>>> |ada7b2
>>> AIRFLOW-1647|Fix Spark-sql hook                                |#2637
>>> |b1e5c6
>>> AIRFLOW-1641|Task gets stuck in queued state                   |#2715
>>> |735497
>>> AIRFLOW-1640|Add Qubole default connection in connection table |-     |-
>>> 
>>> AIRFLOW-1639|ValueError does not have .message attribute       |#2629
>>> |87df67
>>> AIRFLOW-1637|readme not tracking master branch for travis      |-     |-
>>> 
>>> AIRFLOW-1636|aws and emr connection types get cleared          |#2626
>>> |540e04
>>> AIRFLOW-1635|Allow creating Google Cloud Platform connection wi|#2640
>>> |6dec7a
>>> AIRFLOW-1629|make extra a textarea in edit connections form    |#2623
>>> |f5d46f
>>> AIRFLOW-1628|Docstring of sqlsensor is incorrect               |#2621
>>> |9ba73d
>>> AIRFLOW-1627|SubDagOperator initialization should only query po|#2620
>>> |516ace
>>> AIRFLOW-1621|Add tests for logic added on server side dag list |#2614
>>> |8de9fd
>>> AIRFLOW-1614|Improve performance of DAG parsing when there are |#2610
>>> |a95adb
>>> AIRFLOW-1611|Customize logging in Airflow                      |#2631
>>> |8b4a50
>>> AIRFLOW-1609|Ignore all venvs in gitignore                     |#2608
>>> |f1f9b4
>>> AIRFLOW-1608|GCP Dataflow hook missing pending job state       |#2607
>>> |653562
>>> AIRFLOW-1606|DAG.sync_to_db is static, but takes a DAG as first|#2606
>>> |6ac296
>>> AIRFLOW-1605|Fix log source of local loggers                   |-     |-
>>> 
>>> AIRFLOW-1604|Rename the logger to log                          |#2604
>>> |af4050
>>> AIRFLOW-1602|Use LoggingMixin for the DAG class                |#2602
>>> |956699
>>> AIRFLOW-1601|Add configurable time between SIGTERM and SIGKILL |#2601
>>> |48a95e
>>> AIRFLOW-1600|Uncaught exceptions in get_fernet if cryptography |#2600
>>> |ad963e
>>> AIRFLOW-1597|Add GameWisp as Airflow user                      |#2599
>>> |26b747
>>> AIRFLOW-1594|Installing via pip copies test files into python l|#2597
>>> |a6b23a
>>> AIRFLOW-1593|Expose load_string in WasbHook                    |#2596
>>> |7ece95
>>> AIRFLOW-1591|Exception: 'TaskInstance' object has no attribute |#2578
>>> |f4653e
>>> AIRFLOW-1590|Small fix for dates util                          |#2652
>>> |31946e
>>> AIRFLOW-1587|fix `ImportError: cannot import name 'CeleryExecut|#2590
>>> |34c73b
>>> AIRFLOW-1586|MySQL to GCS to BigQuery fails for tables with dat|#2589
>>> |e83012
>>> AIRFLOW-1584|Remove the insecure /headers endpoints            |#2588
>>> |17ac07
>>> AIRFLOW-1582|Improve logging structure of Airflow              |#2592
>>> |a7a518
>>> AIRFLOW-1580|Error in string formatter when throwing an excepti|#2583
>>> |ea9ab9
>>> AIRFLOW-1579|Allow jagged rows in BQ Hook.                     |#2582
>>> |5b978b
>>> AIRFLOW-1577|Add token support to DatabricksHook               |#2579
>>> |c2c515
>>> AIRFLOW-1573|Remove `thrift < 0.10.0` requirement              |#2574
>>> |aa95f2
>>> AIRFLOW-1571|Add AWS Lambda Hook for invoking Lambda Function  |#2718
>>> |017f18
>>> AIRFLOW-1568|Add datastore import/export operator              |#2568
>>> |86063b
>>> AIRFLOW-1567|Clean up ML Engine operators                      |#2567
>>> |af91e2
>>> AIRFLOW-1564|Default logging filename contains a colon         |#2565
>>> |4c674c
>>> AIRFLOW-1560|Add AWS DynamoDB hook for inserting batch items   |#2587
>>> |71400b
>>> AIRFLOW-1556|BigQueryBaseCursor should support SQL parameters  |#2557
>>> |9df0ac
>>> AIRFLOW-1546| add Zymergen to org list in README               |#2512
>>> |7cc346
>>> AIRFLOW-1535|Add support for Dataproc serviceAccountScopes in D|#2546
>>> |b1f902
>>> AIRFLOW-1529|Support quoted newlines in Google BigQuery load jo|#2545
>>> |4a4b02
>>> AIRFLOW-1527|Refactor celery config to make use of template    |#2542
>>> |f4437b
>>> AIRFLOW-1522|Increase size of val column for variable table in |#2535
>>> |8a2d24
>>> AIRFLOW-1521|Template fields definition for bigquery_table_dele|#2534
>>> |f1a7c0
>>> AIRFLOW-1520|S3Hook uses boto2                                 |#2532
>>> |386583
>>> AIRFLOW-1519|Main DAG list page does not scale using client sid|#2531
>>> |d7d7ce
>>> AIRFLOW-1512|Add operator for running Python functions in a vir|#2446
>>> |14e6d7
>>> AIRFLOW-1507|Make src, dst and bucket parameters as templated i|#2516
>>> |d295cf
>>> AIRFLOW-1505|Document when Jinja substitution occurs           |#2523
>>> |984a87
>>> AIRFLOW-1504|Log Cluster Name on Dataproc Operator When Execute|#2517
>>> |1cd6c4
>>> AIRFLOW-1499s|Eliminate duplicate and unneeded code             |-     |-
>>> 
>>> AIRFLOW-1497|Hidden fields in connection form aren't reset when|#2507
>>> |d8da8b
>>> AIRFLOW-1493|Fix race condition with airflow run               |#2505
>>> |b2e175
>>> AIRFLOW-1492|Add metric for task success/failure               |#2504
>>> |fa84d4
>>> AIRFLOW-1489|Docs: Typo in BigQueryCheckOperator               |#2501
>>> |111ce5
>>> AIRFLOW-1483|Page size on model views is to large to render qui|#2497
>>> |04bfba
>>> AIRFLOW-1478|Chart -> Owner column should be sortable          |#2493
>>> |651e60
>>> AIRFLOW-1476|Add INSTALL file for source releases              |#2492
>>> |da76ac
>>> AIRFLOW-1474|Add dag_id regex for 'airflow clear' CLI command  |#2486
>>> |18f849
>>> AIRFLOW-1470s|BashSensor Implementation                         |-     |-
>>> 
>>> AIRFLOW-1459|integration rst doc is broken in github view      |#2481
>>> |322ec9
>>> AIRFLOW-1438|Scheduler batch queries should have a limit       |#2462
>>> |3547cb
>>> AIRFLOW-1437|BigQueryTableDeleteOperator should define deletion|#2459
>>> |b87903
>>> AIRFLOW-1432|NVD3 Charts do not have labeled axes and units cha|#2710
>>> |70ffa4
>>> AIRFLOW-1402|Cleanup SafeConfigParser DeprecationWarning       |#2435
>>> |38c86b
>>> AIRFLOW-1401|Standardize GCP project, region, and zone argument|#2439
>>> |b6d363
>>> AIRFLOW-1397|Airflow 1.8.1 - No data displays in Last Run Colum|-     |-
>>> 
>>> AIRFLOW-1394|Add quote_character parameter to GoogleCloudStorag|#2428
>>> |9fd0be
>>> AIRFLOW-1389|BigQueryOperator should support `createDisposition|#2470
>>> |6e2640
>>> AIRFLOW-1384|Add ARGO/CaDC                                     |#2434
>>> |715947
>>> AIRFLOW-1368|Automatically remove the container when it exits  |#2653
>>> |d42d23
>>> AIRFLOW-1359|Provide GoogleCloudML operator for model evaluatio|#2407
>>> |194d1d
>>> AIRFLOW-1356|add `--celery_hostname` to `airflow worker`       |#2405
>>> |b9d7d1
>>> AIRFLOW-1352|Revert bad logging Handler                        |-     |-
>>> 
>>> AIRFLOW-1350|Add "query_uri" parameter for Google DataProc oper|#2402
>>> |d32c72
>>> AIRFLOW-1348|Paginated UI has broken toggles after first page  |-     |-
>>> 
>>> AIRFLOW-1345|Don't commit on each loop                         |#2397
>>> |0dd002
>>> AIRFLOW-1344|Builds failing on Python 3.5 with AttributeError  |#2394
>>> |2a5883
>>> AIRFLOW-1343|Add airflow default label to the dataproc operator|#2396
>>> |e4b240
>>> AIRFLOW-1338|gcp_dataflow_hook is incompatible with the recent |#2388
>>> |cf2605
>>> AIRFLOW-1337|Customize log format via config file              |#2392
>>> |4841e3
>>> AIRFLOW-1335|Use buffered logger                               |#2386
>>> |0d23d3
>>> AIRFLOW-1333|Enable copy function for Google Cloud Storage Hook|#2385
>>> |e2c383
>>> AIRFLOW-1331|Contrib.SparkSubmitOperator should allow --package|#2622
>>> |fbca8f
>>> AIRFLOW-1330|Connection.parse_from_uri doesn't work for google_|#2525
>>> |6e5e9d
>>> AIRFLOW-1324|Make the Druid operator/hook more general         |#2378
>>> |de99aa
>>> AIRFLOW-1323|Operators related to Dataproc should keep some par|#2636
>>> |ed248d
>>> AIRFLOW-1315|Add Qubole File and Partition Sensors             |-     |-
>>> 
>>> AIRFLOW-1309|Add optional hive_tblproperties in HiveToDruidTran|-     |-
>>> 
>>> AIRFLOW-1301|Add New Relic to Airflow user list                |#2359
>>> |355fc9
>>> AIRFLOW-1299|Google Dataproc cluster creation operator should s|#2358
>>> |c2b80e
>>> AIRFLOW-1289|Don't restrict scheduler threads to CPU cores     |#2353
>>> |8e23d2
>>> AIRFLOW-1286|BaseTaskRunner - Exception TypeError: a bytes-like|#2363
>>> |d8891d
>>> AIRFLOW-1277|Forbid creation of a known event with empty fields|#na
>>> |65184a
>>> AIRFLOW-1276|Forbid event creation with end_data earlier than s|#na
>>> |d5d02f
>>> AIRFLOW-1275|Fix `airflow pool` command exception              |#2346
>>> |9958aa
>>> AIRFLOW-1273|Google Cloud ML Version and Model CRUD Operator   |#2379
>>> |534a0e
>>> AIRFLOW-1272|Google Cloud ML Batch Prediction Operator         |#2390
>>> |e92d6b
>>> AIRFLOW-1271|Google Cloud ML Training Operator                 |#2408
>>> |0fc450
>>> AIRFLOW-1256|Add United Airlines as Airflow user               |#2332
>>> |d3484a
>>> AIRFLOW-1251|Add eRevalue as an Airflow user                   |#2331
>>> |8d5160
>>> AIRFLOW-1248|Fix inconsistent configuration name for worker tim|#2328
>>> |92314f
>>> AIRFLOW-1247|CLI: ignore all dependencies argument ignored     |#2441
>>> |e88ecf
>>> AIRFLOW-1245|Fix random failure of test_trigger_dag_for_date un|#2325
>>> |cef01b
>>> AIRFLOW-1244|Forbid creation of a pool with empty name         |#2324
>>> |df9a10
>>> AIRFLOW-1242|BigQueryHook assumes that a valid project_id can't|#2335
>>> |ffe616
>>> AIRFLOW-1237|Fix IN-predicate sqlalchemy warning               |#2320
>>> |a1f422
>>> AIRFLOW-1234|Cover utils.operator_helpers with unit tests      |#2317
>>> |d16537
>>> AIRFLOW-1233|Cover utils.json with unit tests                  |#2316
>>> |502410
>>> AIRFLOW-1232|Remove deprecated readfp warning                  |#2315
>>> |6ffaaf
>>> AIRFLOW-1231|Use flask_wtf.CSRFProtect instead of flask_wtf.Csr|#2313
>>> |cac49e
>>> AIRFLOW-1221|Fix DatabricksSubmitRunOperator Templating        |#2308
>>> |0fa104
>>> AIRFLOW-1217|Enable logging in Sqoop hook                      |#2307
>>> |4f459b
>>> AIRFLOW-1213|Add hcatalog parameters to the sqoop operator/hook|#2305
>>> |857850
>>> AIRFLOW-1208|Speed-up cli tests                                |#2301
>>> |21c142
>>> AIRFLOW-1207|Enable utils.helpers unit tests                   |#2300
>>> |8ac87b
>>> AIRFLOW-1203|Tests failing after oauth upgrade                 |#2296
>>> |3e9c66
>>> AIRFLOW-1201|Update deprecated 'nose-parameterized' library to |#2298
>>> |d2d3e4
>>> AIRFLOW-1193|Add Checkr to Airflow user list                   |#2276
>>> |707238
>>> AIRFLOW-1189|Get pandas DataFrame using BigQueryHook fails     |#2287
>>> |93666f
>>> AIRFLOW-1188|Add max_bad_records param to GoogleCloudStorageToB|#2286
>>> |443e6b
>>> AIRFLOW-1187|Obsolete package names in documentation           |-     |-
>>> 
>>> AIRFLOW-1185|Incorrect url to PyPi                             |#2283
>>> |829755
>>> AIRFLOW-1181|Enable delete and list function for Google Cloud S|#2281
>>> |24f73c
>>> AIRFLOW-1179|Pandas 0.20 broke Google BigQuery hook            |#2279
>>> |ac9ccb
>>> AIRFLOW-1177|variable json deserialize does not work at set def|#2540
>>> |65319a
>>> AIRFLOW-1175|Add Pronto Tools to Airflow user list             |#2277
>>> |86aafa
>>> AIRFLOW-1173|Add Robinhood to list of Airflow users            |#2271
>>> |379115
>>> AIRFLOW-1165|airflow webservice crashes on ubuntu16 - python3  |-     |-
>>> 
>>> AIRFLOW-1160|Upadte SparkSubmitOperator parameters             |#2265
>>> |2e3f07
>>> AIRFLOW-1155|Add Tails.com to community                        |#2261
>>> |2fa690
>>> AIRFLOW-1149|Allow custom filters to be added to jinja2        |#2258
>>> |48135a
>>> AIRFLOW-1141|Remove DAG.crawl_for_tasks method                 |#2275
>>> |a30fee
>>> AIRFLOW-1140|DatabricksSubmitRunOperator should template the "j|#2255
>>> |e6d316
>>> AIRFLOW-1136|Invalid parameters are not captured for Sqoop oper|#2252
>>> |2ef4db
>>> AIRFLOW-1125|Clarify documentation regarding fernet_key        |#2251
>>> |831f8d
>>> AIRFLOW-1122|Node strokes are too thin for people with color vi|#2246
>>> |a08761
>>> AIRFLOW-1121|airflow webserver --pid no longer write out pid fi|-     |-
>>> 
>>> AIRFLOW-1118|Add evo.company to Airflow users                  |#2243
>>> |f16914
>>> AIRFLOW-1112|Log which pool is full in scheduler when pool slot|#2242
>>> |74c1ce
>>> AIRFLOW-1107|Add support for ftps non-default port             |#2240
>>> |4d0c2f
>>> AIRFLOW-1106|Add Groupalia/Letsbonus                           |#2239
>>> |945b42
>>> AIRFLOW-1095|ldap_auth memberOf should come from configuration |#2232
>>> |6b1c32
>>> AIRFLOW-1094|Invalid unit tests under `contrib/`               |#2234
>>> |219c50
>>> AIRFLOW-1091|As a release manager I want to be able to compare |#2231
>>> |bfae42
>>> AIRFLOW-1090|Add HBO                                           |#2230
>>> |177d34
>>> AIRFLOW-1089|Add Spark application arguments to SparkSubmitOper|#2229
>>> |e5b914
>>> AIRFLOW-1081|Task duration page is slow                        |#2226
>>> |0da512
>>> AIRFLOW-1075|Cleanup security docs                             |#2222
>>> |5a6f18
>>> AIRFLOW-1065|Add functionality for Azure Blob Storage          |#2216
>>> |f1bc5f
>>> AIRFLOW-1059|Reset_state_for_orphaned_task should operate in ba|#2205
>>> |e05d3b
>>> AIRFLOW-1058|Improvements for SparkSubmitOperator              |-     |-
>>> 
>>> AIRFLOW-1051|Add a test for resetdb to CliTests                |#2198
>>> |15aee0
>>> AIRFLOW-1047|Airflow logs vulnerable to XSS                    |#2193
>>> |fe9ebe
>>> AIRFLOW-1045|Make log level configurable via airflow.cfg       |#2191
>>> |e739a5
>>> AIRFLOW-1043|Documentation issues for operators                |#2188
>>> |b55f41
>>> AIRFLOW-1041|DockerOperator replaces its xcom_push method with |#2274
>>> |03704c
>>> AIRFLOW-1040|Fix typos in comments/docstrings in models.py     |#2174
>>> |d8c0f5
>>> AIRFLOW-1036|Exponential backoff should use randomization      |#2262
>>> |66168e
>>> AIRFLOW-1035|Exponential backoff retry logic should use 2 as ba|#2196
>>> |4ec932
>>> AIRFLOW-1034|Make it possible to connect to S3 in sigv4 regions|#2181
>>> |4c0905
>>> AIRFLOW-1031|'scheduled__' may replace with DagRun.ID_PREFIX in|#2613
>>> |aa3844
>>> AIRFLOW-1030|HttpHook error when creating HttpSensor           |-     |-
>>> 
>>> AIRFLOW-1028|Databricks Operator for Airflow                   |#2202
>>> |53ca50
>>> AIRFLOW-1024|Handle CeleryExecutor errors gracefully           |#2355
>>> |7af20f
>>> AIRFLOW-1018|Scheduler DAG processes can not log to stdout     |#2728
>>> |ef775d
>>> AIRFLOW-1016|Allow HTTP HEAD request method on HTTPSensor      |#2175
>>> |4c41f6
>>> AIRFLOW-1010|Add a convenience script for signing              |#2169
>>> |a2b65a
>>> AIRFLOW-1009|Remove SQLOperator from Concepts page             |#2168
>>> |7d1144
>>> AIRFLOW-1007|Jinja sandbox is vulnerable to RCE                |#2184
>>> |daa281
>>> AIRFLOW-1005|Speed up Airflow startup time                     |#na
>>> |996dd3
>>> AIRFLOW-999 |Support for Redis database                        |#2165
>>> |8de850
>>> AIRFLOW-997 |Change setup.cfg to point to Apache instead of Max|#na
>>> |75cd46
>>> AIRFLOW-995 |Update Github PR template                         |#2163
>>> |b62485
>>> AIRFLOW-994 |Add MiNODES to the AIRFLOW Active Users List      |#2159
>>> |ca1623
>>> AIRFLOW-991 |Mark_success while a task is running leads to fail|-     |-
>>> 
>>> AIRFLOW-990 |DockerOperator fails when logging unicode string  |#2155
>>> |6bbf54
>>> AIRFLOW-988 |SLA Miss Callbacks Are Repeated if Email is Not be|#2415
>>> |6e74d4
>>> AIRFLOW-985 |Extend the sqoop operator/hook with additional par|#2177
>>> |82eb20
>>> AIRFLOW-984 |Subdags unrecognized when subclassing SubDagOperat|#2152
>>> |a8bd16
>>> AIRFLOW-979 |Add GovTech GDS                                   |#2149
>>> |b17bd3
>>> AIRFLOW-976 |Mark success running task causes it to fail       |-     |-
>>> 
>>> AIRFLOW-969 |Catch bad python_callable argument at DAG construc|#2142
>>> |12901d
>>> AIRFLOW-963 |Some code examples are not rendered in the airflow|#2139
>>> |f69c1b
>>> AIRFLOW-960 |Add support for .editorconfig                     |#na
>>> |f5cacc
>>> AIRFLOW-959 |.gitignore file is disorganized and incomplete    |#na
>>> |3d3c14
>>> AIRFLOW-958 |Improve tooltip readability                       |#2134
>>> |b3c3eb
>>> AIRFLOW-950 |Missing AWS integrations on documentation::integra|#2552
>>> |01be02
>>> AIRFLOW-947 |Make PrestoHook surface better messages when the P|#na
>>> |6dd4b3
>>> AIRFLOW-945 |Revert psycopg2 workaround when psycopg2 2.7.1 is |-     |-
>>> 
>>> AIRFLOW-943 |Add Digital First Media to the Airflow users list |#2115
>>> |2cfe28
>>> AIRFLOW-942 |Add mytaxi to Airflow Users                       |#2111
>>> |d579e6
>>> AIRFLOW-935 |Impossible to use plugin executors                |#2120
>>> |08a784
>>> AIRFLOW-926 |jdbc connector is broken due to jaydebeapi api upd|#2651
>>> |07ed29
>>> AIRFLOW-917 |Incorrectly formatted failure status message      |#2109
>>> |b8164c
>>> AIRFLOW-916 |Fix ConfigParser deprecation warning              |#2108
>>> |ef6dd1
>>> AIRFLOW-911 |Add colouring and profiling info on tests         |#2106
>>> |4f52db
>>> AIRFLOW-903 |Add configuration setting for default DAG view.   |#2103
>>> |cadfae
>>> AIRFLOW-896 |BigQueryOperator fails to execute with certain inp|#2097
>>> |2bceee
>>> AIRFLOW-891 |Webserver Clock Should Include Day                |-     |-
>>> 
>>> AIRFLOW-889 |Minor error in the docstrings for BaseOperator.   |#2084
>>> |50702d
>>> AIRFLOW-887 |Add compatibility with future v0.16               |#na
>>> |50902d
>>> AIRFLOW-886 |Pass Operator result to post_execute hook         |#na
>>> |4da361
>>> AIRFLOW-885 |Add Change.org to the list of Airflow users       |#2089
>>> |a279be
>>> AIRFLOW-882 |Code example in docs has unnecessary DAG>>Operator|#2088
>>> |baa4cd
>>> AIRFLOW-881 |Create SubDagOperator within DAG context manager w|#2087
>>> |0ed608
>>> AIRFLOW-880 |Fix remote log functionality inconsistencies for W|#2086
>>> |974b75
>>> AIRFLOW-877 |GoogleCloudStorageDownloadOperator: template_ext c|#2083
>>> |debc69
>>> AIRFLOW-875 |Allow HttpSensor params to be templated           |#2080
>>> |62f503
>>> AIRFLOW-871 |multiple places use logging.warn() instead of warn|#2082
>>> |21d775
>>> AIRFLOW-866 |Add FTPSensor                                     |#2070
>>> |5f87f8
>>> AIRFLOW-863 |Example DAG start dates should be recent to avoid |#2068
>>> |bbfd43
>>> AIRFLOW-862 |Add DaskExecutor                                  |#2067
>>> |6e2210
>>> AIRFLOW-860 |Circular module dependency prevents loading of cus|-     |-
>>> 
>>> AIRFLOW-854 |Add Open Knowledge International to Airflow users |#2061
>>> |51a311
>>> AIRFLOW-842 |scheduler.clean_dirty raises warning: SAWarning: T|#2072
>>> |485280
>>> AIRFLOW-840 |Python3 encoding issue in Kerberos                |#2158
>>> |639336
>>> AIRFLOW-836 |The paused and queryview endpoints are vulnerable |#2054
>>> |6aca2c
>>> AIRFLOW-831 |Fix broken unit tests                             |#2050
>>> |b86194
>>> AIRFLOW-830 |Plugin manager should log to debug, not info      |-     |-
>>> 
>>> AIRFLOW-829 |Reduce verbosity of successful Travis unit tests  |-     |-
>>> 
>>> AIRFLOW-826 |Add Zendesk Hook                                  |#2066
>>> |a09762
>>> AIRFLOW-823 |Make task instance details available via API      |#2045
>>> |3f546e
>>> AIRFLOW-822 |Close the connection before throwing exception in |#2038
>>> |4b6c38
>>> AIRFLOW-821 |Scheduler dagbag importing not Py3 compatible     |#2039
>>> |fbb59b
>>> AIRFLOW-809 |SqlAlchemy is_ ColumnOperator Causing Errors in MS|-     |-
>>> 
>>> AIRFLOW-802 |Integration of spark-submit                       |-     |-
>>> 
>>> AIRFLOW-781 |Allow DataFlowJavaOperator to accept jar file stor|#2037
>>> |259c86
>>> AIRFLOW-770 |HDFS hooks should support alternative ways of gett|#2056
>>> |261b65
>>> AIRFLOW-756 |Refactor ssh_hook and ssh_operator                |-     |-
>>> 
>>> AIRFLOW-751 |SFTP file transfer functionality                  |#1999
>>> |fe0ede
>>> AIRFLOW-725 |Make merge tool use OS' keyring for password stora|#1966
>>> |8c1695
>>> AIRFLOW-706 |Configuration shell commands are not split properl|#2053
>>> |0bb6f2
>>> AIRFLOW-705 |airflow.configuration.run_command output does not |-     |-
>>> 
>>> AIRFLOW-681 |homepage doc link should pointing to apache's repo|#2164
>>> |a8027a
>>> AIRFLOW-654 |SSL for AMQP w/ Celery(Executor)                  |#2333
>>> |868bfe
>>> AIRFLOW-645 |HttpHook ignores https                            |#2311
>>> |fd381a
>>> AIRFLOW-365 |Code view in subdag trigger exception             |#2043
>>> |cf102c
>>> AIRFLOW-300 |Add Google Pubsub hook and operator               |#2036
>>> |d231dc
>>> AIRFLOW-289 |Use datetime.utcnow() to keep airflow system indep|#2618
>>> |20c83e
>>> AIRFLOW-71  |docker_operator - pulling from private repositorie|#na
>>> |d4406c
>>> 
>>> Cheers,
>>> Chris
>>> 
> 


Re: [VOTE] Airflow 1.9.0rc1

Posted by Ash Berlin-Taylor <as...@firemirror.com>.
-1 (for now. Non binding. Is that how this process works?)

We've built a test env for this RC and are testing, but have run into an issue reading task logs. (See below)

We haven't gotten very far with this yet, we will dig more tomorrow (it's the end of the UK work day now). I suspect this might be how we've misconfigured our logging. We will see tomorrow.

-ash




File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py", line 712, in log
    logs = handler.read(ti)
AttributeError: 'NoneType' object has no attribute 'read'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1988, in wsgi_app
    response = self.full_dispatch_request()
  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1641, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1544, in handle_user_exception
    reraise(exc_type, exc_value, tb)
  File "/usr/local/lib/python3.5/dist-packages/flask/_compat.py", line 33, in reraise
    raise value
  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1639, in full_dispatch_request
    rv = self.dispatch_request()
  File "/usr/local/lib/python3.5/dist-packages/flask/app.py", line 1625, in dispatch_request
    return self.view_functions[rule.endpoint](**req.view_args)
  File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py", line 69, in inner
    return self._run_view(f, *args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/flask_admin/base.py", line 368, in _run_view
    return fn(self, *args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/flask_login.py", line 758, in decorated_view
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/airflow/www/utils.py", line 262, in wrapper
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/airflow/www/views.py", line 715, in log
    .format(task_log_reader, e.message)]
AttributeError: 'AttributeError' object has no attribute 'message'


> On 8 Nov 2017, at 17:46, Chris Riccomini <cr...@apache.org> wrote:
> 
> Anyone? :/
> 
> On Mon, Nov 6, 2017 at 1:22 PM, Chris Riccomini <cr...@apache.org>
> wrote:
> 
>> Hey all,
>> 
>> I have cut Airflow 1.9.0 RC1. This email is calling a vote on the release,
>> which will last fo 72 hours. Consider this my (binding) +1.
>> 
>> Airflow 1.9.0 RC1 is available at:
>> 
>> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.9.0rc1/
>> 
>> apache-airflow-1.9.0rc1+incubating-source.tar.gz is a source release that
>> comes with INSTALL instructions.
>> apache-airflow-1.9.0rc1+incubating-bin.tar.gz is the binary Python
>> "sdist" release.
>> 
>> Public keys are available at:
>> 
>> https://dist.apache.org/repos/dist/release/incubator/airflow/
>> 
>> The release contains the following JIRAs:
>> 
>> ISSUE ID    |DESCRIPTION                                       |PR
>> |COMMIT
>> AIRFLOW-1779|Add keepalive packets to ssh hook                 |#2749
>> |d2f9d1
>> AIRFLOW-1776|stdout/stderr logging not captured                |#2745
>> |590d9f
>> AIRFLOW-1771|Change heartbeat text from boom to heartbeat      |-     |-
>> 
>> AIRFLOW-1767|Airflow Scheduler no longer schedules DAGs        |-     |-
>> 
>> AIRFLOW-1765|Default API auth backed should deny all.          |#2737
>> |6ecdac
>> AIRFLOW-1764|Web Interface should not use experimental api     |#2738
>> |6bed1d
>> AIRFLOW-1757|Contrib.SparkSubmitOperator should allow --package|#2725
>> |4e06ee
>> AIRFLOW-1745|BashOperator ignores SIGPIPE in subprocess        |#2714
>> |e021c9
>> AIRFLOW-1744|task.retries can be False                         |#2713
>> |6144c6
>> AIRFLOW-1743|Default config template should not contain ldap fi|#2712
>> |270684
>> AIRFLOW-1741|Task Duration shows two charts on first page load.|#2711
>> |974b49
>> AIRFLOW-1734|Sqoop Operator contains logic errors & needs optio|#2703
>> |f6810c
>> AIRFLOW-1731|Import custom config on PYTHONPATH                |#2721
>> |f07eb3
>> AIRFLOW-1726|Copy Expert command for Postgres Hook             |#2698
>> |8a4ad3
>> AIRFLOW-1719|Fix small typo - your vs you                      |-     |-
>> 
>> AIRFLOW-1712|Log SSHOperator output                            |-     |-
>> 
>> AIRFLOW-1711|Ldap Attributes not always a "list" part 2        |#2731
>> |40a936
>> AIRFLOW-1706|Scheduler is failed on startup with MS SQL Server |#2733
>> |9e209b
>> AIRFLOW-1698|Remove confusing SCHEDULER_RUNS env var from syste|#2677
>> |00dd06
>> AIRFLOW-1695|Redshift Hook using boto3 & AWS Hook              |#2717
>> |bfddae
>> AIRFLOW-1694|Hive Hooks: Python 3 does not have an `itertools.i|#2674
>> |c6e5ae
>> AIRFLOW-1692|Master cannot be checked out on windows           |#2673
>> |31805e
>> AIRFLOW-1691|Add better documentation for Google cloud storage |#2671
>> |ace2b1
>> AIRFLOW-1690|Error messages regarding gcs log commits are spars|#2670
>> |5fb5cd
>> AIRFLOW-1682|S3 task handler never writes to S3                |#2664
>> |0080f0
>> AIRFLOW-1678|Fix docstring errors for `set_upstream` and `set_d|-     |-
>> 
>> AIRFLOW-1677|Fix typo in example_qubole_operator               |-     |-
>> 
>> AIRFLOW-1676|GCS task handler never writes to GCS              |#2659
>> |781fa4
>> AIRFLOW-1675|Fix API docstrings to be properly rendered        |#2667
>> |f12381
>> AIRFLOW-1671|Missing @apply_defaults annotation for gcs downloa|#2655
>> |97666b
>> AIRFLOW-1669|Fix Docker import in Master                       |#na
>> |f7f2a8
>> AIRFLOW-1668|Redhsift requires a keep alive of < 300s          |#2650
>> |f2bb77
>> AIRFLOW-1664|Make MySqlToGoogleCloudStorageOperator support bin|#2649
>> |95813d
>> AIRFLOW-1660|Change webpage width to full-width                |#2646
>> |8ee3d9
>> AIRFLOW-1659|Fix invalid attribute bug in FileTaskHandler      |#2645
>> |bee823
>> AIRFLOW-1658|Kill (possibly) still running Druid indexing job a|#2644
>> |cbf7ad
>> AIRFLOW-1657|Handle failure of Qubole Operator for s3distcp had|-     |-
>> 
>> AIRFLOW-1654|Show tooltips for link icons in DAGs view         |#2642
>> |ada7b2
>> AIRFLOW-1647|Fix Spark-sql hook                                |#2637
>> |b1e5c6
>> AIRFLOW-1641|Task gets stuck in queued state                   |#2715
>> |735497
>> AIRFLOW-1640|Add Qubole default connection in connection table |-     |-
>> 
>> AIRFLOW-1639|ValueError does not have .message attribute       |#2629
>> |87df67
>> AIRFLOW-1637|readme not tracking master branch for travis      |-     |-
>> 
>> AIRFLOW-1636|aws and emr connection types get cleared          |#2626
>> |540e04
>> AIRFLOW-1635|Allow creating Google Cloud Platform connection wi|#2640
>> |6dec7a
>> AIRFLOW-1629|make extra a textarea in edit connections form    |#2623
>> |f5d46f
>> AIRFLOW-1628|Docstring of sqlsensor is incorrect               |#2621
>> |9ba73d
>> AIRFLOW-1627|SubDagOperator initialization should only query po|#2620
>> |516ace
>> AIRFLOW-1621|Add tests for logic added on server side dag list |#2614
>> |8de9fd
>> AIRFLOW-1614|Improve performance of DAG parsing when there are |#2610
>> |a95adb
>> AIRFLOW-1611|Customize logging in Airflow                      |#2631
>> |8b4a50
>> AIRFLOW-1609|Ignore all venvs in gitignore                     |#2608
>> |f1f9b4
>> AIRFLOW-1608|GCP Dataflow hook missing pending job state       |#2607
>> |653562
>> AIRFLOW-1606|DAG.sync_to_db is static, but takes a DAG as first|#2606
>> |6ac296
>> AIRFLOW-1605|Fix log source of local loggers                   |-     |-
>> 
>> AIRFLOW-1604|Rename the logger to log                          |#2604
>> |af4050
>> AIRFLOW-1602|Use LoggingMixin for the DAG class                |#2602
>> |956699
>> AIRFLOW-1601|Add configurable time between SIGTERM and SIGKILL |#2601
>> |48a95e
>> AIRFLOW-1600|Uncaught exceptions in get_fernet if cryptography |#2600
>> |ad963e
>> AIRFLOW-1597|Add GameWisp as Airflow user                      |#2599
>> |26b747
>> AIRFLOW-1594|Installing via pip copies test files into python l|#2597
>> |a6b23a
>> AIRFLOW-1593|Expose load_string in WasbHook                    |#2596
>> |7ece95
>> AIRFLOW-1591|Exception: 'TaskInstance' object has no attribute |#2578
>> |f4653e
>> AIRFLOW-1590|Small fix for dates util                          |#2652
>> |31946e
>> AIRFLOW-1587|fix `ImportError: cannot import name 'CeleryExecut|#2590
>> |34c73b
>> AIRFLOW-1586|MySQL to GCS to BigQuery fails for tables with dat|#2589
>> |e83012
>> AIRFLOW-1584|Remove the insecure /headers endpoints            |#2588
>> |17ac07
>> AIRFLOW-1582|Improve logging structure of Airflow              |#2592
>> |a7a518
>> AIRFLOW-1580|Error in string formatter when throwing an excepti|#2583
>> |ea9ab9
>> AIRFLOW-1579|Allow jagged rows in BQ Hook.                     |#2582
>> |5b978b
>> AIRFLOW-1577|Add token support to DatabricksHook               |#2579
>> |c2c515
>> AIRFLOW-1573|Remove `thrift < 0.10.0` requirement              |#2574
>> |aa95f2
>> AIRFLOW-1571|Add AWS Lambda Hook for invoking Lambda Function  |#2718
>> |017f18
>> AIRFLOW-1568|Add datastore import/export operator              |#2568
>> |86063b
>> AIRFLOW-1567|Clean up ML Engine operators                      |#2567
>> |af91e2
>> AIRFLOW-1564|Default logging filename contains a colon         |#2565
>> |4c674c
>> AIRFLOW-1560|Add AWS DynamoDB hook for inserting batch items   |#2587
>> |71400b
>> AIRFLOW-1556|BigQueryBaseCursor should support SQL parameters  |#2557
>> |9df0ac
>> AIRFLOW-1546| add Zymergen to org list in README               |#2512
>> |7cc346
>> AIRFLOW-1535|Add support for Dataproc serviceAccountScopes in D|#2546
>> |b1f902
>> AIRFLOW-1529|Support quoted newlines in Google BigQuery load jo|#2545
>> |4a4b02
>> AIRFLOW-1527|Refactor celery config to make use of template    |#2542
>> |f4437b
>> AIRFLOW-1522|Increase size of val column for variable table in |#2535
>> |8a2d24
>> AIRFLOW-1521|Template fields definition for bigquery_table_dele|#2534
>> |f1a7c0
>> AIRFLOW-1520|S3Hook uses boto2                                 |#2532
>> |386583
>> AIRFLOW-1519|Main DAG list page does not scale using client sid|#2531
>> |d7d7ce
>> AIRFLOW-1512|Add operator for running Python functions in a vir|#2446
>> |14e6d7
>> AIRFLOW-1507|Make src, dst and bucket parameters as templated i|#2516
>> |d295cf
>> AIRFLOW-1505|Document when Jinja substitution occurs           |#2523
>> |984a87
>> AIRFLOW-1504|Log Cluster Name on Dataproc Operator When Execute|#2517
>> |1cd6c4
>> AIRFLOW-1499s|Eliminate duplicate and unneeded code             |-     |-
>> 
>> AIRFLOW-1497|Hidden fields in connection form aren't reset when|#2507
>> |d8da8b
>> AIRFLOW-1493|Fix race condition with airflow run               |#2505
>> |b2e175
>> AIRFLOW-1492|Add metric for task success/failure               |#2504
>> |fa84d4
>> AIRFLOW-1489|Docs: Typo in BigQueryCheckOperator               |#2501
>> |111ce5
>> AIRFLOW-1483|Page size on model views is to large to render qui|#2497
>> |04bfba
>> AIRFLOW-1478|Chart -> Owner column should be sortable          |#2493
>> |651e60
>> AIRFLOW-1476|Add INSTALL file for source releases              |#2492
>> |da76ac
>> AIRFLOW-1474|Add dag_id regex for 'airflow clear' CLI command  |#2486
>> |18f849
>> AIRFLOW-1470s|BashSensor Implementation                         |-     |-
>> 
>> AIRFLOW-1459|integration rst doc is broken in github view      |#2481
>> |322ec9
>> AIRFLOW-1438|Scheduler batch queries should have a limit       |#2462
>> |3547cb
>> AIRFLOW-1437|BigQueryTableDeleteOperator should define deletion|#2459
>> |b87903
>> AIRFLOW-1432|NVD3 Charts do not have labeled axes and units cha|#2710
>> |70ffa4
>> AIRFLOW-1402|Cleanup SafeConfigParser DeprecationWarning       |#2435
>> |38c86b
>> AIRFLOW-1401|Standardize GCP project, region, and zone argument|#2439
>> |b6d363
>> AIRFLOW-1397|Airflow 1.8.1 - No data displays in Last Run Colum|-     |-
>> 
>> AIRFLOW-1394|Add quote_character parameter to GoogleCloudStorag|#2428
>> |9fd0be
>> AIRFLOW-1389|BigQueryOperator should support `createDisposition|#2470
>> |6e2640
>> AIRFLOW-1384|Add ARGO/CaDC                                     |#2434
>> |715947
>> AIRFLOW-1368|Automatically remove the container when it exits  |#2653
>> |d42d23
>> AIRFLOW-1359|Provide GoogleCloudML operator for model evaluatio|#2407
>> |194d1d
>> AIRFLOW-1356|add `--celery_hostname` to `airflow worker`       |#2405
>> |b9d7d1
>> AIRFLOW-1352|Revert bad logging Handler                        |-     |-
>> 
>> AIRFLOW-1350|Add "query_uri" parameter for Google DataProc oper|#2402
>> |d32c72
>> AIRFLOW-1348|Paginated UI has broken toggles after first page  |-     |-
>> 
>> AIRFLOW-1345|Don't commit on each loop                         |#2397
>> |0dd002
>> AIRFLOW-1344|Builds failing on Python 3.5 with AttributeError  |#2394
>> |2a5883
>> AIRFLOW-1343|Add airflow default label to the dataproc operator|#2396
>> |e4b240
>> AIRFLOW-1338|gcp_dataflow_hook is incompatible with the recent |#2388
>> |cf2605
>> AIRFLOW-1337|Customize log format via config file              |#2392
>> |4841e3
>> AIRFLOW-1335|Use buffered logger                               |#2386
>> |0d23d3
>> AIRFLOW-1333|Enable copy function for Google Cloud Storage Hook|#2385
>> |e2c383
>> AIRFLOW-1331|Contrib.SparkSubmitOperator should allow --package|#2622
>> |fbca8f
>> AIRFLOW-1330|Connection.parse_from_uri doesn't work for google_|#2525
>> |6e5e9d
>> AIRFLOW-1324|Make the Druid operator/hook more general         |#2378
>> |de99aa
>> AIRFLOW-1323|Operators related to Dataproc should keep some par|#2636
>> |ed248d
>> AIRFLOW-1315|Add Qubole File and Partition Sensors             |-     |-
>> 
>> AIRFLOW-1309|Add optional hive_tblproperties in HiveToDruidTran|-     |-
>> 
>> AIRFLOW-1301|Add New Relic to Airflow user list                |#2359
>> |355fc9
>> AIRFLOW-1299|Google Dataproc cluster creation operator should s|#2358
>> |c2b80e
>> AIRFLOW-1289|Don't restrict scheduler threads to CPU cores     |#2353
>> |8e23d2
>> AIRFLOW-1286|BaseTaskRunner - Exception TypeError: a bytes-like|#2363
>> |d8891d
>> AIRFLOW-1277|Forbid creation of a known event with empty fields|#na
>> |65184a
>> AIRFLOW-1276|Forbid event creation with end_data earlier than s|#na
>> |d5d02f
>> AIRFLOW-1275|Fix `airflow pool` command exception              |#2346
>> |9958aa
>> AIRFLOW-1273|Google Cloud ML Version and Model CRUD Operator   |#2379
>> |534a0e
>> AIRFLOW-1272|Google Cloud ML Batch Prediction Operator         |#2390
>> |e92d6b
>> AIRFLOW-1271|Google Cloud ML Training Operator                 |#2408
>> |0fc450
>> AIRFLOW-1256|Add United Airlines as Airflow user               |#2332
>> |d3484a
>> AIRFLOW-1251|Add eRevalue as an Airflow user                   |#2331
>> |8d5160
>> AIRFLOW-1248|Fix inconsistent configuration name for worker tim|#2328
>> |92314f
>> AIRFLOW-1247|CLI: ignore all dependencies argument ignored     |#2441
>> |e88ecf
>> AIRFLOW-1245|Fix random failure of test_trigger_dag_for_date un|#2325
>> |cef01b
>> AIRFLOW-1244|Forbid creation of a pool with empty name         |#2324
>> |df9a10
>> AIRFLOW-1242|BigQueryHook assumes that a valid project_id can't|#2335
>> |ffe616
>> AIRFLOW-1237|Fix IN-predicate sqlalchemy warning               |#2320
>> |a1f422
>> AIRFLOW-1234|Cover utils.operator_helpers with unit tests      |#2317
>> |d16537
>> AIRFLOW-1233|Cover utils.json with unit tests                  |#2316
>> |502410
>> AIRFLOW-1232|Remove deprecated readfp warning                  |#2315
>> |6ffaaf
>> AIRFLOW-1231|Use flask_wtf.CSRFProtect instead of flask_wtf.Csr|#2313
>> |cac49e
>> AIRFLOW-1221|Fix DatabricksSubmitRunOperator Templating        |#2308
>> |0fa104
>> AIRFLOW-1217|Enable logging in Sqoop hook                      |#2307
>> |4f459b
>> AIRFLOW-1213|Add hcatalog parameters to the sqoop operator/hook|#2305
>> |857850
>> AIRFLOW-1208|Speed-up cli tests                                |#2301
>> |21c142
>> AIRFLOW-1207|Enable utils.helpers unit tests                   |#2300
>> |8ac87b
>> AIRFLOW-1203|Tests failing after oauth upgrade                 |#2296
>> |3e9c66
>> AIRFLOW-1201|Update deprecated 'nose-parameterized' library to |#2298
>> |d2d3e4
>> AIRFLOW-1193|Add Checkr to Airflow user list                   |#2276
>> |707238
>> AIRFLOW-1189|Get pandas DataFrame using BigQueryHook fails     |#2287
>> |93666f
>> AIRFLOW-1188|Add max_bad_records param to GoogleCloudStorageToB|#2286
>> |443e6b
>> AIRFLOW-1187|Obsolete package names in documentation           |-     |-
>> 
>> AIRFLOW-1185|Incorrect url to PyPi                             |#2283
>> |829755
>> AIRFLOW-1181|Enable delete and list function for Google Cloud S|#2281
>> |24f73c
>> AIRFLOW-1179|Pandas 0.20 broke Google BigQuery hook            |#2279
>> |ac9ccb
>> AIRFLOW-1177|variable json deserialize does not work at set def|#2540
>> |65319a
>> AIRFLOW-1175|Add Pronto Tools to Airflow user list             |#2277
>> |86aafa
>> AIRFLOW-1173|Add Robinhood to list of Airflow users            |#2271
>> |379115
>> AIRFLOW-1165|airflow webservice crashes on ubuntu16 - python3  |-     |-
>> 
>> AIRFLOW-1160|Upadte SparkSubmitOperator parameters             |#2265
>> |2e3f07
>> AIRFLOW-1155|Add Tails.com to community                        |#2261
>> |2fa690
>> AIRFLOW-1149|Allow custom filters to be added to jinja2        |#2258
>> |48135a
>> AIRFLOW-1141|Remove DAG.crawl_for_tasks method                 |#2275
>> |a30fee
>> AIRFLOW-1140|DatabricksSubmitRunOperator should template the "j|#2255
>> |e6d316
>> AIRFLOW-1136|Invalid parameters are not captured for Sqoop oper|#2252
>> |2ef4db
>> AIRFLOW-1125|Clarify documentation regarding fernet_key        |#2251
>> |831f8d
>> AIRFLOW-1122|Node strokes are too thin for people with color vi|#2246
>> |a08761
>> AIRFLOW-1121|airflow webserver --pid no longer write out pid fi|-     |-
>> 
>> AIRFLOW-1118|Add evo.company to Airflow users                  |#2243
>> |f16914
>> AIRFLOW-1112|Log which pool is full in scheduler when pool slot|#2242
>> |74c1ce
>> AIRFLOW-1107|Add support for ftps non-default port             |#2240
>> |4d0c2f
>> AIRFLOW-1106|Add Groupalia/Letsbonus                           |#2239
>> |945b42
>> AIRFLOW-1095|ldap_auth memberOf should come from configuration |#2232
>> |6b1c32
>> AIRFLOW-1094|Invalid unit tests under `contrib/`               |#2234
>> |219c50
>> AIRFLOW-1091|As a release manager I want to be able to compare |#2231
>> |bfae42
>> AIRFLOW-1090|Add HBO                                           |#2230
>> |177d34
>> AIRFLOW-1089|Add Spark application arguments to SparkSubmitOper|#2229
>> |e5b914
>> AIRFLOW-1081|Task duration page is slow                        |#2226
>> |0da512
>> AIRFLOW-1075|Cleanup security docs                             |#2222
>> |5a6f18
>> AIRFLOW-1065|Add functionality for Azure Blob Storage          |#2216
>> |f1bc5f
>> AIRFLOW-1059|Reset_state_for_orphaned_task should operate in ba|#2205
>> |e05d3b
>> AIRFLOW-1058|Improvements for SparkSubmitOperator              |-     |-
>> 
>> AIRFLOW-1051|Add a test for resetdb to CliTests                |#2198
>> |15aee0
>> AIRFLOW-1047|Airflow logs vulnerable to XSS                    |#2193
>> |fe9ebe
>> AIRFLOW-1045|Make log level configurable via airflow.cfg       |#2191
>> |e739a5
>> AIRFLOW-1043|Documentation issues for operators                |#2188
>> |b55f41
>> AIRFLOW-1041|DockerOperator replaces its xcom_push method with |#2274
>> |03704c
>> AIRFLOW-1040|Fix typos in comments/docstrings in models.py     |#2174
>> |d8c0f5
>> AIRFLOW-1036|Exponential backoff should use randomization      |#2262
>> |66168e
>> AIRFLOW-1035|Exponential backoff retry logic should use 2 as ba|#2196
>> |4ec932
>> AIRFLOW-1034|Make it possible to connect to S3 in sigv4 regions|#2181
>> |4c0905
>> AIRFLOW-1031|'scheduled__' may replace with DagRun.ID_PREFIX in|#2613
>> |aa3844
>> AIRFLOW-1030|HttpHook error when creating HttpSensor           |-     |-
>> 
>> AIRFLOW-1028|Databricks Operator for Airflow                   |#2202
>> |53ca50
>> AIRFLOW-1024|Handle CeleryExecutor errors gracefully           |#2355
>> |7af20f
>> AIRFLOW-1018|Scheduler DAG processes can not log to stdout     |#2728
>> |ef775d
>> AIRFLOW-1016|Allow HTTP HEAD request method on HTTPSensor      |#2175
>> |4c41f6
>> AIRFLOW-1010|Add a convenience script for signing              |#2169
>> |a2b65a
>> AIRFLOW-1009|Remove SQLOperator from Concepts page             |#2168
>> |7d1144
>> AIRFLOW-1007|Jinja sandbox is vulnerable to RCE                |#2184
>> |daa281
>> AIRFLOW-1005|Speed up Airflow startup time                     |#na
>> |996dd3
>> AIRFLOW-999 |Support for Redis database                        |#2165
>> |8de850
>> AIRFLOW-997 |Change setup.cfg to point to Apache instead of Max|#na
>> |75cd46
>> AIRFLOW-995 |Update Github PR template                         |#2163
>> |b62485
>> AIRFLOW-994 |Add MiNODES to the AIRFLOW Active Users List      |#2159
>> |ca1623
>> AIRFLOW-991 |Mark_success while a task is running leads to fail|-     |-
>> 
>> AIRFLOW-990 |DockerOperator fails when logging unicode string  |#2155
>> |6bbf54
>> AIRFLOW-988 |SLA Miss Callbacks Are Repeated if Email is Not be|#2415
>> |6e74d4
>> AIRFLOW-985 |Extend the sqoop operator/hook with additional par|#2177
>> |82eb20
>> AIRFLOW-984 |Subdags unrecognized when subclassing SubDagOperat|#2152
>> |a8bd16
>> AIRFLOW-979 |Add GovTech GDS                                   |#2149
>> |b17bd3
>> AIRFLOW-976 |Mark success running task causes it to fail       |-     |-
>> 
>> AIRFLOW-969 |Catch bad python_callable argument at DAG construc|#2142
>> |12901d
>> AIRFLOW-963 |Some code examples are not rendered in the airflow|#2139
>> |f69c1b
>> AIRFLOW-960 |Add support for .editorconfig                     |#na
>> |f5cacc
>> AIRFLOW-959 |.gitignore file is disorganized and incomplete    |#na
>> |3d3c14
>> AIRFLOW-958 |Improve tooltip readability                       |#2134
>> |b3c3eb
>> AIRFLOW-950 |Missing AWS integrations on documentation::integra|#2552
>> |01be02
>> AIRFLOW-947 |Make PrestoHook surface better messages when the P|#na
>> |6dd4b3
>> AIRFLOW-945 |Revert psycopg2 workaround when psycopg2 2.7.1 is |-     |-
>> 
>> AIRFLOW-943 |Add Digital First Media to the Airflow users list |#2115
>> |2cfe28
>> AIRFLOW-942 |Add mytaxi to Airflow Users                       |#2111
>> |d579e6
>> AIRFLOW-935 |Impossible to use plugin executors                |#2120
>> |08a784
>> AIRFLOW-926 |jdbc connector is broken due to jaydebeapi api upd|#2651
>> |07ed29
>> AIRFLOW-917 |Incorrectly formatted failure status message      |#2109
>> |b8164c
>> AIRFLOW-916 |Fix ConfigParser deprecation warning              |#2108
>> |ef6dd1
>> AIRFLOW-911 |Add colouring and profiling info on tests         |#2106
>> |4f52db
>> AIRFLOW-903 |Add configuration setting for default DAG view.   |#2103
>> |cadfae
>> AIRFLOW-896 |BigQueryOperator fails to execute with certain inp|#2097
>> |2bceee
>> AIRFLOW-891 |Webserver Clock Should Include Day                |-     |-
>> 
>> AIRFLOW-889 |Minor error in the docstrings for BaseOperator.   |#2084
>> |50702d
>> AIRFLOW-887 |Add compatibility with future v0.16               |#na
>> |50902d
>> AIRFLOW-886 |Pass Operator result to post_execute hook         |#na
>> |4da361
>> AIRFLOW-885 |Add Change.org to the list of Airflow users       |#2089
>> |a279be
>> AIRFLOW-882 |Code example in docs has unnecessary DAG>>Operator|#2088
>> |baa4cd
>> AIRFLOW-881 |Create SubDagOperator within DAG context manager w|#2087
>> |0ed608
>> AIRFLOW-880 |Fix remote log functionality inconsistencies for W|#2086
>> |974b75
>> AIRFLOW-877 |GoogleCloudStorageDownloadOperator: template_ext c|#2083
>> |debc69
>> AIRFLOW-875 |Allow HttpSensor params to be templated           |#2080
>> |62f503
>> AIRFLOW-871 |multiple places use logging.warn() instead of warn|#2082
>> |21d775
>> AIRFLOW-866 |Add FTPSensor                                     |#2070
>> |5f87f8
>> AIRFLOW-863 |Example DAG start dates should be recent to avoid |#2068
>> |bbfd43
>> AIRFLOW-862 |Add DaskExecutor                                  |#2067
>> |6e2210
>> AIRFLOW-860 |Circular module dependency prevents loading of cus|-     |-
>> 
>> AIRFLOW-854 |Add Open Knowledge International to Airflow users |#2061
>> |51a311
>> AIRFLOW-842 |scheduler.clean_dirty raises warning: SAWarning: T|#2072
>> |485280
>> AIRFLOW-840 |Python3 encoding issue in Kerberos                |#2158
>> |639336
>> AIRFLOW-836 |The paused and queryview endpoints are vulnerable |#2054
>> |6aca2c
>> AIRFLOW-831 |Fix broken unit tests                             |#2050
>> |b86194
>> AIRFLOW-830 |Plugin manager should log to debug, not info      |-     |-
>> 
>> AIRFLOW-829 |Reduce verbosity of successful Travis unit tests  |-     |-
>> 
>> AIRFLOW-826 |Add Zendesk Hook                                  |#2066
>> |a09762
>> AIRFLOW-823 |Make task instance details available via API      |#2045
>> |3f546e
>> AIRFLOW-822 |Close the connection before throwing exception in |#2038
>> |4b6c38
>> AIRFLOW-821 |Scheduler dagbag importing not Py3 compatible     |#2039
>> |fbb59b
>> AIRFLOW-809 |SqlAlchemy is_ ColumnOperator Causing Errors in MS|-     |-
>> 
>> AIRFLOW-802 |Integration of spark-submit                       |-     |-
>> 
>> AIRFLOW-781 |Allow DataFlowJavaOperator to accept jar file stor|#2037
>> |259c86
>> AIRFLOW-770 |HDFS hooks should support alternative ways of gett|#2056
>> |261b65
>> AIRFLOW-756 |Refactor ssh_hook and ssh_operator                |-     |-
>> 
>> AIRFLOW-751 |SFTP file transfer functionality                  |#1999
>> |fe0ede
>> AIRFLOW-725 |Make merge tool use OS' keyring for password stora|#1966
>> |8c1695
>> AIRFLOW-706 |Configuration shell commands are not split properl|#2053
>> |0bb6f2
>> AIRFLOW-705 |airflow.configuration.run_command output does not |-     |-
>> 
>> AIRFLOW-681 |homepage doc link should pointing to apache's repo|#2164
>> |a8027a
>> AIRFLOW-654 |SSL for AMQP w/ Celery(Executor)                  |#2333
>> |868bfe
>> AIRFLOW-645 |HttpHook ignores https                            |#2311
>> |fd381a
>> AIRFLOW-365 |Code view in subdag trigger exception             |#2043
>> |cf102c
>> AIRFLOW-300 |Add Google Pubsub hook and operator               |#2036
>> |d231dc
>> AIRFLOW-289 |Use datetime.utcnow() to keep airflow system indep|#2618
>> |20c83e
>> AIRFLOW-71  |docker_operator - pulling from private repositorie|#na
>> |d4406c
>> 
>> Cheers,
>> Chris
>> 


Re: [VOTE] Airflow 1.9.0rc1

Posted by Ruslan Dautkhanov <da...@gmail.com>.
+1



-- 
Ruslan Dautkhanov

On Wed, Nov 8, 2017 at 10:46 AM, Chris Riccomini <cr...@apache.org>
wrote:

> Anyone? :/
>
> On Mon, Nov 6, 2017 at 1:22 PM, Chris Riccomini <cr...@apache.org>
> wrote:
>
> > Hey all,
> >
> > I have cut Airflow 1.9.0 RC1. This email is calling a vote on the
> release,
> > which will last fo 72 hours. Consider this my (binding) +1.
> >
> > Airflow 1.9.0 RC1 is available at:
> >
> > https://dist.apache.org/repos/dist/dev/incubator/airflow/1.9.0rc1/
> >
> > apache-airflow-1.9.0rc1+incubating-source.tar.gz is a source release
> that
> > comes with INSTALL instructions.
> > apache-airflow-1.9.0rc1+incubating-bin.tar.gz is the binary Python
> > "sdist" release.
> >
> > Public keys are available at:
> >
> > https://dist.apache.org/repos/dist/release/incubator/airflow/
> >
> > The release contains the following JIRAs:
> >
> > ISSUE ID    |DESCRIPTION                                       |PR
> > |COMMIT
> > AIRFLOW-1779|Add keepalive packets to ssh hook                 |#2749
> > |d2f9d1
> > AIRFLOW-1776|stdout/stderr logging not captured                |#2745
> > |590d9f
> > AIRFLOW-1771|Change heartbeat text from boom to heartbeat      |-     |-
> >
> > AIRFLOW-1767|Airflow Scheduler no longer schedules DAGs        |-     |-
> >
> > AIRFLOW-1765|Default API auth backed should deny all.          |#2737
> > |6ecdac
> > AIRFLOW-1764|Web Interface should not use experimental api     |#2738
> > |6bed1d
> > AIRFLOW-1757|Contrib.SparkSubmitOperator should allow --package|#2725
> > |4e06ee
> > AIRFLOW-1745|BashOperator ignores SIGPIPE in subprocess        |#2714
> > |e021c9
> > AIRFLOW-1744|task.retries can be False                         |#2713
> > |6144c6
> > AIRFLOW-1743|Default config template should not contain ldap fi|#2712
> > |270684
> > AIRFLOW-1741|Task Duration shows two charts on first page load.|#2711
> > |974b49
> > AIRFLOW-1734|Sqoop Operator contains logic errors & needs optio|#2703
> > |f6810c
> > AIRFLOW-1731|Import custom config on PYTHONPATH                |#2721
> > |f07eb3
> > AIRFLOW-1726|Copy Expert command for Postgres Hook             |#2698
> > |8a4ad3
> > AIRFLOW-1719|Fix small typo - your vs you                      |-     |-
> >
> > AIRFLOW-1712|Log SSHOperator output                            |-     |-
> >
> > AIRFLOW-1711|Ldap Attributes not always a "list" part 2        |#2731
> > |40a936
> > AIRFLOW-1706|Scheduler is failed on startup with MS SQL Server |#2733
> > |9e209b
> > AIRFLOW-1698|Remove confusing SCHEDULER_RUNS env var from syste|#2677
> > |00dd06
> > AIRFLOW-1695|Redshift Hook using boto3 & AWS Hook              |#2717
> > |bfddae
> > AIRFLOW-1694|Hive Hooks: Python 3 does not have an `itertools.i|#2674
> > |c6e5ae
> > AIRFLOW-1692|Master cannot be checked out on windows           |#2673
> > |31805e
> > AIRFLOW-1691|Add better documentation for Google cloud storage |#2671
> > |ace2b1
> > AIRFLOW-1690|Error messages regarding gcs log commits are spars|#2670
> > |5fb5cd
> > AIRFLOW-1682|S3 task handler never writes to S3                |#2664
> > |0080f0
> > AIRFLOW-1678|Fix docstring errors for `set_upstream` and `set_d|-     |-
> >
> > AIRFLOW-1677|Fix typo in example_qubole_operator               |-     |-
> >
> > AIRFLOW-1676|GCS task handler never writes to GCS              |#2659
> > |781fa4
> > AIRFLOW-1675|Fix API docstrings to be properly rendered        |#2667
> > |f12381
> > AIRFLOW-1671|Missing @apply_defaults annotation for gcs downloa|#2655
> > |97666b
> > AIRFLOW-1669|Fix Docker import in Master                       |#na
> >  |f7f2a8
> > AIRFLOW-1668|Redhsift requires a keep alive of < 300s          |#2650
> > |f2bb77
> > AIRFLOW-1664|Make MySqlToGoogleCloudStorageOperator support bin|#2649
> > |95813d
> > AIRFLOW-1660|Change webpage width to full-width                |#2646
> > |8ee3d9
> > AIRFLOW-1659|Fix invalid attribute bug in FileTaskHandler      |#2645
> > |bee823
> > AIRFLOW-1658|Kill (possibly) still running Druid indexing job a|#2644
> > |cbf7ad
> > AIRFLOW-1657|Handle failure of Qubole Operator for s3distcp had|-     |-
> >
> > AIRFLOW-1654|Show tooltips for link icons in DAGs view         |#2642
> > |ada7b2
> > AIRFLOW-1647|Fix Spark-sql hook                                |#2637
> > |b1e5c6
> > AIRFLOW-1641|Task gets stuck in queued state                   |#2715
> > |735497
> > AIRFLOW-1640|Add Qubole default connection in connection table |-     |-
> >
> > AIRFLOW-1639|ValueError does not have .message attribute       |#2629
> > |87df67
> > AIRFLOW-1637|readme not tracking master branch for travis      |-     |-
> >
> > AIRFLOW-1636|aws and emr connection types get cleared          |#2626
> > |540e04
> > AIRFLOW-1635|Allow creating Google Cloud Platform connection wi|#2640
> > |6dec7a
> > AIRFLOW-1629|make extra a textarea in edit connections form    |#2623
> > |f5d46f
> > AIRFLOW-1628|Docstring of sqlsensor is incorrect               |#2621
> > |9ba73d
> > AIRFLOW-1627|SubDagOperator initialization should only query po|#2620
> > |516ace
> > AIRFLOW-1621|Add tests for logic added on server side dag list |#2614
> > |8de9fd
> > AIRFLOW-1614|Improve performance of DAG parsing when there are |#2610
> > |a95adb
> > AIRFLOW-1611|Customize logging in Airflow                      |#2631
> > |8b4a50
> > AIRFLOW-1609|Ignore all venvs in gitignore                     |#2608
> > |f1f9b4
> > AIRFLOW-1608|GCP Dataflow hook missing pending job state       |#2607
> > |653562
> > AIRFLOW-1606|DAG.sync_to_db is static, but takes a DAG as first|#2606
> > |6ac296
> > AIRFLOW-1605|Fix log source of local loggers                   |-     |-
> >
> > AIRFLOW-1604|Rename the logger to log                          |#2604
> > |af4050
> > AIRFLOW-1602|Use LoggingMixin for the DAG class                |#2602
> > |956699
> > AIRFLOW-1601|Add configurable time between SIGTERM and SIGKILL |#2601
> > |48a95e
> > AIRFLOW-1600|Uncaught exceptions in get_fernet if cryptography |#2600
> > |ad963e
> > AIRFLOW-1597|Add GameWisp as Airflow user                      |#2599
> > |26b747
> > AIRFLOW-1594|Installing via pip copies test files into python l|#2597
> > |a6b23a
> > AIRFLOW-1593|Expose load_string in WasbHook                    |#2596
> > |7ece95
> > AIRFLOW-1591|Exception: 'TaskInstance' object has no attribute |#2578
> > |f4653e
> > AIRFLOW-1590|Small fix for dates util                          |#2652
> > |31946e
> > AIRFLOW-1587|fix `ImportError: cannot import name 'CeleryExecut|#2590
> > |34c73b
> > AIRFLOW-1586|MySQL to GCS to BigQuery fails for tables with dat|#2589
> > |e83012
> > AIRFLOW-1584|Remove the insecure /headers endpoints            |#2588
> > |17ac07
> > AIRFLOW-1582|Improve logging structure of Airflow              |#2592
> > |a7a518
> > AIRFLOW-1580|Error in string formatter when throwing an excepti|#2583
> > |ea9ab9
> > AIRFLOW-1579|Allow jagged rows in BQ Hook.                     |#2582
> > |5b978b
> > AIRFLOW-1577|Add token support to DatabricksHook               |#2579
> > |c2c515
> > AIRFLOW-1573|Remove `thrift < 0.10.0` requirement              |#2574
> > |aa95f2
> > AIRFLOW-1571|Add AWS Lambda Hook for invoking Lambda Function  |#2718
> > |017f18
> > AIRFLOW-1568|Add datastore import/export operator              |#2568
> > |86063b
> > AIRFLOW-1567|Clean up ML Engine operators                      |#2567
> > |af91e2
> > AIRFLOW-1564|Default logging filename contains a colon         |#2565
> > |4c674c
> > AIRFLOW-1560|Add AWS DynamoDB hook for inserting batch items   |#2587
> > |71400b
> > AIRFLOW-1556|BigQueryBaseCursor should support SQL parameters  |#2557
> > |9df0ac
> > AIRFLOW-1546| add Zymergen to org list in README               |#2512
> > |7cc346
> > AIRFLOW-1535|Add support for Dataproc serviceAccountScopes in D|#2546
> > |b1f902
> > AIRFLOW-1529|Support quoted newlines in Google BigQuery load jo|#2545
> > |4a4b02
> > AIRFLOW-1527|Refactor celery config to make use of template    |#2542
> > |f4437b
> > AIRFLOW-1522|Increase size of val column for variable table in |#2535
> > |8a2d24
> > AIRFLOW-1521|Template fields definition for bigquery_table_dele|#2534
> > |f1a7c0
> > AIRFLOW-1520|S3Hook uses boto2                                 |#2532
> > |386583
> > AIRFLOW-1519|Main DAG list page does not scale using client sid|#2531
> > |d7d7ce
> > AIRFLOW-1512|Add operator for running Python functions in a vir|#2446
> > |14e6d7
> > AIRFLOW-1507|Make src, dst and bucket parameters as templated i|#2516
> > |d295cf
> > AIRFLOW-1505|Document when Jinja substitution occurs           |#2523
> > |984a87
> > AIRFLOW-1504|Log Cluster Name on Dataproc Operator When Execute|#2517
> > |1cd6c4
> > AIRFLOW-1499s|Eliminate duplicate and unneeded code             |-     |-
> >
> > AIRFLOW-1497|Hidden fields in connection form aren't reset when|#2507
> > |d8da8b
> > AIRFLOW-1493|Fix race condition with airflow run               |#2505
> > |b2e175
> > AIRFLOW-1492|Add metric for task success/failure               |#2504
> > |fa84d4
> > AIRFLOW-1489|Docs: Typo in BigQueryCheckOperator               |#2501
> > |111ce5
> > AIRFLOW-1483|Page size on model views is to large to render qui|#2497
> > |04bfba
> > AIRFLOW-1478|Chart -> Owner column should be sortable          |#2493
> > |651e60
> > AIRFLOW-1476|Add INSTALL file for source releases              |#2492
> > |da76ac
> > AIRFLOW-1474|Add dag_id regex for 'airflow clear' CLI command  |#2486
> > |18f849
> > AIRFLOW-1470s|BashSensor Implementation                         |-     |-
> >
> > AIRFLOW-1459|integration rst doc is broken in github view      |#2481
> > |322ec9
> > AIRFLOW-1438|Scheduler batch queries should have a limit       |#2462
> > |3547cb
> > AIRFLOW-1437|BigQueryTableDeleteOperator should define deletion|#2459
> > |b87903
> > AIRFLOW-1432|NVD3 Charts do not have labeled axes and units cha|#2710
> > |70ffa4
> > AIRFLOW-1402|Cleanup SafeConfigParser DeprecationWarning       |#2435
> > |38c86b
> > AIRFLOW-1401|Standardize GCP project, region, and zone argument|#2439
> > |b6d363
> > AIRFLOW-1397|Airflow 1.8.1 - No data displays in Last Run Colum|-     |-
> >
> > AIRFLOW-1394|Add quote_character parameter to GoogleCloudStorag|#2428
> > |9fd0be
> > AIRFLOW-1389|BigQueryOperator should support `createDisposition|#2470
> > |6e2640
> > AIRFLOW-1384|Add ARGO/CaDC                                     |#2434
> > |715947
> > AIRFLOW-1368|Automatically remove the container when it exits  |#2653
> > |d42d23
> > AIRFLOW-1359|Provide GoogleCloudML operator for model evaluatio|#2407
> > |194d1d
> > AIRFLOW-1356|add `--celery_hostname` to `airflow worker`       |#2405
> > |b9d7d1
> > AIRFLOW-1352|Revert bad logging Handler                        |-     |-
> >
> > AIRFLOW-1350|Add "query_uri" parameter for Google DataProc oper|#2402
> > |d32c72
> > AIRFLOW-1348|Paginated UI has broken toggles after first page  |-     |-
> >
> > AIRFLOW-1345|Don't commit on each loop                         |#2397
> > |0dd002
> > AIRFLOW-1344|Builds failing on Python 3.5 with AttributeError  |#2394
> > |2a5883
> > AIRFLOW-1343|Add airflow default label to the dataproc operator|#2396
> > |e4b240
> > AIRFLOW-1338|gcp_dataflow_hook is incompatible with the recent |#2388
> > |cf2605
> > AIRFLOW-1337|Customize log format via config file              |#2392
> > |4841e3
> > AIRFLOW-1335|Use buffered logger                               |#2386
> > |0d23d3
> > AIRFLOW-1333|Enable copy function for Google Cloud Storage Hook|#2385
> > |e2c383
> > AIRFLOW-1331|Contrib.SparkSubmitOperator should allow --package|#2622
> > |fbca8f
> > AIRFLOW-1330|Connection.parse_from_uri doesn't work for google_|#2525
> > |6e5e9d
> > AIRFLOW-1324|Make the Druid operator/hook more general         |#2378
> > |de99aa
> > AIRFLOW-1323|Operators related to Dataproc should keep some par|#2636
> > |ed248d
> > AIRFLOW-1315|Add Qubole File and Partition Sensors             |-     |-
> >
> > AIRFLOW-1309|Add optional hive_tblproperties in HiveToDruidTran|-     |-
> >
> > AIRFLOW-1301|Add New Relic to Airflow user list                |#2359
> > |355fc9
> > AIRFLOW-1299|Google Dataproc cluster creation operator should s|#2358
> > |c2b80e
> > AIRFLOW-1289|Don't restrict scheduler threads to CPU cores     |#2353
> > |8e23d2
> > AIRFLOW-1286|BaseTaskRunner - Exception TypeError: a bytes-like|#2363
> > |d8891d
> > AIRFLOW-1277|Forbid creation of a known event with empty fields|#na
> >  |65184a
> > AIRFLOW-1276|Forbid event creation with end_data earlier than s|#na
> >  |d5d02f
> > AIRFLOW-1275|Fix `airflow pool` command exception              |#2346
> > |9958aa
> > AIRFLOW-1273|Google Cloud ML Version and Model CRUD Operator   |#2379
> > |534a0e
> > AIRFLOW-1272|Google Cloud ML Batch Prediction Operator         |#2390
> > |e92d6b
> > AIRFLOW-1271|Google Cloud ML Training Operator                 |#2408
> > |0fc450
> > AIRFLOW-1256|Add United Airlines as Airflow user               |#2332
> > |d3484a
> > AIRFLOW-1251|Add eRevalue as an Airflow user                   |#2331
> > |8d5160
> > AIRFLOW-1248|Fix inconsistent configuration name for worker tim|#2328
> > |92314f
> > AIRFLOW-1247|CLI: ignore all dependencies argument ignored     |#2441
> > |e88ecf
> > AIRFLOW-1245|Fix random failure of test_trigger_dag_for_date un|#2325
> > |cef01b
> > AIRFLOW-1244|Forbid creation of a pool with empty name         |#2324
> > |df9a10
> > AIRFLOW-1242|BigQueryHook assumes that a valid project_id can't|#2335
> > |ffe616
> > AIRFLOW-1237|Fix IN-predicate sqlalchemy warning               |#2320
> > |a1f422
> > AIRFLOW-1234|Cover utils.operator_helpers with unit tests      |#2317
> > |d16537
> > AIRFLOW-1233|Cover utils.json with unit tests                  |#2316
> > |502410
> > AIRFLOW-1232|Remove deprecated readfp warning                  |#2315
> > |6ffaaf
> > AIRFLOW-1231|Use flask_wtf.CSRFProtect instead of flask_wtf.Csr|#2313
> > |cac49e
> > AIRFLOW-1221|Fix DatabricksSubmitRunOperator Templating        |#2308
> > |0fa104
> > AIRFLOW-1217|Enable logging in Sqoop hook                      |#2307
> > |4f459b
> > AIRFLOW-1213|Add hcatalog parameters to the sqoop operator/hook|#2305
> > |857850
> > AIRFLOW-1208|Speed-up cli tests                                |#2301
> > |21c142
> > AIRFLOW-1207|Enable utils.helpers unit tests                   |#2300
> > |8ac87b
> > AIRFLOW-1203|Tests failing after oauth upgrade                 |#2296
> > |3e9c66
> > AIRFLOW-1201|Update deprecated 'nose-parameterized' library to |#2298
> > |d2d3e4
> > AIRFLOW-1193|Add Checkr to Airflow user list                   |#2276
> > |707238
> > AIRFLOW-1189|Get pandas DataFrame using BigQueryHook fails     |#2287
> > |93666f
> > AIRFLOW-1188|Add max_bad_records param to GoogleCloudStorageToB|#2286
> > |443e6b
> > AIRFLOW-1187|Obsolete package names in documentation           |-     |-
> >
> > AIRFLOW-1185|Incorrect url to PyPi                             |#2283
> > |829755
> > AIRFLOW-1181|Enable delete and list function for Google Cloud S|#2281
> > |24f73c
> > AIRFLOW-1179|Pandas 0.20 broke Google BigQuery hook            |#2279
> > |ac9ccb
> > AIRFLOW-1177|variable json deserialize does not work at set def|#2540
> > |65319a
> > AIRFLOW-1175|Add Pronto Tools to Airflow user list             |#2277
> > |86aafa
> > AIRFLOW-1173|Add Robinhood to list of Airflow users            |#2271
> > |379115
> > AIRFLOW-1165|airflow webservice crashes on ubuntu16 - python3  |-     |-
> >
> > AIRFLOW-1160|Upadte SparkSubmitOperator parameters             |#2265
> > |2e3f07
> > AIRFLOW-1155|Add Tails.com to community                        |#2261
> > |2fa690
> > AIRFLOW-1149|Allow custom filters to be added to jinja2        |#2258
> > |48135a
> > AIRFLOW-1141|Remove DAG.crawl_for_tasks method                 |#2275
> > |a30fee
> > AIRFLOW-1140|DatabricksSubmitRunOperator should template the "j|#2255
> > |e6d316
> > AIRFLOW-1136|Invalid parameters are not captured for Sqoop oper|#2252
> > |2ef4db
> > AIRFLOW-1125|Clarify documentation regarding fernet_key        |#2251
> > |831f8d
> > AIRFLOW-1122|Node strokes are too thin for people with color vi|#2246
> > |a08761
> > AIRFLOW-1121|airflow webserver --pid no longer write out pid fi|-     |-
> >
> > AIRFLOW-1118|Add evo.company to Airflow users                  |#2243
> > |f16914
> > AIRFLOW-1112|Log which pool is full in scheduler when pool slot|#2242
> > |74c1ce
> > AIRFLOW-1107|Add support for ftps non-default port             |#2240
> > |4d0c2f
> > AIRFLOW-1106|Add Groupalia/Letsbonus                           |#2239
> > |945b42
> > AIRFLOW-1095|ldap_auth memberOf should come from configuration |#2232
> > |6b1c32
> > AIRFLOW-1094|Invalid unit tests under `contrib/`               |#2234
> > |219c50
> > AIRFLOW-1091|As a release manager I want to be able to compare |#2231
> > |bfae42
> > AIRFLOW-1090|Add HBO                                           |#2230
> > |177d34
> > AIRFLOW-1089|Add Spark application arguments to SparkSubmitOper|#2229
> > |e5b914
> > AIRFLOW-1081|Task duration page is slow                        |#2226
> > |0da512
> > AIRFLOW-1075|Cleanup security docs                             |#2222
> > |5a6f18
> > AIRFLOW-1065|Add functionality for Azure Blob Storage          |#2216
> > |f1bc5f
> > AIRFLOW-1059|Reset_state_for_orphaned_task should operate in ba|#2205
> > |e05d3b
> > AIRFLOW-1058|Improvements for SparkSubmitOperator              |-     |-
> >
> > AIRFLOW-1051|Add a test for resetdb to CliTests                |#2198
> > |15aee0
> > AIRFLOW-1047|Airflow logs vulnerable to XSS                    |#2193
> > |fe9ebe
> > AIRFLOW-1045|Make log level configurable via airflow.cfg       |#2191
> > |e739a5
> > AIRFLOW-1043|Documentation issues for operators                |#2188
> > |b55f41
> > AIRFLOW-1041|DockerOperator replaces its xcom_push method with |#2274
> > |03704c
> > AIRFLOW-1040|Fix typos in comments/docstrings in models.py     |#2174
> > |d8c0f5
> > AIRFLOW-1036|Exponential backoff should use randomization      |#2262
> > |66168e
> > AIRFLOW-1035|Exponential backoff retry logic should use 2 as ba|#2196
> > |4ec932
> > AIRFLOW-1034|Make it possible to connect to S3 in sigv4 regions|#2181
> > |4c0905
> > AIRFLOW-1031|'scheduled__' may replace with DagRun.ID_PREFIX in|#2613
> > |aa3844
> > AIRFLOW-1030|HttpHook error when creating HttpSensor           |-     |-
> >
> > AIRFLOW-1028|Databricks Operator for Airflow                   |#2202
> > |53ca50
> > AIRFLOW-1024|Handle CeleryExecutor errors gracefully           |#2355
> > |7af20f
> > AIRFLOW-1018|Scheduler DAG processes can not log to stdout     |#2728
> > |ef775d
> > AIRFLOW-1016|Allow HTTP HEAD request method on HTTPSensor      |#2175
> > |4c41f6
> > AIRFLOW-1010|Add a convenience script for signing              |#2169
> > |a2b65a
> > AIRFLOW-1009|Remove SQLOperator from Concepts page             |#2168
> > |7d1144
> > AIRFLOW-1007|Jinja sandbox is vulnerable to RCE                |#2184
> > |daa281
> > AIRFLOW-1005|Speed up Airflow startup time                     |#na
> >  |996dd3
> > AIRFLOW-999 |Support for Redis database                        |#2165
> > |8de850
> > AIRFLOW-997 |Change setup.cfg to point to Apache instead of Max|#na
> >  |75cd46
> > AIRFLOW-995 |Update Github PR template                         |#2163
> > |b62485
> > AIRFLOW-994 |Add MiNODES to the AIRFLOW Active Users List      |#2159
> > |ca1623
> > AIRFLOW-991 |Mark_success while a task is running leads to fail|-     |-
> >
> > AIRFLOW-990 |DockerOperator fails when logging unicode string  |#2155
> > |6bbf54
> > AIRFLOW-988 |SLA Miss Callbacks Are Repeated if Email is Not be|#2415
> > |6e74d4
> > AIRFLOW-985 |Extend the sqoop operator/hook with additional par|#2177
> > |82eb20
> > AIRFLOW-984 |Subdags unrecognized when subclassing SubDagOperat|#2152
> > |a8bd16
> > AIRFLOW-979 |Add GovTech GDS                                   |#2149
> > |b17bd3
> > AIRFLOW-976 |Mark success running task causes it to fail       |-     |-
> >
> > AIRFLOW-969 |Catch bad python_callable argument at DAG construc|#2142
> > |12901d
> > AIRFLOW-963 |Some code examples are not rendered in the airflow|#2139
> > |f69c1b
> > AIRFLOW-960 |Add support for .editorconfig                     |#na
> >  |f5cacc
> > AIRFLOW-959 |.gitignore file is disorganized and incomplete    |#na
> >  |3d3c14
> > AIRFLOW-958 |Improve tooltip readability                       |#2134
> > |b3c3eb
> > AIRFLOW-950 |Missing AWS integrations on documentation::integra|#2552
> > |01be02
> > AIRFLOW-947 |Make PrestoHook surface better messages when the P|#na
> >  |6dd4b3
> > AIRFLOW-945 |Revert psycopg2 workaround when psycopg2 2.7.1 is |-     |-
> >
> > AIRFLOW-943 |Add Digital First Media to the Airflow users list |#2115
> > |2cfe28
> > AIRFLOW-942 |Add mytaxi to Airflow Users                       |#2111
> > |d579e6
> > AIRFLOW-935 |Impossible to use plugin executors                |#2120
> > |08a784
> > AIRFLOW-926 |jdbc connector is broken due to jaydebeapi api upd|#2651
> > |07ed29
> > AIRFLOW-917 |Incorrectly formatted failure status message      |#2109
> > |b8164c
> > AIRFLOW-916 |Fix ConfigParser deprecation warning              |#2108
> > |ef6dd1
> > AIRFLOW-911 |Add colouring and profiling info on tests         |#2106
> > |4f52db
> > AIRFLOW-903 |Add configuration setting for default DAG view.   |#2103
> > |cadfae
> > AIRFLOW-896 |BigQueryOperator fails to execute with certain inp|#2097
> > |2bceee
> > AIRFLOW-891 |Webserver Clock Should Include Day                |-     |-
> >
> > AIRFLOW-889 |Minor error in the docstrings for BaseOperator.   |#2084
> > |50702d
> > AIRFLOW-887 |Add compatibility with future v0.16               |#na
> >  |50902d
> > AIRFLOW-886 |Pass Operator result to post_execute hook         |#na
> >  |4da361
> > AIRFLOW-885 |Add Change.org to the list of Airflow users       |#2089
> > |a279be
> > AIRFLOW-882 |Code example in docs has unnecessary DAG>>Operator|#2088
> > |baa4cd
> > AIRFLOW-881 |Create SubDagOperator within DAG context manager w|#2087
> > |0ed608
> > AIRFLOW-880 |Fix remote log functionality inconsistencies for W|#2086
> > |974b75
> > AIRFLOW-877 |GoogleCloudStorageDownloadOperator: template_ext c|#2083
> > |debc69
> > AIRFLOW-875 |Allow HttpSensor params to be templated           |#2080
> > |62f503
> > AIRFLOW-871 |multiple places use logging.warn() instead of warn|#2082
> > |21d775
> > AIRFLOW-866 |Add FTPSensor                                     |#2070
> > |5f87f8
> > AIRFLOW-863 |Example DAG start dates should be recent to avoid |#2068
> > |bbfd43
> > AIRFLOW-862 |Add DaskExecutor                                  |#2067
> > |6e2210
> > AIRFLOW-860 |Circular module dependency prevents loading of cus|-     |-
> >
> > AIRFLOW-854 |Add Open Knowledge International to Airflow users |#2061
> > |51a311
> > AIRFLOW-842 |scheduler.clean_dirty raises warning: SAWarning: T|#2072
> > |485280
> > AIRFLOW-840 |Python3 encoding issue in Kerberos                |#2158
> > |639336
> > AIRFLOW-836 |The paused and queryview endpoints are vulnerable |#2054
> > |6aca2c
> > AIRFLOW-831 |Fix broken unit tests                             |#2050
> > |b86194
> > AIRFLOW-830 |Plugin manager should log to debug, not info      |-     |-
> >
> > AIRFLOW-829 |Reduce verbosity of successful Travis unit tests  |-     |-
> >
> > AIRFLOW-826 |Add Zendesk Hook                                  |#2066
> > |a09762
> > AIRFLOW-823 |Make task instance details available via API      |#2045
> > |3f546e
> > AIRFLOW-822 |Close the connection before throwing exception in |#2038
> > |4b6c38
> > AIRFLOW-821 |Scheduler dagbag importing not Py3 compatible     |#2039
> > |fbb59b
> > AIRFLOW-809 |SqlAlchemy is_ ColumnOperator Causing Errors in MS|-     |-
> >
> > AIRFLOW-802 |Integration of spark-submit                       |-     |-
> >
> > AIRFLOW-781 |Allow DataFlowJavaOperator to accept jar file stor|#2037
> > |259c86
> > AIRFLOW-770 |HDFS hooks should support alternative ways of gett|#2056
> > |261b65
> > AIRFLOW-756 |Refactor ssh_hook and ssh_operator                |-     |-
> >
> > AIRFLOW-751 |SFTP file transfer functionality                  |#1999
> > |fe0ede
> > AIRFLOW-725 |Make merge tool use OS' keyring for password stora|#1966
> > |8c1695
> > AIRFLOW-706 |Configuration shell commands are not split properl|#2053
> > |0bb6f2
> > AIRFLOW-705 |airflow.configuration.run_command output does not |-     |-
> >
> > AIRFLOW-681 |homepage doc link should pointing to apache's repo|#2164
> > |a8027a
> > AIRFLOW-654 |SSL for AMQP w/ Celery(Executor)                  |#2333
> > |868bfe
> > AIRFLOW-645 |HttpHook ignores https                            |#2311
> > |fd381a
> > AIRFLOW-365 |Code view in subdag trigger exception             |#2043
> > |cf102c
> > AIRFLOW-300 |Add Google Pubsub hook and operator               |#2036
> > |d231dc
> > AIRFLOW-289 |Use datetime.utcnow() to keep airflow system indep|#2618
> > |20c83e
> > AIRFLOW-71  |docker_operator - pulling from private repositorie|#na
> >  |d4406c
> >
> > Cheers,
> > Chris
> >
>

Re: [VOTE] Airflow 1.9.0rc1

Posted by Chris Riccomini <cr...@apache.org>.
Anyone? :/

On Mon, Nov 6, 2017 at 1:22 PM, Chris Riccomini <cr...@apache.org>
wrote:

> Hey all,
>
> I have cut Airflow 1.9.0 RC1. This email is calling a vote on the release,
> which will last fo 72 hours. Consider this my (binding) +1.
>
> Airflow 1.9.0 RC1 is available at:
>
> https://dist.apache.org/repos/dist/dev/incubator/airflow/1.9.0rc1/
>
> apache-airflow-1.9.0rc1+incubating-source.tar.gz is a source release that
> comes with INSTALL instructions.
> apache-airflow-1.9.0rc1+incubating-bin.tar.gz is the binary Python
> "sdist" release.
>
> Public keys are available at:
>
> https://dist.apache.org/repos/dist/release/incubator/airflow/
>
> The release contains the following JIRAs:
>
> ISSUE ID    |DESCRIPTION                                       |PR
> |COMMIT
> AIRFLOW-1779|Add keepalive packets to ssh hook                 |#2749
> |d2f9d1
> AIRFLOW-1776|stdout/stderr logging not captured                |#2745
> |590d9f
> AIRFLOW-1771|Change heartbeat text from boom to heartbeat      |-     |-
>
> AIRFLOW-1767|Airflow Scheduler no longer schedules DAGs        |-     |-
>
> AIRFLOW-1765|Default API auth backed should deny all.          |#2737
> |6ecdac
> AIRFLOW-1764|Web Interface should not use experimental api     |#2738
> |6bed1d
> AIRFLOW-1757|Contrib.SparkSubmitOperator should allow --package|#2725
> |4e06ee
> AIRFLOW-1745|BashOperator ignores SIGPIPE in subprocess        |#2714
> |e021c9
> AIRFLOW-1744|task.retries can be False                         |#2713
> |6144c6
> AIRFLOW-1743|Default config template should not contain ldap fi|#2712
> |270684
> AIRFLOW-1741|Task Duration shows two charts on first page load.|#2711
> |974b49
> AIRFLOW-1734|Sqoop Operator contains logic errors & needs optio|#2703
> |f6810c
> AIRFLOW-1731|Import custom config on PYTHONPATH                |#2721
> |f07eb3
> AIRFLOW-1726|Copy Expert command for Postgres Hook             |#2698
> |8a4ad3
> AIRFLOW-1719|Fix small typo - your vs you                      |-     |-
>
> AIRFLOW-1712|Log SSHOperator output                            |-     |-
>
> AIRFLOW-1711|Ldap Attributes not always a "list" part 2        |#2731
> |40a936
> AIRFLOW-1706|Scheduler is failed on startup with MS SQL Server |#2733
> |9e209b
> AIRFLOW-1698|Remove confusing SCHEDULER_RUNS env var from syste|#2677
> |00dd06
> AIRFLOW-1695|Redshift Hook using boto3 & AWS Hook              |#2717
> |bfddae
> AIRFLOW-1694|Hive Hooks: Python 3 does not have an `itertools.i|#2674
> |c6e5ae
> AIRFLOW-1692|Master cannot be checked out on windows           |#2673
> |31805e
> AIRFLOW-1691|Add better documentation for Google cloud storage |#2671
> |ace2b1
> AIRFLOW-1690|Error messages regarding gcs log commits are spars|#2670
> |5fb5cd
> AIRFLOW-1682|S3 task handler never writes to S3                |#2664
> |0080f0
> AIRFLOW-1678|Fix docstring errors for `set_upstream` and `set_d|-     |-
>
> AIRFLOW-1677|Fix typo in example_qubole_operator               |-     |-
>
> AIRFLOW-1676|GCS task handler never writes to GCS              |#2659
> |781fa4
> AIRFLOW-1675|Fix API docstrings to be properly rendered        |#2667
> |f12381
> AIRFLOW-1671|Missing @apply_defaults annotation for gcs downloa|#2655
> |97666b
> AIRFLOW-1669|Fix Docker import in Master                       |#na
>  |f7f2a8
> AIRFLOW-1668|Redhsift requires a keep alive of < 300s          |#2650
> |f2bb77
> AIRFLOW-1664|Make MySqlToGoogleCloudStorageOperator support bin|#2649
> |95813d
> AIRFLOW-1660|Change webpage width to full-width                |#2646
> |8ee3d9
> AIRFLOW-1659|Fix invalid attribute bug in FileTaskHandler      |#2645
> |bee823
> AIRFLOW-1658|Kill (possibly) still running Druid indexing job a|#2644
> |cbf7ad
> AIRFLOW-1657|Handle failure of Qubole Operator for s3distcp had|-     |-
>
> AIRFLOW-1654|Show tooltips for link icons in DAGs view         |#2642
> |ada7b2
> AIRFLOW-1647|Fix Spark-sql hook                                |#2637
> |b1e5c6
> AIRFLOW-1641|Task gets stuck in queued state                   |#2715
> |735497
> AIRFLOW-1640|Add Qubole default connection in connection table |-     |-
>
> AIRFLOW-1639|ValueError does not have .message attribute       |#2629
> |87df67
> AIRFLOW-1637|readme not tracking master branch for travis      |-     |-
>
> AIRFLOW-1636|aws and emr connection types get cleared          |#2626
> |540e04
> AIRFLOW-1635|Allow creating Google Cloud Platform connection wi|#2640
> |6dec7a
> AIRFLOW-1629|make extra a textarea in edit connections form    |#2623
> |f5d46f
> AIRFLOW-1628|Docstring of sqlsensor is incorrect               |#2621
> |9ba73d
> AIRFLOW-1627|SubDagOperator initialization should only query po|#2620
> |516ace
> AIRFLOW-1621|Add tests for logic added on server side dag list |#2614
> |8de9fd
> AIRFLOW-1614|Improve performance of DAG parsing when there are |#2610
> |a95adb
> AIRFLOW-1611|Customize logging in Airflow                      |#2631
> |8b4a50
> AIRFLOW-1609|Ignore all venvs in gitignore                     |#2608
> |f1f9b4
> AIRFLOW-1608|GCP Dataflow hook missing pending job state       |#2607
> |653562
> AIRFLOW-1606|DAG.sync_to_db is static, but takes a DAG as first|#2606
> |6ac296
> AIRFLOW-1605|Fix log source of local loggers                   |-     |-
>
> AIRFLOW-1604|Rename the logger to log                          |#2604
> |af4050
> AIRFLOW-1602|Use LoggingMixin for the DAG class                |#2602
> |956699
> AIRFLOW-1601|Add configurable time between SIGTERM and SIGKILL |#2601
> |48a95e
> AIRFLOW-1600|Uncaught exceptions in get_fernet if cryptography |#2600
> |ad963e
> AIRFLOW-1597|Add GameWisp as Airflow user                      |#2599
> |26b747
> AIRFLOW-1594|Installing via pip copies test files into python l|#2597
> |a6b23a
> AIRFLOW-1593|Expose load_string in WasbHook                    |#2596
> |7ece95
> AIRFLOW-1591|Exception: 'TaskInstance' object has no attribute |#2578
> |f4653e
> AIRFLOW-1590|Small fix for dates util                          |#2652
> |31946e
> AIRFLOW-1587|fix `ImportError: cannot import name 'CeleryExecut|#2590
> |34c73b
> AIRFLOW-1586|MySQL to GCS to BigQuery fails for tables with dat|#2589
> |e83012
> AIRFLOW-1584|Remove the insecure /headers endpoints            |#2588
> |17ac07
> AIRFLOW-1582|Improve logging structure of Airflow              |#2592
> |a7a518
> AIRFLOW-1580|Error in string formatter when throwing an excepti|#2583
> |ea9ab9
> AIRFLOW-1579|Allow jagged rows in BQ Hook.                     |#2582
> |5b978b
> AIRFLOW-1577|Add token support to DatabricksHook               |#2579
> |c2c515
> AIRFLOW-1573|Remove `thrift < 0.10.0` requirement              |#2574
> |aa95f2
> AIRFLOW-1571|Add AWS Lambda Hook for invoking Lambda Function  |#2718
> |017f18
> AIRFLOW-1568|Add datastore import/export operator              |#2568
> |86063b
> AIRFLOW-1567|Clean up ML Engine operators                      |#2567
> |af91e2
> AIRFLOW-1564|Default logging filename contains a colon         |#2565
> |4c674c
> AIRFLOW-1560|Add AWS DynamoDB hook for inserting batch items   |#2587
> |71400b
> AIRFLOW-1556|BigQueryBaseCursor should support SQL parameters  |#2557
> |9df0ac
> AIRFLOW-1546| add Zymergen to org list in README               |#2512
> |7cc346
> AIRFLOW-1535|Add support for Dataproc serviceAccountScopes in D|#2546
> |b1f902
> AIRFLOW-1529|Support quoted newlines in Google BigQuery load jo|#2545
> |4a4b02
> AIRFLOW-1527|Refactor celery config to make use of template    |#2542
> |f4437b
> AIRFLOW-1522|Increase size of val column for variable table in |#2535
> |8a2d24
> AIRFLOW-1521|Template fields definition for bigquery_table_dele|#2534
> |f1a7c0
> AIRFLOW-1520|S3Hook uses boto2                                 |#2532
> |386583
> AIRFLOW-1519|Main DAG list page does not scale using client sid|#2531
> |d7d7ce
> AIRFLOW-1512|Add operator for running Python functions in a vir|#2446
> |14e6d7
> AIRFLOW-1507|Make src, dst and bucket parameters as templated i|#2516
> |d295cf
> AIRFLOW-1505|Document when Jinja substitution occurs           |#2523
> |984a87
> AIRFLOW-1504|Log Cluster Name on Dataproc Operator When Execute|#2517
> |1cd6c4
> AIRFLOW-1499s|Eliminate duplicate and unneeded code             |-     |-
>
> AIRFLOW-1497|Hidden fields in connection form aren't reset when|#2507
> |d8da8b
> AIRFLOW-1493|Fix race condition with airflow run               |#2505
> |b2e175
> AIRFLOW-1492|Add metric for task success/failure               |#2504
> |fa84d4
> AIRFLOW-1489|Docs: Typo in BigQueryCheckOperator               |#2501
> |111ce5
> AIRFLOW-1483|Page size on model views is to large to render qui|#2497
> |04bfba
> AIRFLOW-1478|Chart -> Owner column should be sortable          |#2493
> |651e60
> AIRFLOW-1476|Add INSTALL file for source releases              |#2492
> |da76ac
> AIRFLOW-1474|Add dag_id regex for 'airflow clear' CLI command  |#2486
> |18f849
> AIRFLOW-1470s|BashSensor Implementation                         |-     |-
>
> AIRFLOW-1459|integration rst doc is broken in github view      |#2481
> |322ec9
> AIRFLOW-1438|Scheduler batch queries should have a limit       |#2462
> |3547cb
> AIRFLOW-1437|BigQueryTableDeleteOperator should define deletion|#2459
> |b87903
> AIRFLOW-1432|NVD3 Charts do not have labeled axes and units cha|#2710
> |70ffa4
> AIRFLOW-1402|Cleanup SafeConfigParser DeprecationWarning       |#2435
> |38c86b
> AIRFLOW-1401|Standardize GCP project, region, and zone argument|#2439
> |b6d363
> AIRFLOW-1397|Airflow 1.8.1 - No data displays in Last Run Colum|-     |-
>
> AIRFLOW-1394|Add quote_character parameter to GoogleCloudStorag|#2428
> |9fd0be
> AIRFLOW-1389|BigQueryOperator should support `createDisposition|#2470
> |6e2640
> AIRFLOW-1384|Add ARGO/CaDC                                     |#2434
> |715947
> AIRFLOW-1368|Automatically remove the container when it exits  |#2653
> |d42d23
> AIRFLOW-1359|Provide GoogleCloudML operator for model evaluatio|#2407
> |194d1d
> AIRFLOW-1356|add `--celery_hostname` to `airflow worker`       |#2405
> |b9d7d1
> AIRFLOW-1352|Revert bad logging Handler                        |-     |-
>
> AIRFLOW-1350|Add "query_uri" parameter for Google DataProc oper|#2402
> |d32c72
> AIRFLOW-1348|Paginated UI has broken toggles after first page  |-     |-
>
> AIRFLOW-1345|Don't commit on each loop                         |#2397
> |0dd002
> AIRFLOW-1344|Builds failing on Python 3.5 with AttributeError  |#2394
> |2a5883
> AIRFLOW-1343|Add airflow default label to the dataproc operator|#2396
> |e4b240
> AIRFLOW-1338|gcp_dataflow_hook is incompatible with the recent |#2388
> |cf2605
> AIRFLOW-1337|Customize log format via config file              |#2392
> |4841e3
> AIRFLOW-1335|Use buffered logger                               |#2386
> |0d23d3
> AIRFLOW-1333|Enable copy function for Google Cloud Storage Hook|#2385
> |e2c383
> AIRFLOW-1331|Contrib.SparkSubmitOperator should allow --package|#2622
> |fbca8f
> AIRFLOW-1330|Connection.parse_from_uri doesn't work for google_|#2525
> |6e5e9d
> AIRFLOW-1324|Make the Druid operator/hook more general         |#2378
> |de99aa
> AIRFLOW-1323|Operators related to Dataproc should keep some par|#2636
> |ed248d
> AIRFLOW-1315|Add Qubole File and Partition Sensors             |-     |-
>
> AIRFLOW-1309|Add optional hive_tblproperties in HiveToDruidTran|-     |-
>
> AIRFLOW-1301|Add New Relic to Airflow user list                |#2359
> |355fc9
> AIRFLOW-1299|Google Dataproc cluster creation operator should s|#2358
> |c2b80e
> AIRFLOW-1289|Don't restrict scheduler threads to CPU cores     |#2353
> |8e23d2
> AIRFLOW-1286|BaseTaskRunner - Exception TypeError: a bytes-like|#2363
> |d8891d
> AIRFLOW-1277|Forbid creation of a known event with empty fields|#na
>  |65184a
> AIRFLOW-1276|Forbid event creation with end_data earlier than s|#na
>  |d5d02f
> AIRFLOW-1275|Fix `airflow pool` command exception              |#2346
> |9958aa
> AIRFLOW-1273|Google Cloud ML Version and Model CRUD Operator   |#2379
> |534a0e
> AIRFLOW-1272|Google Cloud ML Batch Prediction Operator         |#2390
> |e92d6b
> AIRFLOW-1271|Google Cloud ML Training Operator                 |#2408
> |0fc450
> AIRFLOW-1256|Add United Airlines as Airflow user               |#2332
> |d3484a
> AIRFLOW-1251|Add eRevalue as an Airflow user                   |#2331
> |8d5160
> AIRFLOW-1248|Fix inconsistent configuration name for worker tim|#2328
> |92314f
> AIRFLOW-1247|CLI: ignore all dependencies argument ignored     |#2441
> |e88ecf
> AIRFLOW-1245|Fix random failure of test_trigger_dag_for_date un|#2325
> |cef01b
> AIRFLOW-1244|Forbid creation of a pool with empty name         |#2324
> |df9a10
> AIRFLOW-1242|BigQueryHook assumes that a valid project_id can't|#2335
> |ffe616
> AIRFLOW-1237|Fix IN-predicate sqlalchemy warning               |#2320
> |a1f422
> AIRFLOW-1234|Cover utils.operator_helpers with unit tests      |#2317
> |d16537
> AIRFLOW-1233|Cover utils.json with unit tests                  |#2316
> |502410
> AIRFLOW-1232|Remove deprecated readfp warning                  |#2315
> |6ffaaf
> AIRFLOW-1231|Use flask_wtf.CSRFProtect instead of flask_wtf.Csr|#2313
> |cac49e
> AIRFLOW-1221|Fix DatabricksSubmitRunOperator Templating        |#2308
> |0fa104
> AIRFLOW-1217|Enable logging in Sqoop hook                      |#2307
> |4f459b
> AIRFLOW-1213|Add hcatalog parameters to the sqoop operator/hook|#2305
> |857850
> AIRFLOW-1208|Speed-up cli tests                                |#2301
> |21c142
> AIRFLOW-1207|Enable utils.helpers unit tests                   |#2300
> |8ac87b
> AIRFLOW-1203|Tests failing after oauth upgrade                 |#2296
> |3e9c66
> AIRFLOW-1201|Update deprecated 'nose-parameterized' library to |#2298
> |d2d3e4
> AIRFLOW-1193|Add Checkr to Airflow user list                   |#2276
> |707238
> AIRFLOW-1189|Get pandas DataFrame using BigQueryHook fails     |#2287
> |93666f
> AIRFLOW-1188|Add max_bad_records param to GoogleCloudStorageToB|#2286
> |443e6b
> AIRFLOW-1187|Obsolete package names in documentation           |-     |-
>
> AIRFLOW-1185|Incorrect url to PyPi                             |#2283
> |829755
> AIRFLOW-1181|Enable delete and list function for Google Cloud S|#2281
> |24f73c
> AIRFLOW-1179|Pandas 0.20 broke Google BigQuery hook            |#2279
> |ac9ccb
> AIRFLOW-1177|variable json deserialize does not work at set def|#2540
> |65319a
> AIRFLOW-1175|Add Pronto Tools to Airflow user list             |#2277
> |86aafa
> AIRFLOW-1173|Add Robinhood to list of Airflow users            |#2271
> |379115
> AIRFLOW-1165|airflow webservice crashes on ubuntu16 - python3  |-     |-
>
> AIRFLOW-1160|Upadte SparkSubmitOperator parameters             |#2265
> |2e3f07
> AIRFLOW-1155|Add Tails.com to community                        |#2261
> |2fa690
> AIRFLOW-1149|Allow custom filters to be added to jinja2        |#2258
> |48135a
> AIRFLOW-1141|Remove DAG.crawl_for_tasks method                 |#2275
> |a30fee
> AIRFLOW-1140|DatabricksSubmitRunOperator should template the "j|#2255
> |e6d316
> AIRFLOW-1136|Invalid parameters are not captured for Sqoop oper|#2252
> |2ef4db
> AIRFLOW-1125|Clarify documentation regarding fernet_key        |#2251
> |831f8d
> AIRFLOW-1122|Node strokes are too thin for people with color vi|#2246
> |a08761
> AIRFLOW-1121|airflow webserver --pid no longer write out pid fi|-     |-
>
> AIRFLOW-1118|Add evo.company to Airflow users                  |#2243
> |f16914
> AIRFLOW-1112|Log which pool is full in scheduler when pool slot|#2242
> |74c1ce
> AIRFLOW-1107|Add support for ftps non-default port             |#2240
> |4d0c2f
> AIRFLOW-1106|Add Groupalia/Letsbonus                           |#2239
> |945b42
> AIRFLOW-1095|ldap_auth memberOf should come from configuration |#2232
> |6b1c32
> AIRFLOW-1094|Invalid unit tests under `contrib/`               |#2234
> |219c50
> AIRFLOW-1091|As a release manager I want to be able to compare |#2231
> |bfae42
> AIRFLOW-1090|Add HBO                                           |#2230
> |177d34
> AIRFLOW-1089|Add Spark application arguments to SparkSubmitOper|#2229
> |e5b914
> AIRFLOW-1081|Task duration page is slow                        |#2226
> |0da512
> AIRFLOW-1075|Cleanup security docs                             |#2222
> |5a6f18
> AIRFLOW-1065|Add functionality for Azure Blob Storage          |#2216
> |f1bc5f
> AIRFLOW-1059|Reset_state_for_orphaned_task should operate in ba|#2205
> |e05d3b
> AIRFLOW-1058|Improvements for SparkSubmitOperator              |-     |-
>
> AIRFLOW-1051|Add a test for resetdb to CliTests                |#2198
> |15aee0
> AIRFLOW-1047|Airflow logs vulnerable to XSS                    |#2193
> |fe9ebe
> AIRFLOW-1045|Make log level configurable via airflow.cfg       |#2191
> |e739a5
> AIRFLOW-1043|Documentation issues for operators                |#2188
> |b55f41
> AIRFLOW-1041|DockerOperator replaces its xcom_push method with |#2274
> |03704c
> AIRFLOW-1040|Fix typos in comments/docstrings in models.py     |#2174
> |d8c0f5
> AIRFLOW-1036|Exponential backoff should use randomization      |#2262
> |66168e
> AIRFLOW-1035|Exponential backoff retry logic should use 2 as ba|#2196
> |4ec932
> AIRFLOW-1034|Make it possible to connect to S3 in sigv4 regions|#2181
> |4c0905
> AIRFLOW-1031|'scheduled__' may replace with DagRun.ID_PREFIX in|#2613
> |aa3844
> AIRFLOW-1030|HttpHook error when creating HttpSensor           |-     |-
>
> AIRFLOW-1028|Databricks Operator for Airflow                   |#2202
> |53ca50
> AIRFLOW-1024|Handle CeleryExecutor errors gracefully           |#2355
> |7af20f
> AIRFLOW-1018|Scheduler DAG processes can not log to stdout     |#2728
> |ef775d
> AIRFLOW-1016|Allow HTTP HEAD request method on HTTPSensor      |#2175
> |4c41f6
> AIRFLOW-1010|Add a convenience script for signing              |#2169
> |a2b65a
> AIRFLOW-1009|Remove SQLOperator from Concepts page             |#2168
> |7d1144
> AIRFLOW-1007|Jinja sandbox is vulnerable to RCE                |#2184
> |daa281
> AIRFLOW-1005|Speed up Airflow startup time                     |#na
>  |996dd3
> AIRFLOW-999 |Support for Redis database                        |#2165
> |8de850
> AIRFLOW-997 |Change setup.cfg to point to Apache instead of Max|#na
>  |75cd46
> AIRFLOW-995 |Update Github PR template                         |#2163
> |b62485
> AIRFLOW-994 |Add MiNODES to the AIRFLOW Active Users List      |#2159
> |ca1623
> AIRFLOW-991 |Mark_success while a task is running leads to fail|-     |-
>
> AIRFLOW-990 |DockerOperator fails when logging unicode string  |#2155
> |6bbf54
> AIRFLOW-988 |SLA Miss Callbacks Are Repeated if Email is Not be|#2415
> |6e74d4
> AIRFLOW-985 |Extend the sqoop operator/hook with additional par|#2177
> |82eb20
> AIRFLOW-984 |Subdags unrecognized when subclassing SubDagOperat|#2152
> |a8bd16
> AIRFLOW-979 |Add GovTech GDS                                   |#2149
> |b17bd3
> AIRFLOW-976 |Mark success running task causes it to fail       |-     |-
>
> AIRFLOW-969 |Catch bad python_callable argument at DAG construc|#2142
> |12901d
> AIRFLOW-963 |Some code examples are not rendered in the airflow|#2139
> |f69c1b
> AIRFLOW-960 |Add support for .editorconfig                     |#na
>  |f5cacc
> AIRFLOW-959 |.gitignore file is disorganized and incomplete    |#na
>  |3d3c14
> AIRFLOW-958 |Improve tooltip readability                       |#2134
> |b3c3eb
> AIRFLOW-950 |Missing AWS integrations on documentation::integra|#2552
> |01be02
> AIRFLOW-947 |Make PrestoHook surface better messages when the P|#na
>  |6dd4b3
> AIRFLOW-945 |Revert psycopg2 workaround when psycopg2 2.7.1 is |-     |-
>
> AIRFLOW-943 |Add Digital First Media to the Airflow users list |#2115
> |2cfe28
> AIRFLOW-942 |Add mytaxi to Airflow Users                       |#2111
> |d579e6
> AIRFLOW-935 |Impossible to use plugin executors                |#2120
> |08a784
> AIRFLOW-926 |jdbc connector is broken due to jaydebeapi api upd|#2651
> |07ed29
> AIRFLOW-917 |Incorrectly formatted failure status message      |#2109
> |b8164c
> AIRFLOW-916 |Fix ConfigParser deprecation warning              |#2108
> |ef6dd1
> AIRFLOW-911 |Add colouring and profiling info on tests         |#2106
> |4f52db
> AIRFLOW-903 |Add configuration setting for default DAG view.   |#2103
> |cadfae
> AIRFLOW-896 |BigQueryOperator fails to execute with certain inp|#2097
> |2bceee
> AIRFLOW-891 |Webserver Clock Should Include Day                |-     |-
>
> AIRFLOW-889 |Minor error in the docstrings for BaseOperator.   |#2084
> |50702d
> AIRFLOW-887 |Add compatibility with future v0.16               |#na
>  |50902d
> AIRFLOW-886 |Pass Operator result to post_execute hook         |#na
>  |4da361
> AIRFLOW-885 |Add Change.org to the list of Airflow users       |#2089
> |a279be
> AIRFLOW-882 |Code example in docs has unnecessary DAG>>Operator|#2088
> |baa4cd
> AIRFLOW-881 |Create SubDagOperator within DAG context manager w|#2087
> |0ed608
> AIRFLOW-880 |Fix remote log functionality inconsistencies for W|#2086
> |974b75
> AIRFLOW-877 |GoogleCloudStorageDownloadOperator: template_ext c|#2083
> |debc69
> AIRFLOW-875 |Allow HttpSensor params to be templated           |#2080
> |62f503
> AIRFLOW-871 |multiple places use logging.warn() instead of warn|#2082
> |21d775
> AIRFLOW-866 |Add FTPSensor                                     |#2070
> |5f87f8
> AIRFLOW-863 |Example DAG start dates should be recent to avoid |#2068
> |bbfd43
> AIRFLOW-862 |Add DaskExecutor                                  |#2067
> |6e2210
> AIRFLOW-860 |Circular module dependency prevents loading of cus|-     |-
>
> AIRFLOW-854 |Add Open Knowledge International to Airflow users |#2061
> |51a311
> AIRFLOW-842 |scheduler.clean_dirty raises warning: SAWarning: T|#2072
> |485280
> AIRFLOW-840 |Python3 encoding issue in Kerberos                |#2158
> |639336
> AIRFLOW-836 |The paused and queryview endpoints are vulnerable |#2054
> |6aca2c
> AIRFLOW-831 |Fix broken unit tests                             |#2050
> |b86194
> AIRFLOW-830 |Plugin manager should log to debug, not info      |-     |-
>
> AIRFLOW-829 |Reduce verbosity of successful Travis unit tests  |-     |-
>
> AIRFLOW-826 |Add Zendesk Hook                                  |#2066
> |a09762
> AIRFLOW-823 |Make task instance details available via API      |#2045
> |3f546e
> AIRFLOW-822 |Close the connection before throwing exception in |#2038
> |4b6c38
> AIRFLOW-821 |Scheduler dagbag importing not Py3 compatible     |#2039
> |fbb59b
> AIRFLOW-809 |SqlAlchemy is_ ColumnOperator Causing Errors in MS|-     |-
>
> AIRFLOW-802 |Integration of spark-submit                       |-     |-
>
> AIRFLOW-781 |Allow DataFlowJavaOperator to accept jar file stor|#2037
> |259c86
> AIRFLOW-770 |HDFS hooks should support alternative ways of gett|#2056
> |261b65
> AIRFLOW-756 |Refactor ssh_hook and ssh_operator                |-     |-
>
> AIRFLOW-751 |SFTP file transfer functionality                  |#1999
> |fe0ede
> AIRFLOW-725 |Make merge tool use OS' keyring for password stora|#1966
> |8c1695
> AIRFLOW-706 |Configuration shell commands are not split properl|#2053
> |0bb6f2
> AIRFLOW-705 |airflow.configuration.run_command output does not |-     |-
>
> AIRFLOW-681 |homepage doc link should pointing to apache's repo|#2164
> |a8027a
> AIRFLOW-654 |SSL for AMQP w/ Celery(Executor)                  |#2333
> |868bfe
> AIRFLOW-645 |HttpHook ignores https                            |#2311
> |fd381a
> AIRFLOW-365 |Code view in subdag trigger exception             |#2043
> |cf102c
> AIRFLOW-300 |Add Google Pubsub hook and operator               |#2036
> |d231dc
> AIRFLOW-289 |Use datetime.utcnow() to keep airflow system indep|#2618
> |20c83e
> AIRFLOW-71  |docker_operator - pulling from private repositorie|#na
>  |d4406c
>
> Cheers,
> Chris
>