You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "ASF GitHub Bot (Jira)" <ji...@apache.org> on 2020/04/07 11:04:00 UTC

[jira] [Commented] (AIRFLOW-6914) Add a default robots.txt to deny all search engines

    [ https://issues.apache.org/jira/browse/AIRFLOW-6914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17077118#comment-17077118 ] 

ASF GitHub Bot commented on AIRFLOW-6914:
-----------------------------------------

kaxil commented on pull request #7653: [AIRFLOW-6914] Add a default robots.txt
URL: https://github.com/apache/airflow/pull/7653
 
 
   
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


> Add a default robots.txt to deny all search engines
> ---------------------------------------------------
>
>                 Key: AIRFLOW-6914
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-6914
>             Project: Apache Airflow
>          Issue Type: Improvement
>          Components: security, ui
>    Affects Versions: 1.10.6, 1.10.7, 1.10.8, 1.10.9
>            Reporter: Kaxil Naik
>            Priority: Major
>              Labels: gsoc
>
> If the Airflow UI is public, Google can index it and if the Authentication has not been enabled it is a serious security threat if it is a prod cluster.
> Something like this probably should work
> {code:python}
> @app.route('/robots.txt', methods=['GET'])
> def robotstxt():
>     return send_from_directory(os.path.join(app.root_path, 'static', 'txt'),
>                                'robots.txt')
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)