You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Felix Uellendall (Jira)" <ji...@apache.org> on 2019/11/14 09:51:00 UTC
[jira] [Comment Edited] (AIRFLOW-2999) Add S3DownloadOperator
[ https://issues.apache.org/jira/browse/AIRFLOW-2999?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16974097#comment-16974097 ]
Felix Uellendall edited comment on AIRFLOW-2999 at 11/14/19 9:50 AM:
---------------------------------------------------------------------
If I would split this into "S3DownloadOperator" >> "MySqlUploadOperator" I would have a state in-between tasks that I think should not be.
was (Author: feluelle):
If I would split this into "S3DownloadOperator" >> "MySqlUploadOperator" I would have a state in between tasks that I think should not be.
> Add S3DownloadOperator
> ----------------------
>
> Key: AIRFLOW-2999
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2999
> Project: Apache Airflow
> Issue Type: Task
> Affects Versions: 1.10.0
> Reporter: jack
> Priority: Major
>
> The [S3_hook |https://github.com/apache/incubator-airflow/blob/master/airflow/hooks/S3_hook.py#L177] has get_key method that returns boto3.s3.Object it also has load_file method which loads file from local file system to S3.
>
> What it doesn't have is a method to download a file from S3 to the local file system.
> Basicly it should be something very simple... an extention to the get_key method with parameter to the destination on local file system adding a code for taking the boto3.s3.Object and save it on the disk. Note: that it can be more than 1 file if the user choose a folder in S3.
>
> +*Update:*+
> As discussed in comments instead having the property in the hook it's better to mirror the GoogleCloudStorageDownloadOperator and have S3DownloadOperator
>
>
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)