You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/06/24 08:53:57 UTC

[GitHub] [airflow] alexInhert opened a new issue #16627: add more filter options to list_keys of S3Hook

alexInhert opened a new issue #16627:
URL: https://github.com/apache/airflow/issues/16627


   The hook has [list_keys](https://github.com/apache/airflow/blob/c8a628abf484f0bd9805f44dd37e284d2b5ee7db/airflow/providers/amazon/aws/hooks/s3.py#L265) allow to filter by prefix it would be nice to be able to filter by creation date of file or last modified date. ideally I think it would be nice if the function can support any kind of filter that boto3 allows.
   
   The use case is that for the moment if you want to get all files that were modified after date X you need to list all the files and get them one by one to check their last modified date. this is not efficent. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] sunank200 edited a comment on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
sunank200 edited a comment on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-873503661


   @eladkal , @alexInhert @potiuk I would love to add this feature and take this as my first issue on the airflow. Can I take this up?
   
   I can think of the following approach that to implement this feature. Here, the class [S3Hook](https://github.com/apache/airflow/blob/c8a628abf484f0bd9805f44dd37e284d2b5ee7db/airflow/providers/amazon/aws/hooks/s3.py#L96)  Interact with AWS S3, using the boto3 library. The hook has [list_keys](https://github.com/apache/airflow/blob/c8a628abf484f0bd9805f44dd37e284d2b5ee7db/airflow/providers/amazon/aws/hooks/s3.py#L265), which uses [S3.Client.list_objects_v2](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.list_objects_v2) of boto3 to fetch the list of keys. The list_object_v2 documentation doesn't specify the argument to filter keys by creation date of file or last modified date, but the response contains last modified date as per documentation. 
   
   The current implementation of list_keys in the S3Hook uses paginate method of a [Paginator](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/paginators.html) in order to iterate over the pages of API operation results. Hence, the approach I purpose here is that the keys can be filtered for last modified date using JMESPath. JMESPath is a query language for JSON that can be used directly on paginated results. One can filter results using JMESPath expressions that are applied to each page of results through the search method of a PageIterator of S3 Paginator. I have added the code snippet of the JMESPath expression below which would list the keys based on filter of last modified datetime between `from_datetime` and `to_datetime` which defaults to None.
   
   ```
       paginator = self.get_conn().get_paginator('list_objects_v2')
       response = paginator.paginate(
               Bucket=bucket_name, Prefix=prefix, Delimiter=delimiter, PaginationConfig=config
           )
   
       # JMESPath to query directly on paginated results
       filtered_response = response.search(
               "Contents[?to_string("
               "LastModified)<='\"{}\"' && "
               "to_string(LastModified)>='\"{"
               "}\"'].Key".format(to_datetime, from_datetime)
           )
       keys = []
       for key in filtered_response:
           keys.append(key)
   ```
   
   This change wouldn't affect dependencies for other operators like `S3DeleteObjectsOperator`, `S3ListOperator`, S3Hook methods:`get_wildcard_key`, `delete_bucket` and `S3KeysUnchangedSensor`.
   
   Corresponding unittest can be modified and added to [test_s3.py](https://github.com/apache/airflow/blob/5399f9124a4e75c7bb89e47c267d89b5280060ad/tests/providers/amazon/aws/hooks/test_s3.py#L146) and [test_gcs_to_s3.py](https://github.com/apache/airflow/blob/main/tests/providers/amazon/aws/transfers/test_gcs_to_s3.py)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] LukeHong commented on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
LukeHong commented on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-869457030


   Could you provide some examples of filter that allowed by boto3?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] sunank200 edited a comment on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
sunank200 edited a comment on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-874194007


   @potiuk I have raised the PR. Please review it whenever you have time. Would be happy to work on the feedback.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] kaxil closed issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
kaxil closed issue #16627:
URL: https://github.com/apache/airflow/issues/16627


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] sunank200 commented on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
sunank200 commented on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-870383059


   Should the response format change as well? Like the addition of timestamp or any other metadata along with the s3 key.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] sunank200 commented on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
sunank200 commented on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-874194007


   @potiuk I have raised the PR. Please review it whenever you have time. Would be happy to know your feedback.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] dstandish commented on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
dstandish commented on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-873521269


   I did this for our internal repo and what I did **was refactor list_keys to call a list_objects method** so you could get the full objects and filter after:
   
   ```python
       @provide_bucket_name
       def list_objects(
           self,
           bucket_name: Optional[str] = None,
           prefix: Optional[str] = None,
           delimiter: Optional[str] = None,
           page_size: Optional[int] = None,
           max_items: Optional[int] = None,
           start_after_key: Optional[str] = None,
           start_after_time: Optional['DateTime'] = None,
       ) -> List[S3Object]:
           """
           Lists keys in a bucket under prefix and not containing delimiter
   
           Args:
               bucket_name: the name of the bucket
               prefix: a key prefix
               delimiter: the delimiter marks key hierarchy.
               page_size: pagination size
               max_items: maximum items to return
               start_after_key: should return only keys greater than this key
               start_after_time: should return only keys with LastModified attr greater than this time
   ```
   
   this lets you use either start after key (which is supported by list_objects_v2) or start after time (which is what you're after, and which requires that we list out every file in the prefix).
   
   and if people want to use other object info for filtering it would be easy to do.
   
   I think that might not be a bad way to go here.  
   
   then list keys somehting like this:
   
   ```python
       @provide_bucket_name
       def list_keys(
           self,
           bucket_name: Optional[str] = None,
           prefix: Optional[str] = None,
           delimiter: Optional[str] = None,
           page_size: Optional[int] = None,
           max_items: Optional[int] = None,
           start_after_key: Optional[str] = None,
           start_after_time: Optional['DateTime'] = None,
       ) -> list:
           objects = self.list_objects(
               bucket_name=bucket_name,
               prefix=prefix,
               delimiter=delimiter,
               page_size=page_size,
               max_items=max_items,
               start_after_key=start_after_key,
               start_after_time=start_after_time,
           )
           return [o.Key for o in objects]
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] sunank200 edited a comment on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
sunank200 edited a comment on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-873503661


   @eladkal , @alexInhert @potiuk I would love to add this feature and take this as my issue on the airflow. Can I take this up?
   
   I can think of the following approach that to implement this feature. Here, the class [S3Hook](https://github.com/apache/airflow/blob/c8a628abf484f0bd9805f44dd37e284d2b5ee7db/airflow/providers/amazon/aws/hooks/s3.py#L96)  Interact with AWS S3, using the boto3 library. The hook has [list_keys](https://github.com/apache/airflow/blob/c8a628abf484f0bd9805f44dd37e284d2b5ee7db/airflow/providers/amazon/aws/hooks/s3.py#L265), which uses [S3.Client.list_objects_v2](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.list_objects_v2) of boto3 to fetch the list of keys. The list_object_v2 documentation doesn't specify the argument to filter keys by creation date of file or last modified date, but the response contains last modified date as per documentation. 
   
   The current implementation of list_keys in the S3Hook uses paginate method of a [Paginator](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/paginators.html) in order to iterate over the pages of API operation results. Hence, the approach I purpose here is that the keys can be filtered for last modified date using JMESPath. JMESPath is a query language for JSON that can be used directly on paginated results. One can filter results using JMESPath expressions that are applied to each page of results through the search method of a PageIterator of S3 Paginator. I have added the code snippet of the JMESPath expression below which would list the keys based on filter of last modified datetime between `from_datetime` and `to_datetime` which defaults to None.
   
   ```
       paginator = self.get_conn().get_paginator('list_objects_v2')
       response = paginator.paginate(
               Bucket=bucket_name, Prefix=prefix, Delimiter=delimiter, PaginationConfig=config
           )
   
       # JMESPath to query directly on paginated results
       filtered_response = response.search(
               "Contents[?to_string("
               "LastModified)<='\"{}\"' && "
               "to_string(LastModified)>='\"{"
               "}\"'].Key".format(to_datetime, from_datetime)
           )
       keys = []
       for key in filtered_response:
           keys.append(key)
   ```
   
   This change wouldn't affect dependencies for other operators like `S3DeleteObjectsOperator`, `S3ListOperator`, S3Hook methods:`get_wildcard_key`, `delete_bucket` and `S3KeysUnchangedSensor`.
   
   Corresponding unittest can be modified and added to [test_s3.py](https://github.com/apache/airflow/blob/5399f9124a4e75c7bb89e47c267d89b5280060ad/tests/providers/amazon/aws/hooks/test_s3.py#L146) and [test_gcs_to_s3.py](https://github.com/apache/airflow/blob/main/tests/providers/amazon/aws/transfers/test_gcs_to_s3.py)
   
   Once all the test has passed, documentation can be updated at [docs](https://github.com/apache/airflow/tree/main/docs/apache-airflow-providers-amazon).


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] sunank200 commented on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
sunank200 commented on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-873503661


   @eladkal , @alexInhert @potiuk I would love to add this feature and take this as my first issue on the airflow. Can I take this up?
   
   I can think of the following approach that to implement this feature. Here, the class [S3Hook](https://github.com/apache/airflow/blob/c8a628abf484f0bd9805f44dd37e284d2b5ee7db/airflow/providers/amazon/aws/hooks/s3.py#L96)  Interact with AWS S3, using the boto3 library. The hook has [list_keys](https://github.com/apache/airflow/blob/c8a628abf484f0bd9805f44dd37e284d2b5ee7db/airflow/providers/amazon/aws/hooks/s3.py#L265), which uses [S3.Client.list_objects_v2](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.list_objects_v2) of boto3 to fetch the list of keys. The list_object_v2 documentation doesn't specify the argument to filter keys by creation date of file or last modified date, but the response contains last modified date as per documentation. 
   The current implementation of list_keys in the S3Hook uses paginate method of a [Paginator](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/paginators.html) in order to iterate over the pages of API operation results. Hence, the approach I purpose here is that the keys can be filtered for last modified date using JMESPath. JMESPath is a query language for JSON that can be used directly on paginated results. One can filter results using JMESPath expressions that are applied to each page of results through the search method of a PageIterator of S3 Paginator. I have added the code snippet of the JMESPath expression below which would list the keys based on filter of last modified datetime between `from_datetime` and `to_datetime` which defaults to None.
   
   ```
       paginator = self.get_conn().get_paginator('list_objects_v2')
       response = paginator.paginate(
               Bucket=bucket_name, Prefix=prefix, Delimiter=delimiter, PaginationConfig=config
           )
   
       # JMESPath to query directly on paginated results
       filtered_response = response.search(
               "Contents[?to_string("
               "LastModified)<='\"{}\"' && "
               "to_string(LastModified)>='\"{"
               "}\"'].Key".format(to_datetime, from_datetime)
           )
       keys = []
       for key in filtered_response:
           keys.append(key)
   ```
   
   This change wouldn't affect dependencies for other operators like `S3DeleteObjectsOperator`, `S3ListOperator`, S3Hook methods:`get_wildcard_key`, `delete_bucket` and `S3KeysUnchangedSensor`.
   
   Corresponding unittest can be modified and added to [test_s3.py](https://github.com/apache/airflow/blob/5399f9124a4e75c7bb89e47c267d89b5280060ad/tests/providers/amazon/aws/hooks/test_s3.py#L146) and [test_gcs_to_s3.py](https://github.com/apache/airflow/blob/main/tests/providers/amazon/aws/transfers/test_gcs_to_s3.py)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] sunank200 commented on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
sunank200 commented on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-870383059


   Should the response format change as well? Like the addition of timestamp or any other metadata along with the s3 key.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] sunank200 edited a comment on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
sunank200 edited a comment on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-873503661


   @eladkal , @alexInhert @potiuk I would love to add this feature and take this as my first issue on the airflow. Can I take this up?
   
   I can think of the following approach that to implement this feature. Here, the class [S3Hook](https://github.com/apache/airflow/blob/c8a628abf484f0bd9805f44dd37e284d2b5ee7db/airflow/providers/amazon/aws/hooks/s3.py#L96)  Interact with AWS S3, using the boto3 library. The hook has [list_keys](https://github.com/apache/airflow/blob/c8a628abf484f0bd9805f44dd37e284d2b5ee7db/airflow/providers/amazon/aws/hooks/s3.py#L265), which uses [S3.Client.list_objects_v2](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.list_objects_v2) of boto3 to fetch the list of keys. The list_object_v2 documentation doesn't specify the argument to filter keys by creation date of file or last modified date, but the response contains last modified date as per documentation. 
   
   The current implementation of list_keys in the S3Hook uses paginate method of a [Paginator](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/paginators.html) in order to iterate over the pages of API operation results. Hence, the approach I purpose here is that the keys can be filtered for last modified date using JMESPath. JMESPath is a query language for JSON that can be used directly on paginated results. One can filter results using JMESPath expressions that are applied to each page of results through the search method of a PageIterator of S3 Paginator. I have added the code snippet of the JMESPath expression below which would list the keys based on filter of last modified datetime between `from_datetime` and `to_datetime` which defaults to None.
   
   ```
       paginator = self.get_conn().get_paginator('list_objects_v2')
       response = paginator.paginate(
               Bucket=bucket_name, Prefix=prefix, Delimiter=delimiter, PaginationConfig=config
           )
   
       # JMESPath to query directly on paginated results
       filtered_response = response.search(
               "Contents[?to_string("
               "LastModified)<='\"{}\"' && "
               "to_string(LastModified)>='\"{"
               "}\"'].Key".format(to_datetime, from_datetime)
           )
       keys = []
       for key in filtered_response:
           keys.append(key)
   ```
   
   This change wouldn't affect dependencies for other operators like `S3DeleteObjectsOperator`, `S3ListOperator`, S3Hook methods:`get_wildcard_key`, `delete_bucket` and `S3KeysUnchangedSensor`.
   
   Corresponding unittest can be modified and added to [test_s3.py](https://github.com/apache/airflow/blob/5399f9124a4e75c7bb89e47c267d89b5280060ad/tests/providers/amazon/aws/hooks/test_s3.py#L146) and [test_gcs_to_s3.py](https://github.com/apache/airflow/blob/main/tests/providers/amazon/aws/transfers/test_gcs_to_s3.py)
   
   Once all the test has passed, documentation can be updated at [docs](https://github.com/apache/airflow/tree/main/docs/apache-airflow-providers-amazon).


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ankit-wadhwaniai commented on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
ankit-wadhwaniai commented on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-922977373


   yes @eladkal. Will finish it this week


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eladkal commented on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
eladkal commented on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-922957709


   @sunank200 seems your PR was almost done. Would you like to finish it?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] dstandish edited a comment on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
dstandish edited a comment on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-873521269


   I did this for our internal repo and what I did **was refactor list_keys to call a list_objects method** so you could get the full objects and filter after:
   
   ```python
       @provide_bucket_name
       def list_objects(
           self,
           bucket_name: Optional[str] = None,
           prefix: Optional[str] = None,
           delimiter: Optional[str] = None,
           page_size: Optional[int] = None,
           max_items: Optional[int] = None,
           start_after_key: Optional[str] = None,
           start_after_time: Optional['DateTime'] = None,
       ) -> List[S3Object]:
           """
           Lists keys in a bucket under prefix and not containing delimiter
   
           Args:
               bucket_name: the name of the bucket
               prefix: a key prefix
               delimiter: the delimiter marks key hierarchy.
               page_size: pagination size
               max_items: maximum items to return
               start_after_key: should return only keys greater than this key
               start_after_time: should return only keys with LastModified attr greater than this time
   ```
   
   this lets you use either start after key (which is supported by list_objects_v2) or start after time (which is what you're after, and which requires that we list out every file in the prefix).
   
   and if people want to use other object info for filtering then they have the list_objects method to use.
   
   I think that might not be a bad way to go here.  
   
   then list keys somehting like this:
   
   ```python
       @provide_bucket_name
       def list_keys(
           self,
           bucket_name: Optional[str] = None,
           prefix: Optional[str] = None,
           delimiter: Optional[str] = None,
           page_size: Optional[int] = None,
           max_items: Optional[int] = None,
           start_after_key: Optional[str] = None,
           start_after_time: Optional['DateTime'] = None,
       ) -> list:
           objects = self.list_objects(
               bucket_name=bucket_name,
               prefix=prefix,
               delimiter=delimiter,
               page_size=page_size,
               max_items=max_items,
               start_after_key=start_after_key,
               start_after_time=start_after_time,
           )
           return [o.Key for o in objects]
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] sunank200 edited a comment on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
sunank200 edited a comment on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-874194007






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-873542801


   Assigned it to you @sunank200 !


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] sunank200 commented on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
sunank200 commented on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-874194007


   @potiuk I have raised the PR. Please review it whenever you have time. Would be happy to know your feedback.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] sunank200 edited a comment on issue #16627: add more filter options to list_keys of S3Hook

Posted by GitBox <gi...@apache.org>.
sunank200 edited a comment on issue #16627:
URL: https://github.com/apache/airflow/issues/16627#issuecomment-874194007


   @potiuk I have raised the PR. Would be happy to work on the feedback.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org