You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2022/01/09 06:40:15 UTC

[GitHub] [airflow] rsg17 opened a new pull request #20769: [wip] [Part 2] Calendar to GCS Operator

rsg17 opened a new pull request #20769:
URL: https://github.com/apache/airflow/pull/20769


   related: #8471
   
   This PR adds a Google Calendar to GCS Operator to write Google Calendar Events to GCS.
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information.
   In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 commented on a change in pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 commented on a change in pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#discussion_r785534928



##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):

Review comment:
       Added




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] eladkal merged pull request #20769: [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
eladkal merged pull request #20769:
URL: https://github.com/apache/airflow/pull/20769


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 commented on pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 commented on pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#issuecomment-1013972440


   > > I read through this [guide](https://github.com/apache/airflow/blob/main/TESTING.rst#airflow-system-tests). I think I need make some changes to variables.env file; but I am not sure what these changes should be.
   > 
   > From your example DAG it looks like you need to set up two variables:
   > 
   > * GCP_GCS_BUCKET
   > * CALENDAR_ID
   > 
   > Apart from that you will need to add a credentials key because you are using it in test:
   > 
   > ```python
   > @pytest.mark.credential_file(GCP_GCS_KEY)
   > ```
   > 
   > The file should be named `gcp_gcs.json` and put according to the guide you linked:
   > 
   > > If your system tests need some credential files to be available for an authentication with external systems, make sure to keep these credentials in the files/airflow-breeze-config/keys directory. Mark your tests with @pytest.mark.credential_file() so that they are skipped if such a credential file is not there. The tests should read the right credentials and authenticate them on their own. The credentials are read in Breeze from the /files directory. The local "files" folder is mounted to the "/files" folder in Breeze.
   > 
   > Regarding this question:
   > 
   > > Also, I am not sure about the --forward-credentials option for [running system tests](https://github.com/apache/airflow/blob/main/TESTING.rst#the-typical-system-test-session). Which credentials will it forward? Do I need to put the credentials somewhere?
   > 
   > You need to use credentials to some GCP project. As far as I know we don't have airflow project that can be used for this purpose (cc @potiuk to confirm).
   
   Thank you for all your help @turbaszek. 
   I am running into an authentication error. I don't see this error when I try to connect to gcs directly. Could you help take a look?
   
   This gives the error location:
   `___________________________________ GoogleCalendarToGCSExampleDagsSystemTest.test_run_example_dag_function ____________________________________
   /usr/local/lib/python3.7/contextlib.py:73: in inner
       with self._recreate_cm():
   /usr/local/lib/python3.7/contextlib.py:112: in __enter__
       return next(self.gen)
   tests/test_utils/gcp_system_helpers.py:108: in provide_gcp_context
       f"--key-file={key_file_path}",
   tests/test_utils/logging_command_executor.py:92: in execute_cmd
       env=env,
   /usr/local/lib/python3.7/subprocess.py:800: in __init__
       restore_signals, start_new_session)
   _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
   
   self = <subprocess.Popen object at 0x4065c8b250>
   args = ['gcloud', 'auth', 'activate-service-account', '--key-file=/files/airflow-breeze-config/keys/gcp_gcs.json'], executable = b'gcloud'
   preexec_fn = None, close_fds = True, pass_fds = (), cwd = None, env = None, startupinfo = None, creationflags = 0, shell = False, p2cread = -1
   p2cwrite = -1, c2pread = 18, c2pwrite = 19, errread = 20, errwrite = 21, restore_signals = True, start_new_session = False
   `
   
   Here is the error:
   `E               FileNotFoundError: [Errno 2] No such file or directory: 'gcloud': 'gcloud'`
   
   I tried running `gcloud auth activate-service-account` with the same key and same service account from my laptop - that worked. I was also able to upload a file to GCS using the same key: https://gist.github.com/rsg17/b582414f630c2ac38323b87213b270d5


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 commented on a change in pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 commented on a change in pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#discussion_r785535294



##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref:TODO `howto/operator:GoogleCalendarToGCSOperator`

Review comment:
       Removed `TODO`




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 commented on a change in pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 commented on a change in pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#discussion_r784365399



##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref:TODO `howto/operator:GoogleCalendarToGCSOperator`
+
+    :param calendar_id: The Google Calendar ID to interact with.
+    :type calendar_id: str
+    :param destination_bucket: The destination Google cloud storage bucket where the
+        report should be written to. (templated)
+    :type destination_bucket: str
+    :param destination_path: The Google cloud storage URI array for the object created by the operator.
+        For example: ``path/to/my/files``.
+    :type destination_path: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate using domain-wide delegation of authority,
+        if any. For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    :param impersonation_chain: Optional service account to impersonate using short-term
+        credentials, or chained list of accounts required to get the access_token
+        of the last account in the list, which will be impersonated in the request.
+        If set as a string, the account must grant the originating account
+        the Service Account Token Creator IAM role.
+        If set as a sequence, the identities from the list must grant
+        Service Account Token Creator IAM role to the directly preceding identity, with first
+        account from the list granting this role to the originating account (templated).
+    :type impersonation_chain: Union[str, Sequence[str]]
+    """
+
+    template_fields = [
+        "calendar_id",
+        "destination_bucket",
+        "destination_path",
+        "impersonation_chain",
+    ]
+
+    def __init__(
+        self,
+        *,
+        destination_bucket: str,
+        calendar_id: str = "primary",
+        i_cal_uid: Optional[str] = None,
+        max_attendees: Optional[int] = None,
+        max_results: Optional[int] = None,
+        order_by: Optional[str] = None,
+        private_extended_property: Optional[str] = None,
+        q: Optional[str] = None,

Review comment:
       Yes - will rename it to `free_text_query` or something similar based on https://developers.google.com/calendar/api/v3/reference/events/list




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 edited a comment on pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 edited a comment on pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#issuecomment-1013972440


   > > I read through this [guide](https://github.com/apache/airflow/blob/main/TESTING.rst#airflow-system-tests). I think I need make some changes to variables.env file; but I am not sure what these changes should be.
   > 
   > From your example DAG it looks like you need to set up two variables:
   > 
   > * GCP_GCS_BUCKET
   > * CALENDAR_ID
   > 
   > Apart from that you will need to add a credentials key because you are using it in test:
   > 
   > ```python
   > @pytest.mark.credential_file(GCP_GCS_KEY)
   > ```
   > 
   > The file should be named `gcp_gcs.json` and put according to the guide you linked:
   > 
   > > If your system tests need some credential files to be available for an authentication with external systems, make sure to keep these credentials in the files/airflow-breeze-config/keys directory. Mark your tests with @pytest.mark.credential_file() so that they are skipped if such a credential file is not there. The tests should read the right credentials and authenticate them on their own. The credentials are read in Breeze from the /files directory. The local "files" folder is mounted to the "/files" folder in Breeze.
   > 
   > Regarding this question:
   > 
   > > Also, I am not sure about the --forward-credentials option for [running system tests](https://github.com/apache/airflow/blob/main/TESTING.rst#the-typical-system-test-session). Which credentials will it forward? Do I need to put the credentials somewhere?
   > 
   > You need to use credentials to some GCP project. As far as I know we don't have airflow project that can be used for this purpose (cc @potiuk to confirm).
   
   Thank you for all your help @turbaszek. 
   I am running into an authentication error through breeze. I don't see this error when I try from my laptop (not through breeze). I have added the key-file, added corresponding variables to variables.env and I am forwarding credentials to breeze.
   
   Could you help take a look?
   
   This gives the error location:
   `___________________________________ GoogleCalendarToGCSExampleDagsSystemTest.test_run_example_dag_function ____________________________________
   /usr/local/lib/python3.7/contextlib.py:73: in inner
       with self._recreate_cm():
   /usr/local/lib/python3.7/contextlib.py:112: in __enter__
       return next(self.gen)
   tests/test_utils/gcp_system_helpers.py:108: in provide_gcp_context
       f"--key-file={key_file_path}",
   tests/test_utils/logging_command_executor.py:92: in execute_cmd
       env=env,
   /usr/local/lib/python3.7/subprocess.py:800: in __init__
       restore_signals, start_new_session)
   _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
   
   self = <subprocess.Popen object at 0x4065c8b250>
   args = ['gcloud', 'auth', 'activate-service-account', '--key-file=/files/airflow-breeze-config/keys/gcp_gcs.json'], executable = b'gcloud'
   preexec_fn = None, close_fds = True, pass_fds = (), cwd = None, env = None, startupinfo = None, creationflags = 0, shell = False, p2cread = -1
   p2cwrite = -1, c2pread = 18, c2pwrite = 19, errread = 20, errwrite = 21, restore_signals = True, start_new_session = False
   `
   
   Here is the error:
   `E               FileNotFoundError: [Errno 2] No such file or directory: 'gcloud': 'gcloud'`
   
   I tried running `gcloud auth activate-service-account` with the same key and same service account from my laptop - that worked. I was also able to upload a file to GCS using the same key: https://gist.github.com/rsg17/b582414f630c2ac38323b87213b270d5


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] josh-fell commented on a change in pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
josh-fell commented on a change in pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#discussion_r780880018



##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref:TODO `howto/operator:GoogleCalendarToGCSOperator`
+
+    :param calendar_id: The Google Calendar ID to interact with.
+    :type calendar_id: str
+    :param destination_bucket: The destination Google cloud storage bucket where the
+        report should be written to. (templated)
+    :type destination_bucket: str
+    :param destination_path: The Google cloud storage URI array for the object created by the operator.
+        For example: ``path/to/my/files``.
+    :type destination_path: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate using domain-wide delegation of authority,
+        if any. For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    :param impersonation_chain: Optional service account to impersonate using short-term
+        credentials, or chained list of accounts required to get the access_token
+        of the last account in the list, which will be impersonated in the request.
+        If set as a string, the account must grant the originating account
+        the Service Account Token Creator IAM role.
+        If set as a sequence, the identities from the list must grant
+        Service Account Token Creator IAM role to the directly preceding identity, with first
+        account from the list granting this role to the originating account (templated).
+    :type impersonation_chain: Union[str, Sequence[str]]
+    """
+
+    template_fields = [
+        "calendar_id",
+        "destination_bucket",
+        "destination_path",
+        "impersonation_chain",
+    ]
+
+    def __init__(
+        self,
+        *,
+        destination_bucket: str,
+        calendar_id: str = "primary",
+        i_cal_uid: Optional[str] = None,
+        max_attendees: Optional[int] = None,
+        max_results: Optional[int] = None,
+        order_by: Optional[str] = None,
+        private_extended_property: Optional[str] = None,
+        q: Optional[str] = None,

Review comment:
       It would be great if this parameter name was something more verbose. I don't think users would have a practical idea of what this is related to by its name alone.

##########
File path: docs/apache-airflow-providers-google/operators/transfer/calendar_to_gcs.rst
##########
@@ -0,0 +1,47 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Calendar to Google Cloud Storage Transfer Operators
+========================================================

Review comment:
       ```suggestion
   Google Calendar to Google Cloud Storage Transfer Operators
   ==========================================================
   ```
   The underline needs to be at least as long as that which is being underlined.

##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref:TODO `howto/operator:GoogleCalendarToGCSOperator`
+
+    :param calendar_id: The Google Calendar ID to interact with.
+    :type calendar_id: str
+    :param destination_bucket: The destination Google cloud storage bucket where the
+        report should be written to. (templated)
+    :type destination_bucket: str
+    :param destination_path: The Google cloud storage URI array for the object created by the operator.
+        For example: ``path/to/my/files``.
+    :type destination_path: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate using domain-wide delegation of authority,
+        if any. For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    :param impersonation_chain: Optional service account to impersonate using short-term
+        credentials, or chained list of accounts required to get the access_token
+        of the last account in the list, which will be impersonated in the request.
+        If set as a string, the account must grant the originating account
+        the Service Account Token Creator IAM role.
+        If set as a sequence, the identities from the list must grant
+        Service Account Token Creator IAM role to the directly preceding identity, with first
+        account from the list granting this role to the originating account (templated).
+    :type impersonation_chain: Union[str, Sequence[str]]
+    """
+
+    template_fields = [
+        "calendar_id",
+        "destination_bucket",
+        "destination_path",
+        "impersonation_chain",
+    ]
+
+    def __init__(
+        self,
+        *,
+        destination_bucket: str,
+        calendar_id: str = "primary",
+        i_cal_uid: Optional[str] = None,
+        max_attendees: Optional[int] = None,
+        max_results: Optional[int] = None,
+        order_by: Optional[str] = None,
+        private_extended_property: Optional[str] = None,
+        q: Optional[str] = None,
+        shared_extended_property: Optional[str] = None,
+        show_deleted: Optional[bool] = False,
+        show_hidden_invitation: Optional[bool] = False,

Review comment:
       ```suggestion
           show_deleted: bool = False,
           show_hidden_invitation: bool = False,
   ```
   The `Optional` typing implies that this value _could_ be `None`. It doesn't seem like this is the case.

##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):

Review comment:
       There are several input parameters to this operator that are missing from the docstring. Would you mind adding them? The docstring is directly used to build the operator's API documentation in Airflow.

##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref:TODO `howto/operator:GoogleCalendarToGCSOperator`
+
+    :param calendar_id: The Google Calendar ID to interact with.
+    :type calendar_id: str
+    :param destination_bucket: The destination Google cloud storage bucket where the
+        report should be written to. (templated)
+    :type destination_bucket: str
+    :param destination_path: The Google cloud storage URI array for the object created by the operator.
+        For example: ``path/to/my/files``.
+    :type destination_path: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate using domain-wide delegation of authority,
+        if any. For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    :param impersonation_chain: Optional service account to impersonate using short-term
+        credentials, or chained list of accounts required to get the access_token
+        of the last account in the list, which will be impersonated in the request.
+        If set as a string, the account must grant the originating account
+        the Service Account Token Creator IAM role.
+        If set as a sequence, the identities from the list must grant
+        Service Account Token Creator IAM role to the directly preceding identity, with first
+        account from the list granting this role to the originating account (templated).
+    :type impersonation_chain: Union[str, Sequence[str]]
+    """
+
+    template_fields = [
+        "calendar_id",
+        "destination_bucket",
+        "destination_path",
+        "impersonation_chain",
+    ]
+
+    def __init__(
+        self,
+        *,
+        destination_bucket: str,
+        calendar_id: str = "primary",
+        i_cal_uid: Optional[str] = None,
+        max_attendees: Optional[int] = None,
+        max_results: Optional[int] = None,
+        order_by: Optional[str] = None,
+        private_extended_property: Optional[str] = None,
+        q: Optional[str] = None,
+        shared_extended_property: Optional[str] = None,
+        show_deleted: Optional[bool] = False,
+        show_hidden_invitation: Optional[bool] = False,
+        single_events: Optional[bool] = False,
+        sync_token: Optional[str] = None,
+        time_max: Optional[datetime] = None,
+        time_min: Optional[datetime] = None,
+        time_zone: Optional[str] = None,
+        updated_min: Optional[datetime] = None,
+        destination_path: Optional[str] = None,
+        gcp_conn_id: str = "google_cloud_default",
+        delegate_to: Optional[str] = None,
+        impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(**kwargs)
+        self.gcp_conn_id = gcp_conn_id
+        self.calendar_id = calendar_id
+        self.i_cal_uid = i_cal_uid
+        self.max_attendees = max_attendees
+        self.max_results = max_results
+        self.order_by = order_by
+        self.private_extended_property = private_extended_property
+        self.q = q
+        self.shared_extended_property = shared_extended_property
+        self.show_deleted = show_deleted
+        self.show_hidden_invitation = show_hidden_invitation
+        self.single_events = single_events
+        self.sync_token = sync_token
+        self.time_max = time_max
+        self.time_min = time_min
+        self.time_zone = time_zone
+        self.updated_min = updated_min
+        self.destination_bucket = destination_bucket
+        self.destination_path = destination_path
+        self.delegate_to = delegate_to
+        self.impersonation_chain = impersonation_chain
+
+    def _upload_data(
+        self,
+        gcs_hook: GCSHook,
+        hook: GoogleCalendarHook,
+        events: List[Any],
+    ) -> str:
+        # Construct destination file path
+        file_name = f"{self.calendar_id}.json".replace(" ", "_")
+        dest_file_name = (
+            f"{self.destination_path.strip('/')}/{file_name}" if self.destination_path else file_name
+        )
+
+        with NamedTemporaryFile("w+") as temp_file:
+            # Write data
+            json.dump(events, temp_file)
+            temp_file.flush()
+
+            # Upload to GCS
+            gcs_hook.upload(
+                bucket_name=self.destination_bucket,
+                object_name=dest_file_name,
+                filename=temp_file.name,
+            )
+        return dest_file_name
+
+    def execute(self, context):
+        calendar_hook = GoogleCalendarHook(
+            gcp_conn_id=self.gcp_conn_id,
+            delegate_to=self.delegate_to,
+            impersonation_chain=self.impersonation_chain,
+        )
+        gcs_hook = GCSHook(
+            gcp_conn_id=self.gcp_conn_id,
+            delegate_to=self.delegate_to,
+            impersonation_chain=self.impersonation_chain,
+        )
+
+        # Pull data and upload
+        destination_array: List[str] = []
+        events = calendar_hook.get_events(
+            calendar_id=self.calendar_id,
+            i_cal_uid=self.i_cal_uid,
+            max_attendees=self.max_attendees,
+            max_results=self.max_results,
+            order_by=self.order_by,
+            private_extended_property=self.private_extended_property,
+            q=self.q,
+            shared_extended_property=self.shared_extended_property,
+            show_deleted=self.show_deleted,
+            show_hidden_invitation=self.show_hidden_invitation,
+            single_events=self.single_events,
+            sync_token=self.sync_token,
+            time_max=self.time_max,
+            time_min=self.time_min,
+            time_zone=self.time_zone,
+            updated_min=self.updated_min,
+        )
+        gcs_path_to_file = self._upload_data(gcs_hook, calendar_hook, events)
+        destination_array.append(gcs_path_to_file)
+
+        self.xcom_push(context, "destination_objects", destination_array)
+        return destination_array

Review comment:
       This feels redundant. The operator is effectively pushing the same value twice as an `XCom` but with different keys: the direct `xcom_push()` will push with an `XCom` key of "destination_objects" while the return would push with a key of "return_value" (assuming users don't set `do_xcom_push=False`).

##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref:TODO `howto/operator:GoogleCalendarToGCSOperator`
+
+    :param calendar_id: The Google Calendar ID to interact with.
+    :type calendar_id: str
+    :param destination_bucket: The destination Google cloud storage bucket where the
+        report should be written to. (templated)
+    :type destination_bucket: str
+    :param destination_path: The Google cloud storage URI array for the object created by the operator.
+        For example: ``path/to/my/files``.
+    :type destination_path: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate using domain-wide delegation of authority,
+        if any. For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    :param impersonation_chain: Optional service account to impersonate using short-term
+        credentials, or chained list of accounts required to get the access_token
+        of the last account in the list, which will be impersonated in the request.
+        If set as a string, the account must grant the originating account
+        the Service Account Token Creator IAM role.
+        If set as a sequence, the identities from the list must grant
+        Service Account Token Creator IAM role to the directly preceding identity, with first
+        account from the list granting this role to the originating account (templated).
+    :type impersonation_chain: Union[str, Sequence[str]]
+    """
+
+    template_fields = [
+        "calendar_id",
+        "destination_bucket",
+        "destination_path",
+        "impersonation_chain",
+    ]
+
+    def __init__(
+        self,
+        *,
+        destination_bucket: str,
+        calendar_id: str = "primary",
+        i_cal_uid: Optional[str] = None,
+        max_attendees: Optional[int] = None,
+        max_results: Optional[int] = None,
+        order_by: Optional[str] = None,
+        private_extended_property: Optional[str] = None,
+        q: Optional[str] = None,
+        shared_extended_property: Optional[str] = None,
+        show_deleted: Optional[bool] = False,
+        show_hidden_invitation: Optional[bool] = False,
+        single_events: Optional[bool] = False,
+        sync_token: Optional[str] = None,
+        time_max: Optional[datetime] = None,
+        time_min: Optional[datetime] = None,
+        time_zone: Optional[str] = None,
+        updated_min: Optional[datetime] = None,
+        destination_path: Optional[str] = None,
+        gcp_conn_id: str = "google_cloud_default",
+        delegate_to: Optional[str] = None,
+        impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(**kwargs)
+        self.gcp_conn_id = gcp_conn_id
+        self.calendar_id = calendar_id
+        self.i_cal_uid = i_cal_uid
+        self.max_attendees = max_attendees
+        self.max_results = max_results
+        self.order_by = order_by
+        self.private_extended_property = private_extended_property
+        self.q = q
+        self.shared_extended_property = shared_extended_property
+        self.show_deleted = show_deleted
+        self.show_hidden_invitation = show_hidden_invitation
+        self.single_events = single_events
+        self.sync_token = sync_token
+        self.time_max = time_max
+        self.time_min = time_min
+        self.time_zone = time_zone
+        self.updated_min = updated_min
+        self.destination_bucket = destination_bucket
+        self.destination_path = destination_path
+        self.delegate_to = delegate_to
+        self.impersonation_chain = impersonation_chain
+
+    def _upload_data(
+        self,
+        gcs_hook: GCSHook,
+        hook: GoogleCalendarHook,
+        events: List[Any],
+    ) -> str:
+        # Construct destination file path
+        file_name = f"{self.calendar_id}.json".replace(" ", "_")
+        dest_file_name = (
+            f"{self.destination_path.strip('/')}/{file_name}" if self.destination_path else file_name
+        )
+
+        with NamedTemporaryFile("w+") as temp_file:
+            # Write data
+            json.dump(events, temp_file)
+            temp_file.flush()
+
+            # Upload to GCS
+            gcs_hook.upload(
+                bucket_name=self.destination_bucket,
+                object_name=dest_file_name,
+                filename=temp_file.name,
+            )
+        return dest_file_name
+
+    def execute(self, context):
+        calendar_hook = GoogleCalendarHook(
+            gcp_conn_id=self.gcp_conn_id,
+            delegate_to=self.delegate_to,
+            impersonation_chain=self.impersonation_chain,
+        )
+        gcs_hook = GCSHook(
+            gcp_conn_id=self.gcp_conn_id,
+            delegate_to=self.delegate_to,
+            impersonation_chain=self.impersonation_chain,
+        )

Review comment:
       This looks like it could be moved to the `_upload_data()` method where it is solely used rather than instantiating it in `execute()` and passing the object around (or making it an instance attribute).

##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref:TODO `howto/operator:GoogleCalendarToGCSOperator`
+
+    :param calendar_id: The Google Calendar ID to interact with.
+    :type calendar_id: str
+    :param destination_bucket: The destination Google cloud storage bucket where the
+        report should be written to. (templated)
+    :type destination_bucket: str
+    :param destination_path: The Google cloud storage URI array for the object created by the operator.
+        For example: ``path/to/my/files``.
+    :type destination_path: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate using domain-wide delegation of authority,
+        if any. For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    :param impersonation_chain: Optional service account to impersonate using short-term
+        credentials, or chained list of accounts required to get the access_token
+        of the last account in the list, which will be impersonated in the request.
+        If set as a string, the account must grant the originating account
+        the Service Account Token Creator IAM role.
+        If set as a sequence, the identities from the list must grant
+        Service Account Token Creator IAM role to the directly preceding identity, with first
+        account from the list granting this role to the originating account (templated).
+    :type impersonation_chain: Union[str, Sequence[str]]
+    """
+
+    template_fields = [
+        "calendar_id",
+        "destination_bucket",
+        "destination_path",
+        "impersonation_chain",
+    ]
+
+    def __init__(
+        self,
+        *,
+        destination_bucket: str,
+        calendar_id: str = "primary",
+        i_cal_uid: Optional[str] = None,
+        max_attendees: Optional[int] = None,
+        max_results: Optional[int] = None,
+        order_by: Optional[str] = None,
+        private_extended_property: Optional[str] = None,
+        q: Optional[str] = None,
+        shared_extended_property: Optional[str] = None,
+        show_deleted: Optional[bool] = False,
+        show_hidden_invitation: Optional[bool] = False,
+        single_events: Optional[bool] = False,
+        sync_token: Optional[str] = None,
+        time_max: Optional[datetime] = None,
+        time_min: Optional[datetime] = None,
+        time_zone: Optional[str] = None,
+        updated_min: Optional[datetime] = None,
+        destination_path: Optional[str] = None,
+        gcp_conn_id: str = "google_cloud_default",
+        delegate_to: Optional[str] = None,
+        impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(**kwargs)
+        self.gcp_conn_id = gcp_conn_id
+        self.calendar_id = calendar_id
+        self.i_cal_uid = i_cal_uid
+        self.max_attendees = max_attendees
+        self.max_results = max_results
+        self.order_by = order_by
+        self.private_extended_property = private_extended_property
+        self.q = q
+        self.shared_extended_property = shared_extended_property
+        self.show_deleted = show_deleted
+        self.show_hidden_invitation = show_hidden_invitation
+        self.single_events = single_events
+        self.sync_token = sync_token
+        self.time_max = time_max
+        self.time_min = time_min
+        self.time_zone = time_zone
+        self.updated_min = updated_min
+        self.destination_bucket = destination_bucket
+        self.destination_path = destination_path
+        self.delegate_to = delegate_to
+        self.impersonation_chain = impersonation_chain
+
+    def _upload_data(
+        self,
+        gcs_hook: GCSHook,
+        hook: GoogleCalendarHook,

Review comment:
       This looks to be unused in the method.

##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref:TODO `howto/operator:GoogleCalendarToGCSOperator`

Review comment:
       Is this `TODO` still needed? The associated doc looks relatively complete.
   
   You can always mark PRs as a draft first if you still have some WIP actions you'd like to do before you'd like a more detailed review.

##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref:TODO `howto/operator:GoogleCalendarToGCSOperator`
+
+    :param calendar_id: The Google Calendar ID to interact with.
+    :type calendar_id: str
+    :param destination_bucket: The destination Google cloud storage bucket where the
+        report should be written to. (templated)
+    :type destination_bucket: str
+    :param destination_path: The Google cloud storage URI array for the object created by the operator.
+        For example: ``path/to/my/files``.

Review comment:
       ```suggestion
       :param destination_bucket: The destination Google Cloud Storage bucket where the
           report should be written to. (templated)
       :type destination_bucket: str
       :param destination_path: The Google Cloud Storage URI array for the object created by the operator.
           For example: ``path/to/my/files``.
   ```
   Small nit for consistency.

##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref:TODO `howto/operator:GoogleCalendarToGCSOperator`
+
+    :param calendar_id: The Google Calendar ID to interact with.
+    :type calendar_id: str
+    :param destination_bucket: The destination Google cloud storage bucket where the
+        report should be written to. (templated)
+    :type destination_bucket: str
+    :param destination_path: The Google cloud storage URI array for the object created by the operator.
+        For example: ``path/to/my/files``.
+    :type destination_path: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate using domain-wide delegation of authority,
+        if any. For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    :param impersonation_chain: Optional service account to impersonate using short-term
+        credentials, or chained list of accounts required to get the access_token
+        of the last account in the list, which will be impersonated in the request.
+        If set as a string, the account must grant the originating account
+        the Service Account Token Creator IAM role.
+        If set as a sequence, the identities from the list must grant
+        Service Account Token Creator IAM role to the directly preceding identity, with first
+        account from the list granting this role to the originating account (templated).
+    :type impersonation_chain: Union[str, Sequence[str]]
+    """
+
+    template_fields = [
+        "calendar_id",
+        "destination_bucket",
+        "destination_path",
+        "impersonation_chain",
+    ]
+
+    def __init__(
+        self,
+        *,
+        destination_bucket: str,
+        calendar_id: str = "primary",
+        i_cal_uid: Optional[str] = None,
+        max_attendees: Optional[int] = None,
+        max_results: Optional[int] = None,
+        order_by: Optional[str] = None,
+        private_extended_property: Optional[str] = None,
+        q: Optional[str] = None,
+        shared_extended_property: Optional[str] = None,
+        show_deleted: Optional[bool] = False,
+        show_hidden_invitation: Optional[bool] = False,
+        single_events: Optional[bool] = False,
+        sync_token: Optional[str] = None,
+        time_max: Optional[datetime] = None,
+        time_min: Optional[datetime] = None,
+        time_zone: Optional[str] = None,
+        updated_min: Optional[datetime] = None,
+        destination_path: Optional[str] = None,
+        gcp_conn_id: str = "google_cloud_default",
+        delegate_to: Optional[str] = None,
+        impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(**kwargs)
+        self.gcp_conn_id = gcp_conn_id
+        self.calendar_id = calendar_id
+        self.i_cal_uid = i_cal_uid
+        self.max_attendees = max_attendees
+        self.max_results = max_results
+        self.order_by = order_by
+        self.private_extended_property = private_extended_property
+        self.q = q
+        self.shared_extended_property = shared_extended_property
+        self.show_deleted = show_deleted
+        self.show_hidden_invitation = show_hidden_invitation
+        self.single_events = single_events
+        self.sync_token = sync_token
+        self.time_max = time_max
+        self.time_min = time_min
+        self.time_zone = time_zone
+        self.updated_min = updated_min
+        self.destination_bucket = destination_bucket
+        self.destination_path = destination_path
+        self.delegate_to = delegate_to
+        self.impersonation_chain = impersonation_chain
+
+    def _upload_data(
+        self,
+        gcs_hook: GCSHook,
+        hook: GoogleCalendarHook,
+        events: List[Any],
+    ) -> str:
+        # Construct destination file path
+        file_name = f"{self.calendar_id}.json".replace(" ", "_")
+        dest_file_name = (
+            f"{self.destination_path.strip('/')}/{file_name}" if self.destination_path else file_name
+        )
+
+        with NamedTemporaryFile("w+") as temp_file:
+            # Write data
+            json.dump(events, temp_file)
+            temp_file.flush()
+
+            # Upload to GCS
+            gcs_hook.upload(
+                bucket_name=self.destination_bucket,
+                object_name=dest_file_name,
+                filename=temp_file.name,
+            )
+        return dest_file_name
+
+    def execute(self, context):
+        calendar_hook = GoogleCalendarHook(
+            gcp_conn_id=self.gcp_conn_id,
+            delegate_to=self.delegate_to,
+            impersonation_chain=self.impersonation_chain,
+        )
+        gcs_hook = GCSHook(
+            gcp_conn_id=self.gcp_conn_id,
+            delegate_to=self.delegate_to,
+            impersonation_chain=self.impersonation_chain,
+        )
+
+        # Pull data and upload
+        destination_array: List[str] = []

Review comment:
       Does this need to be a list? The `_upload_data()` method only returns a single file name and the method isn't called multiple times when the operator is executed.

##########
File path: docs/apache-airflow-providers-google/operators/transfer/calendar_to_gcs.rst
##########
@@ -0,0 +1,47 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Calendar to Google Cloud Storage Transfer Operators
+========================================================
+
+Google has a service `Google Cloud Storage <https://cloud.google.com/storage/>`__. This service is
+used to store large data from various applications.
+
+With `Google Calendar <https://www.google.com/calendar/about/>`__, you can quickly schedule
+meetings and events and get reminders about upcoming activities, so you always know what's next.
+
+Prerequisite Tasks
+^^^^^^^^^^^^^^^^^^
+
+.. include::/operators/_partials/prerequisite_tasks.rst
+
+.. _howto/operator:GoogleCalendarToGCSOperator:
+
+Upload data from Google Calendar to GCS
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Review comment:
       ```suggestion
   Upload data from Google Calendar to GCS
   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 commented on pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 commented on pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#issuecomment-1008241192


   I would like some help on how to do system tests. I have created the example dag and system test file; but I am not sure how to test it.
   
   I read through this [guide](https://github.com/apache/airflow/blob/main/TESTING.rst#airflow-system-tests). I think I need make some changes to variables.env file; but I am not sure what these changes should be.
   
   Also, I am not sure about the --forward-credentials option for [running system tests](https://github.com/apache/airflow/blob/main/TESTING.rst#the-typical-system-test-session). Which credentials will it forward? Do I need to put the credentials somewhere?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 commented on pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 commented on pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#issuecomment-1026523383


   @turbaszek 
   Was able to resolve the issues with system tests and could verify that the events I created on a test calendar were available as JSON file on GCS.
   <img width="1440" alt="Screen Shot 2022-01-31 at 10 32 38 PM" src="https://user-images.githubusercontent.com/5105282/151923765-90a2b262-4704-46ad-9885-f6de5d05ddcc.png">
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] github-actions[bot] commented on pull request #20769: [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
github-actions[bot] commented on pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#issuecomment-1040765170


   The PR is likely OK to be merged with just subset of tests for default Python and Database versions without running the full matrix of tests, because it does not modify the core of Airflow. If the committers decide that the full tests matrix is needed, they will add the label 'full tests needed'. Then you should rebase to the latest main or amend the last commit of the PR, and push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 commented on a change in pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 commented on a change in pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#discussion_r790442587



##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,215 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref: `howto/operator:GoogleCalendarToGCSOperator`

Review comment:
       Thank you!




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 commented on pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 commented on pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#issuecomment-1008340696


   Thank you!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek edited a comment on pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
turbaszek edited a comment on pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#issuecomment-1008326166


   > 
   > I read through this [guide](https://github.com/apache/airflow/blob/main/TESTING.rst#airflow-system-tests). I think I need make some changes to variables.env file; but I am not sure what these changes should be.
   
   From your example DAG it looks like you need to set up two variables:
   - GCP_GCS_BUCKET
   - CALENDAR_ID
   
   Apart from that you will need to add a credentials key because you are using it in test:
   ```py
   @pytest.mark.credential_file(GCP_GCS_KEY)
   ```
   The file should be named `gcp_gcs.json` and put according to the guide you linked:
   
   > If your system tests need some credential files to be available for an authentication with external systems, make sure to keep these credentials in the files/airflow-breeze-config/keys directory. Mark your tests with @pytest.mark.credential_file(<FILE>) so that they are skipped if such a credential file is not there. The tests should read the right credentials and authenticate them on their own. The credentials are read in Breeze from the /files directory. The local "files" folder is mounted to the "/files" folder in Breeze.
   
   
   Regarding this question:
   
   > Also, I am not sure about the --forward-credentials option for [running system tests](https://github.com/apache/airflow/blob/main/TESTING.rst#the-typical-system-test-session). Which credentials will it forward? Do I need to put the credentials somewhere?
   
   You need to use credentials to some GCP project. As far as I know we don't have airflow project that can be used for this purpose (cc @potiuk to confirm).
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] josh-fell commented on a change in pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
josh-fell commented on a change in pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#discussion_r790376898



##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,215 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref: `howto/operator:GoogleCalendarToGCSOperator`

Review comment:
       ```suggestion
           :ref:`howto/operator:GoogleCalendarToGCSOperator`
   ```
   
   This should help with the Build Docs errors.

##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):

Review comment:
       Based on the newly merged #20951, can you remove the `:type:` Sphinx directives from the operator docstring? The API documentation is able to infer from the type annotations of parameters now rather than relying on the docstring.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek commented on pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
turbaszek commented on pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#issuecomment-1008326166


   > 
   > I read through this [guide](https://github.com/apache/airflow/blob/main/TESTING.rst#airflow-system-tests). I think I need make some changes to variables.env file; but I am not sure what these changes should be.
   
   From your example DAG it looks like you need to set up two variables:
   - GCP_GCS_BUCKET
   - CALENDAR_ID
   
   Apart from that you will need to add a credentials key because you are using it in test:
   ```py
   @pytest.mark.credential_file(GCP_GCS_KEY)
   ```
   The file should be named `gcp_gcs.json` and put according to the guid you linked:
   
   > If your system tests need some credential files to be available for an authentication with external systems, make sure to keep these credentials in the files/airflow-breeze-config/keys directory. Mark your tests with @pytest.mark.credential_file(<FILE>) so that they are skipped if such a credential file is not there. The tests should read the right credentials and authenticate them on their own. The credentials are read in Breeze from the /files directory. The local "files" folder is mounted to the "/files" folder in Breeze.
   
   
   > Also, I am not sure about the --forward-credentials option for [running system tests](https://github.com/apache/airflow/blob/main/TESTING.rst#the-typical-system-test-session). Which credentials will it forward? Do I need to put the credentials somewhere?
   
   You need to use credentials to some GCP project. As far as I know we don't have airflow project that can be used for this purpose (cc @potiuk to confirm).
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 edited a comment on pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 edited a comment on pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#issuecomment-1013972440


   > > I read through this [guide](https://github.com/apache/airflow/blob/main/TESTING.rst#airflow-system-tests). I think I need make some changes to variables.env file; but I am not sure what these changes should be.
   > 
   > From your example DAG it looks like you need to set up two variables:
   > 
   > * GCP_GCS_BUCKET
   > * CALENDAR_ID
   > 
   > Apart from that you will need to add a credentials key because you are using it in test:
   > 
   > ```python
   > @pytest.mark.credential_file(GCP_GCS_KEY)
   > ```
   > 
   > The file should be named `gcp_gcs.json` and put according to the guide you linked:
   > 
   > > If your system tests need some credential files to be available for an authentication with external systems, make sure to keep these credentials in the files/airflow-breeze-config/keys directory. Mark your tests with @pytest.mark.credential_file() so that they are skipped if such a credential file is not there. The tests should read the right credentials and authenticate them on their own. The credentials are read in Breeze from the /files directory. The local "files" folder is mounted to the "/files" folder in Breeze.
   > 
   > Regarding this question:
   > 
   > > Also, I am not sure about the --forward-credentials option for [running system tests](https://github.com/apache/airflow/blob/main/TESTING.rst#the-typical-system-test-session). Which credentials will it forward? Do I need to put the credentials somewhere?
   > 
   > You need to use credentials to some GCP project. As far as I know we don't have airflow project that can be used for this purpose (cc @potiuk to confirm).
   
   Thank you for all your help @turbaszek. 
   I am running into an authentication error through breeze. I don't see this error when I try from my laptop (not through breeze). Could you help take a look?
   
   This gives the error location:
   `___________________________________ GoogleCalendarToGCSExampleDagsSystemTest.test_run_example_dag_function ____________________________________
   /usr/local/lib/python3.7/contextlib.py:73: in inner
       with self._recreate_cm():
   /usr/local/lib/python3.7/contextlib.py:112: in __enter__
       return next(self.gen)
   tests/test_utils/gcp_system_helpers.py:108: in provide_gcp_context
       f"--key-file={key_file_path}",
   tests/test_utils/logging_command_executor.py:92: in execute_cmd
       env=env,
   /usr/local/lib/python3.7/subprocess.py:800: in __init__
       restore_signals, start_new_session)
   _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
   
   self = <subprocess.Popen object at 0x4065c8b250>
   args = ['gcloud', 'auth', 'activate-service-account', '--key-file=/files/airflow-breeze-config/keys/gcp_gcs.json'], executable = b'gcloud'
   preexec_fn = None, close_fds = True, pass_fds = (), cwd = None, env = None, startupinfo = None, creationflags = 0, shell = False, p2cread = -1
   p2cwrite = -1, c2pread = 18, c2pwrite = 19, errread = 20, errwrite = 21, restore_signals = True, start_new_session = False
   `
   
   Here is the error:
   `E               FileNotFoundError: [Errno 2] No such file or directory: 'gcloud': 'gcloud'`
   
   I tried running `gcloud auth activate-service-account` with the same key and same service account from my laptop - that worked. I was also able to upload a file to GCS using the same key: https://gist.github.com/rsg17/b582414f630c2ac38323b87213b270d5


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 commented on a change in pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 commented on a change in pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#discussion_r784364041



##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref:TODO `howto/operator:GoogleCalendarToGCSOperator`
+
+    :param calendar_id: The Google Calendar ID to interact with.
+    :type calendar_id: str
+    :param destination_bucket: The destination Google cloud storage bucket where the
+        report should be written to. (templated)
+    :type destination_bucket: str
+    :param destination_path: The Google cloud storage URI array for the object created by the operator.
+        For example: ``path/to/my/files``.
+    :type destination_path: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate using domain-wide delegation of authority,
+        if any. For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    :param impersonation_chain: Optional service account to impersonate using short-term
+        credentials, or chained list of accounts required to get the access_token
+        of the last account in the list, which will be impersonated in the request.
+        If set as a string, the account must grant the originating account
+        the Service Account Token Creator IAM role.
+        If set as a sequence, the identities from the list must grant
+        Service Account Token Creator IAM role to the directly preceding identity, with first
+        account from the list granting this role to the originating account (templated).
+    :type impersonation_chain: Union[str, Sequence[str]]
+    """
+
+    template_fields = [
+        "calendar_id",
+        "destination_bucket",
+        "destination_path",
+        "impersonation_chain",
+    ]
+
+    def __init__(
+        self,
+        *,
+        destination_bucket: str,
+        calendar_id: str = "primary",
+        i_cal_uid: Optional[str] = None,
+        max_attendees: Optional[int] = None,
+        max_results: Optional[int] = None,
+        order_by: Optional[str] = None,
+        private_extended_property: Optional[str] = None,
+        q: Optional[str] = None,
+        shared_extended_property: Optional[str] = None,
+        show_deleted: Optional[bool] = False,
+        show_hidden_invitation: Optional[bool] = False,

Review comment:
       The parameter is Optional and the default value is False.
   
   I think either suggestion of making in `None` or not `Optional` will work. I will test it out.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 commented on a change in pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 commented on a change in pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#discussion_r784366033



##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):
+    """
+    Writes Google Calendar data into Google Cloud Storage.
+
+    .. seealso::
+        For more information on how to use this operator, take a look at the guide:
+        :ref:TODO `howto/operator:GoogleCalendarToGCSOperator`

Review comment:
       Actually I marked it as `TODO` because I think it needs the doc url.. which I assumed would be created after I merged this PR.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 commented on a change in pull request #20769: [wip] [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 commented on a change in pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#discussion_r784362389



##########
File path: airflow/providers/google/cloud/transfers/calendar_to_gcs.py
##########
@@ -0,0 +1,178 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import json
+from datetime import datetime
+from tempfile import NamedTemporaryFile
+from typing import Any, List, Optional, Sequence, Union
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.suite.hooks.calendar import GoogleCalendarHook
+
+
+class GoogleCalendarToGCSOperator(BaseOperator):

Review comment:
       Thank you for your review @josh-fell!
   
   Will add it.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] rsg17 commented on pull request #20769: [Part 2] Calendar to GCS Operator

Posted by GitBox <gi...@apache.org>.
rsg17 commented on pull request #20769:
URL: https://github.com/apache/airflow/pull/20769#issuecomment-1040886865


   > LGTM @rsg17 can you please rebase
   > 
   > If no further comments i'll merge after
   
   Rebased. Thank you!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org