You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/03/14 15:11:18 UTC

[GitHub] [airflow] mik-laj opened a new pull request #7725: [AIRFLOW-7064][WIP] Add CloudFirestoreExportDatabaseOperator

mik-laj opened a new pull request #7725: [AIRFLOW-7064][WIP] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725
 
 
   ---
   Issue link: WILL BE INSERTED BY [boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [X] Description above provides context of the change
   - [X] Commit message/PR title starts with `[AIRFLOW-NNNN]`. AIRFLOW-NNNN = JIRA ID<sup>*</sup>
   - [X] Unit tests coverage for changes (not needed for documentation changes)
   - [X] Commits follow "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)"
   - [X] Relevant documentation is updated including usage instructions.
   - [Xe] I will engage committers as explained in [Contribution Workflow Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   <sup>*</sup> For document-only changes commit message can start with `[AIRFLOW-XXXX]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines) for more information.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394153794
 
 

 ##########
 File path: tests/providers/google/cloud/utils/gcp_authenticator.py
 ##########
 @@ -47,6 +47,7 @@
 GCP_SPANNER_KEY = 'gcp_spanner.json'
 GCP_TASKS_KEY = 'gcp_tasks.json'
 GMP_KEY = 'gmp.json'
+G_FIREBASE_KEY = 'g_firebase.json'
 
 Review comment:
   ```suggestion
   G_FIREBASE_KEY = 'gcp_firebase.json'
   ```
   Let's keep consistency in case of GCP :)

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] potiuk commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
potiuk commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394135643
 
 

 ##########
 File path: airflow/providers/google/firebase/hooks/firestore.py
 ##########
 @@ -0,0 +1,142 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Hook for Google Cloud Build service"""
+
+import time
+from typing import Any, Dict, Optional
+
+from googleapiclient.discovery import build, build_from_document
+
+from airflow.exceptions import AirflowException
+from airflow.providers.google.cloud.hooks.base import CloudBaseHook
+
+# Time to sleep between active checks of the operation results
+TIME_TO_SLEEP_IN_SECONDS = 5
+
+
+# noinspection PyAbstractClass
+class CloudFirestoreHook(CloudBaseHook):
+    """
+    Hook for the Google Firestore APIs.
+
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+
+    :param api_version: API version used (for example v1 or v1beta1).
+    :type api_version: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate, if any.
+        For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    """
+
+    _conn = None  # type: Optional[Any]
+
+    def __init__(
+        self,
+        api_version: str = "v1",
+        gcp_conn_id: str = "google_cloud_default",
+        delegate_to: Optional[str] = None,
+    ) -> None:
+        super().__init__(gcp_conn_id, delegate_to)
+        self.api_version = api_version
+
+    def get_conn(self):
+        """
+        Retrieves the connection to Cloud Build.
+
+        :return: Google Cloud Build services object.
+        """
+        if not self._conn:
+            http_authorized = self._authorize()
+            # We cannot use an Authorized Client to retrieve discovery document due to an error in the API.
+            # When the authorized customer will send a request to the address below
+            # https://www.googleapis.com/discovery/v1/apis/firestore/v1/rest
+            # then it will get the message below:
+            # > Request contains an invalid argument.
+            # At the same time, the Non-Authorized Client has no problems.
+            non_authorized_conn = build("firestore", self.api_version, cache_discovery=False)
+            self._conn = build_from_document(
+                non_authorized_conn._rootDesc,  # pylint: disable=protected-access
+                http=http_authorized
+            )
+        return self._conn
+
+    @CloudBaseHook.fallback_to_default_project_id
+    def export_documents(
+        self, body: Dict, database_id: str = "(default)", project_id: Optional[str] = None
+    ) -> None:
+        """
+        Starts a build with the specified configuration.
 
 Review comment:
   ```suggestion
           Starts export with the specified configuration.
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] potiuk commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
potiuk commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394135089
 
 

 ##########
 File path: airflow/providers/google/firebase/hooks/firestore.py
 ##########
 @@ -0,0 +1,142 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Hook for Google Cloud Build service"""
+
+import time
+from typing import Any, Dict, Optional
+
+from googleapiclient.discovery import build, build_from_document
+
+from airflow.exceptions import AirflowException
+from airflow.providers.google.cloud.hooks.base import CloudBaseHook
+
+# Time to sleep between active checks of the operation results
+TIME_TO_SLEEP_IN_SECONDS = 5
+
+
+# noinspection PyAbstractClass
+class CloudFirestoreHook(CloudBaseHook):
+    """
+    Hook for the Google Firestore APIs.
+
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+
+    :param api_version: API version used (for example v1 or v1beta1).
+    :type api_version: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate, if any.
+        For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    """
+
+    _conn = None  # type: Optional[Any]
+
+    def __init__(
+        self,
+        api_version: str = "v1",
+        gcp_conn_id: str = "google_cloud_default",
+        delegate_to: Optional[str] = None,
+    ) -> None:
+        super().__init__(gcp_conn_id, delegate_to)
+        self.api_version = api_version
+
+    def get_conn(self):
+        """
+        Retrieves the connection to Cloud Build.
+
+        :return: Google Cloud Build services object.
 
 Review comment:
   ```suggestion
           :return: Google Cloud Firestore services object.
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] potiuk commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
potiuk commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394135303
 
 

 ##########
 File path: airflow/providers/google/firebase/hooks/firestore.py
 ##########
 @@ -0,0 +1,142 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Hook for Google Cloud Build service"""
+
+import time
+from typing import Any, Dict, Optional
+
+from googleapiclient.discovery import build, build_from_document
+
+from airflow.exceptions import AirflowException
+from airflow.providers.google.cloud.hooks.base import CloudBaseHook
+
+# Time to sleep between active checks of the operation results
+TIME_TO_SLEEP_IN_SECONDS = 5
+
+
+# noinspection PyAbstractClass
+class CloudFirestoreHook(CloudBaseHook):
+    """
+    Hook for the Google Firestore APIs.
+
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+
+    :param api_version: API version used (for example v1 or v1beta1).
+    :type api_version: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate, if any.
+        For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    """
+
+    _conn = None  # type: Optional[Any]
+
+    def __init__(
+        self,
+        api_version: str = "v1",
+        gcp_conn_id: str = "google_cloud_default",
+        delegate_to: Optional[str] = None,
+    ) -> None:
+        super().__init__(gcp_conn_id, delegate_to)
+        self.api_version = api_version
+
+    def get_conn(self):
+        """
+        Retrieves the connection to Cloud Build.
+
+        :return: Google Cloud Build services object.
+        """
+        if not self._conn:
+            http_authorized = self._authorize()
+            # We cannot use an Authorized Client to retrieve discovery document due to an error in the API.
+            # When the authorized customer will send a request to the address below
+            # https://www.googleapis.com/discovery/v1/apis/firestore/v1/rest
+            # then it will get the message below:
+            # > Request contains an invalid argument.
+            # At the same time, the Non-Authorized Client has no problems.
+            non_authorized_conn = build("firestore", self.api_version, cache_discovery=False)
 
 Review comment:
   :(

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] mik-laj commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394232169
 
 

 ##########
 File path: airflow/providers/google/firebase/hooks/firestore.py
 ##########
 @@ -0,0 +1,142 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Hook for Google Cloud Build service"""
+
+import time
+from typing import Any, Dict, Optional
+
+from googleapiclient.discovery import build, build_from_document
+
+from airflow.exceptions import AirflowException
+from airflow.providers.google.cloud.hooks.base import CloudBaseHook
+
+# Time to sleep between active checks of the operation results
+TIME_TO_SLEEP_IN_SECONDS = 5
+
+
+# noinspection PyAbstractClass
+class CloudFirestoreHook(CloudBaseHook):
+    """
+    Hook for the Google Firestore APIs.
+
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+
+    :param api_version: API version used (for example v1 or v1beta1).
+    :type api_version: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate, if any.
+        For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    """
+
+    _conn = None  # type: Optional[Any]
+
+    def __init__(
+        self,
+        api_version: str = "v1",
+        gcp_conn_id: str = "google_cloud_default",
+        delegate_to: Optional[str] = None,
+    ) -> None:
+        super().__init__(gcp_conn_id, delegate_to)
+        self.api_version = api_version
+
+    def get_conn(self):
+        """
+        Retrieves the connection to Cloud Build.
+
+        :return: Google Cloud Build services object.
+        """
+        if not self._conn:
+            http_authorized = self._authorize()
+            # We cannot use an Authorized Client to retrieve discovery document due to an error in the API.
+            # When the authorized customer will send a request to the address below
+            # https://www.googleapis.com/discovery/v1/apis/firestore/v1/rest
+            # then it will get the message below:
+            # > Request contains an invalid argument.
+            # At the same time, the Non-Authorized Client has no problems.
+            non_authorized_conn = build("firestore", self.api_version, cache_discovery=False)
 
 Review comment:
   I created a ticket in `google-api-python-client`.
   https://github.com/googleapis/google-api-python-client/issues/845
   I am not sure if this is a good place to report a bug, so I do not add a comment in the code. The error is in the service, not in the library, but I don't know a better place to report this error.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] mik-laj merged pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
mik-laj merged pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725
 
 
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394154372
 
 

 ##########
 File path: tests/providers/google/firebase/hooks/test_firestore.py
 ##########
 @@ -0,0 +1,267 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+Tests for Google Cloud Firestore
+"""
+import unittest
+from typing import Optional
+from unittest import mock
+
+from mock import PropertyMock
+
+from airflow.exceptions import AirflowException
+from airflow.providers.google.firebase.hooks.firestore import CloudFirestoreHook
+from tests.providers.google.cloud.utils.base_gcp_mock import (
+    GCP_PROJECT_ID_HOOK_UNIT_TEST, mock_base_gcp_hook_default_project_id,
+    mock_base_gcp_hook_no_default_project_id,
+)
+
+EXPORT_DOCUMENT_BODY = {
+    "outputUriPrefix": "gs://test-bucket/test-naamespace/",
+    "collectionIds": ["test-collection"],
+}
+
+TEST_OPERATION = {"name": "operation-name", }
+TEST_WAITING_OPERATION = {"done": False, "response": "response"}
+TEST_DONE_OPERATION = {"done": True, "response": "response"}
+TEST_ERROR_OPERATION = {"done": True, "response": "response", "error": "error"}
+TEST_PROJECT_ID = "firestore--project-id"
+
+
+class TestCloudBuildHookWithPassedProjectId(unittest.TestCase):
+    hook = None  # type: Optional[CloudFirestoreHook]
+
+    def setUp(self):
+        with mock.patch(
+            "airflow.providers.google.cloud.hooks.base.CloudBaseHook.__init__",
+            new=mock_base_gcp_hook_default_project_id,
+        ):
+            self.hook = CloudFirestoreHook(gcp_conn_id="test")
+
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.CloudFirestoreHook._authorize")
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.build")
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.build_from_document")
+    def test_client_creation(self, mock_build_from_document, mock_build, mock_authorize):
+        result = self.hook.get_conn()
+        mock_build.assert_called_once_with(
+            'firestore', 'v1', cache_discovery=False
+        )
+        mock_build_from_document.assert_called_once_with(
+            mock_build.return_value._rootDesc, http=mock_authorize.return_value
+        )
+        self.assertEqual(mock_build_from_document.return_value, result)
+        self.assertEqual(self.hook._conn, result)
+
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.CloudFirestoreHook.get_conn")
+    def test_mmediately_complete(self, get_conn_mock):
+        service_mock = get_conn_mock.return_value
+
+        mock_export_documents = service_mock.projects.return_value.databases.return_value.exportDocuments
+        mock_operation_get = (
+            service_mock.projects.return_value.databases.return_value.operations.return_value.get
+        )
+        (
+            mock_export_documents.return_value
+            .execute.return_value
+        ) = TEST_OPERATION
+
+        (
+            mock_operation_get.return_value.execute.return_value
+        ) = TEST_DONE_OPERATION
+
+        self.hook.export_documents(body=EXPORT_DOCUMENT_BODY, project_id=TEST_PROJECT_ID)
+
+        mock_export_documents.assert_called_once_with(
+            body=EXPORT_DOCUMENT_BODY, name='projects/firestore--project-id/databases/(default)'
+        )
+
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.CloudFirestoreHook.get_conn")
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.time.sleep")
+    def test_waiting_operation(self, _, get_conn_mock):
+        service_mock = get_conn_mock.return_value
+
+        mock_export_documents = service_mock.projects.return_value.databases.return_value.exportDocuments
+        mock_operation_get = (
+            service_mock.projects.return_value.databases.return_value.operations.return_value.get
+        )
+        (
+            mock_export_documents.return_value
+            .execute.return_value
+        ) = TEST_OPERATION
+
+        execute_mock = mock.Mock(
+            **{"side_effect": [TEST_WAITING_OPERATION, TEST_DONE_OPERATION, TEST_DONE_OPERATION]}
+        )
+        mock_operation_get.return_value.execute = execute_mock
+
+        self.hook.export_documents(body=EXPORT_DOCUMENT_BODY, project_id=TEST_PROJECT_ID)
+
+        mock_export_documents.assert_called_once_with(
+            body=EXPORT_DOCUMENT_BODY, name='projects/firestore--project-id/databases/(default)'
+        )
+
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.CloudFirestoreHook.get_conn")
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.time.sleep")
+    def test_error_operation(self, _, get_conn_mock):
+        service_mock = get_conn_mock.return_value
+
+        mock_export_documents = service_mock.projects.return_value.databases.return_value.exportDocuments
+        mock_operation_get = (
+            service_mock.projects.return_value.databases.return_value.operations.return_value.get
+        )
+        (
+            mock_export_documents.return_value
+            .execute.return_value
+        ) = TEST_OPERATION
+
+        execute_mock = mock.Mock(**{"side_effect": [TEST_WAITING_OPERATION, TEST_ERROR_OPERATION]})
+        mock_operation_get.return_value.execute = execute_mock
+        with self.assertRaisesRegex(AirflowException, "error"):
+            self.hook.export_documents(body=EXPORT_DOCUMENT_BODY, project_id=TEST_PROJECT_ID)
+
+
+class TestGcpComputeHookWithDefaultProjectIdFromConnection(unittest.TestCase):
+    hook = None  # type: Optional[CloudFirestoreHook]
+
+    def setUp(self):
+        with mock.patch(
+            "airflow.providers.google.cloud.hooks.base.CloudBaseHook.__init__",
+            new=mock_base_gcp_hook_default_project_id,
+        ):
+            self.hook = CloudFirestoreHook(gcp_conn_id="test")
+
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.CloudFirestoreHook._authorize")
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.build")
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.build_from_document")
+    def test_client_creation(self, mock_build_from_document, mock_build, mock_authorize):
+        result = self.hook.get_conn()
+        mock_build.assert_called_once_with(
+            'firestore', 'v1', cache_discovery=False
+        )
+        mock_build_from_document.assert_called_once_with(
+            mock_build.return_value._rootDesc, http=mock_authorize.return_value
+        )
+        self.assertEqual(mock_build_from_document.return_value, result)
+        self.assertEqual(self.hook._conn, result)
+
+    @mock.patch(
+        'airflow.providers.google.cloud.hooks.base.CloudBaseHook.project_id',
+        new_callable=PropertyMock,
+        return_value=GCP_PROJECT_ID_HOOK_UNIT_TEST
+    )
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.CloudFirestoreHook.get_conn")
+    def test_immediately_complete(self, get_conn_mock, mock_project_id):
+        service_mock = get_conn_mock.return_value
+
+        mock_export_documents = service_mock.projects.return_value.databases.return_value.exportDocuments
+        mock_operation_get = (
+            service_mock.projects.return_value.databases.return_value.operations.return_value.get
+        )
+        (
+            mock_export_documents.return_value
+            .execute.return_value
+        ) = TEST_OPERATION
+
+        mock_operation_get.return_value.execute.return_value = TEST_DONE_OPERATION
+
+        self.hook.export_documents(body=EXPORT_DOCUMENT_BODY)
+
+        mock_export_documents.assert_called_once_with(
+            body=EXPORT_DOCUMENT_BODY, name='projects/example-project/databases/(default)'
+        )
+
+    @mock.patch(
+        'airflow.providers.google.cloud.hooks.base.CloudBaseHook.project_id',
+        new_callable=PropertyMock,
+        return_value=GCP_PROJECT_ID_HOOK_UNIT_TEST
+    )
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.CloudFirestoreHook.get_conn")
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.time.sleep")
+    def test_waiting_operation(self, _, get_conn_mock, mock_project_id):
+        service_mock = get_conn_mock.return_value
+
+        mock_export_documents = service_mock.projects.return_value.databases.return_value.exportDocuments
+        mock_operation_get = (
+            service_mock.projects.return_value.databases.return_value.operations.return_value.get
+        )
+        (
+            mock_export_documents.return_value
+            .execute.return_value
+        ) = TEST_OPERATION
+
+        execute_mock = mock.Mock(
+            **{"side_effect": [TEST_WAITING_OPERATION, TEST_DONE_OPERATION, TEST_DONE_OPERATION]}
+        )
+        mock_operation_get.return_value.execute = execute_mock
+
+        self.hook.export_documents(body=EXPORT_DOCUMENT_BODY)
+
+        mock_export_documents.assert_called_once_with(
+            body=EXPORT_DOCUMENT_BODY, name='projects/example-project/databases/(default)'
+        )
+
+    @mock.patch(
+        'airflow.providers.google.cloud.hooks.base.CloudBaseHook.project_id',
+        new_callable=PropertyMock,
+        return_value=GCP_PROJECT_ID_HOOK_UNIT_TEST
+    )
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.CloudFirestoreHook.get_conn")
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.time.sleep")
+    def test_error_operation(self, _, get_conn_mock, mock_project_id):
+        service_mock = get_conn_mock.return_value
+
+        mock_export_documents = service_mock.projects.return_value.databases.return_value.exportDocuments
+        mock_operation_get = (
+            service_mock.projects.return_value.databases.return_value.operations.return_value.get
+        )
+        (
+            mock_export_documents.return_value
+            .execute.return_value
+        ) = TEST_OPERATION
+
+        execute_mock = mock.Mock(**{"side_effect": [TEST_WAITING_OPERATION, TEST_ERROR_OPERATION]})
+        mock_operation_get.return_value.execute = execute_mock
+        with self.assertRaisesRegex(AirflowException, "error"):
+            self.hook.export_documents(body=EXPORT_DOCUMENT_BODY)
+
+
+class TestCloudBuildHookWithoutProjectId(unittest.TestCase):
 
 Review comment:
   ```suggestion
   class TestCloudFirestoreHookWithoutProjectId(unittest.TestCase):
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394153266
 
 

 ##########
 File path: airflow/providers/google/firebase/hooks/firestore.py
 ##########
 @@ -0,0 +1,142 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Hook for Google Cloud Build service"""
+
+import time
+from typing import Any, Dict, Optional
+
+from googleapiclient.discovery import build, build_from_document
+
+from airflow.exceptions import AirflowException
+from airflow.providers.google.cloud.hooks.base import CloudBaseHook
+
+# Time to sleep between active checks of the operation results
+TIME_TO_SLEEP_IN_SECONDS = 5
+
+
+# noinspection PyAbstractClass
+class CloudFirestoreHook(CloudBaseHook):
+    """
+    Hook for the Google Firestore APIs.
+
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+
+    :param api_version: API version used (for example v1 or v1beta1).
+    :type api_version: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate, if any.
+        For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    """
+
+    _conn = None  # type: Optional[Any]
+
+    def __init__(
+        self,
+        api_version: str = "v1",
+        gcp_conn_id: str = "google_cloud_default",
+        delegate_to: Optional[str] = None,
+    ) -> None:
+        super().__init__(gcp_conn_id, delegate_to)
+        self.api_version = api_version
+
+    def get_conn(self):
+        """
+        Retrieves the connection to Cloud Build.
+
+        :return: Google Cloud Build services object.
+        """
+        if not self._conn:
+            http_authorized = self._authorize()
+            # We cannot use an Authorized Client to retrieve discovery document due to an error in the API.
+            # When the authorized customer will send a request to the address below
+            # https://www.googleapis.com/discovery/v1/apis/firestore/v1/rest
+            # then it will get the message below:
+            # > Request contains an invalid argument.
+            # At the same time, the Non-Authorized Client has no problems.
+            non_authorized_conn = build("firestore", self.api_version, cache_discovery=False)
+            self._conn = build_from_document(
+                non_authorized_conn._rootDesc,  # pylint: disable=protected-access
+                http=http_authorized
+            )
+        return self._conn
+
+    @CloudBaseHook.fallback_to_default_project_id
+    def export_documents(
+        self, body: Dict, database_id: str = "(default)", project_id: Optional[str] = None
+    ) -> None:
+        """
+        Starts a build with the specified configuration.
+
+        :param database_id: The Database ID.
+        :type database_id: str
+        :param body: The request body.
+            See:
+            https://firebase.google.com/docs/firestore/reference/rest/v1beta1/projects.databases/exportDocuments
+        :type body: dict
+        :param project_id: Optional, Google Cloud Project project_id where the function belongs.
+            If set to None or missing, the default project_id from the GCP connection is used.
+        :type project_id: str
+        """
+        if not project_id:
+            raise ValueError("The project_id should be set")
+        service = self.get_conn()
+
+        name = f"projects/{project_id}/databases/{database_id}"
+
+        operation = (
+            service.projects()  # pylint: disable=no-member
+            .databases()
+            .exportDocuments(name=name, body=body)
+            .execute(num_retries=self.num_retries)
+        )
+
+        self._wait_for_operation_to_complete(operation["name"])
+
+    def _wait_for_operation_to_complete(self, operation_name: str) -> None:
+        """
+        Waits for the named operation to complete - checks status of the
+        asynchronous call.
+
+        :param operation_name: The name of the operation.
+        :type operation_name: str
+        :return: The response returned by the operation.
+        :rtype: dict
+        :exception: AirflowException in case error is returned.
+        """
+        service = self.get_conn()
+        while True:
+            operation_response = (
+                service.projects()  # pylint: disable=no-member
+                .databases()
+                .operations()
+                .get(name=operation_name)
+                .execute(num_retries=self.num_retries)
+            )
+            if operation_response.get("done"):
 
 Review comment:
   This is ok, however, I wonder isn't `if "done" in operation_response` more pythonic?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] mik-laj commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394188979
 
 

 ##########
 File path: airflow/providers/google/firebase/hooks/firestore.py
 ##########
 @@ -0,0 +1,142 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Hook for Google Cloud Build service"""
+
+import time
+from typing import Any, Dict, Optional
+
+from googleapiclient.discovery import build, build_from_document
+
+from airflow.exceptions import AirflowException
+from airflow.providers.google.cloud.hooks.base import CloudBaseHook
+
+# Time to sleep between active checks of the operation results
+TIME_TO_SLEEP_IN_SECONDS = 5
+
+
+# noinspection PyAbstractClass
+class CloudFirestoreHook(CloudBaseHook):
+    """
+    Hook for the Google Firestore APIs.
+
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+
+    :param api_version: API version used (for example v1 or v1beta1).
+    :type api_version: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate, if any.
+        For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    """
+
+    _conn = None  # type: Optional[Any]
+
+    def __init__(
+        self,
+        api_version: str = "v1",
+        gcp_conn_id: str = "google_cloud_default",
+        delegate_to: Optional[str] = None,
+    ) -> None:
+        super().__init__(gcp_conn_id, delegate_to)
+        self.api_version = api_version
+
+    def get_conn(self):
+        """
+        Retrieves the connection to Cloud Build.
+
+        :return: Google Cloud Build services object.
+        """
+        if not self._conn:
+            http_authorized = self._authorize()
+            # We cannot use an Authorized Client to retrieve discovery document due to an error in the API.
+            # When the authorized customer will send a request to the address below
+            # https://www.googleapis.com/discovery/v1/apis/firestore/v1/rest
+            # then it will get the message below:
+            # > Request contains an invalid argument.
+            # At the same time, the Non-Authorized Client has no problems.
+            non_authorized_conn = build("firestore", self.api_version, cache_discovery=False)
+            self._conn = build_from_document(
+                non_authorized_conn._rootDesc,  # pylint: disable=protected-access
+                http=http_authorized
+            )
+        return self._conn
+
+    @CloudBaseHook.fallback_to_default_project_id
+    def export_documents(
+        self, body: Dict, database_id: str = "(default)", project_id: Optional[str] = None
+    ) -> None:
+        """
+        Starts a build with the specified configuration.
+
+        :param database_id: The Database ID.
+        :type database_id: str
+        :param body: The request body.
+            See:
+            https://firebase.google.com/docs/firestore/reference/rest/v1beta1/projects.databases/exportDocuments
+        :type body: dict
+        :param project_id: Optional, Google Cloud Project project_id where the function belongs.
+            If set to None or missing, the default project_id from the GCP connection is used.
+        :type project_id: str
+        """
+        if not project_id:
+            raise ValueError("The project_id should be set")
+        service = self.get_conn()
+
+        name = f"projects/{project_id}/databases/{database_id}"
+
+        operation = (
+            service.projects()  # pylint: disable=no-member
+            .databases()
+            .exportDocuments(name=name, body=body)
+            .execute(num_retries=self.num_retries)
+        )
+
+        self._wait_for_operation_to_complete(operation["name"])
+
+    def _wait_for_operation_to_complete(self, operation_name: str) -> None:
+        """
+        Waits for the named operation to complete - checks status of the
+        asynchronous call.
+
+        :param operation_name: The name of the operation.
+        :type operation_name: str
+        :return: The response returned by the operation.
+        :rtype: dict
+        :exception: AirflowException in case error is returned.
+        """
+        service = self.get_conn()
+        while True:
+            operation_response = (
+                service.projects()  # pylint: disable=no-member
+                .databases()
+                .operations()
+                .get(name=operation_name)
+                .execute(num_retries=self.num_retries)
+            )
+            if operation_response.get("done"):
 
 Review comment:
   We want to check if the done value is True.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] potiuk commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
potiuk commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394135856
 
 

 ##########
 File path: airflow/providers/google/firebase/hooks/firestore.py
 ##########
 @@ -0,0 +1,142 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Hook for Google Cloud Build service"""
+
+import time
+from typing import Any, Dict, Optional
+
+from googleapiclient.discovery import build, build_from_document
+
+from airflow.exceptions import AirflowException
+from airflow.providers.google.cloud.hooks.base import CloudBaseHook
+
+# Time to sleep between active checks of the operation results
+TIME_TO_SLEEP_IN_SECONDS = 5
+
+
+# noinspection PyAbstractClass
+class CloudFirestoreHook(CloudBaseHook):
+    """
+    Hook for the Google Firestore APIs.
+
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+
+    :param api_version: API version used (for example v1 or v1beta1).
+    :type api_version: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate, if any.
+        For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    """
+
+    _conn = None  # type: Optional[Any]
+
+    def __init__(
+        self,
+        api_version: str = "v1",
+        gcp_conn_id: str = "google_cloud_default",
+        delegate_to: Optional[str] = None,
+    ) -> None:
+        super().__init__(gcp_conn_id, delegate_to)
+        self.api_version = api_version
+
+    def get_conn(self):
+        """
+        Retrieves the connection to Cloud Build.
+
+        :return: Google Cloud Build services object.
+        """
+        if not self._conn:
+            http_authorized = self._authorize()
+            # We cannot use an Authorized Client to retrieve discovery document due to an error in the API.
+            # When the authorized customer will send a request to the address below
+            # https://www.googleapis.com/discovery/v1/apis/firestore/v1/rest
+            # then it will get the message below:
+            # > Request contains an invalid argument.
+            # At the same time, the Non-Authorized Client has no problems.
+            non_authorized_conn = build("firestore", self.api_version, cache_discovery=False)
+            self._conn = build_from_document(
+                non_authorized_conn._rootDesc,  # pylint: disable=protected-access
+                http=http_authorized
+            )
+        return self._conn
+
+    @CloudBaseHook.fallback_to_default_project_id
+    def export_documents(
+        self, body: Dict, database_id: str = "(default)", project_id: Optional[str] = None
+    ) -> None:
+        """
+        Starts a build with the specified configuration.
+
+        :param database_id: The Database ID.
+        :type database_id: str
+        :param body: The request body.
+            See:
+            https://firebase.google.com/docs/firestore/reference/rest/v1beta1/projects.databases/exportDocuments
+        :type body: dict
+        :param project_id: Optional, Google Cloud Project project_id where the function belongs.
 
 Review comment:
   ```suggestion
           :param project_id: Optional, Google Cloud Project project_id where the database belongs.
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] mik-laj commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394192928
 
 

 ##########
 File path: tests/providers/google/cloud/utils/gcp_authenticator.py
 ##########
 @@ -47,6 +47,7 @@
 GCP_SPANNER_KEY = 'gcp_spanner.json'
 GCP_TASKS_KEY = 'gcp_tasks.json'
 GMP_KEY = 'gmp.json'
+G_FIREBASE_KEY = 'g_firebase.json'
 
 Review comment:
   It's not GCP, but Firebase service.  This database is available in the Pantheon, but is most often used by Firebase and was created as part of Firebase. It also has a common name with **Fire**base. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394154595
 
 

 ##########
 File path: tests/providers/google/firebase/operators/test_firestore_system.py
 ##########
 @@ -0,0 +1,47 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+import pytest
+
+from airflow.providers.google.firebase.example_dags.example_firestore import (
+    DATASET_NAME, EXPORT_DESTINATION_URL,
+)
+from tests.providers.google.cloud.utils.gcp_authenticator import G_FIREBASE_KEY
+from tests.test_utils.gcp_system_helpers import FIREBASE_DAG_FOLDER, GoogleSystemTest, provide_gcp_context
+
+
+@pytest.mark.system("google.firebase")
+@pytest.mark.system("google.cloud")
+@pytest.mark.credential_file(G_FIREBASE_KEY)
+class CampaignManagerSystemTest(GoogleSystemTest):
 
 Review comment:
   ```suggestion
   class CloudFirestoreSystemTest(GoogleSystemTest):
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] potiuk commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
potiuk commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394134993
 
 

 ##########
 File path: airflow/providers/google/firebase/hooks/firestore.py
 ##########
 @@ -0,0 +1,142 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Hook for Google Cloud Build service"""
+
+import time
+from typing import Any, Dict, Optional
+
+from googleapiclient.discovery import build, build_from_document
+
+from airflow.exceptions import AirflowException
+from airflow.providers.google.cloud.hooks.base import CloudBaseHook
+
+# Time to sleep between active checks of the operation results
+TIME_TO_SLEEP_IN_SECONDS = 5
+
+
+# noinspection PyAbstractClass
+class CloudFirestoreHook(CloudBaseHook):
+    """
+    Hook for the Google Firestore APIs.
+
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+
+    :param api_version: API version used (for example v1 or v1beta1).
+    :type api_version: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate, if any.
+        For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    """
+
+    _conn = None  # type: Optional[Any]
+
+    def __init__(
+        self,
+        api_version: str = "v1",
+        gcp_conn_id: str = "google_cloud_default",
+        delegate_to: Optional[str] = None,
+    ) -> None:
+        super().__init__(gcp_conn_id, delegate_to)
+        self.api_version = api_version
+
+    def get_conn(self):
+        """
+        Retrieves the connection to Cloud Build.
 
 Review comment:
   ```suggestion
           Retrieves the connection to Cloud Firestore.
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] mik-laj commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394191506
 
 

 ##########
 File path: tests/providers/google/firebase/operators/test_firestore_system.py
 ##########
 @@ -0,0 +1,47 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+import pytest
+
+from airflow.providers.google.firebase.example_dags.example_firestore import (
+    DATASET_NAME, EXPORT_DESTINATION_URL,
+)
+from tests.providers.google.cloud.utils.gcp_authenticator import G_FIREBASE_KEY
+from tests.test_utils.gcp_system_helpers import FIREBASE_DAG_FOLDER, GoogleSystemTest, provide_gcp_context
+
+
+@pytest.mark.system("google.firebase")
+@pytest.mark.system("google.cloud")
+@pytest.mark.credential_file(G_FIREBASE_KEY)
+class CampaignManagerSystemTest(GoogleSystemTest):
+    def setUp(self):
+        super().setUp()
+        self.clean_up()
+
+    def tearDown(self):
+        self.clean_up()
+        super().tearDown()
+
+    def clean_up(self):
+        self.execute_with_ctx(["gsutil", "rm", "-r", f"{EXPORT_DESTINATION_URL}"], G_FIREBASE_KEY)
 
 Review comment:
   I do not want to delete the entire bucket, but only its contents, because this test has complex requirements for the location of the bucket and permissions. Information is available in example_firestore.py.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394155102
 
 

 ##########
 File path: tests/providers/google/firebase/operators/test_firestore_system.py
 ##########
 @@ -0,0 +1,47 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+import pytest
+
+from airflow.providers.google.firebase.example_dags.example_firestore import (
+    DATASET_NAME, EXPORT_DESTINATION_URL,
+)
+from tests.providers.google.cloud.utils.gcp_authenticator import G_FIREBASE_KEY
+from tests.test_utils.gcp_system_helpers import FIREBASE_DAG_FOLDER, GoogleSystemTest, provide_gcp_context
+
+
+@pytest.mark.system("google.firebase")
+@pytest.mark.system("google.cloud")
+@pytest.mark.credential_file(G_FIREBASE_KEY)
+class CampaignManagerSystemTest(GoogleSystemTest):
+    def setUp(self):
+        super().setUp()
+        self.clean_up()
+
+    def tearDown(self):
+        self.clean_up()
+        super().tearDown()
+
+    def clean_up(self):
+        self.execute_with_ctx(["gsutil", "rm", "-r", f"{EXPORT_DESTINATION_URL}"], G_FIREBASE_KEY)
 
 Review comment:
   Can't we use `delete_gcs_bucket` method of `GoogleSystemTest`?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] mik-laj commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
mik-laj commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394188979
 
 

 ##########
 File path: airflow/providers/google/firebase/hooks/firestore.py
 ##########
 @@ -0,0 +1,142 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""Hook for Google Cloud Build service"""
+
+import time
+from typing import Any, Dict, Optional
+
+from googleapiclient.discovery import build, build_from_document
+
+from airflow.exceptions import AirflowException
+from airflow.providers.google.cloud.hooks.base import CloudBaseHook
+
+# Time to sleep between active checks of the operation results
+TIME_TO_SLEEP_IN_SECONDS = 5
+
+
+# noinspection PyAbstractClass
+class CloudFirestoreHook(CloudBaseHook):
+    """
+    Hook for the Google Firestore APIs.
+
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+
+    :param api_version: API version used (for example v1 or v1beta1).
+    :type api_version: str
+    :param gcp_conn_id: The connection ID to use when fetching connection info.
+    :type gcp_conn_id: str
+    :param delegate_to: The account to impersonate, if any.
+        For this to work, the service account making the request must have
+        domain-wide delegation enabled.
+    :type delegate_to: str
+    """
+
+    _conn = None  # type: Optional[Any]
+
+    def __init__(
+        self,
+        api_version: str = "v1",
+        gcp_conn_id: str = "google_cloud_default",
+        delegate_to: Optional[str] = None,
+    ) -> None:
+        super().__init__(gcp_conn_id, delegate_to)
+        self.api_version = api_version
+
+    def get_conn(self):
+        """
+        Retrieves the connection to Cloud Build.
+
+        :return: Google Cloud Build services object.
+        """
+        if not self._conn:
+            http_authorized = self._authorize()
+            # We cannot use an Authorized Client to retrieve discovery document due to an error in the API.
+            # When the authorized customer will send a request to the address below
+            # https://www.googleapis.com/discovery/v1/apis/firestore/v1/rest
+            # then it will get the message below:
+            # > Request contains an invalid argument.
+            # At the same time, the Non-Authorized Client has no problems.
+            non_authorized_conn = build("firestore", self.api_version, cache_discovery=False)
+            self._conn = build_from_document(
+                non_authorized_conn._rootDesc,  # pylint: disable=protected-access
+                http=http_authorized
+            )
+        return self._conn
+
+    @CloudBaseHook.fallback_to_default_project_id
+    def export_documents(
+        self, body: Dict, database_id: str = "(default)", project_id: Optional[str] = None
+    ) -> None:
+        """
+        Starts a build with the specified configuration.
+
+        :param database_id: The Database ID.
+        :type database_id: str
+        :param body: The request body.
+            See:
+            https://firebase.google.com/docs/firestore/reference/rest/v1beta1/projects.databases/exportDocuments
+        :type body: dict
+        :param project_id: Optional, Google Cloud Project project_id where the function belongs.
+            If set to None or missing, the default project_id from the GCP connection is used.
+        :type project_id: str
+        """
+        if not project_id:
+            raise ValueError("The project_id should be set")
+        service = self.get_conn()
+
+        name = f"projects/{project_id}/databases/{database_id}"
+
+        operation = (
+            service.projects()  # pylint: disable=no-member
+            .databases()
+            .exportDocuments(name=name, body=body)
+            .execute(num_retries=self.num_retries)
+        )
+
+        self._wait_for_operation_to_complete(operation["name"])
+
+    def _wait_for_operation_to_complete(self, operation_name: str) -> None:
+        """
+        Waits for the named operation to complete - checks status of the
+        asynchronous call.
+
+        :param operation_name: The name of the operation.
+        :type operation_name: str
+        :return: The response returned by the operation.
+        :rtype: dict
+        :exception: AirflowException in case error is returned.
+        """
+        service = self.get_conn()
+        while True:
+            operation_response = (
+                service.projects()  # pylint: disable=no-member
+                .databases()
+                .operations()
+                .get(name=operation_name)
+                .execute(num_retries=self.num_retries)
+            )
+            if operation_response.get("done"):
 
 Review comment:
   We want to check if the done value is "true"

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394154051
 
 

 ##########
 File path: tests/providers/google/firebase/hooks/test_firestore.py
 ##########
 @@ -0,0 +1,267 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+Tests for Google Cloud Firestore
+"""
+import unittest
+from typing import Optional
+from unittest import mock
+
+from mock import PropertyMock
+
+from airflow.exceptions import AirflowException
+from airflow.providers.google.firebase.hooks.firestore import CloudFirestoreHook
+from tests.providers.google.cloud.utils.base_gcp_mock import (
+    GCP_PROJECT_ID_HOOK_UNIT_TEST, mock_base_gcp_hook_default_project_id,
+    mock_base_gcp_hook_no_default_project_id,
+)
+
+EXPORT_DOCUMENT_BODY = {
+    "outputUriPrefix": "gs://test-bucket/test-naamespace/",
+    "collectionIds": ["test-collection"],
+}
+
+TEST_OPERATION = {"name": "operation-name", }
+TEST_WAITING_OPERATION = {"done": False, "response": "response"}
+TEST_DONE_OPERATION = {"done": True, "response": "response"}
+TEST_ERROR_OPERATION = {"done": True, "response": "response", "error": "error"}
+TEST_PROJECT_ID = "firestore--project-id"
+
+
+class TestCloudBuildHookWithPassedProjectId(unittest.TestCase):
 
 Review comment:
   ```suggestion
   class TestCloudFirestoreHookWithPassedProjectId(unittest.TestCase):
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [airflow] nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator

Posted by GitBox <gi...@apache.org>.
nuclearpinguin commented on a change in pull request #7725: [AIRFLOW-7064] Add CloudFirestoreExportDatabaseOperator
URL: https://github.com/apache/airflow/pull/7725#discussion_r394154227
 
 

 ##########
 File path: tests/providers/google/firebase/hooks/test_firestore.py
 ##########
 @@ -0,0 +1,267 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+"""
+Tests for Google Cloud Firestore
+"""
+import unittest
+from typing import Optional
+from unittest import mock
+
+from mock import PropertyMock
+
+from airflow.exceptions import AirflowException
+from airflow.providers.google.firebase.hooks.firestore import CloudFirestoreHook
+from tests.providers.google.cloud.utils.base_gcp_mock import (
+    GCP_PROJECT_ID_HOOK_UNIT_TEST, mock_base_gcp_hook_default_project_id,
+    mock_base_gcp_hook_no_default_project_id,
+)
+
+EXPORT_DOCUMENT_BODY = {
+    "outputUriPrefix": "gs://test-bucket/test-naamespace/",
+    "collectionIds": ["test-collection"],
+}
+
+TEST_OPERATION = {"name": "operation-name", }
+TEST_WAITING_OPERATION = {"done": False, "response": "response"}
+TEST_DONE_OPERATION = {"done": True, "response": "response"}
+TEST_ERROR_OPERATION = {"done": True, "response": "response", "error": "error"}
+TEST_PROJECT_ID = "firestore--project-id"
+
+
+class TestCloudBuildHookWithPassedProjectId(unittest.TestCase):
+    hook = None  # type: Optional[CloudFirestoreHook]
+
+    def setUp(self):
+        with mock.patch(
+            "airflow.providers.google.cloud.hooks.base.CloudBaseHook.__init__",
+            new=mock_base_gcp_hook_default_project_id,
+        ):
+            self.hook = CloudFirestoreHook(gcp_conn_id="test")
+
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.CloudFirestoreHook._authorize")
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.build")
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.build_from_document")
+    def test_client_creation(self, mock_build_from_document, mock_build, mock_authorize):
+        result = self.hook.get_conn()
+        mock_build.assert_called_once_with(
+            'firestore', 'v1', cache_discovery=False
+        )
+        mock_build_from_document.assert_called_once_with(
+            mock_build.return_value._rootDesc, http=mock_authorize.return_value
+        )
+        self.assertEqual(mock_build_from_document.return_value, result)
+        self.assertEqual(self.hook._conn, result)
+
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.CloudFirestoreHook.get_conn")
+    def test_mmediately_complete(self, get_conn_mock):
+        service_mock = get_conn_mock.return_value
+
+        mock_export_documents = service_mock.projects.return_value.databases.return_value.exportDocuments
+        mock_operation_get = (
+            service_mock.projects.return_value.databases.return_value.operations.return_value.get
+        )
+        (
+            mock_export_documents.return_value
+            .execute.return_value
+        ) = TEST_OPERATION
+
+        (
+            mock_operation_get.return_value.execute.return_value
+        ) = TEST_DONE_OPERATION
+
+        self.hook.export_documents(body=EXPORT_DOCUMENT_BODY, project_id=TEST_PROJECT_ID)
+
+        mock_export_documents.assert_called_once_with(
+            body=EXPORT_DOCUMENT_BODY, name='projects/firestore--project-id/databases/(default)'
+        )
+
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.CloudFirestoreHook.get_conn")
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.time.sleep")
+    def test_waiting_operation(self, _, get_conn_mock):
+        service_mock = get_conn_mock.return_value
+
+        mock_export_documents = service_mock.projects.return_value.databases.return_value.exportDocuments
+        mock_operation_get = (
+            service_mock.projects.return_value.databases.return_value.operations.return_value.get
+        )
+        (
+            mock_export_documents.return_value
+            .execute.return_value
+        ) = TEST_OPERATION
+
+        execute_mock = mock.Mock(
+            **{"side_effect": [TEST_WAITING_OPERATION, TEST_DONE_OPERATION, TEST_DONE_OPERATION]}
+        )
+        mock_operation_get.return_value.execute = execute_mock
+
+        self.hook.export_documents(body=EXPORT_DOCUMENT_BODY, project_id=TEST_PROJECT_ID)
+
+        mock_export_documents.assert_called_once_with(
+            body=EXPORT_DOCUMENT_BODY, name='projects/firestore--project-id/databases/(default)'
+        )
+
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.CloudFirestoreHook.get_conn")
+    @mock.patch("airflow.providers.google.firebase.hooks.firestore.time.sleep")
+    def test_error_operation(self, _, get_conn_mock):
+        service_mock = get_conn_mock.return_value
+
+        mock_export_documents = service_mock.projects.return_value.databases.return_value.exportDocuments
+        mock_operation_get = (
+            service_mock.projects.return_value.databases.return_value.operations.return_value.get
+        )
+        (
+            mock_export_documents.return_value
+            .execute.return_value
+        ) = TEST_OPERATION
+
+        execute_mock = mock.Mock(**{"side_effect": [TEST_WAITING_OPERATION, TEST_ERROR_OPERATION]})
+        mock_operation_get.return_value.execute = execute_mock
+        with self.assertRaisesRegex(AirflowException, "error"):
+            self.hook.export_documents(body=EXPORT_DOCUMENT_BODY, project_id=TEST_PROJECT_ID)
+
+
+class TestGcpComputeHookWithDefaultProjectIdFromConnection(unittest.TestCase):
 
 Review comment:
   ```suggestion
   class TestCloudFirestoreHookWithDefaultProjectIdFromConnection(unittest.TestCase):
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services