You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "shaniyaclement (via GitHub)" <gi...@apache.org> on 2023/07/18 18:47:10 UTC

[GitHub] [airflow] shaniyaclement opened a new pull request, #32678: For Google Host Review

shaniyaclement opened a new pull request, #32678:
URL: https://github.com/apache/airflow/pull/32678

   <!--
    Licensed to the Apache Software Foundation (ASF) under one
    or more contributor license agreements.  See the NOTICE file
    distributed with this work for additional information
    regarding copyright ownership.  The ASF licenses this file
    to you under the Apache License, Version 2.0 (the
    "License"); you may not use this file except in compliance
    with the License.  You may obtain a copy of the License at
   
      http://www.apache.org/licenses/LICENSE-2.0
   
    Unless required by applicable law or agreed to in writing,
    software distributed under the License is distributed on an
    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
    KIND, either express or implied.  See the License for the
    specific language governing permissions and limitations
    under the License.
    -->
   
   <!--
   Thank you for contributing! Please make sure that your code changes
   are covered with tests. And in case of new features or big changes
   remember to adjust the documentation.
   
   Feel free to ping committers for the review!
   
   In case of an existing issue, reference it using one of the following:
   
   closes: #ISSUE
   related: #ISSUE
   
   How to write a good git commit message:
   http://chris.beams.io/posts/git-commit/
   -->
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)** for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals)) is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a newsfragment file, named `{pr_number}.significant.rst` or `{issue_number}.significant.rst`, in [newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] shaniyaclement closed pull request #32678: For Google Host Review

Posted by "shaniyaclement (via GitHub)" <gi...@apache.org>.
shaniyaclement closed pull request #32678: For Google Host Review
URL: https://github.com/apache/airflow/pull/32678


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] xianhualiu commented on a diff in pull request #32678: For Google Host Review

Posted by "xianhualiu (via GitHub)" <gi...@apache.org>.
xianhualiu commented on code in PR #32678:
URL: https://github.com/apache/airflow/pull/32678#discussion_r1267253020


##########
tests/providers/google/cloud/operators/test_datapipeline.py:
##########
@@ -0,0 +1,72 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+from unittest.mock import MagicMock
+
+import pytest as pytest
+
+import airflow
+from airflow.exceptions import AirflowException, AirflowProviderDeprecationWarning
+from airflow.providers.google.cloud.operators.datapipeline import (
+    CreateDataPipelineOperator,
+    RunDataPipelineOperator,
+)
+from airflow.providers.google.cloud.hooks.datapipeline import DataPipelineHook
+from airflow.version import version
+
+TASK_ID = "test-datapipeline-operators"
+TEST_BODY = {
+    "name": "projects/dataflow-interns/locations/us-central1/pipelines/dp-create-1642676351302-mp--1675461000",
+            "type": "PIPELINE_TYPE_BATCH",
+            "workload": {
+                "dataflowFlexTemplateRequest": {
+                "launchParameter": {
+                    "containerSpecGcsPath": "gs://intern-bucket-1/templates/word-count.json",
+                    "jobName": "word-count-test-intern1",
+                    "environment": {
+                    "tempLocation": "gs://intern-bucket-1/temp"
+                    },
+                    "parameters": {
+                    "inputFile": "gs://intern-bucket-1/examples/kinglear.txt",

Review Comment:
   please use more permanent gs pathes like https://github.com/apache/airflow/blob/375d2fabcc2cef4b17d2129b557ac6fcb6c65067/tests/providers/google/cloud/operators/test_dataflow.py#L42C3-L42C3
   
   For example:
      ""containerSpecGcsPath": "gs://dataflow-templates-us-central1/latest/Word_Count_metadata"
       "inputFile": "gs://dataflow-samples/shakespeare/kinglear.txt",
       "output": "gs://test/output/my_output",



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] shaniyaclement commented on a diff in pull request #32678: For Google Host Review

Posted by "shaniyaclement (via GitHub)" <gi...@apache.org>.
shaniyaclement commented on code in PR #32678:
URL: https://github.com/apache/airflow/pull/32678#discussion_r1267351081


##########
airflow/providers/google/cloud/operators/datapipeline.py:
##########
@@ -0,0 +1,139 @@
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains Google DataPipeline operators."""
+from __future__ import annotations
+
+import copy
+import re
+import uuid
+import warnings
+from contextlib import ExitStack
+from enum import Enum
+from functools import cached_property
+from typing import TYPE_CHECKING, Any, Sequence
+
+from airflow import AirflowException
+from airflow.exceptions import AirflowProviderDeprecationWarning
+from airflow.providers.google.cloud.hooks.datapipeline import (
+    DEFAULT_DATAPIPELINE_LOCATION,
+    DataPipelineHook
+)
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.cloud.operators.cloud_base import GoogleCloudBaseOperator
+from airflow.version import version
+
+
+class CreateDataPipelineOperator(GoogleCloudBaseOperator):
+    """ 
+    Creates a new Data Pipeline instance from the Data Pipeline API.
+
+    :param body: The request body (contains instance of Pipeline). See:
+        https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+    :param project_id: The ID of the GCP project that owns the job.
+    :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).
+    :param gcp_conn_id: The connection ID to connect to the Google Cloud
+        Platform.
+
+    Returns the created Pipeline instance in JSON representation.
+    """
+    def __init__(
+        self,
+        *,
+        body: dict,
+        project_id: str | None = None,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+        gcp_conn_id: str = "google_cloud_default",
+        **kwargs,
+    ) -> None:
+        super().__init__(**kwargs)
+
+        self.body = body
+        self.project_id = project_id
+        self.location = location
+        self.gcp_conn_id = gcp_conn_id
+        self.datapipeline_hook : DataPipelineHook | None = None
+        self.body["pipelineSources"] = {"airflow":"airflow"}
+
+    def execute(self, context: Context):
+        self.datapipeline_hook = DataPipelineHook(
+            gcp_conn_id=self.gcp_conn_id
+        )
+
+        self.data_pipeline = self.datapipeline_hook.create_data_pipeline(
+            project_id = self.project_id,
+            body = self.body,
+            location = self.location,
+        )
+        self.log.info("Response Body: ", self.data_pipeline)
+

Review Comment:
   done



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] xianhualiu commented on a diff in pull request #32678: For Google Host Review

Posted by "xianhualiu (via GitHub)" <gi...@apache.org>.
xianhualiu commented on code in PR #32678:
URL: https://github.com/apache/airflow/pull/32678#discussion_r1267236839


##########
airflow/providers/google/cloud/hooks/datapipeline.py:
##########
@@ -0,0 +1,129 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains a Google DataPipeline Hook."""
+from __future__ import annotations
+
+import functools
+import json
+import re
+import shlex
+import subprocess
+import time
+import uuid
+import warnings
+import urllib.parse
+from copy import deepcopy
+
+from googleapiclient.discovery import build
+
+from airflow.providers.google.common.hooks.base_google import (
+    GoogleBaseHook,
+)
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.utils.timeout import timeout
+
+# This is the default location
+# https://cloud.google.com/dataflow/pipelines/specifying-exec-params
+DEFAULT_DATAPIPELINE_LOCATION = "us-central1"
+
+
+class DataPipelineHook(GoogleBaseHook):
+    """
+    Hook for Google DataPipeline.
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+    """
+
+    def __init__(
+        self,
+        gcp_conn_id: str = "google_cloud_default",
+        impersonation_chain: str | Sequence[str] | None = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(
+            gcp_conn_id=gcp_conn_id,
+            impersonation_chain=impersonation_chain,
+        )
+
+    def get_conn(self) -> build:
+        """Returns a Google Cloud DataPipeline service object."""
+        http_authorized = self._authorize()
+        return build("datapipelines", "v1", http=http_authorized, cache_discovery=False)
+
+    @GoogleBaseHook.fallback_to_default_project_id
+    def create_data_pipeline(
+        self,
+        body: dict,
+        project_id: str,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+    ) -> None:
+        """
+        Creates a new Data Pipeline instance from the Data Pipeline API.
+
+        :param body: The request body (contains instance of Pipeline). See:
+            https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+        :param project_id: The ID of the GCP project that owns the job.
+        :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).
+        
+        Returns the created Pipeline instance in JSON representation.
+        """
+        
+        parent = self.build_parent_name(project_id, location)
+        service = self.get_conn()
+        print(dir(service.projects().locations()))
+        request = (
+            service.projects().locations().pipelines().create(
+                parent = parent,
+                body = body,
+            )
+        )
+        response = request.execute(num_retries=self.num_retries)
+        return response
+
+    @staticmethod
+    def build_parent_name(project_id: str, location: str):
+        return f"projects/{project_id}/locations/{location}"
+    
+    @GoogleBaseHook.fallback_to_default_project_id
+    def run_data_pipeline(
+        self,
+        data_pipeline_name: str,
+        project_id: str,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+    ) -> None:
+        """
+        Runs a Data Pipeline Instance using the Data Pipeline API 
+
+        :param data_pipeline_name:  The display name of the pipeline. In example 
+            projects/PROJECT_ID/locations/LOCATION_ID/pipelines/PIPELINE_ID it would be the PIPELINE_ID.
+        :param project_id: The ID of the GCP project that owns the job.
+        :param location: The location of the Data Pipeline instance to (example_dags uses uscentral-1).
+        
+        Returns the created Job in JSON representation.
+        """
+        parent = self.build_parent_name(project_id, location)
+        service = self.get_conn()
+        print(dir(service.projects().locations()))

Review Comment:
   replace print with self.log.info



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] xianhualiu commented on a diff in pull request #32678: For Google Host Review

Posted by "xianhualiu (via GitHub)" <gi...@apache.org>.
xianhualiu commented on code in PR #32678:
URL: https://github.com/apache/airflow/pull/32678#discussion_r1267235377


##########
airflow/providers/google/cloud/hooks/datapipeline.py:
##########
@@ -0,0 +1,129 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains a Google DataPipeline Hook."""
+from __future__ import annotations
+
+import functools
+import json
+import re
+import shlex
+import subprocess
+import time
+import uuid
+import warnings
+import urllib.parse
+from copy import deepcopy
+
+from googleapiclient.discovery import build
+
+from airflow.providers.google.common.hooks.base_google import (
+    GoogleBaseHook,
+)
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.utils.timeout import timeout
+
+# This is the default location
+# https://cloud.google.com/dataflow/pipelines/specifying-exec-params
+DEFAULT_DATAPIPELINE_LOCATION = "us-central1"
+
+
+class DataPipelineHook(GoogleBaseHook):
+    """
+    Hook for Google DataPipeline.
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+    """
+
+    def __init__(
+        self,
+        gcp_conn_id: str = "google_cloud_default",
+        impersonation_chain: str | Sequence[str] | None = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(
+            gcp_conn_id=gcp_conn_id,
+            impersonation_chain=impersonation_chain,
+        )
+
+    def get_conn(self) -> build:
+        """Returns a Google Cloud DataPipeline service object."""
+        http_authorized = self._authorize()
+        return build("datapipelines", "v1", http=http_authorized, cache_discovery=False)
+
+    @GoogleBaseHook.fallback_to_default_project_id
+    def create_data_pipeline(
+        self,
+        body: dict,
+        project_id: str,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+    ) -> None:
+        """
+        Creates a new Data Pipeline instance from the Data Pipeline API.
+
+        :param body: The request body (contains instance of Pipeline). See:
+            https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+        :param project_id: The ID of the GCP project that owns the job.
+        :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).
+        
+        Returns the created Pipeline instance in JSON representation.
+        """
+        
+        parent = self.build_parent_name(project_id, location)
+        service = self.get_conn()
+        print(dir(service.projects().locations()))

Review Comment:
   replace print with self.log.info



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] xianhualiu commented on a diff in pull request #32678: For Google Host Review

Posted by "xianhualiu (via GitHub)" <gi...@apache.org>.
xianhualiu commented on code in PR #32678:
URL: https://github.com/apache/airflow/pull/32678#discussion_r1267245584


##########
airflow/providers/google/cloud/operators/datapipeline.py:
##########
@@ -0,0 +1,139 @@
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains Google DataPipeline operators."""
+from __future__ import annotations
+
+import copy
+import re
+import uuid
+import warnings
+from contextlib import ExitStack
+from enum import Enum
+from functools import cached_property
+from typing import TYPE_CHECKING, Any, Sequence
+
+from airflow import AirflowException
+from airflow.exceptions import AirflowProviderDeprecationWarning
+from airflow.providers.google.cloud.hooks.datapipeline import (
+    DEFAULT_DATAPIPELINE_LOCATION,
+    DataPipelineHook
+)
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.cloud.operators.cloud_base import GoogleCloudBaseOperator
+from airflow.version import version
+
+
+class CreateDataPipelineOperator(GoogleCloudBaseOperator):
+    """ 
+    Creates a new Data Pipeline instance from the Data Pipeline API.
+
+    :param body: The request body (contains instance of Pipeline). See:
+        https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+    :param project_id: The ID of the GCP project that owns the job.
+    :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).
+    :param gcp_conn_id: The connection ID to connect to the Google Cloud
+        Platform.
+
+    Returns the created Pipeline instance in JSON representation.
+    """
+    def __init__(
+        self,
+        *,
+        body: dict,
+        project_id: str | None = None,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+        gcp_conn_id: str = "google_cloud_default",
+        **kwargs,
+    ) -> None:
+        super().__init__(**kwargs)
+
+        self.body = body
+        self.project_id = project_id
+        self.location = location
+        self.gcp_conn_id = gcp_conn_id
+        self.datapipeline_hook : DataPipelineHook | None = None
+        self.body["pipelineSources"] = {"airflow":"airflow"}
+
+    def execute(self, context: Context):
+        self.datapipeline_hook = DataPipelineHook(
+            gcp_conn_id=self.gcp_conn_id
+        )
+
+        self.data_pipeline = self.datapipeline_hook.create_data_pipeline(
+            project_id = self.project_id,
+            body = self.body,
+            location = self.location,
+        )
+        self.log.info("Response Body: ", self.data_pipeline)
+

Review Comment:
   please add failure handle logic based on the response code. If the response contains "error", AirflowException needs to be raised with the "message" of the "error". 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] manavgarg commented on pull request #32678: For Google Host Review

Posted by "manavgarg (via GitHub)" <gi...@apache.org>.
manavgarg commented on PR #32678:
URL: https://github.com/apache/airflow/pull/32678#issuecomment-1657231232

   I believe this is now superseded by #32843 and #32846 ?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] xianhualiu commented on a diff in pull request #32678: For Google Host Review

Posted by "xianhualiu (via GitHub)" <gi...@apache.org>.
xianhualiu commented on code in PR #32678:
URL: https://github.com/apache/airflow/pull/32678#discussion_r1267246541


##########
airflow/providers/google/cloud/operators/datapipeline.py:
##########
@@ -0,0 +1,139 @@
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains Google DataPipeline operators."""
+from __future__ import annotations
+
+import copy
+import re
+import uuid
+import warnings
+from contextlib import ExitStack
+from enum import Enum
+from functools import cached_property
+from typing import TYPE_CHECKING, Any, Sequence
+
+from airflow import AirflowException
+from airflow.exceptions import AirflowProviderDeprecationWarning
+from airflow.providers.google.cloud.hooks.datapipeline import (
+    DEFAULT_DATAPIPELINE_LOCATION,
+    DataPipelineHook
+)
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.cloud.operators.cloud_base import GoogleCloudBaseOperator
+from airflow.version import version
+
+
+class CreateDataPipelineOperator(GoogleCloudBaseOperator):
+    """ 
+    Creates a new Data Pipeline instance from the Data Pipeline API.
+
+    :param body: The request body (contains instance of Pipeline). See:
+        https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+    :param project_id: The ID of the GCP project that owns the job.
+    :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).
+    :param gcp_conn_id: The connection ID to connect to the Google Cloud
+        Platform.
+
+    Returns the created Pipeline instance in JSON representation.
+    """
+    def __init__(
+        self,
+        *,
+        body: dict,
+        project_id: str | None = None,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+        gcp_conn_id: str = "google_cloud_default",
+        **kwargs,
+    ) -> None:
+        super().__init__(**kwargs)
+
+        self.body = body
+        self.project_id = project_id
+        self.location = location
+        self.gcp_conn_id = gcp_conn_id
+        self.datapipeline_hook : DataPipelineHook | None = None
+        self.body["pipelineSources"] = {"airflow":"airflow"}
+
+    def execute(self, context: Context):
+        self.datapipeline_hook = DataPipelineHook(
+            gcp_conn_id=self.gcp_conn_id
+        )
+
+        self.data_pipeline = self.datapipeline_hook.create_data_pipeline(
+            project_id = self.project_id,
+            body = self.body,
+            location = self.location,
+        )
+        self.log.info("Response Body: ", self.data_pipeline)
+
+        # returns the full response body
+        return self.data_pipeline
+
+
+class RunDataPipelineOperator(GoogleCloudBaseOperator):
+    """ 
+    Runs a Data Pipeline Instance using the Data Pipeline API 
+
+    :param data_pipeline_name:  The display name of the pipeline. In example 
+        projects/PROJECT_ID/locations/LOCATION_ID/pipelines/PIPELINE_ID it would be the PIPELINE_ID.
+    :param project_id: The ID of the GCP project that owns the job.
+    :param location: The location of the Data Pipeline instance to (example_dags uses uscentral-1).
+    :param gcp_conn_id: The connection ID to connect to the Google Cloud
+        Platform.
+
+    Returns the created Job in JSON representation.
+    """
+    def __init__(
+            self,
+            data_pipeline_name: str,
+            project_id: str | None = None,
+            location: str = DEFAULT_DATAPIPELINE_LOCATION,
+            gcp_conn_id: str = "google_cloud_default",
+            **kwargs
+    ) -> None:
+        super().__init__(**kwargs)
+
+        self.data_pipeline_name = data_pipeline_name
+        self.project_id = project_id
+        self.location = location
+        self.gcp_conn_id =  gcp_conn_id
+
+    def execute(self, context: Context):
+        self.data_pipeline_hook = DataPipelineHook(gcp_conn_id=self.gcp_conn_id)
+
+        if self.data_pipeline_name is None:
+            raise AirflowException(
+                "Data Pipeline name not given; cannot run unspecified pipeline."
+            )
+        if self.project_id is None:
+            raise AirflowException(
+                "Project ID not given; cannot run pipeline."
+            )
+        if self.location is None:
+            raise AirflowException(
+                "Pipeline location not given; cannot run pipeline."
+            )
+
+        self.response = self.data_pipeline_hook.run_data_pipeline(
+            data_pipeline_name = self.data_pipeline_name,
+            project_id = self.project_id,
+            location = self.location,
+        )
+

Review Comment:
   please add failure handle logic based on the response code. If the response contains "error", AirflowException needs to be raised with the "message" of the "error". 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] manavgarg commented on a diff in pull request #32678: For Google Host Review

Posted by "manavgarg (via GitHub)" <gi...@apache.org>.
manavgarg commented on code in PR #32678:
URL: https://github.com/apache/airflow/pull/32678#discussion_r1267366681


##########
airflow/providers/google/cloud/hooks/datapipeline.py:
##########
@@ -0,0 +1,129 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains a Google DataPipeline Hook."""
+from __future__ import annotations
+
+import functools
+import json
+import re
+import shlex
+import subprocess
+import time
+import uuid
+import warnings
+import urllib.parse
+from copy import deepcopy
+
+from googleapiclient.discovery import build
+
+from airflow.providers.google.common.hooks.base_google import (
+    GoogleBaseHook,
+)
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.utils.timeout import timeout
+
+# This is the default location
+# https://cloud.google.com/dataflow/pipelines/specifying-exec-params
+DEFAULT_DATAPIPELINE_LOCATION = "us-central1"
+
+
+class DataPipelineHook(GoogleBaseHook):
+    """
+    Hook for Google DataPipeline.
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+    """
+
+    def __init__(
+        self,
+        gcp_conn_id: str = "google_cloud_default",
+        impersonation_chain: str | Sequence[str] | None = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(
+            gcp_conn_id=gcp_conn_id,
+            impersonation_chain=impersonation_chain,
+        )
+
+    def get_conn(self) -> build:
+        """Returns a Google Cloud DataPipeline service object."""
+        http_authorized = self._authorize()
+        return build("datapipelines", "v1", http=http_authorized, cache_discovery=False)
+
+    @GoogleBaseHook.fallback_to_default_project_id
+    def create_data_pipeline(
+        self,
+        body: dict,
+        project_id: str,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+    ) -> None:
+        """
+        Creates a new Data Pipeline instance from the Data Pipeline API.
+
+        :param body: The request body (contains instance of Pipeline). See:
+            https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+        :param project_id: The ID of the GCP project that owns the job.
+        :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).

Review Comment:
   nit: Pipeline -> Pipelines



##########
airflow/providers/google/cloud/hooks/datapipeline.py:
##########
@@ -0,0 +1,129 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains a Google DataPipeline Hook."""
+from __future__ import annotations
+
+import functools
+import json
+import re
+import shlex
+import subprocess
+import time
+import uuid
+import warnings
+import urllib.parse
+from copy import deepcopy
+
+from googleapiclient.discovery import build
+
+from airflow.providers.google.common.hooks.base_google import (
+    GoogleBaseHook,
+)
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.utils.timeout import timeout
+
+# This is the default location
+# https://cloud.google.com/dataflow/pipelines/specifying-exec-params
+DEFAULT_DATAPIPELINE_LOCATION = "us-central1"
+
+
+class DataPipelineHook(GoogleBaseHook):
+    """
+    Hook for Google DataPipeline.

Review Comment:
   nit (reword): Hook for Google Cloud Data Pipelines.



##########
airflow/providers/google/cloud/operators/datapipeline.py:
##########
@@ -0,0 +1,139 @@
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains Google DataPipeline operators."""
+from __future__ import annotations
+
+import copy
+import re
+import uuid
+import warnings
+from contextlib import ExitStack
+from enum import Enum
+from functools import cached_property
+from typing import TYPE_CHECKING, Any, Sequence
+
+from airflow import AirflowException
+from airflow.exceptions import AirflowProviderDeprecationWarning
+from airflow.providers.google.cloud.hooks.datapipeline import (
+    DEFAULT_DATAPIPELINE_LOCATION,
+    DataPipelineHook
+)
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.cloud.operators.cloud_base import GoogleCloudBaseOperator
+from airflow.version import version
+
+
+class CreateDataPipelineOperator(GoogleCloudBaseOperator):
+    """ 
+    Creates a new Data Pipeline instance from the Data Pipeline API.
+
+    :param body: The request body (contains instance of Pipeline). See:
+        https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+    :param project_id: The ID of the GCP project that owns the job.
+    :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).
+    :param gcp_conn_id: The connection ID to connect to the Google Cloud
+        Platform.
+
+    Returns the created Pipeline instance in JSON representation.
+    """
+    def __init__(
+        self,
+        *,
+        body: dict,
+        project_id: str | None = None,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+        gcp_conn_id: str = "google_cloud_default",
+        **kwargs,
+    ) -> None:
+        super().__init__(**kwargs)
+
+        self.body = body
+        self.project_id = project_id
+        self.location = location
+        self.gcp_conn_id = gcp_conn_id
+        self.datapipeline_hook : DataPipelineHook | None = None
+        self.body["pipelineSources"] = {"airflow":"airflow"}
+
+    def execute(self, context: Context):
+        self.datapipeline_hook = DataPipelineHook(
+            gcp_conn_id=self.gcp_conn_id
+        )
+
+        self.data_pipeline = self.datapipeline_hook.create_data_pipeline(
+            project_id = self.project_id,
+            body = self.body,
+            location = self.location,
+        )
+        self.log.info("Response Body: ", self.data_pipeline)
+
+        # returns the full response body

Review Comment:
   remove the comment.



##########
airflow/providers/google/cloud/hooks/datapipeline.py:
##########
@@ -0,0 +1,129 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains a Google DataPipeline Hook."""
+from __future__ import annotations
+
+import functools
+import json
+import re
+import shlex
+import subprocess
+import time
+import uuid
+import warnings
+import urllib.parse
+from copy import deepcopy
+
+from googleapiclient.discovery import build
+
+from airflow.providers.google.common.hooks.base_google import (
+    GoogleBaseHook,
+)
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.utils.timeout import timeout
+
+# This is the default location
+# https://cloud.google.com/dataflow/pipelines/specifying-exec-params
+DEFAULT_DATAPIPELINE_LOCATION = "us-central1"
+
+
+class DataPipelineHook(GoogleBaseHook):
+    """
+    Hook for Google DataPipeline.
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+    """
+
+    def __init__(
+        self,
+        gcp_conn_id: str = "google_cloud_default",
+        impersonation_chain: str | Sequence[str] | None = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(
+            gcp_conn_id=gcp_conn_id,
+            impersonation_chain=impersonation_chain,
+        )
+
+    def get_conn(self) -> build:
+        """Returns a Google Cloud DataPipeline service object."""
+        http_authorized = self._authorize()
+        return build("datapipelines", "v1", http=http_authorized, cache_discovery=False)
+
+    @GoogleBaseHook.fallback_to_default_project_id
+    def create_data_pipeline(
+        self,
+        body: dict,
+        project_id: str,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+    ) -> None:
+        """
+        Creates a new Data Pipeline instance from the Data Pipeline API.
+
+        :param body: The request body (contains instance of Pipeline). See:
+            https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+        :param project_id: The ID of the GCP project that owns the job.
+        :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).
+        
+        Returns the created Pipeline instance in JSON representation.
+        """
+        
+        parent = self.build_parent_name(project_id, location)
+        service = self.get_conn()
+        print(dir(service.projects().locations()))
+        request = (
+            service.projects().locations().pipelines().create(
+                parent = parent,
+                body = body,
+            )
+        )
+        response = request.execute(num_retries=self.num_retries)
+        return response
+
+    @staticmethod
+    def build_parent_name(project_id: str, location: str):
+        return f"projects/{project_id}/locations/{location}"
+    
+    @GoogleBaseHook.fallback_to_default_project_id
+    def run_data_pipeline(
+        self,
+        data_pipeline_name: str,
+        project_id: str,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+    ) -> None:
+        """
+        Runs a Data Pipeline Instance using the Data Pipeline API 

Review Comment:
   nit: Pipeline -> Pipelines



##########
airflow/providers/google/cloud/operators/datapipeline.py:
##########
@@ -0,0 +1,139 @@
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains Google DataPipeline operators."""
+from __future__ import annotations
+
+import copy
+import re
+import uuid
+import warnings
+from contextlib import ExitStack
+from enum import Enum
+from functools import cached_property
+from typing import TYPE_CHECKING, Any, Sequence
+
+from airflow import AirflowException
+from airflow.exceptions import AirflowProviderDeprecationWarning
+from airflow.providers.google.cloud.hooks.datapipeline import (
+    DEFAULT_DATAPIPELINE_LOCATION,
+    DataPipelineHook
+)
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.cloud.operators.cloud_base import GoogleCloudBaseOperator
+from airflow.version import version
+
+
+class CreateDataPipelineOperator(GoogleCloudBaseOperator):
+    """ 
+    Creates a new Data Pipeline instance from the Data Pipeline API.
+
+    :param body: The request body (contains instance of Pipeline). See:
+        https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+    :param project_id: The ID of the GCP project that owns the job.
+    :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).
+    :param gcp_conn_id: The connection ID to connect to the Google Cloud
+        Platform.
+
+    Returns the created Pipeline instance in JSON representation.
+    """
+    def __init__(
+        self,
+        *,
+        body: dict,
+        project_id: str | None = None,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+        gcp_conn_id: str = "google_cloud_default",
+        **kwargs,
+    ) -> None:
+        super().__init__(**kwargs)
+
+        self.body = body
+        self.project_id = project_id
+        self.location = location
+        self.gcp_conn_id = gcp_conn_id
+        self.datapipeline_hook : DataPipelineHook | None = None
+        self.body["pipelineSources"] = {"airflow":"airflow"}
+
+    def execute(self, context: Context):
+        self.datapipeline_hook = DataPipelineHook(
+            gcp_conn_id=self.gcp_conn_id
+        )
+
+        self.data_pipeline = self.datapipeline_hook.create_data_pipeline(
+            project_id = self.project_id,
+            body = self.body,
+            location = self.location,
+        )
+        self.log.info("Response Body: ", self.data_pipeline)

Review Comment:
   do we need to actually log the response body? 



##########
airflow/providers/google/cloud/operators/datapipeline.py:
##########
@@ -0,0 +1,139 @@
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains Google DataPipeline operators."""
+from __future__ import annotations
+
+import copy
+import re
+import uuid
+import warnings
+from contextlib import ExitStack
+from enum import Enum
+from functools import cached_property
+from typing import TYPE_CHECKING, Any, Sequence
+
+from airflow import AirflowException
+from airflow.exceptions import AirflowProviderDeprecationWarning
+from airflow.providers.google.cloud.hooks.datapipeline import (
+    DEFAULT_DATAPIPELINE_LOCATION,
+    DataPipelineHook
+)
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.cloud.operators.cloud_base import GoogleCloudBaseOperator
+from airflow.version import version
+
+
+class CreateDataPipelineOperator(GoogleCloudBaseOperator):
+    """ 
+    Creates a new Data Pipeline instance from the Data Pipeline API.

Review Comment:
   change this elsewhere in the file where applicable.



##########
airflow/providers/google/cloud/hooks/datapipeline.py:
##########
@@ -0,0 +1,129 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains a Google DataPipeline Hook."""
+from __future__ import annotations
+
+import functools
+import json
+import re
+import shlex
+import subprocess
+import time
+import uuid
+import warnings
+import urllib.parse
+from copy import deepcopy
+
+from googleapiclient.discovery import build
+
+from airflow.providers.google.common.hooks.base_google import (
+    GoogleBaseHook,
+)
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.utils.timeout import timeout
+
+# This is the default location
+# https://cloud.google.com/dataflow/pipelines/specifying-exec-params
+DEFAULT_DATAPIPELINE_LOCATION = "us-central1"
+
+
+class DataPipelineHook(GoogleBaseHook):
+    """
+    Hook for Google DataPipeline.
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+    """
+
+    def __init__(
+        self,
+        gcp_conn_id: str = "google_cloud_default",
+        impersonation_chain: str | Sequence[str] | None = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(
+            gcp_conn_id=gcp_conn_id,
+            impersonation_chain=impersonation_chain,
+        )
+
+    def get_conn(self) -> build:
+        """Returns a Google Cloud DataPipeline service object."""
+        http_authorized = self._authorize()
+        return build("datapipelines", "v1", http=http_authorized, cache_discovery=False)
+
+    @GoogleBaseHook.fallback_to_default_project_id
+    def create_data_pipeline(
+        self,
+        body: dict,
+        project_id: str,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+    ) -> None:
+        """
+        Creates a new Data Pipeline instance from the Data Pipeline API.

Review Comment:
   nit: Pipeline -> Pipelines



##########
airflow/providers/google/cloud/operators/datapipeline.py:
##########
@@ -0,0 +1,139 @@
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains Google DataPipeline operators."""
+from __future__ import annotations
+
+import copy
+import re
+import uuid
+import warnings
+from contextlib import ExitStack
+from enum import Enum
+from functools import cached_property
+from typing import TYPE_CHECKING, Any, Sequence
+
+from airflow import AirflowException
+from airflow.exceptions import AirflowProviderDeprecationWarning
+from airflow.providers.google.cloud.hooks.datapipeline import (
+    DEFAULT_DATAPIPELINE_LOCATION,
+    DataPipelineHook
+)
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.cloud.operators.cloud_base import GoogleCloudBaseOperator
+from airflow.version import version
+
+
+class CreateDataPipelineOperator(GoogleCloudBaseOperator):
+    """ 
+    Creates a new Data Pipeline instance from the Data Pipeline API.
+
+    :param body: The request body (contains instance of Pipeline). See:
+        https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+    :param project_id: The ID of the GCP project that owns the job.
+    :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).
+    :param gcp_conn_id: The connection ID to connect to the Google Cloud
+        Platform.
+
+    Returns the created Pipeline instance in JSON representation.
+    """
+    def __init__(
+        self,
+        *,
+        body: dict,
+        project_id: str | None = None,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+        gcp_conn_id: str = "google_cloud_default",
+        **kwargs,
+    ) -> None:
+        super().__init__(**kwargs)
+
+        self.body = body
+        self.project_id = project_id
+        self.location = location
+        self.gcp_conn_id = gcp_conn_id
+        self.datapipeline_hook : DataPipelineHook | None = None
+        self.body["pipelineSources"] = {"airflow":"airflow"}
+
+    def execute(self, context: Context):
+        self.datapipeline_hook = DataPipelineHook(
+            gcp_conn_id=self.gcp_conn_id
+        )
+
+        self.data_pipeline = self.datapipeline_hook.create_data_pipeline(
+            project_id = self.project_id,
+            body = self.body,
+            location = self.location,
+        )
+        self.log.info("Response Body: ", self.data_pipeline)
+
+        # returns the full response body
+        return self.data_pipeline
+
+
+class RunDataPipelineOperator(GoogleCloudBaseOperator):
+    """ 
+    Runs a Data Pipeline Instance using the Data Pipeline API 

Review Comment:
   nit: Pipeline -> Pipelines



##########
airflow/providers/google/cloud/hooks/datapipeline.py:
##########
@@ -0,0 +1,129 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains a Google DataPipeline Hook."""
+from __future__ import annotations
+
+import functools
+import json
+import re
+import shlex
+import subprocess
+import time
+import uuid
+import warnings
+import urllib.parse
+from copy import deepcopy
+
+from googleapiclient.discovery import build
+
+from airflow.providers.google.common.hooks.base_google import (
+    GoogleBaseHook,
+)
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.utils.timeout import timeout
+
+# This is the default location
+# https://cloud.google.com/dataflow/pipelines/specifying-exec-params
+DEFAULT_DATAPIPELINE_LOCATION = "us-central1"
+
+
+class DataPipelineHook(GoogleBaseHook):
+    """
+    Hook for Google DataPipeline.
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+    """
+
+    def __init__(
+        self,
+        gcp_conn_id: str = "google_cloud_default",
+        impersonation_chain: str | Sequence[str] | None = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(
+            gcp_conn_id=gcp_conn_id,
+            impersonation_chain=impersonation_chain,
+        )
+
+    def get_conn(self) -> build:
+        """Returns a Google Cloud DataPipeline service object."""
+        http_authorized = self._authorize()
+        return build("datapipelines", "v1", http=http_authorized, cache_discovery=False)
+
+    @GoogleBaseHook.fallback_to_default_project_id
+    def create_data_pipeline(
+        self,
+        body: dict,
+        project_id: str,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+    ) -> None:
+        """
+        Creates a new Data Pipeline instance from the Data Pipeline API.
+
+        :param body: The request body (contains instance of Pipeline). See:
+            https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+        :param project_id: The ID of the GCP project that owns the job.
+        :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).
+        
+        Returns the created Pipeline instance in JSON representation.
+        """
+        
+        parent = self.build_parent_name(project_id, location)
+        service = self.get_conn()
+        print(dir(service.projects().locations()))
+        request = (
+            service.projects().locations().pipelines().create(
+                parent = parent,
+                body = body,
+            )
+        )
+        response = request.execute(num_retries=self.num_retries)
+        return response
+
+    @staticmethod
+    def build_parent_name(project_id: str, location: str):
+        return f"projects/{project_id}/locations/{location}"
+    
+    @GoogleBaseHook.fallback_to_default_project_id
+    def run_data_pipeline(
+        self,
+        data_pipeline_name: str,
+        project_id: str,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+    ) -> None:
+        """
+        Runs a Data Pipeline Instance using the Data Pipeline API 
+
+        :param data_pipeline_name:  The display name of the pipeline. In example 
+            projects/PROJECT_ID/locations/LOCATION_ID/pipelines/PIPELINE_ID it would be the PIPELINE_ID.
+        :param project_id: The ID of the GCP project that owns the job.
+        :param location: The location of the Data Pipeline instance to (example_dags uses uscentral-1).

Review Comment:
   nit: Pipeline -> Pipelines



##########
airflow/providers/google/cloud/operators/datapipeline.py:
##########
@@ -0,0 +1,139 @@
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains Google DataPipeline operators."""
+from __future__ import annotations
+
+import copy
+import re
+import uuid
+import warnings
+from contextlib import ExitStack
+from enum import Enum
+from functools import cached_property
+from typing import TYPE_CHECKING, Any, Sequence
+
+from airflow import AirflowException
+from airflow.exceptions import AirflowProviderDeprecationWarning
+from airflow.providers.google.cloud.hooks.datapipeline import (
+    DEFAULT_DATAPIPELINE_LOCATION,
+    DataPipelineHook
+)
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.cloud.operators.cloud_base import GoogleCloudBaseOperator
+from airflow.version import version
+
+
+class CreateDataPipelineOperator(GoogleCloudBaseOperator):
+    """ 
+    Creates a new Data Pipeline instance from the Data Pipeline API.
+
+    :param body: The request body (contains instance of Pipeline). See:
+        https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+    :param project_id: The ID of the GCP project that owns the job.
+    :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).
+    :param gcp_conn_id: The connection ID to connect to the Google Cloud
+        Platform.
+
+    Returns the created Pipeline instance in JSON representation.

Review Comment:
   nit: Pipeline -> Data Pipelines



##########
airflow/providers/google/cloud/operators/datapipeline.py:
##########
@@ -0,0 +1,139 @@
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains Google DataPipeline operators."""
+from __future__ import annotations
+
+import copy
+import re
+import uuid
+import warnings
+from contextlib import ExitStack
+from enum import Enum
+from functools import cached_property
+from typing import TYPE_CHECKING, Any, Sequence
+
+from airflow import AirflowException
+from airflow.exceptions import AirflowProviderDeprecationWarning
+from airflow.providers.google.cloud.hooks.datapipeline import (
+    DEFAULT_DATAPIPELINE_LOCATION,
+    DataPipelineHook
+)
+from airflow.providers.google.cloud.hooks.gcs import GCSHook
+from airflow.providers.google.cloud.operators.cloud_base import GoogleCloudBaseOperator
+from airflow.version import version
+
+
+class CreateDataPipelineOperator(GoogleCloudBaseOperator):
+    """ 
+    Creates a new Data Pipeline instance from the Data Pipeline API.

Review Comment:
   nit: Pipeline -> Pipelines



##########
airflow/providers/google/cloud/hooks/datapipeline.py:
##########
@@ -0,0 +1,129 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains a Google DataPipeline Hook."""
+from __future__ import annotations
+
+import functools
+import json
+import re
+import shlex
+import subprocess
+import time
+import uuid
+import warnings
+import urllib.parse
+from copy import deepcopy
+
+from googleapiclient.discovery import build
+
+from airflow.providers.google.common.hooks.base_google import (
+    GoogleBaseHook,
+)
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.utils.timeout import timeout
+
+# This is the default location
+# https://cloud.google.com/dataflow/pipelines/specifying-exec-params
+DEFAULT_DATAPIPELINE_LOCATION = "us-central1"
+
+
+class DataPipelineHook(GoogleBaseHook):
+    """
+    Hook for Google DataPipeline.
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+    """
+
+    def __init__(
+        self,
+        gcp_conn_id: str = "google_cloud_default",
+        impersonation_chain: str | Sequence[str] | None = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(
+            gcp_conn_id=gcp_conn_id,
+            impersonation_chain=impersonation_chain,
+        )
+
+    def get_conn(self) -> build:
+        """Returns a Google Cloud DataPipeline service object."""

Review Comment:
   nit: change DataPipeline to Data Pipelines



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] shaniyaclement commented on a diff in pull request #32678: For Google Host Review

Posted by "shaniyaclement (via GitHub)" <gi...@apache.org>.
shaniyaclement commented on code in PR #32678:
URL: https://github.com/apache/airflow/pull/32678#discussion_r1267350897


##########
airflow/providers/google/cloud/hooks/datapipeline.py:
##########
@@ -0,0 +1,129 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains a Google DataPipeline Hook."""
+from __future__ import annotations
+
+import functools
+import json
+import re
+import shlex
+import subprocess
+import time
+import uuid
+import warnings
+import urllib.parse
+from copy import deepcopy
+
+from googleapiclient.discovery import build
+
+from airflow.providers.google.common.hooks.base_google import (
+    GoogleBaseHook,
+)
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.utils.timeout import timeout
+
+# This is the default location
+# https://cloud.google.com/dataflow/pipelines/specifying-exec-params
+DEFAULT_DATAPIPELINE_LOCATION = "us-central1"
+
+
+class DataPipelineHook(GoogleBaseHook):
+    """
+    Hook for Google DataPipeline.
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+    """
+
+    def __init__(
+        self,
+        gcp_conn_id: str = "google_cloud_default",
+        impersonation_chain: str | Sequence[str] | None = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(
+            gcp_conn_id=gcp_conn_id,
+            impersonation_chain=impersonation_chain,
+        )
+
+    def get_conn(self) -> build:
+        """Returns a Google Cloud DataPipeline service object."""
+        http_authorized = self._authorize()
+        return build("datapipelines", "v1", http=http_authorized, cache_discovery=False)
+
+    @GoogleBaseHook.fallback_to_default_project_id
+    def create_data_pipeline(
+        self,
+        body: dict,
+        project_id: str,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+    ) -> None:
+        """
+        Creates a new Data Pipeline instance from the Data Pipeline API.
+
+        :param body: The request body (contains instance of Pipeline). See:
+            https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+        :param project_id: The ID of the GCP project that owns the job.
+        :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).
+        
+        Returns the created Pipeline instance in JSON representation.
+        """
+        
+        parent = self.build_parent_name(project_id, location)
+        service = self.get_conn()
+        print(dir(service.projects().locations()))

Review Comment:
   done



##########
airflow/providers/google/cloud/hooks/datapipeline.py:
##########
@@ -0,0 +1,129 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains a Google DataPipeline Hook."""
+from __future__ import annotations
+
+import functools
+import json
+import re
+import shlex
+import subprocess
+import time
+import uuid
+import warnings
+import urllib.parse
+from copy import deepcopy
+
+from googleapiclient.discovery import build
+
+from airflow.providers.google.common.hooks.base_google import (
+    GoogleBaseHook,
+)
+from airflow.utils.log.logging_mixin import LoggingMixin
+from airflow.utils.timeout import timeout
+
+# This is the default location
+# https://cloud.google.com/dataflow/pipelines/specifying-exec-params
+DEFAULT_DATAPIPELINE_LOCATION = "us-central1"
+
+
+class DataPipelineHook(GoogleBaseHook):
+    """
+    Hook for Google DataPipeline.
+    All the methods in the hook where project_id is used must be called with
+    keyword arguments rather than positional.
+    """
+
+    def __init__(
+        self,
+        gcp_conn_id: str = "google_cloud_default",
+        impersonation_chain: str | Sequence[str] | None = None,
+        **kwargs,
+    ) -> None:
+        super().__init__(
+            gcp_conn_id=gcp_conn_id,
+            impersonation_chain=impersonation_chain,
+        )
+
+    def get_conn(self) -> build:
+        """Returns a Google Cloud DataPipeline service object."""
+        http_authorized = self._authorize()
+        return build("datapipelines", "v1", http=http_authorized, cache_discovery=False)
+
+    @GoogleBaseHook.fallback_to_default_project_id
+    def create_data_pipeline(
+        self,
+        body: dict,
+        project_id: str,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+    ) -> None:
+        """
+        Creates a new Data Pipeline instance from the Data Pipeline API.
+
+        :param body: The request body (contains instance of Pipeline). See:
+            https://cloud.google.com/dataflow/docs/reference/data-pipelines/rest/v1/projects.locations.pipelines/create#request-body
+        :param project_id: The ID of the GCP project that owns the job.
+        :param location: The location to direct the Data Pipeline instance to (example_dags uses uscentral-1).
+        
+        Returns the created Pipeline instance in JSON representation.
+        """
+        
+        parent = self.build_parent_name(project_id, location)
+        service = self.get_conn()
+        print(dir(service.projects().locations()))
+        request = (
+            service.projects().locations().pipelines().create(
+                parent = parent,
+                body = body,
+            )
+        )
+        response = request.execute(num_retries=self.num_retries)
+        return response
+
+    @staticmethod
+    def build_parent_name(project_id: str, location: str):
+        return f"projects/{project_id}/locations/{location}"
+    
+    @GoogleBaseHook.fallback_to_default_project_id
+    def run_data_pipeline(
+        self,
+        data_pipeline_name: str,
+        project_id: str,
+        location: str = DEFAULT_DATAPIPELINE_LOCATION,
+    ) -> None:
+        """
+        Runs a Data Pipeline Instance using the Data Pipeline API 
+
+        :param data_pipeline_name:  The display name of the pipeline. In example 
+            projects/PROJECT_ID/locations/LOCATION_ID/pipelines/PIPELINE_ID it would be the PIPELINE_ID.
+        :param project_id: The ID of the GCP project that owns the job.
+        :param location: The location of the Data Pipeline instance to (example_dags uses uscentral-1).
+        
+        Returns the created Job in JSON representation.
+        """
+        parent = self.build_parent_name(project_id, location)
+        service = self.get_conn()
+        print(dir(service.projects().locations()))

Review Comment:
   done



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [airflow] shaniyaclement commented on a diff in pull request #32678: For Google Host Review

Posted by "shaniyaclement (via GitHub)" <gi...@apache.org>.
shaniyaclement commented on code in PR #32678:
URL: https://github.com/apache/airflow/pull/32678#discussion_r1267327025


##########
tests/providers/google/cloud/operators/test_datapipeline.py:
##########
@@ -0,0 +1,72 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+from __future__ import annotations
+
+from unittest import mock
+from unittest.mock import MagicMock
+
+import pytest as pytest
+
+import airflow
+from airflow.exceptions import AirflowException, AirflowProviderDeprecationWarning
+from airflow.providers.google.cloud.operators.datapipeline import (
+    CreateDataPipelineOperator,
+    RunDataPipelineOperator,
+)
+from airflow.providers.google.cloud.hooks.datapipeline import DataPipelineHook
+from airflow.version import version
+
+TASK_ID = "test-datapipeline-operators"
+TEST_BODY = {
+    "name": "projects/dataflow-interns/locations/us-central1/pipelines/dp-create-1642676351302-mp--1675461000",
+            "type": "PIPELINE_TYPE_BATCH",
+            "workload": {
+                "dataflowFlexTemplateRequest": {
+                "launchParameter": {
+                    "containerSpecGcsPath": "gs://intern-bucket-1/templates/word-count.json",
+                    "jobName": "word-count-test-intern1",
+                    "environment": {
+                    "tempLocation": "gs://intern-bucket-1/temp"
+                    },
+                    "parameters": {
+                    "inputFile": "gs://intern-bucket-1/examples/kinglear.txt",

Review Comment:
   I was required to push this code to create the PR, but the testing file is not complete. I included a TODO to reference needing to change the body pathes. Sorry, I wasn't aware you were going to review the testing file because you said the functions



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org