You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/05/27 14:25:53 UTC

[GitHub] [airflow] joppevos opened a new pull request #9037: Create guide for Dataproc Operators

joppevos opened a new pull request #9037:
URL: https://github.com/apache/airflow/pull/9037


   A simple guide based on the example_dataproc file.  Addressing this [issue](https://github.com/apache/airflow/issues/8203)
   
   ##### qs
   - @mik-laj All the examples we have of job configurations are added in the guide. Maybe it is too much, but I assumed people will probably just search through the page. 
   - Is an explanation of the arguments needed? The operators themself are richly documented already so I kept it out for now.
   ---
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Target Github ISSUE in description if exists
   - [x] Commits follow "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines) for more information.
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek merged pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
turbaszek merged pull request #9037:
URL: https://github.com/apache/airflow/pull/9037


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek commented on a change in pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
turbaszek commented on a change in pull request #9037:
URL: https://github.com/apache/airflow/pull/9037#discussion_r431310411



##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,186 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.

Review comment:
       Oh I see it's below in references, but personally I would put it in both places 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek commented on a change in pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
turbaszek commented on a change in pull request #9037:
URL: https://github.com/apache/airflow/pull/9037#discussion_r432834080



##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       ```suggestion
       :dedent: 0
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek commented on pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
turbaszek commented on pull request #9037:
URL: https://github.com/apache/airflow/pull/9037#issuecomment-636454231


   @joppevos can you adjust tests? We've got test that checks that guides are not missing... 
   ```
   tests/test_project_structure.py:218: AssertionError
   ```
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek commented on pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
turbaszek commented on pull request #9037:
URL: https://github.com/apache/airflow/pull/9037#issuecomment-636471714


   Thanks @joppevos !


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek commented on a change in pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
turbaszek commented on a change in pull request #9037:
URL: https://github.com/apache/airflow/pull/9037#discussion_r431309657



##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,186 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__

Review comment:
       Can you add link API docs with  cluster object definition in create cluster section?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] joppevos commented on a change in pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
joppevos commented on a change in pull request #9037:
URL: https://github.com/apache/airflow/pull/9037#discussion_r432930675



##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__
+
+An example of a new cluster config and the updateMask:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_updatemask_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_updatemask_cluster_operator]
+
+To update a cluster you can use:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocUpdateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_update_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_update_cluster_operator]
+
+Deleting a cluster
+------------------
+
+To delete a cluster you can use:
+
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocDeleteClusterOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_delete_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_delete_cluster_operator]
+
+Submit a job to a cluster
+-------------------------
+
+Dataproc supports submitting jobs of different big data components.
+The list currently includes Spark, Hadoop, Pig and Hive.
+For more information on versions and images take a look at `Cloud Dataproc Image version list <https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-versions>`__
+
+To submit a job to the cluster you need a provide a job source file. The job source file can be on GCS, the cluster or on your local
+file system. You can specify a file:/// path to refer to a local file on a cluster's master node.
+
+The job configuration can be submitted by using:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocSubmitJobOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_submit_job_to_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_submit_job_to_cluster_operator]
+
+Examples of job configurations to submit
+----------------------------------------
+
+We have provided an example for every framework below.
+There are more arguments to provide in the jobs than the examples show. For the complete list of arguments take a look at
+`DataProc Job arguments <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs>`__
+
+Example of the configuration for a PySpark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pyspark_config]
+    :end-before: [END how_to_cloud_dataproc_pyspark_config]
+
+Example of the configuration for a SparkSQl Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_sparksql_config]
+    :end-before: [END how_to_cloud_dataproc_sparksql_config]
+
+Example of the configuration for a Spark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_spark_config]
+    :end-before: [END how_to_cloud_dataproc_spark_config]
+
+Example of the configuration for a Hive Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hive_config]
+    :end-before: [END how_to_cloud_dataproc_hive_config]
+
+Example of the configuration for a Hadoop Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hadoop_config]
+    :end-before: [END how_to_cloud_dataproc_hadoop_config]
+
+Example of the configuration for a Pig Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pig_config]
+    :end-before: [END how_to_cloud_dataproc_pig_config]
+
+
+Example of the configuration for a SparkR:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       @turbaszek Thanks for the explanations! clears it up




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek commented on a change in pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
turbaszek commented on a change in pull request #9037:
URL: https://github.com/apache/airflow/pull/9037#discussion_r432439593



##########
File path: docs/operators-and-hooks-ref.rst
##########
@@ -696,7 +696,7 @@ These integrations allow you to perform various operations within the Google Clo
      -
 
    * - `Dataproc <https://cloud.google.com/dataproc/>`__
-     -
+     - :doc:`howto/operator/gcp/dataproc`

Review comment:
       ```suggestion
        - :doc:`How to use <howto/operator/gcp/dataproc>`
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek commented on a change in pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
turbaszek commented on a change in pull request #9037:
URL: https://github.com/apache/airflow/pull/9037#discussion_r432834134



##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__
+
+An example of a new cluster config and the updateMask:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       ```suggestion
       :dedent: 0
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek commented on a change in pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
turbaszek commented on a change in pull request #9037:
URL: https://github.com/apache/airflow/pull/9037#discussion_r431308938



##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,186 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.

Review comment:
       Would you mind adding link to Dataproc website / API docs?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] joppevos commented on a change in pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
joppevos commented on a change in pull request #9037:
URL: https://github.com/apache/airflow/pull/9037#discussion_r432872432



##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__
+
+An example of a new cluster config and the updateMask:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_updatemask_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_updatemask_cluster_operator]
+
+To update a cluster you can use:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocUpdateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_update_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_update_cluster_operator]
+
+Deleting a cluster
+------------------
+
+To delete a cluster you can use:
+
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocDeleteClusterOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_delete_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_delete_cluster_operator]
+
+Submit a job to a cluster
+-------------------------
+
+Dataproc supports submitting jobs of different big data components.
+The list currently includes Spark, Hadoop, Pig and Hive.
+For more information on versions and images take a look at `Cloud Dataproc Image version list <https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-versions>`__
+
+To submit a job to the cluster you need a provide a job source file. The job source file can be on GCS, the cluster or on your local
+file system. You can specify a file:/// path to refer to a local file on a cluster's master node.
+
+The job configuration can be submitted by using:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocSubmitJobOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_submit_job_to_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_submit_job_to_cluster_operator]
+
+Examples of job configurations to submit
+----------------------------------------
+
+We have provided an example for every framework below.
+There are more arguments to provide in the jobs than the examples show. For the complete list of arguments take a look at
+`DataProc Job arguments <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs>`__
+
+Example of the configuration for a PySpark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pyspark_config]
+    :end-before: [END how_to_cloud_dataproc_pyspark_config]
+
+Example of the configuration for a SparkSQl Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_sparksql_config]
+    :end-before: [END how_to_cloud_dataproc_sparksql_config]
+
+Example of the configuration for a Spark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_spark_config]
+    :end-before: [END how_to_cloud_dataproc_spark_config]
+
+Example of the configuration for a Hive Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hive_config]
+    :end-before: [END how_to_cloud_dataproc_hive_config]
+
+Example of the configuration for a Hadoop Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hadoop_config]
+    :end-before: [END how_to_cloud_dataproc_hadoop_config]
+
+Example of the configuration for a Pig Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pig_config]
+    :end-before: [END how_to_cloud_dataproc_pig_config]
+
+
+Example of the configuration for a SparkR:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       My bad, copied it straight from the other guides and was not aware of its function. Will adjust it 

##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__
+
+An example of a new cluster config and the updateMask:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_updatemask_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_updatemask_cluster_operator]
+
+To update a cluster you can use:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocUpdateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_update_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_update_cluster_operator]
+
+Deleting a cluster
+------------------
+
+To delete a cluster you can use:
+
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocDeleteClusterOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_delete_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_delete_cluster_operator]
+
+Submit a job to a cluster
+-------------------------
+
+Dataproc supports submitting jobs of different big data components.
+The list currently includes Spark, Hadoop, Pig and Hive.
+For more information on versions and images take a look at `Cloud Dataproc Image version list <https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-versions>`__
+
+To submit a job to the cluster you need a provide a job source file. The job source file can be on GCS, the cluster or on your local
+file system. You can specify a file:/// path to refer to a local file on a cluster's master node.
+
+The job configuration can be submitted by using:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocSubmitJobOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_submit_job_to_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_submit_job_to_cluster_operator]
+
+Examples of job configurations to submit
+----------------------------------------
+
+We have provided an example for every framework below.
+There are more arguments to provide in the jobs than the examples show. For the complete list of arguments take a look at
+`DataProc Job arguments <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs>`__
+
+Example of the configuration for a PySpark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pyspark_config]
+    :end-before: [END how_to_cloud_dataproc_pyspark_config]
+
+Example of the configuration for a SparkSQl Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_sparksql_config]
+    :end-before: [END how_to_cloud_dataproc_sparksql_config]
+
+Example of the configuration for a Spark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_spark_config]
+    :end-before: [END how_to_cloud_dataproc_spark_config]
+
+Example of the configuration for a Hive Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hive_config]
+    :end-before: [END how_to_cloud_dataproc_hive_config]
+
+Example of the configuration for a Hadoop Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hadoop_config]
+    :end-before: [END how_to_cloud_dataproc_hadoop_config]
+
+Example of the configuration for a Pig Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pig_config]
+    :end-before: [END how_to_cloud_dataproc_pig_config]
+
+
+Example of the configuration for a SparkR:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       @turbaszek My bad, copied it straight from the other guides and was not aware of its function. Will adjust it 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek commented on a change in pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
turbaszek commented on a change in pull request #9037:
URL: https://github.com/apache/airflow/pull/9037#discussion_r432873891



##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__
+
+An example of a new cluster config and the updateMask:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_updatemask_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_updatemask_cluster_operator]
+
+To update a cluster you can use:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocUpdateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_update_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_update_cluster_operator]
+
+Deleting a cluster
+------------------
+
+To delete a cluster you can use:
+
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocDeleteClusterOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_delete_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_delete_cluster_operator]
+
+Submit a job to a cluster
+-------------------------
+
+Dataproc supports submitting jobs of different big data components.
+The list currently includes Spark, Hadoop, Pig and Hive.
+For more information on versions and images take a look at `Cloud Dataproc Image version list <https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-versions>`__
+
+To submit a job to the cluster you need a provide a job source file. The job source file can be on GCS, the cluster or on your local
+file system. You can specify a file:/// path to refer to a local file on a cluster's master node.
+
+The job configuration can be submitted by using:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocSubmitJobOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_submit_job_to_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_submit_job_to_cluster_operator]
+
+Examples of job configurations to submit
+----------------------------------------
+
+We have provided an example for every framework below.
+There are more arguments to provide in the jobs than the examples show. For the complete list of arguments take a look at
+`DataProc Job arguments <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs>`__
+
+Example of the configuration for a PySpark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pyspark_config]
+    :end-before: [END how_to_cloud_dataproc_pyspark_config]
+
+Example of the configuration for a SparkSQl Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_sparksql_config]
+    :end-before: [END how_to_cloud_dataproc_sparksql_config]
+
+Example of the configuration for a Spark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_spark_config]
+    :end-before: [END how_to_cloud_dataproc_spark_config]
+
+Example of the configuration for a Hive Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hive_config]
+    :end-before: [END how_to_cloud_dataproc_hive_config]
+
+Example of the configuration for a Hadoop Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hadoop_config]
+    :end-before: [END how_to_cloud_dataproc_hadoop_config]
+
+Example of the configuration for a Pig Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pig_config]
+    :end-before: [END how_to_cloud_dataproc_pig_config]
+
+
+Example of the configuration for a SparkR:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       @joppevos we don't have to set all dedents to 0. Only where it is necessary.
   
   For example:
   ```python
   # [START anhor]
   zero_indent_dict = {"a": 1}
   # [END anhor]
   
   # [START anhor2]
   with DAG():
       op = MyOperator()
   # [END anhor2]
   ```
   then in docs:
   ```rst
   .. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
       :language: python
       :dedent: 0
       :start-after: [START anhor]
       :end-before: [END anhor]
   ```
   because the first anhor has no indent in linked python file. But the second code snipped is indented by 4 spaces so to make code look better in docs we do:
   
   ```rst
   .. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
       :language: python
       :dedent: 4
       :start-after: [START anhor2]
       :end-before: [END anhor2]
   ```
   in this way it will render to:
   ```
   op = MyOperator()
   ```
   not to
   ```
       op = MyOperator()
   ```
   
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek commented on a change in pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
turbaszek commented on a change in pull request #9037:
URL: https://github.com/apache/airflow/pull/9037#discussion_r432873891



##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__
+
+An example of a new cluster config and the updateMask:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_updatemask_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_updatemask_cluster_operator]
+
+To update a cluster you can use:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocUpdateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_update_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_update_cluster_operator]
+
+Deleting a cluster
+------------------
+
+To delete a cluster you can use:
+
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocDeleteClusterOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_delete_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_delete_cluster_operator]
+
+Submit a job to a cluster
+-------------------------
+
+Dataproc supports submitting jobs of different big data components.
+The list currently includes Spark, Hadoop, Pig and Hive.
+For more information on versions and images take a look at `Cloud Dataproc Image version list <https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-versions>`__
+
+To submit a job to the cluster you need a provide a job source file. The job source file can be on GCS, the cluster or on your local
+file system. You can specify a file:/// path to refer to a local file on a cluster's master node.
+
+The job configuration can be submitted by using:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocSubmitJobOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_submit_job_to_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_submit_job_to_cluster_operator]
+
+Examples of job configurations to submit
+----------------------------------------
+
+We have provided an example for every framework below.
+There are more arguments to provide in the jobs than the examples show. For the complete list of arguments take a look at
+`DataProc Job arguments <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs>`__
+
+Example of the configuration for a PySpark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pyspark_config]
+    :end-before: [END how_to_cloud_dataproc_pyspark_config]
+
+Example of the configuration for a SparkSQl Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_sparksql_config]
+    :end-before: [END how_to_cloud_dataproc_sparksql_config]
+
+Example of the configuration for a Spark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_spark_config]
+    :end-before: [END how_to_cloud_dataproc_spark_config]
+
+Example of the configuration for a Hive Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hive_config]
+    :end-before: [END how_to_cloud_dataproc_hive_config]
+
+Example of the configuration for a Hadoop Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hadoop_config]
+    :end-before: [END how_to_cloud_dataproc_hadoop_config]
+
+Example of the configuration for a Pig Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pig_config]
+    :end-before: [END how_to_cloud_dataproc_pig_config]
+
+
+Example of the configuration for a SparkR:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       @joppevos we don't have to set all dedents to 0. Only where it is necessary.
   
   For example:
   ```python
   # [START anhor]
   zero_indent_dict = {"a": 1}  # indent = 0
   # [END anhor]
   
   # [START anhor2]
   with DAG():
       op = MyOperator()  # indent = 4
   # [END anhor2]
   ```
   then in docs:
   ```rst
   .. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
       :language: python
       :dedent: 0
       :start-after: [START anhor]
       :end-before: [END anhor]
   ```
   because the first anhor has no indent in linked python file. But the second code snipped is indented by 4 spaces so to make code look better in docs we do:
   
   ```rst
   .. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
       :language: python
       :dedent: 4
       :start-after: [START anhor2]
       :end-before: [END anhor2]
   ```
   in this way it will render to:
   ```
   op = MyOperator()
   ```
   not to
   ```
       op = MyOperator()
   ```
   
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] turbaszek commented on a change in pull request #9037: Create guide for Dataproc Operators

Posted by GitBox <gi...@apache.org>.
turbaszek commented on a change in pull request #9037:
URL: https://github.com/apache/airflow/pull/9037#discussion_r432834157



##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__
+
+An example of a new cluster config and the updateMask:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_updatemask_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_updatemask_cluster_operator]
+
+To update a cluster you can use:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocUpdateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_update_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_update_cluster_operator]
+
+Deleting a cluster
+------------------
+
+To delete a cluster you can use:
+
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocDeleteClusterOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_delete_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_delete_cluster_operator]
+
+Submit a job to a cluster
+-------------------------
+
+Dataproc supports submitting jobs of different big data components.
+The list currently includes Spark, Hadoop, Pig and Hive.
+For more information on versions and images take a look at `Cloud Dataproc Image version list <https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-versions>`__
+
+To submit a job to the cluster you need a provide a job source file. The job source file can be on GCS, the cluster or on your local
+file system. You can specify a file:/// path to refer to a local file on a cluster's master node.
+
+The job configuration can be submitted by using:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocSubmitJobOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_submit_job_to_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_submit_job_to_cluster_operator]
+
+Examples of job configurations to submit
+----------------------------------------
+
+We have provided an example for every framework below.
+There are more arguments to provide in the jobs than the examples show. For the complete list of arguments take a look at
+`DataProc Job arguments <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs>`__
+
+Example of the configuration for a PySpark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       ```suggestion
       :dedent: 0
   ```

##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__
+
+An example of a new cluster config and the updateMask:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_updatemask_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_updatemask_cluster_operator]
+
+To update a cluster you can use:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocUpdateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_update_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_update_cluster_operator]
+
+Deleting a cluster
+------------------
+
+To delete a cluster you can use:
+
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocDeleteClusterOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_delete_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_delete_cluster_operator]
+
+Submit a job to a cluster
+-------------------------
+
+Dataproc supports submitting jobs of different big data components.
+The list currently includes Spark, Hadoop, Pig and Hive.
+For more information on versions and images take a look at `Cloud Dataproc Image version list <https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-versions>`__
+
+To submit a job to the cluster you need a provide a job source file. The job source file can be on GCS, the cluster or on your local
+file system. You can specify a file:/// path to refer to a local file on a cluster's master node.
+
+The job configuration can be submitted by using:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocSubmitJobOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_submit_job_to_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_submit_job_to_cluster_operator]
+
+Examples of job configurations to submit
+----------------------------------------
+
+We have provided an example for every framework below.
+There are more arguments to provide in the jobs than the examples show. For the complete list of arguments take a look at
+`DataProc Job arguments <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs>`__
+
+Example of the configuration for a PySpark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pyspark_config]
+    :end-before: [END how_to_cloud_dataproc_pyspark_config]
+
+Example of the configuration for a SparkSQl Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       ```suggestion
       :dedent: 0
   ```

##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__
+
+An example of a new cluster config and the updateMask:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_updatemask_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_updatemask_cluster_operator]
+
+To update a cluster you can use:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocUpdateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_update_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_update_cluster_operator]
+
+Deleting a cluster
+------------------
+
+To delete a cluster you can use:
+
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocDeleteClusterOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_delete_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_delete_cluster_operator]
+
+Submit a job to a cluster
+-------------------------
+
+Dataproc supports submitting jobs of different big data components.
+The list currently includes Spark, Hadoop, Pig and Hive.
+For more information on versions and images take a look at `Cloud Dataproc Image version list <https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-versions>`__
+
+To submit a job to the cluster you need a provide a job source file. The job source file can be on GCS, the cluster or on your local
+file system. You can specify a file:/// path to refer to a local file on a cluster's master node.
+
+The job configuration can be submitted by using:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocSubmitJobOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_submit_job_to_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_submit_job_to_cluster_operator]
+
+Examples of job configurations to submit
+----------------------------------------
+
+We have provided an example for every framework below.
+There are more arguments to provide in the jobs than the examples show. For the complete list of arguments take a look at
+`DataProc Job arguments <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs>`__
+
+Example of the configuration for a PySpark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pyspark_config]
+    :end-before: [END how_to_cloud_dataproc_pyspark_config]
+
+Example of the configuration for a SparkSQl Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_sparksql_config]
+    :end-before: [END how_to_cloud_dataproc_sparksql_config]
+
+Example of the configuration for a Spark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_spark_config]
+    :end-before: [END how_to_cloud_dataproc_spark_config]
+
+Example of the configuration for a Hive Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hive_config]
+    :end-before: [END how_to_cloud_dataproc_hive_config]
+
+Example of the configuration for a Hadoop Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hadoop_config]
+    :end-before: [END how_to_cloud_dataproc_hadoop_config]
+
+Example of the configuration for a Pig Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pig_config]
+    :end-before: [END how_to_cloud_dataproc_pig_config]
+
+
+Example of the configuration for a SparkR:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       ```suggestion
       :dedent: 0
   ```

##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__
+
+An example of a new cluster config and the updateMask:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_updatemask_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_updatemask_cluster_operator]
+
+To update a cluster you can use:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocUpdateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_update_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_update_cluster_operator]
+
+Deleting a cluster
+------------------
+
+To delete a cluster you can use:
+
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocDeleteClusterOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_delete_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_delete_cluster_operator]
+
+Submit a job to a cluster
+-------------------------
+
+Dataproc supports submitting jobs of different big data components.
+The list currently includes Spark, Hadoop, Pig and Hive.
+For more information on versions and images take a look at `Cloud Dataproc Image version list <https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-versions>`__
+
+To submit a job to the cluster you need a provide a job source file. The job source file can be on GCS, the cluster or on your local
+file system. You can specify a file:/// path to refer to a local file on a cluster's master node.
+
+The job configuration can be submitted by using:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocSubmitJobOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_submit_job_to_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_submit_job_to_cluster_operator]
+
+Examples of job configurations to submit
+----------------------------------------
+
+We have provided an example for every framework below.
+There are more arguments to provide in the jobs than the examples show. For the complete list of arguments take a look at
+`DataProc Job arguments <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs>`__
+
+Example of the configuration for a PySpark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pyspark_config]
+    :end-before: [END how_to_cloud_dataproc_pyspark_config]
+
+Example of the configuration for a SparkSQl Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_sparksql_config]
+    :end-before: [END how_to_cloud_dataproc_sparksql_config]
+
+Example of the configuration for a Spark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_spark_config]
+    :end-before: [END how_to_cloud_dataproc_spark_config]
+
+Example of the configuration for a Hive Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       ```suggestion
       :dedent: 0
   ```

##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__
+
+An example of a new cluster config and the updateMask:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_updatemask_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_updatemask_cluster_operator]
+
+To update a cluster you can use:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocUpdateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_update_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_update_cluster_operator]
+
+Deleting a cluster
+------------------
+
+To delete a cluster you can use:
+
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocDeleteClusterOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_delete_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_delete_cluster_operator]
+
+Submit a job to a cluster
+-------------------------
+
+Dataproc supports submitting jobs of different big data components.
+The list currently includes Spark, Hadoop, Pig and Hive.
+For more information on versions and images take a look at `Cloud Dataproc Image version list <https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-versions>`__
+
+To submit a job to the cluster you need a provide a job source file. The job source file can be on GCS, the cluster or on your local
+file system. You can specify a file:/// path to refer to a local file on a cluster's master node.
+
+The job configuration can be submitted by using:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocSubmitJobOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_submit_job_to_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_submit_job_to_cluster_operator]
+
+Examples of job configurations to submit
+----------------------------------------
+
+We have provided an example for every framework below.
+There are more arguments to provide in the jobs than the examples show. For the complete list of arguments take a look at
+`DataProc Job arguments <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs>`__
+
+Example of the configuration for a PySpark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pyspark_config]
+    :end-before: [END how_to_cloud_dataproc_pyspark_config]
+
+Example of the configuration for a SparkSQl Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_sparksql_config]
+    :end-before: [END how_to_cloud_dataproc_sparksql_config]
+
+Example of the configuration for a Spark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_spark_config]
+    :end-before: [END how_to_cloud_dataproc_spark_config]
+
+Example of the configuration for a Hive Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hive_config]
+    :end-before: [END how_to_cloud_dataproc_hive_config]
+
+Example of the configuration for a Hadoop Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hadoop_config]
+    :end-before: [END how_to_cloud_dataproc_hadoop_config]
+
+Example of the configuration for a Pig Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       ```suggestion
       :dedent: 0
   ```

##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__
+
+An example of a new cluster config and the updateMask:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_updatemask_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_updatemask_cluster_operator]
+
+To update a cluster you can use:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocUpdateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_update_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_update_cluster_operator]
+
+Deleting a cluster
+------------------
+
+To delete a cluster you can use:
+
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocDeleteClusterOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_delete_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_delete_cluster_operator]
+
+Submit a job to a cluster
+-------------------------
+
+Dataproc supports submitting jobs of different big data components.
+The list currently includes Spark, Hadoop, Pig and Hive.
+For more information on versions and images take a look at `Cloud Dataproc Image version list <https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-versions>`__
+
+To submit a job to the cluster you need a provide a job source file. The job source file can be on GCS, the cluster or on your local
+file system. You can specify a file:/// path to refer to a local file on a cluster's master node.
+
+The job configuration can be submitted by using:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocSubmitJobOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_submit_job_to_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_submit_job_to_cluster_operator]
+
+Examples of job configurations to submit
+----------------------------------------
+
+We have provided an example for every framework below.
+There are more arguments to provide in the jobs than the examples show. For the complete list of arguments take a look at
+`DataProc Job arguments <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs>`__
+
+Example of the configuration for a PySpark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pyspark_config]
+    :end-before: [END how_to_cloud_dataproc_pyspark_config]
+
+Example of the configuration for a SparkSQl Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_sparksql_config]
+    :end-before: [END how_to_cloud_dataproc_sparksql_config]
+
+Example of the configuration for a Spark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_spark_config]
+    :end-before: [END how_to_cloud_dataproc_spark_config]
+
+Example of the configuration for a Hive Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_hive_config]
+    :end-before: [END how_to_cloud_dataproc_hive_config]
+
+Example of the configuration for a Hadoop Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       ```suggestion
       :dedent: 0
   ```

##########
File path: docs/howto/operator/gcp/dataproc.rst
##########
@@ -0,0 +1,188 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Google Cloud Dataproc Operators
+===============================
+
+Dataproc is a managed Apache Spark and Apache Hadoop service that lets you
+take advantage of open source data tools for batch processing, querying, streaming and machine learning.
+Dataproc automation helps you create clusters quickly, manage them easily, and
+save money by turning clusters off when you don't need them.
+
+For more information about the service visit `Dataproc production documentation <Product documentation <https://cloud.google.com/dataproc/docs/reference>`__
+
+.. contents::
+  :depth: 1
+  :local:
+
+Prerequisite Tasks
+------------------
+
+.. include:: _partials/prerequisite_tasks.rst
+
+
+.. _howto/operator:DataprocCreateClusterOperator:
+
+Create a Cluster
+----------------
+
+Before you create a dataproc cluster you need to define the cluster.
+It describes the identifying information, config, and status of a cluster of Compute Engine instances.
+For more information about the available fields to pass when creating a cluster, visit `Dataproc create cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters#Cluster>`__
+
+A cluster configuration can look as followed:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster]
+    :end-before: [END how_to_cloud_dataproc_create_cluster]
+
+With this configuration we can create the cluster:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocCreateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_create_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_create_cluster_operator]
+
+Update a cluster
+----------------
+You can scale the cluster up or down by providing a cluster config and a updateMask.
+In the updateMask argument you specifies the path, relative to Cluster, of the field to update.
+For more information on updateMask and other parameters take a look at `Dataproc update cluster API. <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/patch>`__
+
+An example of a new cluster config and the updateMask:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_updatemask_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_updatemask_cluster_operator]
+
+To update a cluster you can use:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocUpdateClusterOperator`
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_update_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_update_cluster_operator]
+
+Deleting a cluster
+------------------
+
+To delete a cluster you can use:
+
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocDeleteClusterOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_delete_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_delete_cluster_operator]
+
+Submit a job to a cluster
+-------------------------
+
+Dataproc supports submitting jobs of different big data components.
+The list currently includes Spark, Hadoop, Pig and Hive.
+For more information on versions and images take a look at `Cloud Dataproc Image version list <https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-versions>`__
+
+To submit a job to the cluster you need a provide a job source file. The job source file can be on GCS, the cluster or on your local
+file system. You can specify a file:/// path to refer to a local file on a cluster's master node.
+
+The job configuration can be submitted by using:
+:class:`~airflow.providers.google.cloud.operators.dataproc.DataprocSubmitJobOperator`.
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_submit_job_to_cluster_operator]
+    :end-before: [END how_to_cloud_dataproc_submit_job_to_cluster_operator]
+
+Examples of job configurations to submit
+----------------------------------------
+
+We have provided an example for every framework below.
+There are more arguments to provide in the jobs than the examples show. For the complete list of arguments take a look at
+`DataProc Job arguments <https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.jobs>`__
+
+Example of the configuration for a PySpark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_pyspark_config]
+    :end-before: [END how_to_cloud_dataproc_pyspark_config]
+
+Example of the configuration for a SparkSQl Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4
+    :start-after: [START how_to_cloud_dataproc_sparksql_config]
+    :end-before: [END how_to_cloud_dataproc_sparksql_config]
+
+Example of the configuration for a Spark Job:
+
+.. exampleinclude:: ../../../../airflow/providers/google/cloud/example_dags/example_dataproc.py
+    :language: python
+    :dedent: 4

Review comment:
       ```suggestion
       :dedent: 0
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org