You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/11/06 01:18:20 UTC

[GitHub] [airflow] josh-fell commented on a change in pull request #19437: Add Databricks PySpark job example DAG task, update cluster config

josh-fell commented on a change in pull request #19437:
URL: https://github.com/apache/airflow/pull/19437#discussion_r744040837



##########
File path: airflow/providers/databricks/example_dags/example_databricks.py
##########
@@ -71,4 +72,17 @@
         libraries=[{'jar': 'dbfs:/lib/etl-0.1.jar'}],
     )
     # [END howto_operator_databricks_named]
-    notebook_task >> spark_jar_task
+
+    # [START howto_operator_databricks_pyspark]

Review comment:
       These `# [START/END howto_operator_...]` tags are used as references in the operator documentation so the docs can display code snippets of the related example DAG between these `START` and `END` tags. Using these tags keeps the examples and the operator docs in sync. Would you be willing to add some context around this new task and how it relates to the `DatabricksSubmitRunOperator `?
   
   You can check out how similar tags are used for the existing operators in the Databricks operator documentation [here](https://github.com/apache/airflow/edit/main/docs/apache-airflow-providers-databricks/operators.rst).
   
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org