You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "nivangio (via GitHub)" <gi...@apache.org> on 2023/03/02 16:38:09 UTC

[GitHub] [airflow] nivangio commented on a diff in pull request #29840: DatabricksSubmitRunOperator to support taskflow

nivangio commented on code in PR #29840:
URL: https://github.com/apache/airflow/pull/29840#discussion_r1123399094


##########
airflow/providers/databricks/operators/databricks.py:
##########
@@ -285,12 +285,12 @@ def __init__(
         *,
         json: Any | None = None,
         tasks: list[object] | None = None,
-        spark_jar_task: dict[str, str] | None = None,
-        notebook_task: dict[str, str] | None = None,
-        spark_python_task: dict[str, str | list[str]] | None = None,
-        spark_submit_task: dict[str, list[str]] | None = None,
-        pipeline_task: dict[str, str] | None = None,
-        dbt_task: dict[str, str | list[str]] | None = None,

Review Comment:
   Ok, this might be unrelated to the specific issue we are mentioning here but passing a `dict[str, dict[str, str]]` is also accepted and produces expected outcome under `notebook_task` parameter, for example.
   
   However, when running mypy checks, it fails. I agree that maybe passing as `object` might be vague and some more thought will be needed but, at the very least, the key for `notebook_task` param (and probably for the rest too) should accept `dict`, `str`, `int` to go in line with what's there so far (i.e., without this PR) and, eventually `XComArg` and `PlainXComArg` to support the changes in this PR



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org