You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2022/03/17 14:36:10 UTC

[GitHub] [airflow] MaksYermak removed a comment on pull request #19248: Create dataproc serverless spark batches operator

MaksYermak removed a comment on pull request #19248:
URL: https://github.com/apache/airflow/pull/19248#issuecomment-1070964190


   @aoelvp94 I have checked your configuration for correct work you should use dictionary instead of Batch() object. It is because in some reasons Airflow can't template object's property.
   
   `{
               "pyspark_batch": {
                   "main_python_file_uri": main_python_file_uri,
                   "python_file_uris": files,
                   "args": [
                       "--start-date={{ get_week_ago(data_interval_start) }}",
                       "--end-date={{ ds }}",
                       (. . .)
                   ],
               },
              ( . . .)
   }`
   One more thing in the last Jinja version 'execution_date' from the template is deprecated. Please use 'data_interval_start' or 'logical_date' instead.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org