You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "gabFirmaway (via GitHub)" <gi...@apache.org> on 2023/02/08 16:52:05 UTC

[GitHub] [airflow] gabFirmaway commented on issue #29423: GlueJobOperator throws error after migration to newest version of Airflow

gabFirmaway commented on issue #29423:
URL: https://github.com/apache/airflow/issues/29423#issuecomment-1422932899

   Oh! I was stuck with this error today too! 
   I tested the triggering of existing Glue jobs on my development machine with an older version of Airflow and everything worked flawlessly. 
   Today, when I create an ec2 instance, create a custom airflow docker container and triggered the job, it keeps asking for s3_bucket and if you provide one, it override the configuration that you create in the Glue Editor (for example, for python only scripts, it changes to glue script with spark 3 but not defined language).
   
   A temporary solution would be to add in the task definition:
   
   create_job_kwargs={'GlueVersion':'3.0', "DefaultArguments": {"--job-language": "python"}}


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org