You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/03/26 19:16:07 UTC

[GitHub] [airflow] ashb commented on a change in pull request #15037: Batch send to not overload multiprocessing pipe

ashb commented on a change in pull request #15037:
URL: https://github.com/apache/airflow/pull/15037#discussion_r602534318



##########
File path: airflow/jobs/scheduler_job.py
##########
@@ -1245,6 +1252,9 @@ def _process_executor_events(self, session: Session = None) -> int:
 
                 self.processor_agent.send_callback_to_execute(request)
 
+            if i % CALLBACK_SEND_BATCH_SIZE == 0:

Review comment:
       Hmmm, I don't _think_ this will help actually.
   
   This block/line 1253 doesn't get called that often, so I suspect what is happening is that the processor_manager side of the pipe is full, so trying to send _even a single byte_ might block until the other end reads, but it can't read because it is also trying to write.
   
   And trying to heartbeat every time before writing would be slow!.
   
   🤔  




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org