You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/03/15 21:03:14 UTC

[GitHub] [airflow] rbankston opened a new issue #14813: Forked processes aren't logged by tasks using the Python Operator

rbankston opened a new issue #14813:
URL: https://github.com/apache/airflow/issues/14813


   **Apache Airflow version**: 2.0.0
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**:  2Ghz Quad Core i5
   - **OS** (e.g. from /etc/os-release): macOS Big Sur 11.2.3
   - **Kernel** (e.g. `uname -a`): 20.3.0 Darwin Kernel Version
   - **Container OS**: Debian 10.8 4.19.121-linuxkit
   - **Install tools**: Docker compose
   
   **What happened**: When having the Python operator run a subprocess the fork is never logged to the task logs.
   
   ```
   [2021-03-15 20:18:12,844] {taskinstance.py:1063} INFO - Executing <Task(PythonOperator): sub> on 2021-03-15T20:13:45.710628+00:00
   [2021-03-15 20:18:12,849] {standard_task_runner.py:52} INFO - Started process 400 to run task
   [2021-03-15 20:18:12,853] {standard_task_runner.py:76} INFO - Running: ['airflow', 'tasks', 'run', 'sub_processing', 'sub', '2021-03-15T20:13:45.710628+00:00', '--job-id', '20', '--pool', 'default_pool', '--raw', '--subdir', 'DAGS_FOLDER/sub-process-2.py', '--cfg-path', '/tmp/tmpmspwuqh1', '--error-file', '/tmp/tmp75f6t8bp']
   [2021-03-15 20:18:12,854] {standard_task_runner.py:77} INFO - Job 20: Subtask sub
   [2021-03-15 20:18:12,898] {logging_mixin.py:103} INFO - Running <TaskInstance: sub_processing.sub 2021-03-15T20:13:45.710628+00:00 [running]> on host 5af90102472a
   [2021-03-15 20:18:12,937] {taskinstance.py:1256} INFO - Exporting the following env vars:
   AIRFLOW_CTX_DAG_OWNER=airflow
   AIRFLOW_CTX_DAG_ID=sub_processing
   AIRFLOW_CTX_TASK_ID=sub
   AIRFLOW_CTX_EXECUTION_DATE=2021-03-15T20:13:45.710628+00:00
   AIRFLOW_CTX_DAG_RUN_ID=manual__2021-03-15T20:13:45.710628+00:00
   [2021-03-15 20:18:23,003] {logging_mixin.py:103} INFO - Task Completed With Success
   [2021-03-15 20:18:23,004] {python.py:118} INFO - Done. Returned value was: None
   [2021-03-15 20:18:23,015] {taskinstance.py:1166} INFO - Marking task as SUCCESS. dag_id=sub_processing, task_id=sub, execution_date=20210315T201345, start_date=20210315T201812, end_date=20210315T201823
   [2021-03-15 20:18:23,055] {taskinstance.py:1219} INFO - 0 downstream tasks scheduled from follow-on schedule check
   [2021-03-15 20:18:23,069] {local_task_job.py:142} INFO - Task exited with return code 0
   ```
   
   **What you expected to happen**: Subtask is logged
   
   ```
   [2021-03-15 20:43:28,520] {taskinstance.py:1063} INFO - Executing <Task(PythonOperator): sub> on 2021-03-15T20:13:45.710628+00:00
   [2021-03-15 20:43:28,520] {base_task_runner.py:133} INFO - Running on host: 49fb1f13cbae
   [2021-03-15 20:43:28,521] {base_task_runner.py:134} INFO - Running: ['airflow', 'tasks', 'run', 'sub_processing', 'sub', '2021-03-15T20:13:45.710628+00:00', '--job-id', '28', '--pool', 'default_pool', '--raw', '--subdir', 'DAGS_FOLDER/sub-process-2.py', '--cfg-path', '/tmp/tmpfukw5nds', '--error-file', '/tmp/tmp18xkzu_9']
   [2021-03-15 20:43:30,035] {base_task_runner.py:118} INFO - Job 28: Subtask sub [2021-03-15 20:43:30,035] {plugins_manager.py:286} INFO - Loading 2 plugin(s) took 1.00 seconds
   [2021-03-15 20:43:30,140] {base_task_runner.py:118} INFO - Job 28: Subtask sub [2021-03-15 20:43:30,139] {dagbag.py:440} INFO - Filling up the DagBag from /usr/local/airflow/dags/sub-process-2.py
   [2021-03-15 20:43:30,196] {base_task_runner.py:118} INFO - Job 28: Subtask sub Running <TaskInstance: sub_processing.sub 2021-03-15T20:13:45.710628+00:00 [running]> on host 49fb1f13cbae
   [2021-03-15 20:43:30,233] {taskinstance.py:1256} INFO - Exporting the following env vars:
   AIRFLOW_CTX_DAG_OWNER=airflow
   AIRFLOW_CTX_DAG_ID=sub_processing
   AIRFLOW_CTX_TASK_ID=sub
   AIRFLOW_CTX_EXECUTION_DATE=2021-03-15T20:13:45.710628+00:00
   AIRFLOW_CTX_DAG_RUN_ID=manual__2021-03-15T20:13:45.710628+00:00
   [2021-03-15 20:43:40,289] {base_task_runner.py:118} INFO - Job 28: Subtask sub Mon Mar 15 20:43:30 2021
   [2021-03-15 20:43:40,289] {base_task_runner.py:118} INFO - Job 28: Subtask sub Mon Mar 15 20:43:35 2021
   [2021-03-15 20:43:40,294] {logging_mixin.py:103} INFO - Task Completed With Success
   [2021-03-15 20:43:40,295] {python.py:118} INFO - Done. Returned value was: None
   [2021-03-15 20:43:40,312] {taskinstance.py:1166} INFO - Marking task as SUCCESS. dag_id=sub_processing, task_id=sub, execution_date=20210315T201345, start_date=20210315T204328, end_date=20210315T204340
   [2021-03-15 20:43:40,339] {taskinstance.py:1219} INFO - 0 downstream tasks scheduled from follow-on schedule check
   [2021-03-15 20:43:40,605] {local_task_job.py:142} INFO - Task exited with return code 0
   ```
   
   **How to reproduce it**:
   1. Create a DAG using the Python Operator that calls another python script. An example of such a DAG is:
   
   ```
   from airflow.models import DAG
   from airflow.operators.python import PythonOperator
   from datetime import datetime
   import subprocess
   import shlex
   
   default_args = {
    'start_date': datetime(2021, 1, 1),
    'catchup': False
   }
   
   def _sub_process():
       cmd = "python /usr/local/airflow/easy.py"
       ret = subprocess.run(shlex.split(cmd)).returncode
       if ret == 0:
           print("Task Completed With Success")
       else:
           print("Error")
   
   with DAG('sub_processing', 
       schedule_interval='@hourly', 
       default_args=default_args) as dag:
       #Define tasks/operators
   
       sub = PythonOperator(
           task_id='sub',
           python_callable=_sub_process
       )
   ```
   
   2. Create a script that is named easy.py that prints time:
   
   ```
   import time
   import sys
   
   maybe_fail = round(time.time() * 1000)
   
   if maybe_fail % 5 == 0:
       for count in range(30):
           print(time.ctime())
           time.sleep(5)
   else :
       for count in range(2):
           print(time.ctime())
           time.sleep(5)
   ```
   
   3. Run the DAG and view the output of the tasks.
   
   How often does this problem occur? Once? Every time etc? Every time unless CAN_FORK = False in standard_task_runner.py
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] xinbinhuang commented on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
xinbinhuang commented on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-834511702


   > @xinbinhuang By default subprocess inherits stdout/err, so it should get printed:
   > 
   > ```
   > airflow ❯ python -c 'import subprocess; print(subprocess.run(["date"]).returncode)'
   > Fri  7 May 12:58:47 BST 2021
   > 0
   > ```
   
   Yup it does get printed to stdout/err but it wouldn't go into the airflow log file not showing up on the task log view in the webserver because the logging into the file is routed by the logger.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] xinbinhuang edited a comment on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
xinbinhuang edited a comment on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-820876982


   Hi @rbankston, I don't think this is an airflow bug, but just how the `subprocess` and the `logging` module work. In a word, you need to propagate back the `stdout` from your subprocess back in order to log in airflow. There are two options:
   
   1. propagate the `stdout` yourself, i.e.:
   ```python
       ...
       p = subprocess.run(shlex.split(cmd), capture_output=True)
       print(p.stdout)
       ret = p.returncode
       ...
   ```
   In this case, the `p.stdout` is a simple string and you may need to parse it with some extra logic.
   
   2. Use the `SubprocessHook` (recommended). This will automatically capture the stdout and log them in airflow for you.
   
   ```python
   sub_result:  SubprocessResult = SubprocessHook().run_command(shlex.split(cmd))
   
   print(sub_result.exit_code)
   
   # this is only necessary if you want to do extra stuff with the log other than logging in airflow
   print(sub_result.output)
   ```
   
   cc: @dstandish 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] raphaelauv commented on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
raphaelauv commented on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-976046463


   @kaxil you can apparently remove that issue from 2.3.0 https://github.com/apache/airflow/milestone/36?closed=1


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk edited a comment on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
potiuk edited a comment on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-972693383


   Agree it's not easy, but I think it's just a matter of careful implementation (and making some compromises), rather than using yet another dependency.
   
   Isn't that the case that stderr is also sent to stdout in SubprocessHook @hterik ? I think someone (not me :) ) made a decision that in this case we do not care about the stderr/stdout difference (`see stderr=STDOUT` line):
   
   ```
               self.sub_process = Popen(
                   command,
                   stdout=PIPE,
                   stderr=STDOUT,
                   cwd=cwd,
                   env=env if env or env == {} else os.environ,
                   preexec_fn=pre_exec,
               )
   
               self.log.info('Output:')
               line = ''
               for raw_line in iter(self.sub_process.stdout.readline, b''):
                   line = raw_line.decode(output_encoding).rstrip()
                   self.log.info("%s", line)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-972697748


   That's how it works:
   
   > subprocess.STDOUT
   > Special value that can be used as the stderr argument to Popen and indicates that standard error should go into the same handle as standard output.
   
   And I think this is precisely the reason why @xinbinhuang was right - we alreaady have some ways that we implemented in Airflow to handle it nicely and made logging works, so we should close the issue IMHO.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk closed issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
potiuk closed issue #14813:
URL: https://github.com/apache/airflow/issues/14813


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] xinbinhuang edited a comment on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
xinbinhuang edited a comment on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-820876982


   Hi @rbankston, I don't think this is an airflow bug, but just how the `subprocess` and the `logging` module work. In a word, you need to propagate back the `stdout` from your subprocess back in order to log in airflow. There are two options:
   
   1. propagate the `stdout` yourself, i.e.:
   ```python
       ...
       p = subprocess.run(shlex.split(cmd), capture_output=True)
       print(p.stdout)
       ret = p.returncode
       ...
   ```
   In this case, the `p.stdout` is a simple string and you may need to parse it yourself.
   
   2. Use the `SubprocessHook` (recommended). This will automatically capture the stdout and log them in airflow for you.
   
   ```python
   sub_result:  SubprocessResult = SubprocessHook().run_command(shlex.split(cmd))
   
   print(sub_result.exit_code)
   
   # this is only necessary if you want to do extra stuff with the log other than logging in airflow
   print(sub_result.output)
   ```
   
   cc: @dstandish 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] raphaelauv commented on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
raphaelauv commented on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-952963256


   maybe we can close that issue , the solution of @xinbinhuang is good


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] xinbinhuang removed a comment on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
xinbinhuang removed a comment on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-834511702


   > @xinbinhuang By default subprocess inherits stdout/err, so it should get printed:
   > 
   > ```
   > airflow ❯ python -c 'import subprocess; print(subprocess.run(["date"]).returncode)'
   > Fri  7 May 12:58:47 BST 2021
   > 0
   > ```
   
   Yup it does get printed to stdout/err but it wouldn't go into the airflow log file not showing up on the task log view in the webserver because the logging into the file is routed by the logger.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] hterik commented on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
hterik commented on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-972665718


   Note that SubprocessHook still sends the stderr of the subprocess straight to stdout without going to logger, it also differs a lot from stdlib subprocess behavior which can be unexpected, like changing the cwd to a tempdir. 
   
   Having written a similar class myself before, just have to warn that it is not as easy as one thinks to simultaneously capture and stream both stdout+stderr to a logger. You will end up more or less re-implementing POpen.communicate(), to not risk either of the streams getting full and blocking read of the other.
   
   I would recommend looking for a subprocess wrapper outside of airflow that can handle this. It's a general python logging + subprocess issue which not only affects airflow. Maybe try https://github.com/amoffat/sh, haven't used it much but it looks like it would might possible to tweak into this, right now it has the subprocess output in DEBUG logs but those logs also contain a lot of other verbose data.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ashb removed a comment on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
ashb removed a comment on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-834308360


   @kaxil Didn't we fix this already? 🤔 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-972693383


   Agree it's not easy, but I think it's just a matter of careful implementation (and making some compromises), rather than using yet another dependency.
   
   Isn't that the case that stderr is also sent to stdout in SubprocessHook @hterik ? I think someone (not me :) ) made a decision that in this case we do not care about the stderr/stdout difference (`see stderr=STDOUT`) line:
   
   ```
               self.sub_process = Popen(
                   command,
                   stdout=PIPE,
                   stderr=STDOUT,
                   cwd=cwd,
                   env=env if env or env == {} else os.environ,
                   preexec_fn=pre_exec,
               )
   
               self.log.info('Output:')
               line = ''
               for raw_line in iter(self.sub_process.stdout.readline, b''):
                   line = raw_line.decode(output_encoding).rstrip()
                   self.log.info("%s", line)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] xinbinhuang edited a comment on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
xinbinhuang edited a comment on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-820876982


   Hi @rbankston, I don't think this is an airflow bug, but just how the `subprocess` and the `logging` module work. In a word, you need to propagate back the `stdout` from your subprocess back in order to log in airflow. There are two options:
   
   1. propagate the `stdout` yourself, i.e.:
   ```python
       ...
       p = subprocess.run(shlex.split(cmd), capture_output=True)
       print(p.stdout)
       ret = p.returncode
       ...
   ```
   In this case, the `p.stdout` is a simple string and you may need to parse it with some extra logic.
   
   2. **Recommended**: Use the `SubprocessHook`. This will automatically capture the stdout and log them in airflow for you.
   
   ```python
   sub_result:  SubprocessResult = SubprocessHook().run_command(shlex.split(cmd))
   
   print(sub_result.exit_code)
   
   # this is only necessary if you want to do extra stuff with the log other than logging in airflow
   print(sub_result.output)
   ```
   
   cc: @dstandish 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
boring-cyborg[bot] commented on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-799750444


   Thanks for opening your first issue here! Be sure to follow the issue template!
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk edited a comment on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
potiuk edited a comment on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-972693383


   Agree it's not easy, but I think it's just a matter of careful implementation (and making some compromises), rather than using yet another dependency.
   
   Isn't that the case that stderr is also sent to stdout in SubprocessHook @hterik ? I think someone (not me :) ) made a decision that in this case we do not care about the stderr/stdout difference (see `stderr=STDOUT` line):
   
   ```
               self.sub_process = Popen(
                   command,
                   stdout=PIPE,
                   stderr=STDOUT,
                   cwd=cwd,
                   env=env if env or env == {} else os.environ,
                   preexec_fn=pre_exec,
               )
   
               self.log.info('Output:')
               line = ''
               for raw_line in iter(self.sub_process.stdout.readline, b''):
                   line = raw_line.decode(output_encoding).rstrip()
                   self.log.info("%s", line)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
potiuk commented on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-972698908


   Closing - but if there are other voices and opinions and arguments here I am happy to reconsider and reopen.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] xinbinhuang commented on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
xinbinhuang commented on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-820876982


   Hi @rbankston, I don't think this is an airflow bug, but just how the `subprocess` and the `logging` module work. In a word, you need to propagate back the `stdout` from your subprocess back in order to log in airflow. There are two options:
   
   1. propagate the `stdout` yourself, i.e.:
   ```python
       ...
       p = subprocess.run(shlex.split(cmd), capture_output=True)
       print(p.stdout)
       ret = p.returncode
       ...
   ```
   In this case, the `p.stdout` is a simple string and you may need to parse it yourself.
   
   2. Use the `SubprocessHook` (recommended):
   
   ```python
   sub_result:  SubprocessResult = SubprocessHook().run_command(shlex.split(cmd))
   
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ashb commented on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
ashb commented on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-834308360


   @kaxil Didn't we fix this already? 🤔 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] ashb commented on issue #14813: Forked processes aren't logged by tasks using the Python Operator

Posted by GitBox <gi...@apache.org>.
ashb commented on issue #14813:
URL: https://github.com/apache/airflow/issues/14813#issuecomment-834310139


   @xinbinhuang By default subprocess inherits stdout/err, so it should get printed:
   
   ```
   airflow ❯ python -c 'import subprocess; print(subprocess.run(["date"]).returncode)'
   Fri  7 May 12:58:47 BST 2021
   0
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org