You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The airflow pod trigger is raising errors and is slowing down Airflow processes. The error is; Triggerer's async thread was blocked for 0.23 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.. I tried resolving this by increasing the resources, but even after removing all the limits and giving it 10 GB RAM and lots of CPU head room it still raises this error. I also check the response times of the Postgres database and couldn't find any thing that could slow down the async process and cause this error. Please let me know what other steps I can do to resolve this error.
Relevant Logs
2024-02-23 01:48:09.268
[2024-02-23T00:48:09.267+0000] {triggerer_job_runner.py:573} INFO - Triggerer's async thread was blocked for 0.23 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.2024-02-23 09:00:44.327 [2024-02-23T08:00:44.325+0000] {triggerer_job_runner.py:573} INFO - Triggerer's async thread was blocked for 0.38 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
Custom Helm Values
No response
The text was updated successfully, but these errors were encountered:
Checks
User-Community Airflow Helm Chart
.Chart Version
8.8.0
Kubernetes Version
Helm Version
Description
The airflow pod trigger is raising errors and is slowing down Airflow processes. The error is;
Triggerer's async thread was blocked for 0.23 seconds, likely by a badly-written trigger. Set PYTHONASYNCIODEBUG=1 to get more information on overrunning coroutines.
. I tried resolving this by increasing the resources, but even after removing all the limits and giving it 10 GB RAM and lots of CPU head room it still raises this error. I also check the response times of the Postgres database and couldn't find any thing that could slow down the async process and cause this error. Please let me know what other steps I can do to resolve this error.Relevant Logs
Custom Helm Values
No response
The text was updated successfully, but these errors were encountered: