-
Notifications
You must be signed in to change notification settings - Fork 14.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Celery Executor is not working with redis-py 5.0.0 #33744
Comments
I am not able to reproduce a 2nd time. Once I implemented the workaround I mentioned to downgrade to Python redis package 4.x ( |
I tried with Breeze with
and I also can't reproduce the issue I was having. It was a recurring issue after upgrading to 2.7.0 even after multiple restarts of my Docker Compose environment, and only got resolved after downgrading redis-py to 4.x. The issue was strange because the error says If nobody else can figure out what might have happened, maybe close this issue (since Celery package will soon restrict Redis-py to <5.0.0 anyways) |
Redis 5 relased last week breaks celery, celery is limiting it for now and will resolve it later, we should similarly limit redis on our side to limit redis for users who will not upgrade to celery that will be released shortly. Fixes: apache#33744
Redis 5 relased last week breaks celery, celery is limiting it for now and will resolve it later, we should similarly limit redis on our side to limit redis for users who will not upgrade to celery that will be released shortly. Fixes: #33744
@potiuk just an FYI that Celery has now released v5.3.3 which includes the change to limit Redis client to 4.x, and Kombu's PR is now open and approved, so likely will be merged soon. |
Hello, Python 3.8.6 Name: kombu Name: apache-airflow Name: redis Traceback (most recent call last): [airflow@xxxxx-scheduler-1 airflow]$ ls /home/airflow/.local/lib/python3.8/site-packages/redis |
I think you have something messed up with your environment. I recommend you to look more closely at it and if you need help open a new discussion or slack troubleshooting query with more details of your environment and your finding. There is no problem with importing redis.client in 4.6.0
Also maybe somewhere on PYTHONPATH you have another Anyhow - this is not the same issue - start a new discussion on it |
Thanks a lot for you answer. |
That's why I mean - something is messed up in your env (but I have no idea why and only you can investigate it). Our CI and development environment already does "whole stack". Whern you install airflow in a given version you are supposed to use constraints - see the docs: https://airflow.apache.org/docs/apache-airflow/stable/installation/installing-from-pypi.html to achieve reproducible installation with "known good" set of dependencies. All those are tested, verified and they are known to work. And only that set is "guaranteed" to wrork. You can downgrade or upgrade dependencies within the requirements limit that airflow has (and in most cases you will be fine) - but then it might be that some dependenciex will cause some problems (it's impossible to test matrix of combinations of 670 dependencies). In Airflow 2.7.2 constriants for example for Python 3.8, we know that Airflow works with those dependencies described in our constraints: https://github.com/apache/airflow/blob/constraints-2.7.2/constraints-3.8.txt For example there:
I just run airfow 2.7.2. with those set of constraints and there is no problem - everything imports fine in Celery, everything works. You can check it yourself (and our CI does it automatically) by running development tool called
There - you have running airflow with celery and redis and everything works fine, no import problem. That's why I think somehting is broken in your environment. What It is, hard to say, but I recommend you to start with the constraint list and compare your installed dependency versions with those on the constraint lits and work it out from there. Maybe some other dependency you have installed is breaking things. What we can do is provide you with "known good" constraints, and if you diverge from it, you have all the means to investigate what's causing it if you see problem like this (or you can stick to the constraints of our that are tested, verified and provide reproducible installation. Unless you have a reason to diverge from them - I recommend you to follow the installation process we have (as this is the only one that providese reproducible, working installation that we had a chance to test. |
This issue is also showing up in |
@ybryan. I suggest you open a discussion where you describe your problem with all the details. Separate one. Describe what you do and how you modify the image or what you do there. If you think you do not modify the image, double check it. The apache/airflow:2.7.3-python3.8 image has both celery and redis in
So - if you need help - please start a new discussion where you describe your circumstances and refer to this as "similar issue". Commenting on a closed issue is generally almost always bad idea because it is well, closed. Instaad creating new issue (or discussion if not sure) where you describe your problem has usually much better chance that something happens there. |
Thanks Jared. New discussion here |
Apache Airflow version
2.7.0
What happened
After upgrading to Airflow 2.7.0 in my local environment my Airflow DAGs won't run with Celery Executor using Redis even after changing
celery_app_name
configuration incelery
section fromairflow.executors.celery_executor
toairflow.providers.celery.executors.celery_executor
.I see the error actually is unrelated to the recent Airflow Celery provider changes, but is related to Celery's Redis support. What is happening is Airflow fails to send jobs to the worker as the Kombu module is not compatible with Redis 5.0.0 (released last week). It gives this error (I will update this to the full traceback once I can reproduce this error one more time):
Celery actually is limiting redis-py to 4.x in an upcoming version of Celery 5.3.x (it is merged to main on August 17, 2023 but it is not yet released: celery/celery#8442 . The latest version is v5.3.1 released on June 18, 2023).
Kombu is also going to match Celery and limit redis-py to 4.x in an upcoming version as well (the PR is draft, I am assuming they are waiting for the Celery change to be released: celery/kombu#1776)
For now there is not really a way to fix this unless there is a way we can do a redis constraint to avoid 5.x. Or maybe once the next Celery 5.3.x release includes limiting redis-py to 4.x we can possibly limit Celery provider to that version of Celery?
What you think should happen instead
Airflow should be able to send jobs to workers when using Celery Executor with Redis
How to reproduce
Workaround:
redis>=4.5.2,<5.0.0,!=4.5.5
Operating System
Debian GNU/Linux 11 (bullseye)
Versions of Apache Airflow Providers
apache-airflow-providers-celery==3.3.2
Deployment
Docker-Compose
Deployment details
I am using
bitnami/airflow:2.7.0
image in Docker Compose when I first encountered this issue, but I will test with Breeze as well shortly and then update this issue.Anything else
No response
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: