Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pin redis py to 4.3.5 #349

Merged
merged 9 commits into from Mar 9, 2023
Merged

Conversation

bbrowning918
Copy link
Contributor

This in practice seems to be the last/most stable version of redis-py for the test suite. GitHub's default is to run for 6 hours which is excessive, if tox hangs and takes longer than 10 minutes it never comes back.

Would make a temporary fix to get proper support/fix for #348

@bbrowning918
Copy link
Contributor Author

Stuck again on our old friend test_receive_hang.

@bbrowning918
Copy link
Contributor Author

bbrowning918 commented Mar 1, 2023

The tests hang consistently in the same spot, but not in the same python version. Sister runs in my fork will fail on one or more different python versions than here, and even more puzzling can succeed from time to time.

tests/test_pubsub_sentinel.py::test_recieve_hang is where the tests get caught up. I am trying my usual prodding to figure out where/why.

@bbrowning918 bbrowning918 marked this pull request as draft March 1, 2023 04:14
@carltongibson
Copy link
Member

Thanks for your work on this @bbrowning918! 🎁

It's #319 again/still 😅

test_receive_hang… Q: Do you think it's worth to skip this and add to #348, to allow a bit of time to resolve? 🤔

@bbrowning918
Copy link
Contributor Author

I had considered that but am unsure, which is why I moved this to a draft.

#319 was certainly improved upon and lessened. That an added regression test for that issue is suffering from a hang hints something still lurks.

Should #348 be resolved given it shows a fail and a hang in a different spot (and I believe earlier in the suite) then skipping right to redis-py >= 4.4.0 where the suite passes every time would be satisfactory. My goal here is to reduce false positives that block or cloud merging of other's excellent contributions and improvements.

I do not know what kind of noise me aggressively retrying the actions creates for yourself and other maintainers. I do not want that to be intrusive as I stumble about experimenting with stability. I have no issue closing this and playing off in a fork until (or if) a meaningful result.

@carltongibson
Copy link
Member

Hey @bbrowning918 — no noise, please do experiment away! 😀

@bbrowning918
Copy link
Contributor Author

Ok, 3 for 3 in both fork and main is something. The tests themselves pass, but the pytest fixture for the channel layer hangs on it's call to await channel_layer.flush() for pubsub with and without a sentinel.

I am yet unable to consistently reproduce the flush hang locally and write a test against it.

@carltongibson carltongibson marked this pull request as ready for review March 9, 2023 10:43
Copy link
Member

@carltongibson carltongibson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, let's take this for now.

Using redis 4.3.5 I'm not seeing the hang locally, whereas as soon as I move to 4.4.x or above I do.

@carltongibson carltongibson merged commit 89b29ad into django:main Mar 9, 2023
fredfsh pushed a commit to cobalt-robotics/channels_redis that referenced this pull request Jun 20, 2023
* tox tests longer than 10 minutes are likely hung so let github bail
* pin redis 4.3.5
* let the async_timeout occur
* Added change note.

---------

Co-authored-by: Carlton Gibson <carlton.gibson@noumenal.es>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants