Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Strange redis key growth, at least in Heroku #2083

Closed
mariusandra opened this issue Oct 28, 2020 · 1 comment
Closed

Strange redis key growth, at least in Heroku #2083

mariusandra opened this issue Oct 28, 2020 · 1 comment
Labels
bug Something isn't working right

Comments

@mariusandra
Copy link
Collaborator

Bug description

The number of keys in Redis starts to grow quite quickly once an app is launched. 2.6k keys in 40min. It takes just ~1MB of memory, but what are those keys?
Screenshot 2020-10-28 at 21 02 47

It levels out at 73k keys and 25MB memory in each case.
image

This screenshot is from the session recording PR's heroku instance.
image

When working on the plugins, I noticed that in an undeployed state with no app running, the keys all expired within 24h.

Expected behavior

There shouldn't be a leak.

How to reproduce

Launch an app on Heroku and have a look. Or check any redis usage of the deployed preview apps.

Environment

  • At least any posthog instance running on Heroku, could be wider spread

Additional context

  • I think celery saves some "Ack" / "success" messages in the queue after an event is done, to notify of task completion (e.g. await task()).

Thank you for your bug report – we love squashing them!

@mariusandra mariusandra added the bug Something isn't working right label Oct 28, 2020
@mariusandra
Copy link
Collaborator Author

This is what has piled up in ~40 minutes:

2020-10-28 21 12 39

ec2-3-210-254-187.compute-1.amazonaws.com:14519> get  celery-task-meta-94f158ef-c2f7-4011-9a42-4c227e0d7b8f
{"status": "SUCCESS", "result": null, "traceback": null, "children": [], "date_done": "2020-10-28T19:49:31.704216", "task_id": "94f158ef-c2f7-4011-9a42-4c227e0d7b8f"}
ec2-3-210-254-187.compute-1.amazonaws.com:14519> get celery-task-meta-3b166121-f9dc-4303-8344-0b03cd311590
{"status": "SUCCESS", "result": null, "traceback": null, "children": [], "date_done": "2020-10-28T19:53:38.840494", "task_id": "3b166121-f9dc-4303-8344-0b03cd311590"}
ec2-3-210-254-187.compute-1.amazonaws.com:14519> get celery-task-meta-c2c4d606-6f21-4bc1-8090-405e6ecbabaf
{"status": "SUCCESS", "result": null, "traceback": null, "children": [], "date_done": "2020-10-28T19:09:54.184765", "task_id": "c2c4d606-6f21-4bc1-8090-405e6ecbabaf"}

timgl added a commit that referenced this issue Oct 28, 2020
@mariusandra mariusandra mentioned this issue Oct 28, 2020
3 tasks
timgl added a commit that referenced this issue Oct 28, 2020
* #2083 ignore result

* ADd to task
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working right
Projects
None yet
Development

No branches or pull requests

1 participant