'Caching design problem for precomputed social feeds
I am learning Redis and caching to integrate it into a social-app I am creating, so approaching the design of a scalable caching system is a bit overwhelming.
I am using Redis (NodeJS) to precompute user’s newsfeeds so when a post is created it finds all of followers of the posting user who and adds the post to those user’s newsfeed list with postId and userId in an object like.
‘’’ {postId: “id”, userId: “id”} ‘’’
Then I add the whole post object into Redis as key/value with expiration so when a user loads their pre computed list, they can get the whole post object from Redis OR go to MongoDB if expired/removed via LRU policy.
My question is regarding removing a post from potentially hundreds or thousands of user’s pre-computed newsfeed lists whenever a user DELETES a post?
Currently when this happens I am getting a lot of null values for deleted posts but I need to return 10 posts to the client otherwise it thinks the end of the newsfeed has been reached if the post count < 10 and will not make a request for the next page of posts.
It seems like it would use A LOT of resources to look inside all users newsfeeds that follow the user who deleted a post to remove the postId in EVERY list at any scale.
It thought about dealing with null values (meaning the post has been deleted) by counting the null values returned (x) and then performing additional queries to Redis/MongoDB to get x more posts until 10 posts have filled the page OR the end of the newsfeed is reached?
Could someone PLEASE point to a scalable approach to deal with this situation? I feel like the logic is getting too complicated OR since I am new to Redis that I am missing a proper approach to solving this issue.
Thank you
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
