'How do you scale a system based on bandwidth read and write estimates?
I understand the need to compute bandwidth estimation but it’s not tangible to me how these estimates are then used for scaling the design.
Like if we have a read bandwidth of 200MB/s and a write bandwidth of 35GB/s, how do you determine how many servers you will need to load balance? Or that you will need a load balancer? Or a cluster of load balancers? Yeah I get that 35GB/s is a ton of data but how much data can a single server to take reading/ writing?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
