'Dependence of data transfer latency on number of target nodes
Suppose some fixed size data is to be transferred between machines. Let us consider two scenarios:
- 1 target: Data is transferred from machine
Ato machineB. Suppose this takesxseconds. - 2 targets: Data is transferred from machine
Ato machineBandC. Suppose this takesyseconds.
What can be said about the relationship between x and y? I think usually y should be greater than x. Am I right? Is there any work/paper/blog that gives such an insight?
I think the relationship between x and y should depend on the topology of network connection between the machines too.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
