'Distributing high resolution rendering on server of many computers
I have a grid of monitors (12*2=24 monitors) running as one big monitor (Big wall-sized display).
I built a visualization with three.js with more than 10 000 000 data points and 12690 x 3840 resolution. So with this visualization the GPU’s full power is used and the interaction with the visualization is getting slow.
I would like to distribute the rendering on 12 computers through a server to get better performance, be able to add more data points and interact smoothly.
How can I implement that?
Solution 1:[1]
There's an examples here and here.
And this answer: display three.js scene across multiple screens
There's also this non-three.js example but it does show a solution and this three.js one that runs across machines
Basically you setup a server to relay websocket messages across the machines. That might sound scary but with node.js it's not that much code (I'm sure other languages have simple solutions too).
Each machine needs to know which portion of the scene to display and needs to have the same scene (or the portion of the scene that machine will display). The animations on the machines should be tied directly to a clock. As a start you can use Date.now(). Once it's working you can use the websockets to keep the clock synchronized across the machines. Otherwise if there are interactive camera controls or other settings, as long as they are global you can broadcast them to all machines over websockets.
If there is more interaction then you can pass the inputs or other data between machines but ideally you need to make it so not much needs to be synced and so the simulation is deterministic so that given the same state (time + settings) each machine will generate the same display state.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 |
