'Performance difference between one websocket connection vs multiple

The architecture I have is that clients receive events from a service through websockets. We are building another service which also needs to communicate it's events to the clients. We are faced with a design decision - whether to change existing service or just add another websocket connection from client to new service.

It is technically complex and expensive to change the other service as it's not maintained by us, but at the end of the day, if necessary, it is possbile. Adding another connection to new service would be cheap and no redesign would be needed, but it would imply that there would be two connections in parallel.

Other questions i've looked at often get answers from server perspective, where TCP connections are limited and thus it's important to reduce the amount of connections, but it's obviously not applicable here, since it's two separate services. What i'm asking is that how big of a difference it is from a client perspective that instead of a single websocket connection there would be multiple connections? My view on this has been that our goal should be to use the same connection as each connection reduces performance, but to be honest it's just an opinion and not a proven fact.

Since it's enterprise setting, backwards compatibility and performance is highly important, but at the same time, development cost is way higher if we'd choose to go with one connection instead of two. So.. is there a real difference or it's something I should not worry about?



Solution 1:[1]

From a client perspective, on any modern platform, the difference in cost of managing one vs two sockets in an application or browser should be negligible. It sounds to me like simplifying development and maintenance effort should dominate your decision.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Kelly Burkhart