'Server 2019 Data Center Failover Cluster Role not working as expected when port forwarding public traffic from firewall router
I have a 3 node windows failover cluster and each node runs Windows Server Data Center 2019. Each node has a mail server installed (hmailserver) and each nodes mailserver service connects to a central mySQL database and uses cluster shared storage that is visible to each node.
The IP addresses in my network are:
Firewall/Router 192.168.1.1 Node 1 192.168.1.41 Node 2 192.168.1.42 Node 3 192.168.1.43 Mailserver Role 192.168.1.51 (virtual IP)
I added port forwards in my firewall router for all standard mail server ports and pointed these to the physical IP of node 1 then tested public could access the mail server properly. I then changed the port forwards to point to nodes 2 then node 3 to test each node had a working mail server and all worked perfectly.
Then I setup a cluster role using a GENERIC SERVICE pointing it to the mail server service and gave this role an IP of 192.168.1.51. Initially, the role attached itself to node 3 as its host node.
I changed the port forwards in the router to point to this virtual IP and while I was connected inside the same LAN, I could failover the role and the mail server kept going from one node to another but when I tried to connect to the system through public internet again, the mail server was only accessible when the cluster role was hosted on node 3. If changed to node 1 or 2 the connection was broken.
I tried turning off and disabling and resetting the windows firewalls on each node and ensured each node settings in firewall were identical but this had no effect on the problem. What am I missing ?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
