'Pytorch DDP find_unused_parameters setting during using multiple forwards/backwards

Issues like https://github.com/pytorch/pytorch/issues/69031 My training task need multiple step forwards,which means part of my epochs need find_unused_parameters=True and other epochs need find_unused_parameters=False.

pytorch still can't track used and unused parameters automaticall. Is there any alternative solutions now?

Thanks!



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source