'Multi-output neural network combining regression and classification

If you have both a classification and regression problem that are related and rely on the same input data, is it possible to successfully architect a neural network that gives both classification and regression outputs?

If so, how might the loss function be constructed?



Solution 1:[1]

Usually, for such cases the loss is considered simply a weighted sum of classification loss and regression loss. In other words, your network have 2 independent output parts, one responsible for regression, on which you apply reggression loss L_reg (such as MSE) and another responsible for classification part, on which you apply classification loss L_class (such as cross entropy) and your final optimization criterion is simply (alpha)*L_reg + (1-alpha)*L_class, for some predefined alpha. This allows easy computation of gradients (and overall easy analysis).

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 lejlot