'Is neural network training done with one sample at a time or with an average of multiple samples?
I recently saw this video (https://youtu.be/Ilg3gGewQ5U), where the guy says that the weights in a neural network are adjusted based on an AVERAGE of the errors of multiple samples (at 8:37, if you don't want to see the whole video). Is this always the case, or are there situations where it is better to update the weights after every sample?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
