'Why a Tensorflow model is much more bigger when loading into RAM?
I have a TensorFlow saved model which is about 20 MB. When I load it to prepare for inferencing, it takes a much bigger size of RAM (~400MB) than its size on disk. This model is run on a computer with only CPU. Is there any reason for this?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
