'LevelDB: Put data slow after database is large (1GB+)
Actually, I'm using the goleveldb implementation.
I am performing a task of inserting data continuously, and in the first few seconds, the insertion is fast, but when the amount of data reaches about 1 GB, the insert operation hangs.
The key written is a number, and the insertion order is random. The value is a file of a few KB. My storage directory is under the HDD disk and is always 100% occupied.
Maybe I should use an SSD, or maybe this is not a leveldb scenario? Or do I have the wrong configuration? I am using exactly the default configuration.
I found someone encountered the same issue: https://github.com/google/leveldb/issues/828, but there is no solution.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
