'A2DP and detecting packet loss so PLC insertion can be done
I am writing some code that renders the A2DP media packet which contains multiple frames which in sniffs in some cases show that at regular intervals there is one less frame than normal. This means a packet loss is not simple in terms of how many frames to inject.
Each packet has sequence number that enables a packet loss to be detected so that the correct number of frames can be inserted by the PLC algorithm.
I am using a long term average mechanism to decide how many frames I insert when a packet loss is detected.
What is the best strategy for actually detecting a packet loss given a packet could be delayed and so a decision has be deferred?
The only certainty is that if I receive a packet with a sequence number which is greater than (Last_seq_num+1) than I know for certain I lost one or more packets. But this means I need to wait for that long time before deciding and that means I have to increase latency.
A complex scheme would be to start inserting PLC and when a delayed packet arrives, only queue up some frames from that late packet based on how may PLC's have been inserted.
I thank you in advance for your suggestions and thoughts.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
