'macOS: How to capture audio input levels in Swift?
On macOS, I am looking for the best way to capture audio input levels from a microphone using Swift.
This article is along the lines of what I am thinking, but am not sure how to go about processing the sample buffers to get the volume levels:
macOS/swift Capture Audio with AVCaptureSession
Am I on the right track, or looking at this the wrong way?
Solution 1:[1]
I was close on this but did not realize it.
For the AVCaptureAudioDataOutputSampleBufferDelegate captureOutput callback, the parameters also include a value for AVCaptureConnection. This connection has a parameter named audioChannels which provide the peakHoldLevel and averagePowerLevel properties.
Problem solved.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | dmetzler |
