'How to get audio stream from cordova-plugin-audioinput for realtime visualizer
I am using cordova-plugin-audioinput for recording audio in my cordova based app. The documentation can be found here : https://www.npmjs.com/package/cordova-plugin-audioinput
I was previously using the MediaRecorder function of the browser to record audio but I switched to the plugin due to audio quality issues. My problem is that I have a realtime visualizer of the volume during the record, my function used to work using an input stream from the media recorder
function wave(stream) {
audioContext = new AudioContext();
analyser = audioContext.createAnalyser();
microphone = audioContext.createMediaStreamSource(stream);
javascriptNode = audioContext.createScriptProcessor(2048, 1, 1);
analyser.smoothingTimeConstant = 0.8;
analyser.fftSize = 1024;
microphone.connect(analyser);
analyser.connect(javascriptNode);
javascriptNode.connect(audioContext.destination);
javascriptNode.onaudioprocess = function () {
var array = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(array);
var values = 0;
var length = array.length;
for (var i = 0; i < length; i++) {
values += (array[i]);
}
var average = values / length;
// use average for visualization
}
}
Now that I use the cordova-plugin-audioinput, I can't find a way to retrieve the stream from the microphone even though the documentation mention a "streamToWebAudio" parameter, I can't find a way to make it work.
Any insight on this ? Thanks you in advance !
Solution 1:[1]
As someone who stumbled upon this a few years later and wondered why there was an extra destination being made in the other answer, i now realise it's because Eric needed to get the input stream into the same AudioContext as the analyser.
Now, ignoring the fact that the spec for analyser has changed since the answer, and just focusing on getting the input stream into something useful. You could just pass the audiocontext into the audioinput config like so and save yourself a few steps
function wave(stream) {
var audioContext = new AudioContext();
var analyser = audioContext.createAnalyser();
analyser.connect(audioContext.destination);
audioinput.start({
streamToWebAudio: true
audioContext: audioContext
});
audioinput.connect(analyser);
analyser.onaudioprocess = function(){
...
}
}
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Ryk Waters |
