Development guide: Signal metering

Before we begin

If you've followed our Getting started and Applying effects guides, you'll have an application which is receiving audio from the user's microphone and passes it through a reverb and filter before looping back to the audio output.

Again, we'll continue this guide from where Applying effects ended to prevent repetition. You can find the code or each of the guides here.

If you'd like to pick up from this point, you can clone the guide code from here:

splice/superpowered-guideshttps://github.com/splice/superpowered-guides

Download the code examples for all the Superpowered guides in both Native and JS.


What we'll set up

We'll use the existing audio pipeline we already have in place from the user's microphone, passed through the reverb and filter effect to the speakers, but we'll demonstrate how you can perform metering at different parts of the audio graph and get the required data back and forth for UI display.

We'll be using Superpowered's Peak function to extract the peak volume levels from an audio stream pre and post processing. The two level meters will be drawn via HTML canvas elements, with the frame rate being determined by the browsers native getAnimationFrame callback.

We'll be using Superpowered's Peak function to extract the peak volume levels from an audio stream pre and post processing. The two level meters will be rendered in our application with two instances of the UIKit NSLevelIndicator class.

Performing the analysis

Before we dive into rendering the peak meters on screen, we must first calculate the values. Luckily, this is incredibly simple with Superpowered.

First, lets create some local variables which will hold the current peak value of the input and output of the AudioWorkletProcessor. Let's create these in the onReady function.

...
onReady() {
this.reverb = new this.Superpowered.Reverb(
this.samplerate,
this.samplerate
);
this.reverb.enabled = true;
this.reverb.mix = 0.5;
this.filter = new this.Superpowered.Filter(
this.Superpowered.Filter.Resonant_Lowpass,
this.samplerate
);
this.filter.resonance = 0.2;
this.filter.frequency = 2000;
this.filter.enabled = true;
this.inputGain = 0.2;
this.previousInputGain = 0;
+ this.inputPeakValue = 0;
+ this.outputPeakValue = 0;
// Notify the main scope that we're prepared.
this.sendMessageToMainScope({ event: "ready" });
}
...

Then, in our processAudio function, simply update these values:

processAudio(inputBuffer, outputBuffer, buffersize, parameters) {
// Ensure the samplerate is in sync on every audio processing callback.
this.filter.samplerate = this.samplerate;
this.reverb.samplerate = this.samplerate;
// The second argument of the Peak function is the number of values, not a number of frames.
// Because inputBuffer is stereo, the number of frames is multiplied by two (channels).
+ this.inputPeakValue = this.Superpowered.Peak(
+ inputBuffer.pointer,
+ buffersize * 2 // Number of frames to process.
+ );
// Apply volume while copy the input buffer to the output buffer.
// Gain is smoothed, starting from "previousInputGain" to "inputGain".
this.Superpowered.Volume(
inputBuffer.pointer,
outputBuffer.pointer,
this.previousInputGain,
this.inputGain,
buffersize
);
this.previousInputGain = this.inputGain; // Save the gain for the next round.
// Apply reverb to output (in-place).
this.reverb.process(outputBuffer.pointer, outputBuffer.pointer, buffersize);
// Apply the filter (in-place).
this.filter.process(outputBuffer.pointer, outputBuffer.pointer, buffersize);
+ this.outputPeakValue = this.Superpowered.Peak(
+ outputBuffer.pointer,
+ buffersize * 2
+ );
}

First let's declare two local variables to hold the current input and output peak values:

@implementation ViewController {
SuperpoweredOSXAudioIO *audioIO;
Superpowered::Filter *filter;
Superpowered::Reverb *reverb;
float previousInputGain;
+ float inputPeak, outputPeak;
}

Then in our audioProcessingCallback add the Peak functions:

- (bool)audioProcessingCallback:(float *)inputBuffer outputBuffer:(float *)outputBuffer numberOfFrames:(unsigned int)numberOfFrames samplerate:(unsigned int)samplerate hostTime:(unsigned long long int)hostTime {
// Ensure the sample rate is in sync on every audio processing callback.
reverb->samplerate = samplerate;
filter->samplerate = samplerate;
// The second argument of the Peak function is the number of values, not a number of frames.
// Because inputBuffer is stereo, the number of frames is multiplied by two (channels).
+ inputPeak = Superpowered::Peak(inputBuffer, numberOfFrames * 2);
// Apply volume while copy the input buffer to the output buffer.
// Gain is smoothed, starting from "previousInputGain" to "inputGain".
float inputGain = self.inputGainSlider.floatValue;
Superpowered::Volume(inputBuffer, outputBuffer, previousInputGain, inputGain, numberOfFrames);
previousInputGain = inputGain; // Save the gain for the next round.
// Apply reverb to output (in-place).
reverb->process(outputBuffer, outputBuffer, numberOfFrames);
// Apply the filter (in-place).
filter->process(outputBuffer, outputBuffer, numberOfFrames);
+ outputPeak = Superpowered::Peak(outputBuffer, numberOfFrames * 2);
return true;
}

We now have the peak values being calculated and stored on every single audio processing callback.


Setting up the UI

We'll be using HTML canvas elements to render the meters. So first, add two canvas elements at the top and bottom of the existing index.html

<div id="bootedControls">
+ <span>Input peak meter</span>
+ <canvas
+ width="100"
+ height="10"
+ class="peakMeter"
+ id="inputPeakMeter"
+ ></canvas>
<label>Input gain</label>
<input
type="range"
min="0"
max="1"
step="0.01"
value="0.4"
oninput="onParamChange('inputGain', this.value)"
/>
<label>Reverb mix</label>
<input
type="range"
min="0"
max="1"
step="0.01"
value="0.5"
oninput="onParamChange('reverbMix', this.value)"
/>
<label>Filter frequency>
<input
type="range"
min="100"
max="10000"
step="0.1"
value="1000"
oninput="onParamChange('filterFreq', this.value)"
/>
+ <span>Output peak meter</span>
+ <canvas
+ width="100"
+ height="10"
+ class="peakMeter"
+ id="outputPeakMeter"
+ ></canvas>
</div>

Add two NSLevelIndicator elements to the main storyboard and CTRL drag them over to your ViewController to bind them as properties. We'll also need a NSTimer which we will use later to schedule metering updates.

@interface ViewController ()
@property (weak) IBOutlet NSSlider *inputGainSlider;
@property (weak) IBOutlet NSSlider *reverbMixSlider;
@property (weak) IBOutlet NSSlider *filterFrequencySlider;
+ @property (weak) IBOutlet NSLevelIndicator *inputLevelMeter;
+ @property (weak) IBOutlet NSLevelIndicator *outputLevelMeter;
@end
@implementation ViewController {
SuperpoweredOSXAudioIO *audioIO;
Superpowered::Filter *filter;
Superpowered::Reverb *reverb;
float previousInputGain, inputPeak, outputPeak;
+ NSTimer *timer;
}

Reading data and rendering

We'll need to query the values stored by our audio processing loop to draw our meters at a much slower rate than they are written eg 60fps vs (sample rate / buffersize)fps. This communication is also across threads and we need to ensure the audio thread is not locked during runtime.

Now we have our canvases on screen we must schedule the rendering and fetch data from our AudioWorkletProcessor.

In your application, you should create references to the canvas elements on screen so we can draw onto them with Javascript and also setup a recursive function call via the browser's native requestAnimationFrame callback.

// as your application is initialized
...
this.inputCanvas = document.getElementById("inputPeakMeter");
this.inputCanvasContext = this.inputCanvas.getContext("2d");
this.outputCanvas = document.getElementById("outputPeakMeter");
this.outputCanvasContext = this.outputCanvas.getContext("2d");
this.inputCanvasContext.fillStyle = "#1254fe";
this.outputCanvasContext.fillStyle = "#1254fe";
// then right at the end of your applications initialization
window.requestAnimationFrame(this.requestData);
...

requestAnimationFrame will be called by the browser when it is ready to draw its next screen render, typically at around 60fps but can vary. This ensures we don't clog up the UI thread when it's unable to draw.

We'll need this.requestData to request the peak values from the audio processing scope (thread) and then recursively schedule the next call to requestAnimationFrame. Also remember that when using Superpowered in Javascript, all communication between threads in Javascript must occur over the Worklet message port. The port is available in the main scope with the methods sendMessageToAudioScope and onMessageFromAudioScope. Within the AudioWorkletProcessor script, it's available with sendMessageToMainScope and onMessageFromMainScope.

requestData() {
// first we check the processNode has been created before we try to send messages to it
if (this.processorNode?.sendMessageToAudioScope) {
// We then send across a message which informs the processNode to send us back its locally held peak values
this.processorNode.sendMessageToAudioScope({
type: "dataAnalyzerRequest" // we parse this message.type within the processorNode, the same port is used to send slider value changes.
});
}
// here we are scheduling the next call for the data based on the browser's screen refresh rate.
window.requestAnimationFrame(this.requestData.bind(this));
}

Next, we must make our AudioWorkletProcessor node returning the data back to the main thread, so within our AudioWorkletProcessor add the following to the onMessageFromMainScope method.

...
onMessageFromMainScope(message) {
if (message.type === "parameterChange") {
if (message.payload?.id === "inputGain") this.inputGain = message.payload.value;
else if (message.payload?.id === "reverbMix") this.reverb.mix = message.payload.value;
else if (message.payload?.id === "filterFrequency") this.filter.frequency = message.payload.value;
+ } else if (message.type === "dataAnalyzerRequest") {
+ this.sendMessageToMainScope({
+ data: {
+ analyzerData: {
+ inputPeakDb: this.inputPeakValue,
+ outputPeakDb: this.outputPeakValue
+ }
+ }
+ });
}
}
...

Now we have data being returned from the Worklet, we need to draw the results onto the browser DOM. Back in the main scope, we must now handle the incoming data from the audio thread.

onMessageProcessorAudioScope(message) {
if (message.event === "ready") {
console.log(message);
}
if (message.event === 'dataAnalyzerData') {
this.drawInputMeterCanvas(message.data.analyzerData.inputPeakDb);
this.drawOutputMeterCanvas(message.data.analyzerData.outputPeakDb);
}
}

The two functions called above are using the local references to the two canvas element contexts we created earlier, to draw a filled rectangle proportionally size based on the peak value.

drawInputMeterCanvas(peakValue) {
this.inputCanvasContext.clearRect(0, 0, 100, 10); // first clear down the current drawing
this.inputCanvasContext.fillRect(0, 0, 100 * peakValue, 10); // then draw the updated value, scaled to the width of the canvas in the DOM
}
drawOutputMeterCanvas(peakValue) {
this.outputCanvasContext.clearRect(0, 0, 100, 10);
this.outputCanvasContext.fillRect(0, 0, 100 * peakValue, 10);
}

There are many ways to go about rendering things on screen, but we've gone with the simplest to keep the guide clean and focused. We'll schedule UIKit to render the meters on screen by using NSTimer, requesting an interval of 100fps to call a function which updates our NSLevelIndicator values. UIKit will then manage the rendering of these on screen.

Right at the bottom of our viewDidLoad function let's set up that timer right after Superpowered audio I/O starts running. We'll be calling a new function animate.

- (void)viewDidLoad {
[super viewDidLoad];
...
audioIO = [[SuperpoweredOSXAudioIO alloc] initWithDelegate:(id<SuperpoweredOSXAudioIODelegate>)self preferredBufferSizeMs:12 numberOfChannels:2 enableInput:true enableOutput:true];
[audioIO start];
// Start timer to update the meters.
+ timer = [NSTimer timerWithTimeInterval:1.0/100.0 target:self selector:@selector(animate) userInfo:nil repeats:YES];
+ [[NSRunLoop mainRunLoop] addTimer:timer forMode:NSRunLoopCommonModes];
}
}

Animate simply sets the values of our two NSLevelMeter instances we created earlier.

- (void)animate {
[self.inputLevelMeter setDoubleValue:inputPeak * 10];
[self.outputLevelMeter setDoubleValue:outputPeak * 10];
}

We scale by 10 to make the color stops easier to configure in the interface builder. Here we set the warning color level from 7 and critical to 10. Our peak will be on a scale rom 0 to 1, so we multiply by 10 to scale this up for the meters.


End result

We created an ES6 Javascript sandbox that applies all of the steps above for you to try out and experiment with.

If it has all gone well, you should up with the following application.


You can find the example code for this guide and all the others in both JS and native in one repository over at GitHub.

splice/superpowered-guideshttps://github.com/splice/superpowered-guides

Download the code examples for all the Superpowered guides in both Native and JS.


v1.0.31