Documentation
Tools for embedded systems
Loading...
Searching...
No Matches
Tapped Delay Line in O(1)

The class qTDL is an implementation of the Tapped Delay Line (TDL) structure. A TDL is a discrete element in digital filter theory, that allows a signal to be delayed by a number of samples and provides an output signal for each delay. Here, a TDL is implemented as a circular buffer. This means that the complexity of the enqueue and dequeue operations is constant O(1), so integer delays can be computed very efficiently.

The delay by one sample is notated \(z^{-1}\) and delays of \(N\) samples is notated as \(z^{-N}\) motivated by the role the z-transform plays in describing digital filter structures.

To create a TDL, you just need to define an instance of type qTDL_t and then, configure it by using qTDL_Setup(), where you can define the number of lines to be delayed and the initial values for all of them. Then, you can start operating over this structure by inserting samples of the input signal by using qTDL_InsertSample(). You can also get any specific delay from it by using:

Given the applications of this structure, the qTDL class is used as the base component of some aspects of qSSMoother and qLTISys.

Example : Code snippet to instantiate a TDL to hold up to 256 delays.

#define MAX_DELAYS ( 256 )
float tdl_storage[ MAX_DELAYS ];
qTDL_t tdl;
qTDL_Setup( &tdl, tdl_storage, MAX_DELAYS, 0.0f );
void qTDL_Setup(qTDL_t *const q, float *const area, const size_t n, const float initVal)
Setup and initialize a Tapped Delay Line (TDL) instance by setting the default optimal parameters.
Definition qtdl.c:13
A Tapped Delay Line (TDL) object.
Definition qtdl.h:35