![]() |
Documentation
Tools for embedded systems
|
The class Tapped Delay Line implements a Tapped Delay Line (TDL), a fundamental structure in digital filter theory. A TDL delays an input signal by a number of samples and exposes each delayed version as an output
In this implementation, the TDL is backed by a circular buffer. As a result, both enqueue and dequeue operations run in constant time O(1)
, making integer delays highly efficient.
A delay of one sample is denoted \(z^{-1}\) , while a delay of \(N\) samples is denoted \(z^{-N}\) consistent with z-transform notation in digital signal processing.
To use a TDL, create an instance of qlibs::tdl and configure it using either the constructor or qlibs::tdl::setup(). During setup you specify both the number of delay taps and their initial values. Once configured, samples of the input signal can be inserted using qlibs::tdl::insertSample() (or the function-call operator).
You can access specific delayed samples via:
delay
[i] — equivalent to getAtIndex(i)
Because of its versatility, the qlibs::tdl class is a core component in higher-level modules such as qlibs::smoother and qlibs::ltisys.
Insert new samples:
Retrieve delayed samples: