2009 | Joel A. Tropp, Member, IEEE, Jason N. Laska, Student Member, IEEE, Marco F. Duarte, Member, IEEE, Justin K. Romberg, Member, IEEE, and Richard G. Baraniuk, Fellow, IEEE
This paper introduces a novel data acquisition system called the random demodulator, which is designed to efficiently sample sparse bandlimited signals. Unlike traditional Nyquist-rate sampling, which requires sampling at the Nyquist rate of $W$ Hz, the random demodulator operates at a much lower rate of $O(K \log(W/K))$ samples per second, where $K$ is the number of significant frequencies and $W$ is the bandlimit. This approach leverages the sparsity of the signal, where only a small number of frequencies are present, to reduce the sampling rate significantly.
The random demodulator works by multiplying the input signal with a high-rate pseudonoise sequence, which smears the frequencies across the entire spectrum. A lowpass anti-aliasing filter is then applied, followed by sampling at a low rate. This process ensures that each frequency has a distinct signature within the filter's passband, allowing the tones to be identified from the low-rate samples.
The paper provides a detailed theoretical analysis and empirical results to support the effectiveness of the random demodulator. Theoretical guarantees show that the sampling rate required for successful reconstruction is $O(K \log W + \log^3 W)$, while empirical results demonstrate that the system can reconstruct signals with a sampling rate of $R \approx 1.7K \log(W/K+1)$. The system is also shown to be robust against noise and quantization errors, making it suitable for a wide range of applications, including ultrawideband and radar systems.
The random demodulator is constructed from robust, readily available components, making it a practical solution for efficient sampling of sparse bandlimited signals. The paper discusses the implementation details, nonidealities, and the mathematical model of the signals, providing a comprehensive framework for understanding and applying the random demodulator.This paper introduces a novel data acquisition system called the random demodulator, which is designed to efficiently sample sparse bandlimited signals. Unlike traditional Nyquist-rate sampling, which requires sampling at the Nyquist rate of $W$ Hz, the random demodulator operates at a much lower rate of $O(K \log(W/K))$ samples per second, where $K$ is the number of significant frequencies and $W$ is the bandlimit. This approach leverages the sparsity of the signal, where only a small number of frequencies are present, to reduce the sampling rate significantly.
The random demodulator works by multiplying the input signal with a high-rate pseudonoise sequence, which smears the frequencies across the entire spectrum. A lowpass anti-aliasing filter is then applied, followed by sampling at a low rate. This process ensures that each frequency has a distinct signature within the filter's passband, allowing the tones to be identified from the low-rate samples.
The paper provides a detailed theoretical analysis and empirical results to support the effectiveness of the random demodulator. Theoretical guarantees show that the sampling rate required for successful reconstruction is $O(K \log W + \log^3 W)$, while empirical results demonstrate that the system can reconstruct signals with a sampling rate of $R \approx 1.7K \log(W/K+1)$. The system is also shown to be robust against noise and quantization errors, making it suitable for a wide range of applications, including ultrawideband and radar systems.
The random demodulator is constructed from robust, readily available components, making it a practical solution for efficient sampling of sparse bandlimited signals. The paper discusses the implementation details, nonidealities, and the mathematical model of the signals, providing a comprehensive framework for understanding and applying the random demodulator.