View publication

Speech dereverberation has been an important component of effective far-field voice interfaces in many applications. Algorithms based on multichannel linear prediction (MCLP) have been shown to be especially effective for blind speech dereverberation and numerous variants have been introduced in the literature. Most of these approaches can be derived from a common framework, where the MCLP problem for speech dereverberation is formulated as a weighted least squares problem that can be solved analytically. Since conventional batch MCLP-based dereverberation algorithms are not suitable for low-latency applications, a number of online variants based on the recursive least squares (RLS) algorithm have been proposed. However, RLS-based approaches often suffer from numerical instability and their use in online systems can further be limited due to high computational complexity with a large number of channels or filter taps. In this paper, we aim to address the issues of numerical robustness and computational complexity. More specifically, we derive alternative online weighted least squares algorithms through Householder RLS and Householder least squares lattice (HLSL), which are numerically stable and retain the fast convergence capability of the RLS algorithm. Furthermore, we derive an angle-normalized variant of the HLSL algorithm and show that it is robust to speech cancellation for a wide range of forgetting factors and filter taps. Finally, we support our findings through experimental results and demonstrate numerical and algorithmic robustness, long-term stability, linear complexity in filter taps, low memory footprint, and effectiveness in speech recognition applications.

Related readings and updates.

Double-talk Robust Multichannel Acoustic Echo Cancellation Using Least Squares MIMO Adaptive Filtering: Transversal, Array, and Lattice Forms

In this paper, we address the problem of noise-robust multiple-input multiple-output (MIMO) adaptive filtering that is optimal in least-squares sense with application to multichannel acoustic echo cancellation. We formulate the problem as minimization of a multichannel least squares cost function that incorporates near-end speech and noise statistics resulting in a novel noise-robust framework for MIMO adaptive filtering. Although the issue of…
See paper details

Least Squares Binary Quantization of Neural Networks

Quantizing weights and activations of deep neural networks results in significant improvement in inference efficiency at the cost of lower accuracy. A source of the accuracy gap between full precision and quantized models is the quantization error. In this work, we focus on the binary quantization, in which values are mapped to -1 and 1. We provide a unified framework to analyze different scaling strategies. Inspired by the pareto-optimality of…
See paper details