doi.org
Dynamic Gated Recurrent Neural Network for Compute-efficient Speech Enhancement
September 2024 • Longbiao Cheng, Ashutosh Pandey, Buye Xu, Tobi Delbrück, Shih‐Chii Liu
This paper introduces a new Dynamic Gated Recurrent Neural Network (DG-RNN)\nfor compute-efficient speech enhancement models running on resource-constrained\nhardware platforms. It leverages the slow evolution characteristic of RNN\nhidden states over steps, and updates only a selected set of neurons at each\nstep by adding a newly proposed select gate to the RNN model. This select gate\nallows the computation cost of the conventional RNN to be reduced during\nnetwork inference. As a realization of the DG-RNN, we …