A gated recurrent unit (GRU) is a [[recurrent neural network (RNN)]] cell that is similar to the [[long short-term memory (LSTM)]] (Long Short-Term Memory) cell but with fewer parameters. The GRU cell was introduced by Cho et al. in 2014 and has since become popular in natural language processing, speech recognition, and other applications.
The GRU cell has two gates, a reset gate and an update gate, which control the flow of information through the cell. The reset gate determines how much of the previous hidden state should be forgotten, while the update gate decides how much of the current input should be added to the new hidden state.
![[Gated_Recurrent_Unit,_base_type.svg]]
Source: [Wikimedia Commons](https://commons.wikimedia.org/wiki/File:Gated_Recurrent_Unit,_base_type.svg)
The advantage of using a GRU over a traditional RNN is that it can better handle long-term dependencies in sequential data while requiring fewer parameters to train. This makes it more comput