Bidirectional Recurrent Neural Network
Bidirectional Recurrent Neural Network (BRNN) connect two hidden layers of opposite directions to the same output .The output layer can get information from past state (backwards) and from future state(forward).
BRNNs were introduced to increase the amount of input information available to the network. For example, multilayer perceptron (MLPs) and time delay neural network (TDNNs) have limitations on the input data flexibility, as they require their input data to be fixed. Recurrent neural network (RNNs) also have restrictions as the future input information cannot be reached from the current state, while BRNNs do not require their input data to be fixed. Moreover, their future input information is reachable from the current state.
In simple word if i say why we need this BRNN after RNN because if we want to predict the next word of the sentence it would either depend on past word or up-coming future word .

IF i take example of email when we write an email what we see that if we write some word it will give us what would be the next word for your sentence here the use of Bidirectional Recurrent Neural Network where the transformation of data is in bidirection.
Bidirectional Recurrent Neural Network (BRNN) are extensively use in hand writing recognition . In this the performance can be enhanced by knowledge of the letters located before and after the current letter.
BRNN Architecture


This is Bidirectional Recurrent Neural Network looks like dont worry by looking the diagram it's too simple i will walk through each and every step in this diagram and give information that how it works , so let's jump into it.
IF you followed my Recurrent Neural Network Blog you saw that their is a hidden layer which has any no of neurones, when we apply RNN our RNN at each and every time at t it preprocess every word .
and the passing it to next layer with previous output similarly here we have taken one RNN strip in one direction and another RNN strip in opposite direction with respect to first one.
This both RNN are connected to each other which have same input and same output .The principle of BRNN is to split the neurons of a regular RNN into two directions, one for positive time direction (forward states), and another for negative time direction (backward states). By using two time directions, input information from the past and future of the current time frame can be used unlike standard RNN which requires the delays for including future information.
Training of BRNN -
BRNNs can be trained using similar algorithms to RNNs, because the two directional neurons do not have any interactions. However, when back-propagation through time is applied, additional processes are needed because updating input and output layers cannot be done at once. General procedures for training are as follows: For forward pass, forward states and backward states are passed first, then output neurons are passed. For backward pass, output neurons are passed first, then forward states and backward states are passed next. After forward and backward passes are done, the weights are updated
Applications -
1. Speech Recognition (Combined with Long short-term memory)
2. Handwritten Recognition
3. Translation
For in depth intuition you can read Research paper ( https://papers.nips.cc/)
Your feedback is appreciated!
Did you find this Blog helpful? Any suggestions for improvement? Please let me know by filling the contact us form . Thanks!