Unidirectional is *not* the only way...
Recurrent Neural Networks, Long Short Term Memory, and Gated Recurrent Units, all three have one similarity. They all are unidirectional. It means, context is created by only and only considering the past. However, some use cases benefit to have context of both past and future to estimate the present. For instance, machine translation is one of the NLP use case that heavily benefits from the context built from both past and the future. The neural networks like this, are called Bidirectional.
While we are discussing the use cases, you might want to hold and think of one case where you would never use Bidirectional Neural Networks and comment the same.
How BiRNN work (or essentially any Bidirectional network works) is simple. There are 2 different RNN layers processing data from either direction and the output is then merged together to form the final output.
BiRNNs and other bidirectional networks are extremely complex, computationally speaking. This makes them harder to train, require more memory, and takes considerably more time to train. However, they do usually perform better in terms of unidirectional networks in terms of accuracy and also can handle variable length sequences better.
Subscribe to my newsletter
Read articles from Japkeerat Singh directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
Japkeerat Singh
Japkeerat Singh
Hi, I am Japkeerat. I am working as a Machine Learning Engineer since January 2020, straight out of college. During this period, I've worked on extremely challenging projects - Security Vulnerability Detection using Graph Neural Networks, User Segmentation for better click through rate of notifications, and MLOps Infrastructure development for startups, to name a few. I keep my articles precise, maximum of 4 minutes of reading time. I'm currently actively writing 2 series - one for beginners in Machine Learning and another related to more advance concepts. The newsletter, if you subscribe to, will send 1 article every Thursday on the advance concepts.