Transformer

Posted on Sun 04 December 2022 in Models

Transformers were first introduced by Google in 2017 as an improvement on the then-state-of-the-art models used for language tasks: Recurrent Neural Networks. Its advantage over RNNs is that it can process sequences in both directions, allowing them to train on larger amount of data.

They are good at interpreting context and dealing with words with many meanings.