[Articles] Attention is All You Need

The foundational paper for modern NLP models, replacing RNNs with a self-attention mechanism.