In recent years, we’ve witnessed a significant boom in generative AI, stemming from a groundbreaking scientific paper titled “Attention is All You Need”. This paper introduced the Transformer architecture, which underpins most of today’s successful large language models. Published in 2017, this new architecture set the stage for…