Abstract: The self-attention technique was first used to develop transformers for natural language processing. The groundbreaking work “Attention Is All You Need” (2017) for Natural Language ...