Skip to main content

Glossary

Explore key terms and concepts in the "Ethics by Design" course. This glossary provides clear definitions to help you better understand the material and enhance your learning experience.

You are welcome to add new entries and edit or comment on existing ones.


Browse the glossary using this index

Special | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | ALL

T

VW

Transformer

by Valentin Weber - Sunday, 25 August 2024, 2:26 PM
 

SOMETHING ALONG THESE LINES:

The Transformer architecture uses attention mechanisms to process all words in a sentence simultaneously, allowing the model to understand the context and relationships between words more effectively. This parallel processing method makes it faster and more efficient than traditional sequential models, improving performance on tasks like translation and text generation. By using self-attention and positional encoding, the Transformer can accurately handle the order and significance of words in a sentence.

Viswani et al. 2017

Entry link: Transformer