Embeddings
Numerical representations of text that capture semantic meaning, enabling AI to measure similarity between concepts.
AI-generated
An embedding converts text (a word, sentence, or document) into a list of numbers (a vector). Texts with similar meanings get similar vectors, even if they use different words. "happy" and "joyful" would have similar embeddings, while "happy" and "refrigerator" would not.
Embeddings are the backbone of semantic search, RAG systems, recommendation engines, and clustering. When you search a knowledge base using AI and it finds relevant documents even though you used different words than the documents, that is embeddings at work. They are essential infrastructure for any production AI application.
OpenAI: Embeddings guide - https://platform.openai.com/docs/guides/embeddings
Jay Alammar: "The Illustrated Word2Vec" - https://jalammar.github.io/illustrated-word2vec/