Latent Space
The compressed, abstract representation that AI models use internally to encode and manipulate concepts.
AI-generated
Latent space is the internal representation a model creates of its inputs. When a language model processes text, it converts words into vectors in a high-dimensional latent space where similar concepts are near each other. Image generation models like Stable Diffusion work in a latent space where each point represents a possible image.
Latent spaces are how AI models organize and manipulate information. Operations that are complex in the real world (like blending two image styles) become simple in latent space (interpolating between two points). Understanding latent space conceptually helps you understand why AI can generalize: it operates on abstract representations, not raw data.
Jay Alammar: "The Illustrated Stable Diffusion" - https://jalammar.github.io/illustrated-stable-diffusion/
Wikipedia: Latent space - https://en.wikipedia.org/wiki/Latent_space