Embedding models and vector databases – semantic long-term memory
In addition to the reasoning capabilities provided by LLMs, intelligent applications require semantic long-term memory for storing and retrieving information.
Semantic memory typically consists of two core components—AI vector embedding models and vector databases. Vector embedding models represent the semantic meaning of unstructured data, such as text or images, in large arrays of numbers. Vector databases efficiently store and retrieve these vectors to support semantic search and context retrieval. These components work together to enable the reasoning engine to access relevant context and information as needed.
Embedding models
Embedding models are AI models that map text and other data types, such as images and audio, into high-dimensional vector representations. These vector representations capture the semantic meaning of the input data, allowing for efficient similarity comparisons and semantic...