LangChain’s chat history
LangChain’s chat history is a memory component that stores previous exchanges between the user and the assistant so that the Large Language Model (LLM) can reference past context in multi-turn conversations. Implemented through the BaseChatMessageHistory interface, it maintains in-memory buffers. Each turn is serialized, and optional metadata — timestamp, user ID, channel — allows for per-session retrieval. When a new request comes in, LangChain inserts the most recent messages, trimmed by a token window or relevance score, into the system request before invoking the LLM. This continuity allows chatbots to remember preferences, follow flows, and avoid repetitive questions without retraining. Developers configure memory size, summarization frequency, and eviction policies to balance cost and consistency. Because each memory class follows the same API, teams can change storage tiers or add encryption for GDPR compliance without affecting downstream chains or agents.