Hindsight provides AI agents with an advanced long-term memory system, addressing core challenges such as context loss between sessions, inability to perform temporal reasoning, difficulty in connecting fragmented knowledge, lack of personalized "opinion" memory, and insufficient contextual understanding of information.
Hindsight supports complex temporal queries, automatically connects and reasons over scattered facts to answer indirect questions, enables memory banks to form and apply user preferences or "opinions" to influence recommendations, and adapts its understanding of information based on different memory banks or agent "personalities." It offers a feature-rich API and a visual control plane, supporting lightweight embedded deployment. Core capabilities include memory storage, information retrieval based on complex logic (e.g., temporal reasoning), and the generation of comprehensive insights.
Full Hindsight documentation: vectorize-io.github.io/hindsight.
Delivers the full API and control plane UI experience:
export OPENAI_API_KEY=your-key
docker run -p 8888:8888 -p 9999:9999 \
-e HINDSIGHT_API_LLM_PROVIDER=openai \
-e HINDSIGHT_API_LLM_API_KEY=$OPENAI_API_KEY \
-e HINDSIGHT_API_LLM_MODEL=gpt-4o-mini \
-v $HOME/.hindsight-docker:/home/hindsight/.pg0 \
ghcr.io/vectorize-io/hindsight
API endpoint: http://localhost:8888
Control plane UI: http://localhost:9999
After deployment, use the Python client:
pip install hindsight-client
from hindsight import HindsightClient
client = HindsightClient(base_url="http://localhost:8888")
# Store memories
client.retain(bank_id="my-agent", content="Alice works as a software engineer at Google.")
client.retain(bank_id="my-agent", content="Alice mentioned she enjoys hiking in the mountains.")
# Perform a temporal reasoning query
results = client.recall(bank_id="my-agent", query="What is Alice's job?")
# Get a synthesized perspective response
response = client.reflect(bank_id="my-agent", query="Tell me about Alice.")
print(response.text)
For quick prototyping, run it in-process:
pip install hindsight-all
export OPENAI_API_KEY=your-key
import os
from hindsight import HindsightServer, HindsightClient
with HindsightServer(llm_provider="openai", llm_model="gpt-4o-mini", llm_api_key=os.environ["OPENAI_API_KEY"]) as server:
client = HindsightClient(base_url=server.url)
client.retain(bank_id="my-user", content="The user prefers functional programming.")
response = client.reflect(bank_id="my-user", query="What programming style should I use?")
print(response.text)