🎁 Get the FREE AI Skills Starter Guide β€” Subscribe β†’
BytesAgainBytesAgain
πŸ¦€ ClawHub

getmem.ai Memory

by @nimblev2023

Persistent memory for AI agents via getmem.ai. Call mem.get() before each LLM call to inject context, and mem.ingest() after each turn to save the conversation.

Versionv1.0.2
πŸ’‘ Examples

import getmem_ai as getmem, os

mem = getmem.init(os.environ["GETMEM_API_KEY"])

Before each LLM call β€” get relevant memory context

result = mem.get(user_id, query=user_message) context = result["context"] # inject into system prompt

After each turn β€” save both user + assistant messages

mem.ingest(user_id, messages=[ {"role": "user", "content": user_message}, {"role": "assistant", "content": reply}, ])

βš™οΈ Configuration

Set your API key in the environment:

export GETMEM_API_KEY=gm_live_YOUR_KEY_HERE

Get your key at https://platform.getmem.ai β€” $20 free credit on signup.

View on ClawHub
TERMINAL
clawhub install getmem

πŸ§ͺ Use this skill with your agent

Most visitors already have an agent. Pick your environment, install or copy the workflow, then run the smoke-test prompt above.

πŸ” Can't find the right skill?

Search 60,000+ AI agent skills β€” free, no login needed.

Search Skills β†’