🎁 Get the FREE AI Skills Starter GuideSubscribe →
BytesAgainBytesAgain
🦀 ClawHub

Prompt Cache

by @nissan

SHA-256 prompt deduplication for LLM and TTS calls — hash normalize prompts, check cache before calling APIs, store results for instant replay. Use when maki...

Versionv1.0.0
Installs2
📦 Core Types
💡 Examples

import prompt_cache

Check before calling expensive API

cached = await prompt_cache.get_cached( prompt="Tell me a story about clouds", child_name="Sophie", language="fr" )

if cached: return cached # Free! No API call needed.

Cache miss — call the API

result = await generate_story(prompt, child_name, language)

Store for next time

await prompt_cache.set_cached(prompt, child_name, language, result)

View on ClawHub
TERMINAL
clawhub install prompt-cache

🧪 Use this skill with your agent

Most visitors already have an agent. Pick your environment, install or copy the workflow, then run the smoke-test prompt above.

🔍 Can't find the right skill?

Search 60,000+ AI agent skills — free, no login needed.

Search Skills →