π¦ ClawHub
Nirvana Plugin
by @shivaclaw
Local-first privacy-first inference. Your OpenClaw agent thinks locally and asks the cloud intelligently. Saves 85%+ tokens, protects privacy, agent learns f...
β‘ When to Use
βοΈ Configuration
Basic Setup
{
"nirvana": {
"mode": "local-first",
"local_model": {
"provider": "ollama",
"endpoint": "http://ollama:11434",
"model": "qwen2.5:7b",
"timeout_ms": 180000
},
"routing": {
"local_threshold": 0.75,
"max_local_tokens": 8000,
"cloud_fallback": true
},
"privacy": {
"strip_soul": true,
"strip_user": true,
"strip_memory": true,
"audit_logging": true
}
}
}
Custom Local LLM (Non-Ollama)
{
"nirvana": {
"local_model": {
"provider": "custom",
"endpoint": "http://your-llm-server:5000",
"api_format": "openai-compatible",
"model": "your-model-name",
"timeout_ms": 120000
}
}
}
TERMINAL
clawhub install project-nirvana-plugin