π¦ ClawHub
RamaLama CLI
by @ieaves
Run and interact with AI agents.
π‘ Examples
Start with top-level discovery:
ramalama --help
ramalama version
Apply global options before the subcommand when needed:
ramalama [--debug|--quiet] [--dryrun] [--engine podman|docker] [--nocontainer] [--runtime llama.cpp|vllm|mlx] [--store ] ...
Use command-level help before invoking unknown flags:
ramalama --help
π Tips & Best Practices
serve exposes an OpenAI-compatible endpoint for external clients.list --json, inspect --json) for robust parsing in automation.ramalama chat --url when the model is already served elsewhere.TERMINAL
clawhub install ramalama-cli