🎁 Get the FREE AI Skills Starter Guide β€” Subscribe β†’
BytesAgainBytesAgain
πŸ¦€ ClawHub

RamaLama CLI

by @ieaves

Run and interact with AI agents.

Versionv1.0.0
πŸ’‘ Examples

Start with top-level discovery:

ramalama --help
ramalama version

Apply global options before the subcommand when needed:

ramalama [--debug|--quiet] [--dryrun] [--engine podman|docker] [--nocontainer] [--runtime llama.cpp|vllm|mlx] [--store ]  ...

Use command-level help before invoking unknown flags:

ramalama  --help

πŸ“‹ Tips & Best Practices

  • serve exposes an OpenAI-compatible endpoint for external clients.
  • Prefer JSON output flags where available (list --json, inspect --json) for robust parsing in automation.
  • Use ramalama chat --url when the model is already served elsewhere.
  • View on ClawHub
    TERMINAL
    clawhub install ramalama-cli

    πŸ§ͺ Use this skill with your agent

    Most visitors already have an agent. Pick your environment, install or copy the workflow, then run the smoke-test prompt above.

    πŸ” Can't find the right skill?

    Search 60,000+ AI agent skills β€” free, no login needed.

    Search Skills β†’