🎁 Get the FREE AI Skills Starter Guide β€” Subscribe β†’
BytesAgainBytesAgain
πŸ¦€ ClawHub

Local AI Stack

by @jaysclawd-cloud

Transform your Mac into an offline AI workstation with Ollama and OpenCode, running curated local models for coding and reasoning without internet or API costs.

Versionv1.0.0
πŸ’‘ Examples

Ollama Commands

# Run a local model
ollama run qwen2.5-coder "Write a Python function..."

List installed models

ollama list

Pull latest model version

ollama pull qwen2.5-coder

Remove a model

ollama rm mistral

OpenCode Commands

# Interactive coding session
opencode

Single command

opencode run "Write a React component" --model opencode/big-pickle

List available models

opencode models

Help

opencode --help

πŸ“‹ Tips & Best Practices

  • Models load into RAM when used, unload when idle
  • Only one model runs at a time by default
  • For best performance, use Apple Silicon Mac with 24GB+ RAM
  • View on ClawHub
    TERMINAL
    clawhub install local-ai-stack

    πŸ§ͺ Use this skill with your agent

    Most visitors already have an agent. Pick your environment, install or copy the workflow, then run the smoke-test prompt above.

    πŸ” Can't find the right skill?

    Search 60,000+ AI agent skills β€” free, no login needed.

    Search Skills β†’