π¦ ClawHub
Hallucination Detector
by @charlie-morrison
Detect and flag hallucinations in LLM outputs by cross-referencing claims against source documents, code, and verifiable data. Essential for RAG pipelines an...
π‘ Examples
"Check this AI-generated summary for hallucinations against the source docs"
"Verify the code documentation matches the actual implementation"
"Detect hallucinations in this RAG pipeline output"
"Fact-check this AI-generated API reference"
"Review these generated release notes for accuracy"
TERMINAL
clawhub install cm-hallucination-detector