Data Quality Validator
by @charlie-morrison
Validate data quality in pipelines by checking completeness, consistency, freshness, accuracy, and distribution anomalies. Define expectations, profile data...
Invoke this skill when you need to validate data quality, define quality expectations, detect anomalies, or audit pipeline data.
Basic invocation: > Validate data quality for the orders pipeline > Set up data quality checks for /path/to/data/ > Audit this dataset for completeness and consistency
Focused analysis: > Check data freshness across all pipeline outputs > Profile distributions for anomaly detection > Detect schema drift between pipeline runs > Generate a data quality scorecard for stakeholders
The agent reads data source definitions, pipeline code, schema files, and sample data, then produces a comprehensive data quality assessment with actionable expectations.
clawhub install data-quality-validator