The research problem
- Rapid expansion of AI education pilots across classrooms, districts, libraries, and after-school settings.
- Evidence is fragmented and hard to compare across sites.
- Implementation details and “failures” are rarely preserved.
AAB fills the “implementation layer” gap—structured public memory that complements peer-reviewed publication.
What researchers gain
- Structured, comparable case data across contexts
- Cross-institutional visibility beyond one’s own site
- Access to early-stage pilots (before publication)
- Longitudinal evidence preservation
- Research signal detection via recurring patterns
- Documentation entries that can be cited
Core documentation infrastructure
Case Registry
- Real-world implementations
- Context, constraints, process, outcomes
- Successes and failures
Pilot Registry
- Early experimentation
- Evaluation design and risk mitigation
- Observed impact (documented, not marketed)
Emerging Consensus Layer
- Identify recurring patterns (enabling conditions, safety practices, assessment signals)
- Extract patterns — not certifications
Evidence Maturity Index (EMI)
EMI helps researchers interpret documentation rigor and filter by maturity—clarity, not certification.
- Level 1: Exploratory framework
- Level 2: Emerging consensus draft
- Level 3: Validated provisional standard
- Level 4: Mature normative standard
EMI supports meta-analysis readiness and shared vocabulary across disciplines.
IRB, data governance & privacy alignment
- AAB does not replace institutional IRB review.
- Entries can indicate IRB status, consent procedures, and governance alignment.
- AAB documents metadata—not raw student data.
Transparency strengthens public trust.
How to contribute
Submit documented cases
- From funded research projects
- From classroom pilots
- From collaborative field trials
Join working groups
- Documentation standards refinement
- Ethical reporting guidance
- Safety signal tracking