PyOPPE ยท Home

A plain-language guide to what coding exam data reveals

This project studies anonymized Python exam activity from 2025. It helps non-technical readers understand where students get stuck, what kinds of mistakes repeat, and what exam improvements may help real learners.

What this repository does

Think of this as a research notebook plus public report site: it starts from raw logs, rebuilds metrics with scripts, and publishes easy-to-read findings for educators and program teams.

1) Collects evidence

Uses anonymous event data like test runs and submissions from coding exams.

2) Rebuilds results reproducibly

All major numbers and tables come from scripts in this repo, so they can be rerun and audited.

3) Turns data into decisions

Publishes stories and practical recommendations: teaching support, exam design fixes, and intervention targeting.

Explore pages

Start with the core stories. Then browse alternate drafts and other resources.

Built for future additions

This home page is data-driven: to add new cards or new sections (analysis or non-analysis), update a single list in the page script.

Edit HOME_SECTIONS in this file to add more destinations.