The Problem
Students can memorize, pass tests, and still not really understand.
Confidence ≠ Comprehension
Students often feel confident about concepts they don't truly understand. Surface familiarity masks deeper gaps.
Name-Dropping vs Understanding
Mentioning terminology doesn't prove understanding. True comprehension reveals itself in causal explanations.
Existing Tools Fail
Tutors check answers. Chatbots provide answers. Neither diagnoses whether you understand the underlying structure.
How It Works
We use multiple signals to figure out if you truly understand
Logical Consistency
Do your claims contradict each other? Are there circular definitions?
Uses Natural Language Inference models to detect contradictions and logical flaws.
Concept Dependency Coverage
Are prerequisite concepts mentioned? Do you cover the core components?
Maps your explanation to a canonical concept graph to identify missing links.
Assumption Completeness
Are hidden assumptions made explicit? Do you state boundary conditions?
Identifies implicit assumptions and evaluates whether they're acknowledged.
Explanation Stability
Does your understanding hold when explained differently?
Tests if explanations remain consistent under reformulation (WOW factor!).
What You'll See
Interactive visualizations reveal gaps in understanding
Understanding Score Breakdown
See exactly where your explanation is strong and where it needs work
Interactive Concept Graph
Covered Weak Missing
Ethics & Limitations
Transparency is fundamental to our design
What This System IS
- ✓A diagnostic tool for revealing gaps
- ✓A self-assessment aid for learners
- ✓An explanation structure analyzer
What This System IS NOT
- ✗A judge of intelligence or capability
- ✗A replacement for teachers
- ✗An arbiter of absolute truth
Key Limitations
- •Scores are probabilistic, not absolute judgments
- •Cannot detect understanding in non-verbal or culturally specific contexts
- •Requires clear written/spoken explanations to function
- •Every score has visible reasoning—we show our work