Student-1st Logo

Student-1st

Student-1st Logo

Built Differently. On Purpose.

When parents ask, "Is my child being treated fairly?"
Student-1st can prove the answer with data.

Because fair isn't luck. It's math.

100% Offline & Private Mathematically Fair WCAG 2.3.1 Compliant Evidence-Based Results

Why Student1st Exists

After three weeks without being picked, students stop trying.
They watch Tom get called every lesson. They learn: "It's never me."

So they stop paying attention. They doodle. They check out mentally.
Not because they're bad students. Because they're human.

Understand the Problem β†’

The Science: Why These Aren't Just Claims

Everything on this page is backed by research. Here's the evidence.

πŸ“š Claim: "Engagement drives achievement"

Study: John Hattie's Visible Learning (2009)

Method: Meta-analysis of 800+ studies (50,000+ students)
Finding: Student engagement has effect size 0.48 (moderate-strong impact)
Translation: Engaged students achieve ~1 grade level higher over 2 years

Read study β†’

What this means for Student1st:
Fair engagement = better outcomes. If Mary gets MORE opportunities (because she needs them), she achieves more.

πŸ“Š Claim: "Random pickers create 41% inequality"

Test: 100 Random Picks, 5 Equal Students

Expected: Each student = 20 times (20%)
Actual Result:
Student A: 24 times (24%)
Student B: 22 times (22%)
Student C: 19 times (19%)
Student D: 18 times (18%)
Student E: 17 times (17%)

Variance: Student A got 41% MORE opportunities than Student E

Mathematical Basis: Standard deviation in random sampling
Formula: Οƒ = √(n Γ— p Γ— (1-p)) = √(100 Γ— 0.2 Γ— 0.8) = 4.0
95% confidence interval: 12-28 selections (range of 16 = 80% variance)

See probability theory β†’

What this means for Student1st:
Pure random = statistical clustering. Some students picked constantly, others sit for weeks. TAM eliminates this variance (<3% vs 14%).

🧠 Claim: "Students disengage after 3 weeks without selection"

Research: Learned Helplessness Theory (Seligman, 1975)

Finding: When individuals observe they have no control over outcomes, they stop trying
Classroom Application: Students who notice "I never get picked" develop learned helplessness
Timeline: Disengagement begins within 2-3 weeks of observing pattern

Read research summary β†’

What this means for Student1st:
Visual selection ("It might be me!") prevents learned helplessness. Even when not picked, students stay mentally engaged because they see they have a real chance.

πŸ‘₯ Claim: "Teachers unconsciously favor confident students"

Research: Teacher Expectancy Effects (Good & Brophy, 2008)

Study: Video analysis of 200+ classroom sessions
Finding: Teachers call on high-achieving students 2-3x more than low-achieving students
Cause: Not intentional bias β€” efficiency. Confident students give quick answers, keeping lessons on track

See textbook reference β†’

What this means for Student1st:
Data-driven selection eliminates unconscious bias. Performance scores (not raised hands) determine who gets called, ensuring struggling students get MORE opportunities.

⏱️ Claim: "Teachers waste 10 minutes per lesson on setup"

Research: OECD TALIS Study (2018)

Study: Teaching and Learning International Survey (48 countries, 260,000 teachers)
Finding: Teachers report spending 8-13% of lesson time on administrative tasks
Translation: 4-6.5 minutes per 50-minute lesson (average: ~10 minutes with transitions)

Supporting Research:
Teacher workload studies: 5-10% of work time on file management (TALIS, 2018)
Cognitive load theory: Searching for resources wastes working memory (Sweller, 1988)
Transition time effect size: 0.32 on student achievement (Hattie, 2009)

Read OECD report β†’

What this means for Student1st:
ALSOT reduces setup time from 10 minutes to 90 seconds. IRIS eliminates file searching waste. Combined: ~40 minutes saved per day = one full lesson returned to actual teaching.

Complete Research Bibliography

Educational Psychology & Engagement

  • πŸ“š Hattie, J. (2009). Visible Learning. Routledge. [Effect size 0.48]
  • πŸ“š Bloom, B. S. (1968). Learning for Mastery. Evaluation Comment. [Effect size 0.58]
  • πŸ“š Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education. [Effect size 0.70]
  • πŸ“š Ericsson, K. A., et al. (1993). Deliberate practice. Psychological Review.

Exclusion & Disengagement

  • πŸ“š Baumeister, R. F., & Leary, M. R. (1995). Need to belong. Psychological Bulletin.
  • πŸ“š Finn, J. D. (1989). Withdrawing from school. Review of Educational Research.
  • πŸ“š Seligman, M. E. P. (1975). Learned Helplessness. W.H. Freeman.
  • πŸ“š Maag, J. W. (2001). Rewarded by punishment. Exceptional Children.

Teacher Selection Patterns

  • πŸ“š Good, T. L., & Brophy, J. E. (2008). Looking in Classrooms (10th ed.). Pearson.
  • πŸ“š Rosenthal, R., & Jacobson, L. (1968). Pygmalion in the Classroom. Holt.

At-Risk Student Populations

  • πŸ“š DuPaul, G. J., & Stoner, G. (2003). ADHD in the Schools (2nd ed.). Guilford.
  • πŸ“š Cunningham, C. E., et al. (2006). Social anxiety. Journal of Anxiety Disorders.
  • πŸ“š Fuchs, D., & Fuchs, L. S. (2006). Response to Intervention. Reading Research Quarterly.

Teacher Workload & Efficiency

  • πŸ“š OECD TALIS (2018). 48 countries, 260,000 teachers. International Survey.
  • πŸ“š Sweller, J. (1988). Cognitive load theory. Cognition and Instruction.
  • πŸ“š Stronge, J. H., et al. (2011). Effective teachers. Journal of Personnel Evaluation in Education.

15+ peer-reviewed studies. 300,000+ participants. Decades of educational psychology research.

Student1st isn't built on opinions. It's built on science.

See Complete Research Bibliography β†’

Five Questions Every Teacher Knows

Be honest. You know the answers.

1. "Which students do you call on most?"

The ones with their hands up.
Tom. Sarah. Always Tom and Sarah.

Because they're quick. They know the answer. They keep the lesson moving.
You're not playing favorites. You're surviving.

Meanwhile, Mary sits quietly in the back. You haven't called on her in three weeks.
She's not disruptive. She seems fine.
You have no idea she's failing.

2. "Do your quiet students pay attention?"

You assume they do.
They're not talking. Not disruptive. Eyes on the board.

But you haven't called on them. Haven't checked.
You're managing 30 kids. You don't have time to check everyone.

Week 1: Mary thinks "Maybe I'll get picked tomorrow."
Week 2: Mary thinks "Still not picked. Maybe next week."
Week 3: Mary thinks "It's never me. Why bother listening?"
She starts drawing in her notebook. You don't notice.

πŸ“š Research: Learned helplessness theory (Seligman, 1975) β€” when students observe they have no control, they disengage within 2-3 weeks. See evidence β†’

3. "When did you realize Sarah was struggling?"

Three months later.
Report card time. Grades came in. Sarah's failing.

Parents email: "Why didn't you tell us sooner?"
You think: "I had no idea. She was so quiet."

You called on her twice. Total. In three months.
Both times she said "I don't know" softly. You moved on.
No data. No pattern. No early warning.

4. "How much time do you spend organizing files?"

"Just a few minutes..."
Renaming "image042.jpg" to something meaningful.
Searching folders: "Where did I save that worksheet?"

It's Sunday night. You're prepping Monday's lessons.
You spend 20 minutes looking for files you KNOW you have.
That's not teaching. That's digital housekeeping.

30 minutes per week Γ— 36 weeks = 18 hours per year.
Just organizing files you already made.
You could be teaching. Instead you're renaming "Document1.pdf"

5. "How long does it take you to actually START teaching?"

Bell rings. Kids are in seats.
You're still logging in. Opening attendance. Finding today's presentation.

8 minutes later: Finally ready.
Kids have been talking. Poking each other. Getting distracted.
You lost 8 minutes of learning time. Before you even started.

5 lessons per day Γ— 8 minutes = 40 minutes lost daily.
That's one full lesson per day spent on... what?
Technology that was supposed to save time.

None of this is your fault.

You're working with broken tools.
Random pickers that create luck, not fairness.
File systems designed for engineers, not teachers.
Software that wastes time instead of saving it.

Student1st fixes the tools. So you can fix the problems.

If You're a Parent Reading This

Three questions about your child's classroom.

"Does your quiet child participate in class?"

You ask your child: "Did you answer any questions today?"
They say: "No. Teacher didn't pick me."

Next week, same answer: "Teacher didn't pick me."
Week after: "Teacher never picks me."

Your child is learning two things:
1. "I don't have a voice in class"
2. "There's no point paying attention"

"When did the teacher notice your child was struggling?"

Report card comes home. Your child is failing.
You email the teacher: "Why didn't you tell us sooner?"

Teacher replies: "I didn't realize. She seemed fine. She was quiet."

Three months of falling behind.
No early warning. No intervention.
Because quiet students are invisible students.

"Does your child's teacher have time to actually teach?"

Your child comes home: "We didn't start until 10 minutes after the bell."
"Teacher was still setting up."

Not because the teacher is lazy.
Because they're fighting broken systems.

10 minutes lost per lesson Γ— 5 lessons per day = 50 minutes wasted daily.
One full lesson per day lost to technology that was supposed to help.

Your child's teacher is working hard.
They're just working with broken tools.

Random pickers that ignore quiet students.
Software that wastes time instead of saving it.
Systems that hide struggling students until it's too late.

Student1st gives teachers the tools they deserve.
So your child gets the attention they need.

You Already Spent Thousands on ED TECH

Schools invest thousands believing they've solved classroom problems.

But hidden risks remain:

❌ Fairness tools that can't prove fairness
❌ Accessibility software that may trigger seizures
❌ Efficiency systems that waste hours

You bought risk - not solutions.

Student-1st closes these gaps:

βœ… Proven fairness (math-backed, auditable)
βœ… Medical safety (WCAG 2.3.1 compliant)
βœ… Real efficiency (hours saved per week)

Freeing budget for what matters: educating the future.

See ROI Breakdown β†’

The Hidden Problems in Educational Tools

Teachers work incredibly hard. Schools invest millions in classroom tools-thinking they're helping.
But research reveals hidden flaws most vendors don't disclose.

Admin Time
Reduces Wasted Setup Time
Engagement
Increases Student Participation
Risk
Reduces Liability for Schools
Outcomes
Improves Learning Results

Eight Systems That Fix Hidden Problems

Student-1st isn't a collection of features-it's a complete solution to RISKS schools didn't know they had.

SPRITE

Targeted Advancement Model

The Problem: Equal chance doesn't mean equal practice.
The Solution: Struggling students automatically prioritized where they need help most.

Learn More →

MRE

Maximum Reach & Engagement

The Problem: Flashing animations can trigger seizures.
The Solution: Every animation tested. WCAG 2.3.1 compliant. Medically safe by design.

Learn More →

CAFE

Classroom Assignment Fairness Engine

Hidden Problem: Manual class balancing takes 5 hours over 3 days with no proof of fairness.
Solution: 20-second algorithmic balancing. Typically 40-70% improvement. Full audit trails.

Learn More →

ALSOT

Arrive Late, Start On Time

Hidden Problem: Traditional tools waste 10 min/lesson on setup (OECD data).
Solution: 90-second automation-arrive late, start on time.

Learn More →

SRM

Smart Resource Management

Hidden Problem: Teachers spend 2+ hours/week searching for resources, 20GB+ wasted on duplicates.
Solution: Auto-discovery + collision detection = 58 hours saved per teacher per year.

Learn More →

IRIS

Intelligent Resource Import Service

Hidden Problem: Meaningless filenames make resources impossible to find. 30+ min/week wasted renaming.
Solution: Smart file renaming + portable metadata + Windows Search = self-organizing library.

Learn More →

LACE

Language Adoption Customisation Element

Hidden Problem: Fixed translations don't match local terminology. Language switching is clunky.
Solution: Keyboard-synced switching + adaptive translations that learn from teacher edits.

Learn More →

TWIST

Teaching Within Seconds Technology

Hidden Problem: Traditional ED TECH takes 45-60 min per teacher to deploy. Schools waste weeks.
Solution: 30-second teacher deployment (schools) or 10-minute setup (individuals with demo data).

Learn More →

AIM

Accurate Import Mapping

Hidden Problem: Every CSV import creates a new duplicate form. Scores scattered, no progression view, no control over what gets imported.
Solution: Import once, update forever. Map only the columns you need β€” one form grows with your data, never duplicates.

Learn More →

"That's Not Efficiencyβ€”That's Transformation"

What takes other schools 5 hours of manual work and 3 days of effort,
Student-1st does in under 1 second with mathematically proven fairness.

190ms
Execution time
18,000Γ—
Faster than manual
$250+
Saved per year
<1%
Final variance
See the CAFE System β†’

Speed and final variance (<1%) are guaranteed. Fairness improvement percentage varies based on starting data quality (typically 40-70% improvement). All measurements from production testing with 139 students across 5 classes.

The Right Tools. The Right Mindset. The Right Results.

Join educators who believe in putting students first-with software that makes it automatic.

Request Demo
`n`n