AI Vanguardest. 2024
← Back to web version
AI Vanguard
AI Vanguard
Student voice on AI in education
Brief no. 01
Date Spring 2026
Site aivanguard.org

Student voice on AI in education.

Six asks for schools and districts, grounded in primary research from AI Vanguard's student representatives across Southern California.

Students surveyed
447
Campuses
6
Teachers surveyed
10
Brief date
April 2026

Executive Summary

AI Vanguard is a student-led 501(c)(3) nonprofit founded in 2024 to ensure that students have a direct voice in how AI is used in their classrooms. This brief summarizes findings from three primary research projects — a 447-response student policy survey, a 10-educator teacher pilot, and a qualitative perception study — and translates those findings into six concrete policy positions.

The pattern across all three studies is consistent. Students want guidance over bans. Teachers feel pressure to adopt AI but lack the tools to evaluate it — in a blind detection test, educators averaged 30% accuracy, below the 33% expected from guessing, and zero of ten correctly identified the paragraph actually written by a student. Schools that build AI policy on detection alone are building on unstable ground.

I. Preamble

The decisions being made right now about AI in classrooms will shape the next decade of education. The students most affected are almost never in the room when those decisions are made.

AI Vanguard exists to change that. We organize students across Southern California into a serious policy voice — with research, with recommendations, and with standing relationships to the schools and districts we work with. This brief is our summary of what students are telling us a good AI policy looks like, and an invitation to bring those students to the policy table.

II. Six Asks

  1. 1.

    Teach responsible use — don't default to bans.

    Prohibition policies produce compliance-by-hiding, not compliance. The students most familiar with AI are the ones most insistent that schools teach — not ban — its use. Schools should adopt explicit curricula on responsible AI use: when to ask for help, how to verify outputs, when to cite, what counts as original work.

    Evidence — 74% of 447 surveyed students named "teach students how to use AI responsibly" as what schools should do — the single most-selected response. Only 4% called AI use outright cheating.

  2. 2.

    Give students a seat at the policy table.

    Student representatives should participate in drafting, reviewing, and revising any school- or district-level AI policy — as a standing voice with the ability to flag implementation problems before they become enforcement problems.

    Evidence — 35% of students explicitly asked to be involved in shaping AI rules and policies — the second-most common policy preference in our survey and the clearest mandate we have seen for direct student input on any technology policy.

  3. 3.

    Differentiate by task, not by tool.

    Policies should distinguish categories of academic work. AI assistance on studying, brainstorming, concept explanation, and work-checking should be broadly permitted. AI assistance on graded written submissions and creative work should have clear, task-specific rules with labelling conventions.

    Evidence — 41% of students say AI is "acceptable only for certain tasks" — the most common qualified stance. 50% say "acceptable if used responsibly." A binary allow/ban policy contradicts the actual student distribution of views.

  4. 4.

    Protect creative and identity-bearing work.

    Creative-work policies deserve stricter protection. Student voice, artistic expression, and identity-bearing writing should be explicitly scoped as human-only work — both to preserve what makes the work educational and to prevent the erosion of creative skill-building.

    Evidence — Multiple student comments surfaced unprompted concern about AI replacing creative work. Our qualitative perception study further found that teachers' grades on AI-generated work drop once the AI source is revealed — subjective assessment of creative work is unstable in the presence of AI.

  5. 5.

    Close the access gap.

    Schools that permit AI use should ensure equal access to AI tools across the student body. Reliance on personal devices and paid tools creates an invisible divide that tracks existing socioeconomic divides. District-sanctioned tools with equal access are the floor.

    Evidence — 18% of students say not all peers have the same access to AI tools at their school, with another 15% unsure. Unequal access in a setting where AI shapes academic outcomes is a straightforward equity failure.

  6. 6.

    Invest in teacher development, not just detection tools.

    Schools should fund professional development on AI-present pedagogy — designing assignments that are AI-resilient, grading in an AI-present world, teaching critical evaluation of AI output. Detection software is a secondary tool, not the primary strategy.

    Evidence — 80% of teachers in our pilot feel pressure to integrate AI and 80% suspect frequent unauthorized use. But on a blind detection quiz, teachers averaged 30% accuracy (below the 33% chance baseline) and zero of ten correctly identified the paragraph actually written by a student. Detection is neither high-confidence nor stable.

III. What AI Vanguard Offers

Any school or district considering these positions should know what AI Vanguard provides in return:

  • A standing student representative on your campus, trained to run research and convene peer feedback.
  • Access to our ongoing survey instruments — policy, teacher, and creative-work perception — rerun each cycle.
  • Structured student forums your administration can attend or observe, not just read about.
  • Drafting help on AI-use guidelines, honor-code language, and detection-policy language, co-authored with students.

IV. In Closing

A policy students helped write is a policy students can live with. The asks above are what the evidence says a good policy looks like. The rest is a conversation we are ready to have.

To bring AI Vanguard to your school or district, or to receive the full research dataset behind this brief, write to info@aivanguard.org.


AI Vanguard · 501(c)(3) nonprofit · Student-led
aivanguard.org · info@aivanguard.org