Generative AI in Education - Have Your Say Report

Tim Davies

Tim Davies

Between 1st November and 15th December 2025, over 1000 students shared their views through a distributed dialogue on how schools, government and AI firms should be approaching the use of generative AI in education.

The full report provides a summary of student views.

The insights video below draws on interviews with a small group of students to reflect key themes from the report.

Executive Summary

Method

Between 1st November and 15th December 2025, over 1000 students shared their views through a distributed dialogue on how schools, government and AI firms should be approaching the use of generative AI in education.

  • 23 settings: across England, Year 6 - Year 13, and representing a range of urban and rural schools and school types. Over 50 individual workshop sessions.
  • 1 or 2 hour deliberative workshop sessions: learning about AI in education, discussing issues and providing feedback.
  • Balanced workshop materials exploring benefits and problems of AI in education, supporting students to develop their own views.

Students were asked to:

  • respond to a set of statements about the use of generative AI in education, through a pol.is platform, starting with 10 statements developed with the UK Department for Education (DfE), with new statements from students added over time
  • discuss benefits and problems of different AI in education tools and vote on their top and bottom tools
  • generate creative feedback reflecting their hopes and fears about the use of generative AI in education

Headline findings are based on patterns of responses to the statements alongside themes arising from creative feedback and session observations.

Supporting images in the report come from worksheets and creative feedback activities, sharing student feedback in their own words.

Findings

Students expressed detailed, nuanced and critical views on how generative AI could or should be used in education, emphasising:

  • the importance of the existing student-teacher relationship
  • specific uses of AI to enhance learning by:
    • making lessons more engaging
    • providing more personalised pathways
    • supporting self-directed learning
    • preparing students to navigate a future where AI is ubiquitous

Students want AI tools to be checked for safety, accuracy and fairness before they are used in the classroom.

“AI tools for education should be officially checked and approved as safe, accurate and unbiased before they can be used, even if that means it takes longer for the latest models to be used in schools.” (Near unanimous)

Students raised concerns about AI tools:

  • being inaccurate and giving false information
  • being too quick to offer easy answers, rather encouraging a process of learning

Students strongly value personal attention from their teachers and social interaction in the classroom with peers. They are concerned that AI could disrupt this.

Students prefer human teachers and human markers, even if AI might be faster or fairer.

Some felt AI could:

  • support factual learning in subjects with right and wrong answers
  • help struggling students when teachers do not have capacity to provide personalised help

Some younger students (Year 6) suggested AI tools might be smarter than their teachers, or expressed that “AI can’t get mad at you”, suggesting some may prefer feedback delivered through AI.

  • Students are split on whether AI use should be mandatory. 50% of groups disagreed (39% agreed, 10% unsure) that:

“Sometimes students will have to use AI in the classroom, whether they want to or not” (Mixed response)

Groups in favour of a requirement to use AI want it to be accurate and want AI tools to support students to clearly tell their teachers when they have completed work using AI.

Some students expressed concerns about the environmental impacts of AI, although this was not widely considered to be a barrier to exploring the use of AI in education.

A number expressed strong opposition to use of AI for creative work.

Students are concerned about the mental health impacts of AI. They do not want AI tools to be involved in personal, emotional and social support. A clear majority of groups agreed that:

“AI tutors should never be able to have conversations with pupils about personal issues, like friendships or family problems.” (Majority support)

They opposed AI tools pretending to have feelings, though some suggested it could help if AI tools offered light encouragement.

Students were generally in favour of the statement that:

“Children from a young age should be educated on how to use AI most effectively and safely in their learning” (Generally supported)

Further work is needed to understand student views on the use of their data within AI.

A majority opposed the statement:

“We should not place too many limits on the data that AI for education can use, so that AI can be as effective as possible” (Majority opposed)

However, there was a degree of uncertainty on the statement:

“AI needs a lot of data to be effective. We should wait until all the privacy and data protection issues are sorted out before we start to use it in education” (Uncertain response)

Students might not fully understand the kinds of data that personalised learning through AI uses. We observed students interpreting personal data as details such as name and address, rather than examples of past work and marks.

Some students also expressed concerns that personalised learning would lead to feeling judged by AI, or to a loss of opportunities for peer-support when everyone is given different tasks.

Students suggested that AI for education needs a ‘school mode’, adequate restrictions, and tightly defined scope of use. Students strongly preferred a focus on safe educational AI tools that can support independent learning or personalised consolidation of learning.

A notable minority called for a ban or very limited AI use in education due to ethical concerns, and feeling AI is bad for learning. Others suggested ways of mitigating risks such as:

  • AI tools not giving direct answers but encouraging
  • step-by-step learning
  • AI tools that express uncertainty
  • restricting the use of AI for homework
  • daily time-limits or quotas on AI use

Others highlighted customising the tone of voice in AI for education; co-designing applications with student inputs; and involving students in the assurance of educational AI tools.

Recommendations

For generative AI to play a positive role in education, it is vital that AI developers, policy makers, school leaders and individual teachers engage in ongoing dialogue with students, and listen to their ideas and concerns.

  • Take the time to get generative AI in education right. Students want assurance that AI is accurate, and that bias and safety risks of AI are well addressed before it is deployed in the classroom. Focus on limited classroom and homework uses of AI, rather than AI everywhere.
  • Embed a role for students in decision-making on AI. AI for education should be co-designed with student input. Support should be available for teachers and education leaders to have ongoing and informed dialogue with students about shared expectations around AI in education. The ‘workshop in a box’ demonstrates the feasibility and value of a deliberative approach.
  • Build student agency in a world of AI. Young people hold a diversity of attitudes towards AI and its future use. Support students to learn about the pros and cons of particular uses of AI, and to make informed individual and collective choices about when and how they engage with it.

Read the full report

Read the full report (PDF) here.

 Read more

Do you collect, use or share data?

We can help you build trust with your customers, clients or citizens

 Read more

Do you want data to be used in your community’s interests?

We can help you organise to ensure that data benefits your community

 Read more