Primary tabs

How to harness AI for student support: a human-centred approach

By Laura.Duckett, 24 September, 2025
Drawing on the creation of a virtual assistant for her MBA students, Zemin Chen shares lessons for educators seeking to introduce GenAI into the classroom
Article type
Article
Main text

Even in industries buzzing with technological innovation, such as higher education, the most impactful uses of AI come from a clear purpose and a human touch. My own experience, developing a custom AI virtual assistant for MBA students – the “research method adviser”, highlights how purposeful design and collaboration can turn AI into a true partner in teaching and learning

This AI solution targeted real pain points: repetitive student queries, overflowing inboxes and disengagement in online environments. By focusing on concrete problems and involving the people who would use the tool, we achieved tangible improvements, including a 30 per cent drop in email traffic and a 100 per cent recommendation rate in student surveys. Here, I share insights and actionable advice that other educators can apply.

Use AI intentionally 

AI should never be introduced simply for innovation’s sake. Start with a specific problem or goal. In our case, students needed quick, accurate guidance on research methods and academic writing, and staff were stretched thin answering the same queries repeatedly. A purpose-driven approach ensured that every feature of the assistant was tied to an outcome: saving time and improving responsiveness and learning.

Before adopting AI, ask: “What exact challenge am I trying to address?” Then define success metrics (such as reduced queries, faster response times, higher satisfaction and other relevant measures). This clarity ensures that AI delivers genuine value.

Make AI tools human-centred by design

AI should adapt to students and educators, not the other way around. We customised the assistant’s knowledge base with programme materials, university policies and academic writing guidance so that its outputs were authentic and relevant.

As educators, we effectively acted as designers of the interaction, shaping the chatbot’s responses, tone and limitations. The assistant was built to complement human support, offering help at scale while ensuring the personal touch remained in mentoring and complex discussions.

Co-design with students and colleagues

Ensure you involve end-users from the outset. We piloted the chatbot with students and colleagues, gathering feedback on tone, accuracy and usefulness. This helped show that the AI was addressing the right problems and built trust among users.

By the time of launch, students already felt ownership of the tool, while staff had confidence that it aligned with university policies and academic integrity standards. Treating both students and colleagues as co-creators makes the system more resilient and relevant.

Test rigorously and refine continuously

The initial build is the easy part. The real work lies in testing, testing and testing again. We spent months refining responses, checking tone and stress-testing the tool with complex queries.

Human oversight is crucial. It ensures the assistant can escalate sensitive or academic judgement questions back to staff rather than giving misleading or insensitive replies. Continuous refinement and updating should be seen as essential maintenance, not optional extras.

Measure impact and keep evolving

Assess impact through meaningful data. For example:

  • Comparing August 2023 to August 2024, email traffic fell by 30 per cent, reducing workload and improving responsiveness.
  • Surveys of 182 MBA students across three cohorts showed 100 per cent would recommend the Research Method Adviser to peers.

Outcomes demonstrate tangible value while also pointing to areas for refinement (for example, expanding examples within explanations). Define clear success metrics, measure outcomes rather than vanity statistics and keep iterating.

AI as a collaborative partner

When introduced with purpose, empathy and continuous improvement, AI becomes more than a novelty. It can be a collaborative partner in education, handling routine support so that staff can focus on what humans do best: mentoring, inspiring and guiding learners.

The key lesson is that AI works best when shaped by human goals and tested by its users. Done well, it enhances rather than replaces the human dimensions of education.

Zemin Chen is a senior lecturer at the University of Lancashire. 

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

Standfirst
Drawing on the creation of a virtual assistant for her MBA students, Zemin Chen shares lessons for educators seeking to introduce GenAI into the classroom

comment