University lecturers often struggle to provide timely feedback when teaching large classes. In psychology programmes where hundreds of students may attend a single lecture, giving meaningful formative feedback can be difficult without placing heavy demands on teaching staff.
A reflection, published in Psychreg Journal of Psychology, suggests that digital polling platforms such as Vevox could help address this challenge. The study explores how the tool can support formative assessment, improve participation in lectures, and give students immediate feedback on their understanding of course material.
Formative feedback refers to information given to learners that helps them adjust their thinking or behaviour in order to improve learning. Unlike graded assessments, formative activities are designed to guide progress rather than contribute directly to final marks.
Research in educational psychology has long linked formative feedback with improved academic performance and motivation. The difficulty arises when lecturers attempt to provide that feedback in large cohorts where written responses to each student can be impractical.
Psychology programmes frequently include lecture groups ranging from several dozen students to more than 400. In such settings, traditional feedback methods such as written comments on draft work or extended classroom discussions can become difficult to manage within limited teaching time.
The reflection examines how Vevox, an internet based audience response system, was used across several undergraduate psychology modules. The platform allows students to respond to questions during lectures using their devices while results appear instantly on screen.
The lecturer first introduced the tool in 2018 in a foundation level psychology module with around 60 students. It was later used in larger undergraduate lectures with more than 300 students and occasionally approaching 400.
Several types of activities were used to encourage participation. Multiple choice questions allowed students to test their understanding of lecture content, while word cloud responses helped generate discussion around more complex topics.
A question and answer feature also enabled students to submit longer responses during class. Because submissions are anonymous, students may feel more comfortable participating than in traditional hands up discussions.
Anonymity appears to be an important factor in participation. Previous classroom activities often relied on volunteers answering questions aloud, which can discourage some students who worry about giving the wrong answer in front of peers.
Using live polling meant that students could see how their responses compared with the wider group immediately. The lecturer could then explain why certain answers were correct or incorrect, turning the activity into a moment of instant feedback.
The reflection reports that students generally participated willingly in the activities even though they were not graded. This supports wider research suggesting that students engage in formative tasks when they see clear learning value.
However, the approach is not without limitations. Participation tended to decline if too many activities were included in a single lecture, suggesting that digital engagement tools must be used carefully rather than excessively.
Different formats also worked better in different class sizes. Simple response types such as multiple choice questions were more suitable for large cohorts, while other activities were more effective in smaller classes.
Despite these limitations, the reflection suggests that digital polling tools can make feedback more immediate and interactive. For lecturers facing increasing class sizes, such systems may offer a practical way to maintain student engagement while still supporting learning through formative feedback.

