Moving to Online Teaching – some considerations and potential pitfalls

With the global impact of COVID-19, online teaching activities will continue for the next academic year. The two options available for teaching delivery are interactive synchronous live sessions or asynchronous recorded lecture videos. I thought I would share what I am planning to do with moving to online teaching. I am considering three phases of interaction: Pre-Activity, Teaching Activity and Post-Activity and these are discussed further below. Figure 1 shows an overview of the plan that involves three phases of interaction between the lecturer and the learners. It may be useful to others who are facing the same situation.

Students typically experience a lecture as a 50-minute classroom-based activity where a PowerPoint presentation is used to support the delivery of important topics aligned to learning outcomes. The recording of a 50-minute lecture video is not something I plan to incorporate with online teaching. TED style talks with the sharing of one key idea (or learning objective), chunking down the 50 minutes into smaller units is preferred and this is a decision based on learner processing of information and a consideration of the additional distractions that online delivery presents. Some further discussions on these issues in the Teaching Activity section.

The pre-recording of asynchronous lecture videos will allow me to manage workload for lectures, but also mitigate any technological issues with live sessions. If internet connectivity fails, then it won’t be a great experience for my students nor for me. The asynchronous recorded lecture videos will provide flexibility to students where they can manage their time and external responsibilities (e.g. childcare, school closures, health issues). Students are familiar with asynchronous online teaching; they have access to many lecture capture recordings on their course. My first- and second-year students frequently report using YouTube videos and other online resources like the Khan academy.

The lack of direct social interactions during COVID-19 caused some students to feel isolated. Students missed dropping in to speak with their lecturers. Some students found studying at home more challenging as a quiet dedicated study space was not guaranteed. Homes were busy environments with high demands on technology use. Synchronous live sessions can provide structure and an opportunity to connect within a session. I will be using live sessions for tutorials and Q&As.

Online Teaching Graphic S Fergus

Figure 1. An online teaching plan that involves three phases of interaction between the lecturer and learners.


The use of pre-learning has been shown to support student learning. Exercises and activities for pre-learning are purposefully designed to bring previous ideas and knowledge to the surface and awareness which will help information processing work more efficiently. In science education, using pre-learning activities have been found to improve students’ grades (Sirhan et al., 1999) which presupposes that acquisition of prior knowledge occurred previously.

Cognitive load theory suggests that learning occurs best with conditions that are aligned with the workings of the human brain. There are many books and publications applying this theory in teaching since it was first researched in the late 1980s (Sweller and Chandler, 1991, Sweller, 1988, Paas and Ayres, 2014, Kalyuga, 2007, Sweller et al., 2019). The theory provides empirical support for models of instruction in which instructors design their teaching to optimise the load on students’ working memories. Information is received through the senses and the first important step is attention (Figure 2). Learners must first attend to information where it is processed and stored in working memory. The use of pre-activities can help emphasise the importance of information that learners must attend to.Information ProcessingFigure 2. A model of information processing

There is a limited capacity of the working memory that causes displacement of information. The number of elements or “chunks of information” that can be held simultaneously in the working memory is limited (Miller, 1956). Scaffolding of information in pre- activities allows learners to engage with smaller chunks of new information and helps to avoid overwhelming their working memory (Paas et al., 2003). Examples of simple pre-lecture activities are recapping questions or quiz questions to surface prior knowledge related to the teaching activity. The pre-activity is an opportunity to establish expectations for students and prepare them for the teaching activity.

Teaching Activity

Active learning underpins good quality teaching and is recognised to enhance student engagement over passive learning approaches. Active learning is a broad term that includes different models of instruction and places learners as responsible partners in the learning process (Bonwell and Eison, 1991). Interactivity is important in teaching and particularly for online teaching activities.

One of the challenges with video lectures for learners is ‘mind wandering’ where there is a shift of thoughts from the task activity to unrelated thoughts. The extent and rate of mind wandering for university students has been found to be 40-45% (Kane et al., 2017, Szpunar et al., 2013). Interactivity is essential to help with ‘mind wandering’. Breaking up content with a question would be a simple level of interactivity. Evidence of using an interactive e-classroom achieved significantly better learning performance and a higher level of satisfaction than using a non-interactive video or no video (Zhang et al., 2006).

It’s difficult for learners to passively watch long online videos. Based on an analysis of 6.9 million MOOC video viewing episodes, the median engagement time was six minutes so shorter video segments with online delivery is recommended (Lagerstrom et al., 2015). Shorter videos are not always suitable with more complex content so to increase the length of online media, including layers of interactivity is essential. During synchronous live lectures, interactivity can be facilitated using small group discussions, quizzes, self-assessment and peer assessment.

Learning requires a change in the schematic structures of long-term memory and this advancement is experienced as a progression from error-prone, slow and difficult to error-free, smooth and effortless. A novice learner compared to an expert will not have acquired the same expert level of schemas. As the novice learner becomes more familiar with the content, changes to the associated cognitive characteristics occur and the working memory can more effectively manage the material. The goal with good teaching and instructional design is a consideration of “the load” and the level of complexity of the content. The use of video to present information using dynamic visuals and auditory content also create a cognitive load. Using an introduction at the beginning with signposting can help learners to navigate the teaching content.

In cognitive load theory there are three different types of loads: intrinsic, extraneous and germane. Intrinsic cognitive load is demanding on working memory due to complexity of the material. It is not always possible to simply reduce essential and critical interacting elements particularly in relation to more complex content where the sophisticated nature of the content is central to learning within the discipline. Staggering the introduction of essential elements may be possible and ultimately allow all elements to be processed together by the end.

Scaffolding of learning can be structured in two ways: ‘part-whole approach’ and ‘whole-part approach’. The ‘part-whole approach’ is breaking down the complex task into a series of smaller sub-tasks that build up the necessary skills in sequence. The ‘whole-part approach’ is introducing the whole task initially with attention directed and focused to each sub-task. This approach can support understanding the interactions and connections between each sub-task

Extraneous cognitive load is unnecessary and interferes with schema acquisition. This can result from using cognitive capacity to search for information or details in an explanation instead of solving the problem. If the intrinsic load is high, then the importance of extraneous load increases. With video as an instructional device, it’s important to direct learner’s attention to relevant information. Additional irrelevant details use more of the processing resources. This was shown in a science topic on how a cold virus infects the human body (Mayer et al., 2008). High-interest details included the role of viruses in sex or death whereas the low-interest details consisted of facts and health tips about viruses. As the level of interesting details increased, student understanding decreased as measured by problem-solving transfer. This is not predicted initially, particularly as I have found that contextualising content helped engage students and increase their motivation (Fergus et al., 2015). It’s the use and timing of high-interest details that is critical.

During a lecture, particularly larger groups and cohorts, it can be difficult to ascertain whether students have fully understood a teaching point. Asking a question often results in silence or perhaps a small number of responses but usually students are not confident in contributing an answer and some learners referred to as “the quiet learner”, will prefer to reflect and answer questions internally (Akinbode, 2015). Think-Pair-Share is a collaborative discussion strategy that provides additional time for students to consider, reflect and think in order to improve the quality of their responses. Think-Pair-Share provides the opportunity for students to work together within a group where they can discuss their understanding and ideas in a safe environment. Due to the structure of the task, students are encouraged to have something to share keeping them engaged and it reduces pressure on any student who is reluctant (quiet, unsure). Research shows that the quality of student responses increases with the additional discussions (Smith et al., 2009, Wood et al., 2014) and think-pair-share has been used to effectively foster critical thinking skills within nursing students (Kaddoura, 2013). Using breakout rooms during a synchronous live lecture provides a useful format to enable small group discussions. A structured instructional strategy using Team-based learning (TBL) enhanced active learning and critical thinking in medical and science courses (Parmelee and Michaelsen, 2010, Parmelee et al., 2009). The emphasis in TBL is shifting from knowledge transmission to knowledge application. Individual quizzes and team quizzes can be designed into the TBL approach to capture and provide feedback on learning performance.



With recorded lecture videos, if students postpone their engagement and don’t schedule time in their weekly schedules this will be problematic. Binge watching a boxset is a familiar experience today but in relation to learning and memory, there is much robust evidence that distributed practice demonstrates improved long-term performance compared to block practice (Cepeda et al., 2006). As viewing recorded lecture videos aims to support learning, a similar spacing out of such activities would be preferred. Activities that support the consolidation of learning and provide feedback are recommended post-activity. Setting a quiz or collaborative discussions on problems would set expectations for timely engagement and highlight any misconceptions. I used to run office hours close to assessment submissions and schedule times when my students were also available and on campus. This was effective and weekly online office hours will be something I will trial so that students have an opportunity to participate live and ask questions. My experience with discussion boards is that students don’t engage much as their names are visible. Although I reinforce “there is no such thing as a silly question, it’s silly not to ask”, students are reluctant to participate.  Mentimeter has been more successful for engagement because the participants have anonymity. So, another option is to poll questions (using Mentimeter or Padlet) and answer them in a recorded session. This would be inclusive for students who cannot participate in any live Q&As and provide flexibility in delivery.



The migration of traditional classroom-based lectures to online delivery platforms requires thoughtful consideration of learner engagement and processing of information. For online teaching the integration of a pre-activity, teaching activity and a post-activity will help to address some of the challenges (online distractions, study environment and social community) learners experienced during COVID-19 alternate teaching arrangements. Interactivity in online teaching is critical for effective student enagement and supports effective learning. The level of interactivity can be distributed across the pre-activity, teaching activity and a post-activity in a variety of ways.


AKINBODE, A. 2015. The quiet learner and the quiet teacher. University of Hertfordshire Link, 1;2

BONWELL, C. C. & EISON, J. A. 1991. Active Learning: Creating Excitement in the Classroom. 1991 ASHE-ERIC Higher Education Reports, ERIC.

CEPEDA, N. J., PASHLER, H., VUL, E., WIXTED, J. T. & ROHRER, D. 2006. Distributed practice in verbal recall tasks: A review and quantitative synthesis. Psychological bulletin, 132, 354.

FERGUS, S., KELLETT, K. & GERHARD, U. 2015. The khat and meow meow tale: Teaching the relevance of chemistry through novel recreational drugs. Journal of Chemical Education, 92, 843-848.

KADDOURA, M. 2013. Think pair share: A teaching learning strategy to enhance students’ critical thinking. Educational Research Quarterly, 36, 3-24.

KALYUGA, S. 2007. Enhancing instructional efficiency of interactive e-learning environments: A cognitive load perspective. Educational Psychology Review, 19, 387-399.

KANE, M. J., SMEEKENS, B. A., VON BASTIAN, C. C., LURQUIN, J. H., CARRUTH, N. P. & MIYAKE, A. 2017. A combined experimental and individual-differences investigation into mind wandering during a video lecture. Journal of Experimental Psychology: General, 146, 1649.

LAGERSTROM, L., JOHANES, P. & PONSUKCHAROEN, M. U. The myth of the six-minute rule: Student engagement with online videos.  Proceedings of the American Society for Engineering Education, 2015. 14-17.

MAYER, R. E., GRIFFITH, E., JURKOWITZ, I. T. N. & ROTHMAN, D. 2008. Increased Interestingness of Extraneous Details in a Multimedia Science Presentation Leads to Decreased Learning. Journal of Experimental Psychology: Applied, 14, 329-339.

MILLER, G. A. 1956. The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological review, 63, 81.

PAAS, F. & AYRES, P. 2014. Cognitive Load Theory: A Broader View on the Role of Memory in Learning and Education. Educational Psychology Review, 26, 191-195.

PAAS, F., RENKL, A. & SWELLER, J. 2003. Cognitive load theory and instructional design: Recent developments. Educational psychologist, 38, 1-4.

PARMELEE, D. X., DESTEPHEN, D. & BORGES, N. J. 2009. Medical students’ attitudes about team-based learning in a pre-clinical curriculum. Medical education online, 14, 4503.

PARMELEE, D. X. & MICHAELSEN, L. K. 2010. Twelve tips for doing effective team-based learning (TBL). Medical teacher, 32, 118-122.

SIRHAN, G., GRAY, C., JOHNSTONE, A. H. & REID, N. 1999. Preparing the mind of the learner. University Chemistry Education, 3, 43-46.

SMITH, M. K., WOOD, W. B., ADAMS, W. K., WIEMAN, C., KNIGHT, J. K., GUILD, N. & SU, T. T. 2009. Why peer discussion improves student performance on in-class concept questions. Science, 323, 122-124.

SWELLER, J. 1988. Cognitive load during problem solving: Effects on learning. Cognitive science, 12, 257-285.

SWELLER, J. & CHANDLER, P. 1991. Evidence for Cognitive Load Theory. Cognition and Instruction, 8, 351-362.

SWELLER, J., VAN MERRIËNBOER, J. J. G. & PAAS, F. 2019. Cognitive Architecture and Instructional Design: 20 Years Later. Educational Psychology Review.

SZPUNAR, K. K., KHAN, N. Y. & SCHACTER, D. L. 2013. Interpolated memory tests reduce mind wandering and improve learning of online lectures. Proceedings of the National Academy of Sciences, 110, 6313-6317.

WOOD, A. K., GALLOWAY, R. K., HARDY, J. & SINCLAIR, C. M. 2014. Analyzing learning during Peer Instruction dialogues: A resource activation framework. Physical Review Special Topics-Physics Education Research, 10, 020107.

ZHANG, D., ZHOU, L., BRIGGS, R. O. & NUNAMAKER JR, J. F. 2006. Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness. Information & management, 43, 15-27.


My 10-month hiatus away from the world of Twitter

This wasn’t a planned experiment, but my last tweet was August 2019. A lot has happened since then and removing myself from Twitter has been an interesting experience.

So did I miss Twitter? The answer is well, Yes and No.

I left Twitter for my mental health. Something unkind happened at a conference I attended and without getting into details, let’s just say someone that I used to hold in very high regard wasn’t kind. I needed to step away.

Twitter and any form of social media can result in both positive and negative experiences. Thankfully I never experienced trolls or abuse, so my experiences are general. I use twitter in an academic professional capacity. It is great for keeping up to date on the latest developments, new publications, ideas, hints and tips. Previously, when I shared advice on publishing education research, I included Twitter as a positive networking tool. That’s the value it had and still has for me. I have to admit that I have missed Twitter as a resource and missed that sense of being in the hive mind of activity.

What I didn’t miss were the cliques that can get magnified online. Yes, there are cliques everywhere, that happens but it can be an exclusive club.

As an introvert, I tend not to shout about what I’m doing to others. Twitter is a great way to share updates on some professional developments. It helps to connect what you are doing with the outside world and that can be sharing and connecting to anyone, both home and away.

Overall the positives outweigh the negatives and so I am connecting back with Twitter. I remind myself that we are in control of managing notifications and the times we engage online. It’s those damn notifications that invite you to open the app especially in the evenings and weekends. That’s how it starts! So it will be a conscious choice now and I am looking forward to my return to Twitter in 2020 (it’s a strange year anyway).

“I am not happy with my mark” – Tough!

A lecturer releases coursework grades for a recent assessment and then emails begin to flow into their inbox from students who are not at all happy with their results. One student for example did ok on the Multiple Choice Questions (MCQs) but the Short Answer Question (SAQ) was below a pass grade. The email which was brief and very much to the point stated “I am not happy with my mark and would like to arrange a meeting to discuss this with you”. The lecturer begins to feel overwhelmed with the emails and thinks “so you don’t agree with the mark, tough!”

This was a conversation I had recently with a colleague, the emotional responses both from the students to their feedback and the immediate feelings of the lecturer, wanting to do the best they can and feeling pressure from students, feeling like they have to do more and more. This creates and contributes to a sense of resentment, the belief that students are not taking responsibility for their learning and simply complain about their grades. A barrier is created for effective feedback dialogue.

Dr Naomi Winstone’s research on learners’ proactive recipience within the feedback process has been a catalyst for me professionally, prompting our investigations into approaches that support feedback dialogue in our Programmes. A funded project on “Demystifying Academic Expectations” has enabled some further insights into approaches that will impact positively on our students, empowering them to learn more effectively.

So what happened with the student who emailed?

After the initial TOUGH response, acknowledging those reactive emotions that surfaced, the lecturer responded by email and arranged a meeting with the student.

The lecturer made sure that they were in a calm state ahead of the meeting so that they could have an adult to adult conversation with the student. They had in the back of their mind, that initial response that this student might just want to argue the mark but also that they could be wrong, the student might be concerned and looking to understand how to improve. The important point, let’s not jump to conclusions ahead of the meeting.

After exchanging pleasantries, the lecturer asked them, “how well do you think you did on this assignment?”

The student explained their concerns, they were performing better on other aspects of their studies and this grade worried them because it was the SAQ that resulted in the poor mark overall. They will be writing further SAQs on other modules and in exams so understanding what is not working for them is very important.

The lecturer explained the criteria and standard against which their work is being judged and then examined the student’ work against this to highlight the feedback and have a discussion on that.

From the conversation with my colleague, the feedback dialogue process that took place between the lecturer and student had a very positive outcome. Creating the right environment to have that conversation was key.

The lecturer then addressed answering SAQs in a subsequent workshop with all the students that supports demystifying academic expectations.

  • An activity to help students focus on the verbs used in questions and what the difference between verbs used was the first part of the workshop. Students wrote in their own words what they understood was meant by “define”, “discuss”, “explain” etc.


  • Then the lecturer introduced Bloom’s Taxonomy in a brief format and the development of intellectual thinking skills. So for a first year student remembering and understanding are important skills with perhaps some applying. In second year the skills of applying and analysing develop to distinguish the expectations required between those levels of study. The students were then asked to consider the verbs on the activity sheet and consider which levels of Bloom’s Taxonomy are addressed.
  • Following discussions, a summary of frequently used verbs was presented where students could compare to their own interpretations.

Verbs in questions_title (2)

  • The students then examined some questions from different topics so that they could apply this approach within a specific context.
  • The final part of the workshop involved exemplars that students could evaluate. The exemplars produced by the lecturer were created to illustrate an outstanding example, a good example and a poor example. This is something that Prof Phil Race mentioned to me when I had an insightful conversation about exemplars. Thank you Phil. The lecturer deliberately included some errors in the exemplars that students included in previous assessments so this was feeding back to the students their errors. The students evaluated each exemplar and then using a think-pair-share approach they discussed their evaluations with each other.
  • The lecturer then gave the students a behind the scenes look into how they would mark the exemplars, this is the demystifying aspect, helping students to understand what goes on from the academic perspective. Students could ask question to clarify the differences between the exemplars and how the lecturer awarded and deducted marks.

Student feedback on this has been extremely positive, when you hear students use “amazing” it must have had a positive impact on their learning experience which ultimately is the reason we devise and pilot new approaches in our teaching. I think unpacking the ‘black box’ and demystifying academic expecatations can only help and support our students.


Winstone, NE, Nash, RA, Parker, M & Rowntree, J 2017, ‘Supporting learners’ agentic engagement with feedback: a systematic review and a taxonomy of recipience processes‘ Educational Psychologist, vol. 52, no. 1, pp. 17-37.

Lecture Capture: will my students show up?

One of the recent game changers in Higher Education in the UK is the introduction of lecture capture.  Across the sector, the use of lecture capture to record live lectures is now commonly adopted with 86% of universities in the UK using lecture capture. It is now being introduced more widely in the university where I work, so naturally there has been much debate and discussion about the potential impact of this initiative on student attendance and on student learning. Much discussion has focused on a deficit model particularly that attendance will drop and students will rely on recordings for cramming.

I wanted to understand better what we know from research on these issues and what are the implications for academic practice? So that’s what this blog post is about.


Before exploring lecture capture specifically, simply looking at evidence related to student attendance and student performance, the literature shows a positive link between these two factors. In a meta-analytic review, attendance was found to have a strong relationship with final course grade and importantly was a better predictor of academic performance than for example study habits, study skills and standardized admission tests. Interestingly, research in a medical education context where the teaching sessions were highly interactive, showed a mean attendance rate of 89%. A positive correlation was found between attendance and overall examination score (r = 0.59 [95%CI, 0.44-0.70]; P < .001). Distinction grades (above 60%) were obtained by students with attendance rates of 80% or higher.


Figure 1. Correlation Between Total Attendance Rate and Overall Examination Score with the line of best fit (Deane and Murphy, 2013)

Further research from a randomized experimental approach suggest that on average, the effect of attending lectures corresponds to a 9.4 percent to 18.0 percent improvement in exam performance. It’s not a surprise to recognise that student engagement and attendance has a positive impact on their learning success. So what about lecture capture where recordings of lectures are available to students? Will the face to face lecture be replaced by a recording, which is possibly how many students might think?

Attendance and performance with lecture capture

A study on attendance and lecture capture was carried out with 396 first-year students attending a mandatory course on biological psychology. All the lectures were of a traditional university style with a teacher in front of the class covering content.  Four different groups were identified when it comes to attendance and using recorded lectures: non-users, viewers, visitors and supplementers.

box (2)

Student membership of these categories was not fixed and changed between semesters. A large number of students used the recorded lectures as a substitute for lecture attendance (viewers). In the first assessment which focused on developing a knowledge base, the students who used the recorded lectures as a supplement scored significantly higher. In the second assessment which focused on higher order thinking skills, no significant differences were found between using recording lectures and attending lectures. A correlation study on time spent and exam score for the first knowledge base assessment found that only watching online lectures (viewers) had the lowest correlation and only attending lectures (visitors) had the highest correlation. The supplementers who had an increased time on task did not result in a higher exam score. Again, as in other research attendance at face-to face lectures is a strong predictor of academic performance. Recent research within a UK context investigated the impact of attendance, lecture recording, and student attainment across four years of an undergraduate psychology programme. For first year students, supplemental use was beneficial but only stronger students could overcome the impact of low attendance. Weaker students did not observe grade improvements from additional use of recordings if their attendance was poor. The predictive aspect on student attainment was only found for first year students.

The benefits of lecture capture

The positive appreciation by students of having recorded lectures available will not come as a surprise to my colleagues. The ability of students to review recordings available during a course particularly where they have difficulty understanding new content or terminology is a beneficial support tool. I know from student feedback where I work that some students don’t like gaps deliberately placed in lecture slide handouts. There are a number of issues at play here, some students are anxious that they have missed important points, others are over relying on studying just with the lecture slides and incomplete versions don’t support this and obviously those who don’t attend won’t have heard the lecture unless they have a recording. Research within a health science context where there is a high content load examined lecture recordings where a podcast and the lecture slides were available. Some students did listen to the podcast during other activities (e.g. commuting) but most accessed the resources and viewed the lecture slides on a computer. The flexibility with this approach and being able to hear the lecture again were the most useful aspects from students.

Implications for practice

The goal in higher education is to support the development of independent adult learners, equipped with the skills and competencies to thrive in a competitive global market. The student population in Higher Education is diverse, with widening participation, this includes more mature students, international students, students with a disability, students from under-represented groups (e.g. Black Minority Ethnic) and students who are first in the family to attend university. With respect to attendance, students may choose not to attend for a variety of different factors, the time of the lecture, the timetable schedule for that day. I know that some students will actively decide not to attend a lecture because of commuting time and costs especially if there are few activities scheduled for that day.

Do learners understand the importance of lecture attendance?

I know that we tell them it is important but actually sharing evidence on the relationship between lecture attendance and student grades. I am doing other research where I have interviewed successful students and attendance is mentioned a lot so sharing student quotes could hopefully hammer home the point. An informed choice for adult learners is what I am getting at here. It also reminds me of feedback and students being aware of what feedback is and noticing it within their educational experiences. We have to support students in navigating this new university world they find themselves within.

How can we advise students on using lecture capture?

A recent pre-print of a publication from Emily Nordmann et al. provides practical recommendations on lecture capture for staff and students; also provided are helpful infographics that are freely available to download and use (highly recommended). Local guides are essential to communicate expectations with students, for example “7 ways to get the most out of Lecture Capture: A Guide for students. It’s about messaging and involving students in different ways.

Will lecture capture change teaching approaches?

The cons – The concern is that lecturers may feel inhibited to be creative with their teaching, try out new initiatives or tell jokes. The common policies on lecture capture include guidance that staff pause recordings for any interactive discussions with students or any confidential topics are not included in lecture capture. There’s tensions about recordings being available to view and how they might be used. It would be a shame if lectures became less interactive, lacked energy and were didactic in delivery. Actually, I am not sure if I could watch such recordings anyway, probably just fast forward to the parts I am interested in hearing and leave the rest.

The pros- I believe there’s an educational opportunity here, to critically discuss the purpose of the lecture and how it is delivered. If a lecturer is the only person talking during a lecture, then I do think a recording can suitably replace the lecture. Would I attend if I can listen to it all online at a time that is more flexible and suits me better? Interactivity within lectures, using technologies such as mentimeter for large class sizes to promote inclusive activities, ensuring that learners are not passive participants in a lecture will engage students better but also promote meaningful learning that isn’t so easily captured with a recording especially if the interactive aspects cannot be recorded for privacy constraints.

There’s a lot to think about with lecture capture and if you have any comments please tweet @suzannefergus or leave a comment below

Student Marking Exemplars – flipping a revision session for answering a SAQ

How to answer a Short Answer Question? This is the most frequent request I get from students when asked what they would like to cover in a study revision session. Their exams consist of a combination of Multiple Choice Questions (MCQs) and Short Answer Questions (SAQs). Typically for each student their performance on the SAQs is lower than for the MCQs. Not a surprise, it’s much harder to synthesise an answer on a blank sheet of paper than to recognise or work out the correct answer from a list of plausible options.

My 1st year chemistry students experience answering a formative SAQ under exam conditions ahead of a summative mid-term test to help familiarise them with what to expect. Feedback during this session helps to clarify what a good SAQ answer consists of using the formative SAQ as a specific example.

The key message I emphasise is to show both knowledge and understanding in their answers. If the question uses list or identify, then this requires knowledge of the correct answer. It is not asking to explain your answer or justify your reasoning. If the question asks for an explanation or to discuss the answer then understanding is required. The weighting on question parts within a SAQ can also indicate this with knowledge only questions weighted less than knowledge + understanding questions.


The students were also provided with written guidance on answering SAQs which included the main types of questions for 1st year chemistry.                               Top Tips for answering a SAQ                                               

top tips SAQ

A short answer questions means that the answer is relatively short compared to an essay. The answer may involve working out a calculation or may consist of several sentences and could include chemical structures/diagrams.

An answer that’s longer than necessary won’t cause you to lose marks, as long as everything you write is correct. But if incorrect, you will lose marks. In addition, writing more than necessary wastes time that could be spent on other questions. So you should only write more if you think it’s correct and crucial to answering the question.

Despite this guidance and further feedback from all staff within the module during smaller workshops, there was still a lack of confidence from the students with SAQs.

So I decided to flip a revision activity and provide the students with two exemplars answering a SAQ on the sample exam paper. I created the two answers, both with deliberate errors and the activity required the students to mark the questions. I explained that the students should mark both answers according to a 40:60 ratio for knowledge: understanding demonstrated in each answer. I wanted to revisit the initial guidance from the formative workshop on knowledge and understanding. Answer A was almost a top answer but included an error. Answer B was incorrect but aspects of the answer demonstrated correct understanding of the key concepts asked in the question. The aim here was to illustrate how we look to reward correct knowledge and understanding, that chemistry SAQs are not necessarily text only as they may include chemical structures or figures. I marked Answer A as 90% and Answer B as 40%.

It was clear early on that this activity challenged the students to think, they had to reflect on their own level of knowledge and understanding of the topic whilst marking the answers.

What was most interesting was the marks the students gave to both Answer A (my mark 90%) and Answer B (my mark 40%). I used mentimeter to capture their marks and it caused amusement when student marks were polled as the range of marks was indeed quite large!

The average student mark for Answer A was 56% and the average for Answer B was 54%. The highest mark for Answer A was 80% and the lowest was 20%. A similar pattern was seen for Answer B, highest mark 82%, lowest 30%. The students were challenged with the process of evaluation and from Bloom’s Taxonomy, unless good knowledge and understanding exists as a strong foundation, the intellectual skills of evaluating and analysing will be difficult to achieve. In terms of Answer B, 1/3 of the students gave a grade above 60%. Student feedback on the session was very positive, it got them thinking about SAQs from a new perspective and it was good feedback to them on what they needed to revisit in their revision. I used this activity as a pilot and will definitely want to repeat again including other topics and the critical aspect is to include some common errors or misconceptions in the exemplar answers.


Bloom’s Taxonomy (I don’t represent anymore using a pyramid)

Apple Logo, Amide bond…

One of my first steps and explorations into chemistry education research involved creating a diagnostic test to use with 1st year students in the chemistry module on both the Pharmacy and Pharmaceutical Science programmes. This approach was previously published in New Directions (see here) and consisted of 40 questions on fundamental chemistry principles pertaining to organic chemistry and basic chemistry knowledge that 1st year students on our programmes should have acquired based on the entry requirements for both programmes. One of the questions yielded unexpected outcomes with both 1st year and 3rd year students when the diagnostic test was piloted and this outcome has been consistently replicated annually with 1st year students!

The question I am referring to is

“Draw the chemical structure of an amide bond”

In the published pilot of the diagnostic test, this question was poorly answered with 80% of the answers incorrect. What has been observed each year the diagnostic test is administered is a similar result, students do attempt to answer the question, recognise there is a nitrogen but are unable to recall correctly the chemical structure of an amide bond. Many draw an amine bond. Other functional groups feature in the diagnostic test with questions that focus on the recognition of functional groups not on drawing them. These questions have higher percentages of correct answers. This year I included a question that required recognition of an amide bond from a selection of four chemical structures. Not surprisingly, a higher number of answers were correct for the question – 58%.

amide question

Figure 1. A question on the recognition of the amide functional group

There is something about correctly drawing the structure from visual memory that is causing difficulties. My hunch is that the problem lies in the input approach used by students for this type of information and the manner in which it is studied. Visual memories in humans is exceptionally good however exposure does not lead to enhanced memory which has been demonstrated by the classic penny experiment, locating the nearest fire extinguisher and more recently (which is my favourite) the Apple logo (see article here). This explicit memory or intentional retrieval of encountered events is poor for daily interactions with objects. Constant exposure and interaction however does not lead to accurate spatial recall.

The Apple logo experiment is a repeat of the classic penny experiment where participants were asked to recall the logo from memory and recognise the logo from a set of alternatives. They also measured metamemory judgements to assess confidence either before or after drawing the logo from memory with users rating on a 10-point scale their level of confidence.

apple logo

Figure 2. An example of some stimuli used in the recognition task (the correct logo is not shown here!)

In general, the participants (85 undergraduate students) were not able to draw the logo from memory. One participant drew it perfectly and seven participants had 3 or fewer errors. In terms of recognition, less than half (47%) correctly recognised the correct logo. There was an advantage with apple users over non-apple users but the difference was not significant. Not surprisingly, the confidence of participants (26 students) was higher prior to recalling the Apple logo (M 8.58) and dropped (to M 5.54) after the retrieval process.

Ok so recalling the amide bond requires conscious thought and for many students it has associations with a nitrogen but they cannot correctly recall the chemical structure. I do emphasise in my diagnostic test feedback, notice the word amide contains a D for double bond as a cue to help them trigger this recall. How many students test themselves when studying or are they rereading their lecture notes with an increased familiarity and hence confidence with such repetition!


Teaching with Mentimeter

I discovered mentimeter over a year ago and was eager to implement this technology within my teaching and obviously evaluate its use and effectiveness. So in this blog post, I will share my thoughts, be open about the lessons I have learned along the way and suggest useful ways to include mentimeter within presentations or classroom activities.

Mentimeter is an online interactive presentation tool, with no plug-in required and it’s free! (an essential factor for many educators when considering using or piloting a new technology).  There is no limit to the audience size and each presentation can include two different question types or 5 quiz questions. I have used the open-ended comments, word cloud and multiple choice questions formats. Other options are scales where participants can rate a statement, 100 points for prioritizing items and a gamification option to find a winner. I think the range of options and ease of use make it my favourite new technology tool to use. There are payment plans which allow greater functionality particularly to include branding and exporting data to excel but personally for teaching and interactivity the free version is ideal.

I first used mentimeter to understand better how my 1st year students were finding their chemistry module. Have you like me experienced that moment when you ask students for feedback or pose a question to be met with a sea of silence? I can understand this, students are reluctant to speak up in a large group (particularly 1st year at university) and if they do, it can be to tell you what they believe you want to hear.  Therefore, I introduced the open-ended comment function with my 1st year Pharmacy and Pharmaceutical Science students. It is the online version of write down on a post-it note your thoughts to the following question “What are your biggest challenges with learning chemistry?” The great thing with this mentimeter option is that the comments are anonymous so students can freely post their replies. From a teaching perspective I want to hear all comments, positive and negative, so removing the identity aspect, this allows students freedom to respond honestly. It allows a more inclusive approach, providing a mechanism for the ‘quiet learner’ to participate. I have also used this function during induction activities to ask students “how will studying at university differ from your previous school experience?” and engage them into this discussion.

Learning at university

The students really enjoy watching the comments appear on the screen. There is a profanity filter too although I have never had any issues with the comments written, just once when a student posted there would be a party and gave the details. We had to have a group conversation about professionalism. It didn’t happen again!

I have used multiple choice questions during lectures which is a great way to test understanding. There is the possibility to use peer-instruction with this approach and after having read “Make it Stick: The Science of Successful Learning” earlier this year, I incorporated using questions prior to instruction to help maximise learning. I invited students to answer a question on chemical bonding at the start of the lecture with 8% of students selecting the correct answer. At the end of the lecture, it increased to 61%. I was very excited to see this in action.

My third mentimeter use was with a Word cloud to understand how my 1st year students were feeling about chemistry at the very start of the module. I allowed students to respond more than once which is the lesson learned. They got a bit giddy seeing the words appear as the word cloud created live and some students started to include inappropriate words and slang that I had to look up to understand! So next time they won’t have the option to post more than once.

word cloud.png

Some of my colleagues have expressed concern about students using their phones during lectures and the potential distractions but I haven’t found this problematic. Setting expectations and managing unwanted classroom behaviours is always important. It has been very much a positive addition to my teaching practice. I have understood my students much more than previously and with my initial pilots, I have been impressed with the potential to use mentimeter from the instant interaction at the start of a presentation and inclusive student engagement to more structured learning developments.

PS I have no involvement with mentimeter, just sharing my experiences.


Learning Styles and Education Myths – an academic journey

This past year I had a great opportunity to work in our Learning and Teaching Innovation Centre and part of my role was teaching on the PGCert in HE more commonly known in my institution as CPAD (Continued Professional Academic Development).

I was module lead for Linking Pedagogic Theory to Practice, with a focus on learning through critical reflection. Those details are purely for context and will lead into what this blog post is really all about…..Learning styles and urban myths in education.

When I started my academic career, over 10 years ago, I was introduced to learning styles during my own CPAD journey. We spent time completing a purchased questionnaire and associated booklet within our multidisciplinary groups, followed by a buzz around the table “Oh I’m a reflector”, “I’m a pragmatist”, “I’m an activist”. Now, I was a complete novice to learning theory but eager to understand anything that might help my teaching and learning styles sounded like something tangible to use and apply. The key message from that session – students are not the same and learn differently. However, there was no what next, no critical discussion or evaluations from practice. I could hardly hand out a questionnaire to all my students and even if I did, what do I change in my teaching? I was somewhat confused…


I needed clarity and thankfully more recent publications help to critically review learning styles. So fast forward to the present CPAD where I have the opportunity to facilitate on learning styles and we get to grips with the concept and more importantly the lack of evidence to support them.

Learning styles refers to the belief that different people learn information in different ways.

There are at least 71 different learning styles described by Coffield et al. (2004) so this has been a vast area of interest among professional educators over the years not to mention the commercial activities associated with the instrument tools and resources.

The premise behind learning styles is that if the teaching approach matches the preferred mode of learning style this will lead to improved performance.  So for a “visual learner”, information should be presented visually to match or mesh with the learning style.

Pashler et al. (2009) provide an excellent critical review including their methodology to test the learning style hypothesis.

Any research study must:

1. Divide learners into two or more groups (e.g. visual and auditory)

2. Within each learning-style group, learners must be randomly assigned to one of at least two different instructional methods

3. The same test must be used with all learners

4. The results must demonstrate that the learning method providing optimal test performance of one learning-style group is different to the learning method that optimizes the performance of a second learning-style group.

Pashler et al. illustrate this with crossover interactions of acceptable evidence where the learning method with the highest mean test score for one category of learners is different from the learning method producing the highest mean test score for the other category of learners. In other words, if the same learning method optimizes the mean test score of both groups, the result does not provide evidence to support the learning style hypothesis.

Pasher et al cropped

They found only one study potentially meeting the criteria but with questionable evidence and methodological issues including removal of outliers and no reporting of mean scores for each final assessment.

Learning styles, now I only ever hear them mentioned with disapproval. There is however another urban myth “the learning pyramid” that I have seen at recent conferences used as a supporting model to demonstrate the effectiveness of learning using a particular mode of instruction. So we discussed, (well tore it apart!) this too in the CPAD module.


This intuitive model is attributed to the National Training Laboratories (NTL) in Bethel, Maine, USA. The laboratory respond to any requests for further information on the research with a standard email:

[The research was carried out] at our Bethel, Maine campus in the early sixties, when we were still part of the National Education Association’s Adult Education Division. Yes, we believe it to be accurate but no, we no longer have – nor can we find – the original research that supports the numbers. We get many enquiries about this every month – and many, many people have searched for the original research and have come up empty handed. We know that in 1954 a similar pyramid with slightly different numbers appeared on p. 43 of a book called Audio-Visual Methods in Teaching, published by the Edgar Dale Dryd. 

So the origins, well the above speaks for itself. The percentages are too nicely rounded off, almost impossible in scientific research and without specifying the method of measurement, the learners, the learning task casts further doubt. The pyramid is at odds and contradicts learning styles, a visual learner has a lower retention rate despite matching their preferred learning mode!

Is it a surprise then, that when I expressed an interest in chemistry education research early in my academic career that the response I got “but what about your real research?”

This was said with a positive intention I might add from a supportive Prof who was a critical friend later on when I wrote up a context-based case study.  He simply was concerned that I would lose my academic currency.  The education urban myths certainly haven’t helped our cause, fluffy non-rigorous research, that’s how many of my colleagues feel about education focused research.



Feedback and the Ryanair effect

What has Ryanair got to do with feedback in education? If you have ever had the pleasure to fly with Ryanair then you will have met the Ryanair effect! Knock airport in the west of Ireland is my destination frequently,  only 1 hour 10 mins from Luton. When the plane touches down on the runaway which was built over boggy terrain,  a familiar sound rings out – the Ryanair trumpet fanfair! The recording begins and a voice sounds

“Thank you for flying with Ryanair” followed by

“Last year, over 90 per cent of our flights arrived on time. We hope you enjoyed yours and we look forward to welcoming you on board again soon.”

Yet another on time flight! I have heard that jingle so many times, I think the association has become hard-wired into my brain! Now if Ryanair asked me in a survey,  “do our flights arrive on time?” I would find it difficult not to agree and answer positively, why? because they have reminded me so many times that their flights arrive on time.

So let’s transfer to feedback. Colleagues from my Department were feeling exasperated with student dissatisfaction on feedback in the post module questionnaire. The teaching team were giving so much quality feedback, feedback prior to activities, feedback immediately after tasks, feedback 1 week after some assessments but always delivering feedback in a timely manner according to the university regulations. There was no way to increase the feedback even if they tried!

And so they decided to have some fun with a serious intent as I like to frame it. They ordered bright pink caps with the word FEEDBACK clearly visible in white to use the following academic year. feedback-caps-2

The pink caps were worn anytime feedback was provided to the students. The students found it amusing, the staff had a giggle too, well you really couldn’t get overly serious wearing a bright pink cap! But the serious intent came to light in the student ratings at the end of the module. There was a 36% increase in the rating on feedback being timely and 26% increase in the rating on feedback being helpful.

Remember nothing changed from the previous year in relation to feedback other than highlighting when it was happening by wearing the bright pink cap. A very happy teaching team feeling justly recognised for their efforts. For me this is exactly like the Ryanair effect with their annoying yet memorable jingle.

So when it comes to feedback, do our students actually recognise feedback? Probably not unless we help them to identify it. Most students would say feedback is the comments they receive on their work. But feedback is more than that.  How can we help students to identify feedback? Small quick wins, this is definitely one of those.