Abstract
Providing formative assessment opportunities has been recognised as a significant benefit to student learning. The outcome of any formative assessment should be one that ultimately helps improve student learning through familiarising students with the levels of learning required, informing them about gaps in their learning and providing feedback to guide the direction of learning. This article provides an example of how formative assessments can be developed into a formative assessment journey where a number of different assessments can be offered to students during the course of a module of teaching, thus utilising a spaced-education approach. As well as incorporating the specific drivers of formative assessment, we demonstrate how approaches deemed to be stimulating, interactive and entertaining with the aim of maximising enthusiasm and engagement can be incorporated. We provide an example of a mixed approach to evaluating elements of the assessment journey that focuses student reaction, appraisal of qualitative and quantitative feedback from student questionnaires, focus group analysis and teacher observations. Whilst it is not possible to determine a quantifiable effect of the assessment journey on student learning, usage data and student feedback shows that formative assessment can achieve high engagement and positive response to different assessments. Those assessments incorporating an active learning element and a quiz-based approach appear to be particularly popular. A spaced-education format encourages a building block approach to learning that is continuous in nature rather than focussed on an intense period of study prior to summative examinations.
Keywords: anatomy, feedback, formative assessment, quiz, spaced education, student learning
Introduction
Formative assessment is designed to aid learning by generating feedback information that benefits students during the learning process and leads to enhanced learning outcomes. The provision of opportunities for formative assessment has been recognised as a significant benefit to student learning (Rolfe & McPherson, 1995; Black & Wiliam, 1998; Bierer et al. 2008; Carrillo-de-la-Pena et al. 2009). Formative assessments are usually systematic in approach, and are designed to be available to students during a particular period of study to provide motivation for learning. Whilst it is generally agreed that the outcome of any formative assessment should be one that ultimately helps improve learning, it has been suggested that there should be focus on three specific drivers when designing any formative assessment: using a method to inform students of gaps in their learning; familiarising students with the expectations of summative assessments; and providing feedback that guides the direction of student learning (Rolfe & McPherson, 1995; Krasne et al. 2006). Ideally and unlike summative assessments, formative assessment should occur in a non-threatening environment, be offered at a time that is applicable to the students' learning journey and be one where the student takes an active part in the process (Rolfe & McPherson, 1995; Harlen & James, 1997; Krasne et al. 2006). The progress of students will only be enhanced by formative assessment if they are able to use the opportunities effectively and recognise where they need to develop their learning or skills.
The retention of knowledge after the learning period is essential so that the ‘graduate’ is able to use and rely on a full understanding of that knowledge for use in their chosen career (Sugand et al. 2010). This deeper learning is usually achieved when knowledge is not restricted to learning a collection of isolated facts, and is instead a result of an active learning process being engaged and knowledge being fully understood and retained in context by the learner. Cognitive psychology research demonstrates that understanding involves creating links, and this is accomplished through active participation of the learner and being familiar with the material in question (Harlen & James, 1997). It has been suggested that the incorporation of formative assessment into the process will encourage adoption of an active learning approach and therefore may help achieve deeper learning (Rolfe & McPherson, 1995). In addition, it has been recognised that retention of gained knowledge is improved when educational encounters are spaced and repeated over a defined period and students are encouraged to apply ongoing learning and not a cramming technique just before the final examinations (Krasne et al. 2006; Kerfoot et al. 2007). Greater use of formative assessment throughout the module or course therefore provides an ideal opportunity for encouraging a spaced approach to learning.
Incorporating an active experience into the formative assessment format used is important, and research suggests that students prefer approaches that are stimulating, motivating and entertaining because they encourage their involvement (Harlen & James, 1997; Hudson & Bristow, 2006). The use of games can be used as an educational intervention to achieve these goals, and such strategies may help promote longer-term knowledge retention and the learning of key cognitive skills (Akl et al. 2010). A wide variety of games has been used as formative assessments in subjects such as physiology, biochemistry and pharmacology, and approaches have included board games, puzzles and activities based on television game shows (Moy et al. 2000; Willmott, 2001; Howard et al. 2002; Zakaryan et al. 2005; Hudson & Bristow, 2006; Shah et al. 2010). In each case students appear to have reacted positively to each intervention, and results suggests that the students' learning process might be enhanced using such approaches.
Formative assessment opportunities have been used in many areas of medical training, and are seen by some as an expected provision by medical schools to ensure students are able to track progress and therefore reach specified competencies (Bierer et al. 2008). For the opportunity of formative assessment to be maximized, it must therefore feature as a built-in component of a planned curriculum (Rushton, 2005). In the anatomy arena formative assessment has been used for many years, although has increasingly become a defined and integrated part of the approach used in gross anatomy, histology and embryology (McBride & Prayson, 2008; Rizzolo et al. 2010; Evans, 2011). Here we present a formative assessment journey where a number of different assessments can be offered to students during the course of a module or unit of teaching, thus utilising a spaced-education approach. In this example from Brighton and Sussex Medical School (BSMS), assessments included those of differing content, format and release, and were made available to all students taking the module. A specific focus was given to activity and quiz-based approaches with the aim of maximising enthusiasm and engagement. We demonstrate how a mixed approach to evaluation that focusses on aspects including evaluation of the server logs from the Learning Management System (LMS), appraisal of qualitative and quantitative feedback from student questionnaires, focus group analysis and teacher observations can be used to examine the use of and student reaction to formative assessment.
Creating an assessment journey
A collection of nine formative assessments in anatomy was designed and introduced into each of the system-based modules at BSMS. The system-based modules make up the basis of the first 2 years of the medical programme, and are designed to combine both scientific and clinical knowledge and skills in an integrated manner. Anatomy features as an interwoven element throughout each of the system-based modules, and is delivered using a multi-faceted approach that includes dissection (Evans & Watt, 2005; Evans & Cuffe, 2009). Individual elements of assessment were designed to have a different focus both in terms of content and approach using an interactive element where possible. Whilst content was directed towards anatomical knowledge, each assessment included functional and clinical relevance and reference to other elements of the module to enhance an integrated approach. In each case the assessments were targeted at different stages during the running of the modules to create a formative assessment journey, whereby specific assessments were released at defined times during the 10 weeks of each module. All students were introduced to the concept of formative vs. summative assessments in order to reduce any confusion (Anziani et al. 2008).
Table 1 provides a summary of the formative assessments released in each module, and includes the time of release, the mode of delivery, and details of when students receive answers and feedback. Assessments were not all originally produced at the same time, and it took a period of 3 years to put the full assessment journey together. Analysis was made, however, for students that had access to all assessments.
Table 1.
Details of each assessment type used in the formative assessment journey
Assessment | Release time (Module is 10 weeks) | Mode/location of delivery | Answers/feedback |
---|---|---|---|
‘Who Wants to be an Anatomist?’ | During each lecture (weeks 1–7) | Interactive coloured card quiz, lecture theatre | Immediate |
Anatomical wordsearch | At the end of each lecture (weeks 1–7) | Puzzle, managed learning environment | One week after release |
Dissection session checklist | At the end of each dissection session (weeks 1–7) | Checklist and questions, anatomy laboratory | Immediate |
‘Anatomy Quiz of the Week’ | At the end of each 2-week period of anatomy teaching (weeks 2, 4, 6, 8) | PowerPoint-based quiz, managed learning environment | One week after release |
Anatomy self-assessment quiz | Week 6 | PowerPoint-based quiz, lecture theatre and tutorial rooms | Immediate |
Anatomy spotter test | Week 7 | Anatomical specimens/models, anatomy laboratory | One day after dissection session |
‘A Question of Anatomy’ Revision quiz | Week 8 | Screencast quiz, managed learning environment | Immediate |
Anatomy viva | Week 8/9 | Oral assessment, anatomy laboratory | Immediate and result 1 day later |
Online picture quiz | Week 7–10 (new questions added each week) | Interactive online quiz, managed learning environment | Immediate (plus league table) |
A ‘Who wants to be an Anatomist’ quiz approach was used in each lecture in order to break up the session, to take account of students' total concentration and to see how they were responding to the material being presented. A question would appear at different times during the presentation with four possible answers to which a colour was assigned. Students were each provided with a collection of cards of the same four colours as an automated audience response systems (ARS) was not available, and when prompted were asked to show the lecturer the colour of their chosen answer. The lecturer was able to receive an immediate response of student performance and provide feedback to students.
A wordsearch puzzle was released on the LMS after each lecture, and contained 10 words that the lecturer felt were important to the lecture topic and with which the students should become familiar and be able to define.
The dissection checklists appeared in the dissection notes along with a number of clinical and observational questions. The checklists covered the main learning outcomes of the session, and were used by each table to check student knowledge in a group environment and provide immediate feedback. Simple PowerPoint quizzes containing a selection of different question formats were released to students via the LMS at the end of every 2-week teaching period. The quiz included image annotation, multiple choice and short-answer questions. Answers were time-released for the following week.
A self-assessment quiz was released in the second half of each module, it was held in a presentation format for all students in the lecture theatre. Students were asked to write down their individual answers to a range of questions. Following the quiz, students went through their answers in a group tutorial environment with demonstrators.
Spotter tests using specimens, models and images were incorporated into the dissection review sessions in the anatomy laboratory. The definitive answers were released 1 day after the session on the LMS.
The ‘A Question of Anatomy’ revision/review quiz was developed as a screencast and delivered online. The quiz was loosely based on a UK television programme called ‘A Question of Sport’ and used a number of different quiz rounds with varying formats. The screencast lasted for approximately 15 min without breaks, but could be paused at any point. The screencast approach allowed students to have multiple attempts, and for the lecturer to provide immediate feedback and include supplementary questions.
The anatomy viva was delivered as a group activity where the students from each dissecting table were asked a serious of oral questions using specimens and models. Students were each asked individual questions to test their knowledge and understanding. Unlike all the other assessments, an element of a summative approach was included in the vivas with students required to reach a satisfactory standard before being ‘signed-off’ by the examiner.
The final element of the assessment journey was the online picture quiz, which students could access on an individual basis at any time. On entering the quiz, students were presented with a set of six randomly chosen anatomical images from a large database of multiple choice and true/false answer image-based questions. Whilst students would get a single attempt at each question, they could access a new set of six randomly chosen questions as many times as they wished. Ongoing total scores were made available to the student and also presented as part of a league table to all students.
Student reaction to a formative assessment journey
A mixed analysis approach was adopted to assess student usage and reaction to the implementation of the formative assessment journey, and focussed on both qualitative and quantitative data retrieval for one first-year and one second-year module. Data were collected from the LMS server logs, teacher observations, a focus group and feedback questionnaires.
Student usage
Server log data from the LMS were retrieved for each of those assessments that had an online element, for example ‘Anatomy Quiz of the Week’ or the downloading of results of the spotter test. Data for each assessment were focussed on the total number of downloads; the number of students within a cohort downloading or attempting the assessment; and the maximum number of downloads made by an individual student. Four separate items of analysis were chosen, with each focussing on different aspects of the student usage (Table 2). The number of downloads for each element differed substantially, with some assessments such as the wordsearches receiving relatively few downloads (e.g. 23 for one wordsearch in the first-year module), whilst others received a much higher level (e.g. 270 for the second-year spotter test results). This was largely reflected in the total number of students downloading each assessment, which ranged from 13% to 77% of the total cohort. Although the average number of repeated downloads per participating student averaged only 1 in each case, some students downloaded a particular assessment up to 11 times. For the online picture quiz, student usage data were recorded based on numbers of questions answered in the quiz collectively by the student cohort and by individual students. The quizzes were released for 3 weeks prior to the summative examination, and a total of 39 704 and 12 795 questions were answered by first-and second-year students, respectively. First-year students recorded a total of 85% correct answers in their quiz, and second-year students recorded 74% correct answers. The maximum number of questions attempted by an individual student in each quiz was 2765 for the first-year module and 689 for the second-year module, with an average of 272 and 127 attempts per student in each respective year group. The proportion of students attempting the quiz was 94% for first-year students and 75% for second-year students.
Table 2.
Student usage data retrieved from LMS server logs for online formative assessments. Figures in brackets represent the number of students as a percentage of the year cohort
Assessment | Average number of downloads per type of assessment |
Average number of students downloading each assessment |
Average number of downloads per student |
Maximum number of downloads for one student |
||||
---|---|---|---|---|---|---|---|---|
1st year module | 2nd year module | 1st year module | 2nd year module | 1st year module | 2nd year module | 1st year module | 2nd year module | |
Anatomical wordsearch | 23 | 41 | 20 (13%) | 33 (25%) | 1 | 1 | 1 | 1 |
‘Anatomy Quiz of the Week’ | 133 | 122 | 74 (50%) | 69 (52%) | 1 | 1 | 7 | 6 |
Anatomy spotter test | 144 | 270 | 77 (52%) | 97 (73%) | 1 | 1 | 4 | 11 |
‘A Question of Anatomy’ Revision quiz | 177 | 112 | 114 (77%) | 68 (52%) | 1 | 1 | 4 | 6 |
For the wordsearch and quiz of the week assessments, the results presented are a typical example of each of these types of assessments as there were multiple wordsearches and quizzes of the week released during the module.
LMS, learning management system.
Student reaction
First-and second-year student feedback was collected for formative assessments via questions contained in the formal end-of-module anonymous questionnaires. Any open comments made about anatomy and the assessment in particular were noted and categorised into common themes. One-hundred and forty-four first-year students (97% of student cohort) and 127 second-year students (96% of student cohort) responded to the questionnaire. All students were given the opportunity to contribute to open-ended comments under two categories: best aspect of the module; and aspects requiring improvement. A total of 134 (first year) and 102 (second year) separate comments attributed to the anatomy component were made in the ‘best aspect’ section, and included remarks related to the overall teaching quality, the subject area, lecture content and staff. Of the anatomy comments, 37 (first year) and 17 (second year) individual comments were made specifically about the formative assessments offered and are categorised in Table 3. There were no first-year comments about anatomy in the ‘aspects requiring improvement’ section of the questionnaire, but nine separate comments were made about the anatomy component from the second-year students, and of these two were attributed to the formative assessments (Table 3).
Table 3.
Details of qualitative student comments attributed to the formative assessments obtained from the end-of module questionnaire
Categorization of student comments | Number of comments |
|
---|---|---|
First year | Second year | |
Best aspects | ||
Having formative assessment in anatomy | 11 | 8 |
Aided in lecture interaction | 10 | – |
Needed for other disciplines | 9 | 6 |
Variety of assessment types | 6 | 3 |
Viva | 1 | – |
Aspects requiring improvement | ||
Questions were not always representative of summative assessments | – | 1 |
Viva should be placed earlier in the cycle | – | 1 |
Quantitative evaluation using a five-point Likert scale was also made of whether first-year students found the ‘Question of Anatomy’ quiz of high quality and as a useful element in their learning. A total of 140 students (94% of student cohort) completed this section of the questionnaire, and indicated that they rated the usefulness of the material covered as an average of 4.4 and the difficulty and challenge of the material tested at 4.1. Students also gave a high rating for the interest and enjoyment of the quiz with an average score of 4.5.
Analysis of student reaction was also assessed using a small focus group approach, to which four volunteer students from year 1 were recruited in response to an open e-mail invitation. The focus group methodology was included as we wanted to explore the question of formative assessment in relation to learning styles, the approach used to teach anatomy and to capitalise on group interactions. A student facilitator was used for the focus group to help ensure no bias was given to the questions and discussion topics raised and responses recorded. The format of the focus group was semi-structured, with prompts ensuring that student views were appropriately explored. This approach was used to help keep the discussion reasonably focussed, whilst allowing views to be freely expressed. Notes (agreed by the student group) were taken by the facilitator during the discussion and used for further analysis. The focus group highlighted that formative assessment opportunities helped ‘consolidate their knowledge’ by linking particular concepts and information learned in a teaching session. Students commented that they ‘like to know what you need to know and to what level’, and formative assessments helped them gauge this more clearly. When asked, not all students in the group were aware that they had been through a formative assessment journey, but recognised the benefit of releasing assessments at different times. They felt such an approach helped reinforce their learning in a continuous way and not just during the revision/review period. Students felt it was important that formative assessments were ‘not an absolute requirement’, and that they could choose what to use and when. When asked about individual resources, the students commented that ‘different students find different formative assessments more useful than others’. Students were aware that they have different learning styles and used the example that whilst for some a wordsearch puzzle might be effective, for others it was less so as it was not particularly active in approach. The ‘Who wants to be an Anatomist’ quiz was highlighted as a way of maintaining interest and attention during lectures. The group members felt it gave them ‘more of an incentive to listen’, allowed them to follow the thread of the lecture and was done in a non-threatening way so as not to embarrass particular students who did not know the answer to the question. The immediate feedback allowed them to re-examine a concept they may have been struggling to grasp or identify gaps in their knowledge. This was also true of the dissection checklists that the students in the group valued as these allowed them to test each other and identify and correct areas they had not been clear on before leaving the laboratory. The quizzes of the week, whilst popular, were not always accessed by this group of students as they felt they needed time to direct towards other non-anatomical areas of learning in the module and not spend all their time on one subject. Some commented that they thought the quizzes could have covered other material delivered in that week of learning and not just anatomy and delivered as a ‘unified quiz’. The group all used the revision/review quiz and online picture quiz as they felt it was a good way to review material, with one student commenting that it helped ‘check my progress and highlight what I didn't know’. Students thought it was a fun and interactive approach and gave variation to their learning. In each case students felt that whilst the feedback given for each question was useful it could have been expanded.
Teacher observation
For those assessments that were not delivered online and where no LMS data could therefore be retrieved, such as in the lecture ‘Who wants to be an Anatomist’ quiz and the anatomy self-assessment quiz, qualitative teacher observations on student engagement (including attendance, discussion and interaction) were recorded through informal interviews. In the case of the delivery of the ‘Who wants to be an Anatomist’ quiz, the lead teacher reported that all students appeared to participate by raising the coloured card enthusiastically suggesting positive engagement. A total of two–four questions were delivered in the space of a 50-min lecture. It was noted that a minority of students were initially rather hesitant in providing answers to the questions; however, this appeared to improve as the module progressed and students became used to the quiz approach. The dissection checklists were used by both tutors and students to check knowledge and understanding of the main learning outcomes at the end of each dissection session. These appeared to work well, with most students reported to have responded and interacted well. Tutors found that the checklists were particularly useful in identifying those areas requiring clarification or more cadaveric dissection. Many students also used the checklists as an aide memoir during their dissection sessions to ensure they were covering the objectives of the session. This was even more notable in the review/revision sessions held towards the end of the module where students reviewed all the material covered in the sessions and had access to the checklists. The self-assessment quiz was well attended by the students in both the lecture element and subsequent feedback tutorial (approximately 95%). Tutors reported that the feedback sessions were interactive and students appeared positively engaged, with most students regularly proffering answers to questions. Informal feedback from the students suggested they approached the quiz in one of two ways, with some students actively revising their knowledge beforehand and other students testing their current level of knowledge and determining how much extra learning was necessary to achieve a suitable standard. The viva was a compulsory element and delivered as a group activity. Students demonstrated professional behaviour during the assessment, and the assessors gauged the level of performance as very high for most students. A total of seven second-year students (5%) were judged not to have achieved the standard of knowledge and understanding required, and they were given a subsequent additional opportunity at a later date, and all students were able to be ‘signed’ off at this stage. All first-year students were ‘signed’ off after their first viva attempt.
The effect of an assessment journey on student learning activity
A formative assessment journey was designed and introduced in anatomy to examine the effect of a variety of different assessment modes released and available to students at defined stages during the course of the module on student learning activity. Overall the analysis of the data collected demonstrated that formative assessments were well used by students, although the reasons for utilising each assessment and the numbers of students using each assessment were different.
A spaced-education approach was used for introducing each of the formative assessments, and each was released at different stages during the module to allow a building block approach to learning to be supported and meant each assessment included only the material that had already been covered or alluded to at that stage of the module, thus not alienating students by testing on unseen material. It has been argued that knowledge retention is improved when educational encounters are spaced and repeated over time (Krasne et al. 2006; Kerfoot et al. 2007). Interestingly students in the focus group did not appear to recognise that they had been through a formative assessment journey, although they thought such an approach was beneficial and would help reinforce their learning in a continuous way, which is in line with student feedback from other studies (Bierer et al. 2008). To improve the perception of the journey, all students are now given an overview of the assessment journey at the start of the module and provided with an illustrative map showing the types of assessment and when they are available.
An active approach was built into the design of many of the assessments to encourage participation and to enhance the way students experiment with their learning. Data from the LMS server logs show that some assessments were used more than others, with those that required a more active approach such as the ‘A Question of Anatomy’ revision quiz. The results of the online picture quiz, which combined the most active elements, demonstrate this trend further with extremely high participation rates recorded especially by first-year students. In contrast, the wordsearch puzzles were used less widely, and this might be because this was a less active assessment type or that this approach does not reflect the types of questions used in the summative assessment and therefore is perceived as less useful. Rolfe & McPherson (1995) suggest that when an active learning approach is used in assessment design, deeper learning is encouraged. It is unclear whether the introduction of a formative assessment journey using active formats has helped develop a deeper approach to learning in our students, and this can only be judged at a later date when students are required to use the knowledge and understanding gained in subsequent modules and when on particular clinical rotations.
It has been reported that students use a range of different resources in their learning and in preparation for their examinations (Ozolins et al. 2008). The formative assessment journey was designed to incorporate a variety of different assessment types so that students would not get overly used to just one type of assessment, and so that all students would hopefully be able to find an assessment that they felt was useful in their learning. Usage data demonstrated that a large percentage of students used many if not all the assessments on offer, although they utilised some assessment types more than others. Students in the focus group commented that it was important that they could choose what to use and when, and this suited the various learning styles different students have. Some students suggested that if similar assessments could be made available in other module themes or combined into the anatomy assessments this would provide a more rounded progress report for them and could utilise their time more effectively. Student feedback demonstrated that the online revision quiz and picture quiz were the most popular assessments, and this was attributed in part to the fun nature of the quizzes. Other studies also suggest that students prefer approaches that are stimulating, motivating and entertaining because they encourage their involvement (Harlen & James, 1997; Hudson & Bristow, 2006). In addition, incorporation of fun into the learning process can help reduce the stress and anxiety some students have about assessment and help them to deal with the overwhelming feeling that there is too much to learn (Allery, 2004; Ballon & Silver, 2004; Zhang et al. 2011). One reason might be that activities such as quizzes and puzzles are often associated as being a recreational activity (Shah et al. 2010).
Teacher observations demonstrated that the ‘Who wants to be an Anatomist’ quiz approach worked well in the lecture sessions with high levels of participation. Students reported that this quiz approach maintained interest and attention during the lectures. This lecture-based quiz approach has been used widely for many years, with Harden reporting the use of audience response cards in lectures several decades ago, and showed that it helped maintain student interest and enabled a student to self-assess their understanding (Harden, 1968). The use of coloured card and similar approaches has been largely superseded by the introduction of automated ARS, which have also been shown to enhance attention and enthusiasm in learners (Miller et al. 2003; Latessa & Mouw, 2005; Alexander et al. 2009). The current study demonstrates, however, that the coloured card system is still an appropriate method to use and one that is highly valued by students. Whilst it is recognised that the coloured card approach is more limited than using ARS clickers, it does provide an opportunity for spontaneous questions to be introduced if gaps in understanding are identified during the session, and unlike ARS it is not subject to technical difficulties, is fully transportable and the cost is minimal.
Immediacy of feedback appears to be appreciated by the students and noted as an attribute for some of the assessments used in the formative assessment journey, such as the self-assessment quiz, anatomy viva and for the dissection checklists. Students noted the checklists as particularly valuable as they allowed them to identify and correct areas immediately (through their own investigation or with help from a demonstrator) and before leaving the anatomy laboratory. Checklists have been shown to also increase practical examination scores and dissection quality (Hofer et al. 2011). It is also recognised that students benefit from not always being provided with answers immediately, and a number of assessments can be used that encourage students to find out answers for themselves after the assessment.
Concluding remarks
The development of a formative assessment journey in anatomy provides a programme of different assessments that can help ensure students find a format to engage with and to act as an effective learning support resource. Assessments are designed to incorporate the drivers of highlighting gaps in student learning, giving indication to the level of knowledge required, and to provide useful feedback (Rolfe & McPherson, 1995; Krasne et al. 2006). Presenting formative assessment using a temporal journey format encourages a continuous and staged approach to learning rather than an intense period of study just prior to summative assessments. Whilst it was not possible to determine a quantifiable effect of the assessment journey on student learning, the overall engagement with the elements of the assessment journey suggests that students reacted positively to the inclusion of each resource. To ensure that active participation continues in the future, however, it will be important to enhance assessment journeys by involving students with the development of the resources.
Acknowledgments
The authors would like to thank Tim Vincent for helping with the technical aspects of some of the online materials, Tracy Cuffe for some of the quiz design, and Alison Bryson for help with the text. The authors are also grateful to the students who took part in the focus group and completed anonymous questionnaires. Ethical approval was not required for this study, and the anonymity of the participants in data collection through informal questioning, questionnaires and the focus group was guaranteed.
Author contributions
DJRE: concept and design of the formative assessments, the construction of the assessment journey, literature search, data collection and analysis, and manuscript preparation. PZ: literature search, focus group facilitation and analysis, manuscript preparation. RAS: technical construction of the online picture quiz, data collection and analysis, and manuscript preparation.
References
- Akl EA, Pretorius RW, Sackett K, et al. The effects of educational games on medical students' learning outcomes: a systematic review. Med Teach. 2010;32:16–27. doi: 10.3109/01421590903473969. [DOI] [PubMed] [Google Scholar]
- Alexander CJ, Crescini WM, Juskewitch JE, et al. Assessing the integration of audience response system technology in teaching of anatomical sciences. Anat Sci Educ. 2009;2:160–166. doi: 10.1002/ase.99. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Allery LA. Educational games and structured experiences. Med Teach. 2004;26:504–505. doi: 10.1080/01421590412331285423. [DOI] [PubMed] [Google Scholar]
- Anziani H, Durham J, Moore U. The relationship between formative and summative assessment of undergraduates in oral surgery. Eur J Dent Educ. 2008;12:233–238. doi: 10.1111/j.1600-0579.2008.00524.x. [DOI] [PubMed] [Google Scholar]
- Ballon B, Silver I. Context is key: an interactive experiential and content frame game. Med Teach. 2004;26:525–528. doi: 10.1080/01421590412331282282. [DOI] [PubMed] [Google Scholar]
- Bierer SB, Dannefer EF, Taylor C, et al. Methods to assess students' acquisition, application and integration of basic science knowledge in an innovative competency-based curriculum. Med Teach. 2008;30:e171–e177. doi: 10.1080/01421590802139740. [DOI] [PubMed] [Google Scholar]
- Black P, Wiliam D. Assessment and classroom learning. Assess Educ. 1998;5:7–74. [Google Scholar]
- Carrillo-de-la-Pena MT, Bailles E, Caseras X, et al. Formative assessment and academic achievement in pre-graduate students of health sciences. Adv Health Sci Educ Theory Pract. 2009;14:61–67. doi: 10.1007/s10459-007-9086-y. [DOI] [PubMed] [Google Scholar]
- Evans DJ. Using embryology screencasts: a useful addition to the student learning experience? Anat Sci Educ. 2011;4:57–63. doi: 10.1002/ase.209. [DOI] [PubMed] [Google Scholar]
- Evans DJ, Cuffe T. Near-peer teaching in anatomy: an approach for deeper learning. Anat Sci Educ. 2009;2:227–233. doi: 10.1002/ase.110. [DOI] [PubMed] [Google Scholar]
- Evans DJ, Watt DJ. Provision of anatomical teaching in a new British medical school: getting the right mix. Anat Rec. 2005;284B:22–27. doi: 10.1002/ar.b.20065. [DOI] [PubMed] [Google Scholar]
- Harden RM. An audience response card. Br J Med Educ. 1968;2:220–222. doi: 10.1111/j.1365-2923.1968.tb01774.x. [DOI] [PubMed] [Google Scholar]
- Harlen W, James M. Assessment and learning: differences and relationships between formative and summative assessment. Assess Educ. 1997;4:365–379. [Google Scholar]
- Hofer RE, Nikolaus OB, Pawlina W. Using checklists in a gross anatomy laboratory improves learning outcomes and dissection quality. Anat Sci Educ. 2011;4:249–255. doi: 10.1002/ase.243. [DOI] [PubMed] [Google Scholar]
- Howard MG, Collins HL, DiCarlo SE. “Survivor” torches “Who wants to be a physician?” in the educational games ratings war. Adv Physiol Educ. 2002;26:30–36. doi: 10.1152/advan.00014.2001. [DOI] [PubMed] [Google Scholar]
- Hudson JN, Bristow DR. Formative assessment can be fun as well as educational. Adv Physiol Educ. 2006;30:33–37. doi: 10.1152/advan.00040.2005. [DOI] [PubMed] [Google Scholar]
- Kerfoot BP, Baker HE, Koch MO, et al. Randomized, controlled trial of spaced education to urology residents in the United States and Canada. J Urol. 2007;177:1481–1487. doi: 10.1016/j.juro.2006.11.074. [DOI] [PubMed] [Google Scholar]
- Krasne S, Wimmers PF, Relan A, et al. Differential effects of two types of formative assessment in predicting performance of first-year medical students. Adv Health Sci Educ Theory Pract. 2006;11:155–171. doi: 10.1007/s10459-005-5290-9. [DOI] [PubMed] [Google Scholar]
- Latessa R, Mouw D. Use of an audience response system to augment interactive learning. Fam Med. 2005;37:12–14. [PubMed] [Google Scholar]
- McBride JM, Prayson RA. Development of synergistic case-based microanatomy curriculum. Anat Sci Educ. 2008;1:102–105. doi: 10.1002/ase.21. [DOI] [PubMed] [Google Scholar]
- Miller RG, Ashar BH, Getz KJ. Evaluation of an audience response system for the continuing education of health professionals. J Contin Educ Health Prof. 2003;23:109–115. doi: 10.1002/chp.1340230208. [DOI] [PubMed] [Google Scholar]
- Moy JR, Rodenbaugh DW, Collins HL, et al. Who wants to be a physician? An educational tool for reviewing pulmonary physiology. Adv Physiol Educ. 2000;24:30–37. doi: 10.1152/advances.2000.24.1.30. [DOI] [PubMed] [Google Scholar]
- Ozolins I, Hall H, Peterson R. The student voice: recognising the hidden and informal curriculum in medicine. Med Teach. 2008;30:606–611. doi: 10.1080/01421590801949933. [DOI] [PubMed] [Google Scholar]
- Rizzolo LJ, Rando WC, O'Brien MK, et al. Design, implementation and evaluation of an innovative anatomy course. Anat Sci Educ. 2010;3:109–120. doi: 10.1002/ase.152. [DOI] [PubMed] [Google Scholar]
- Rolfe I, McPherson J. Formative assessment: how am I doing? Lancet. 1995;345:837–839. doi: 10.1016/s0140-6736(95)92968-1. [DOI] [PubMed] [Google Scholar]
- Rushton A. Formative assessment: a key to deep learning? Med Teach. 2005;27:509–513. doi: 10.1080/01421590500129159. [DOI] [PubMed] [Google Scholar]
- Shah S, Lynch LMJ, Macias-Moriarity LZ. Crossword puzzles as a tool to enhance learning about anti-ulcer agents. Am J Pharm Educ. 2010;74:117. doi: 10.5688/aj7407117. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sugand K, Abrahams P, Khurana A. The anatomy of anatomy: a review for it modernization. Anat Sci Educ. 2010;3:83–93. doi: 10.1002/ase.139. [DOI] [PubMed] [Google Scholar]
- Willmott CJR. Revision bingo. Biochem Mol Biol Educ. 2001;29:193–195. [Google Scholar]
- Zakaryan V, Bliss R, Sarvazyan N. Non-trivial pursuit of physiology. Adv Physiol Educ. 2005;29:11–14. doi: 10.1152/advan.00031.2004. [DOI] [PubMed] [Google Scholar]
- Zhang J, Peterson RF, Ozolins LZ. Student approaches for learning in medicine: what does it tell us about the informal curriculum? BMC Med Educ. 2011;11:87. doi: 10.1186/1472-6920-11-87. [DOI] [PMC free article] [PubMed] [Google Scholar]