References

Astin F, Long A Characteristics of qualitative research and its application. British Journal of Cardiac Nursing. 2014; 9:(2)93-8 https://doi.org/10.12968/bjca.2014.9.2.93

Anderson M, Stickley T Finding reality: the use of objective structured clinical examination (OSCE) in the assessment of mental health nursing students interpersonal skills. Nurse Educ Pract. 2002; 2:(3)160-8 https://doi.org/10.1054/nepr.2002.0067

Barry M, Bradshaw C, Noonan M Improving the content and face validity of OSCE assessment marking criteria on an undergraduate midwifery programme: a quality initiative. Nurse Educ Pract. 2013; 13:(5)477-80 https://doi.org/10.1016/j.nepr.2012.11.006

Brosnan M, Evans W, Brosnan E, Brown G Implementing objective structured clinical skills evaluation (OSCE) in nurse registration programmes in a centre in Ireland: a utilisation focused evaluation. Nurse Educ Today. 2006; 26:(2)115-22

Bujack L, McMillan M, Dwyer J, Hazelton M Assessing comprehensive nursing performance: the Objective Structural Clinical Assessment (OSCA). Part 1--Development of the assessment strategy. Nurse Educ Today. 1991; 11:(3)179-84

Butler MM, Fraser DM, Murphy R What are the essential competencies required of a midwife at the point of registration?. Midwifery. 2008; 24:(3)260-9

Seven principles for good practice in undergraduate education. 1987. http://www.lonestar.edu/multimedia/SevenPrinciples.pdf (accessed 15 April 2016)

Clynes MP, Raftery SE Feedback: an essential element of student learning in clinical practice. Nurse Educ Pract. 2008; 8:(6)405-11 https://doi.org/10.1016/j.nepr.2008.02.003

Corbett AT, Anderson JR Locus of feedback control in computer-based tutoring: Impact on learning rate, achievement and attitudes. In: Jacko J, Sears A, Beaudouin-Lafon M, Jacob R New York: ACM Press; 2001

, 3rd edn. In: Denzin NK, Lincoln YS Thousand Oaks, CA: SAGE Publications; 2005

Duffield KE, Spencer JA A survey of medical students' views about the purposes and fairness of assessment. Med Educ. 2002; 36:(9)879-86

Fern EF Focus Groups: a Review of Some Contradictory Evidence, Implications, and Suggestions For Future Research. In: Bagozzi RP, Tybout AM, Abor A Duluth, MN: Association for Consumer Research; 1983

Fluckiger J, Tixier y Vigil Y, Pasco R, Danielson K Formative Feedback: Involving Students as Partners in Assessment to Enhance Learning. College Teaching. 2010; 58:(4)136-40 https://doi.org/10.1080/87567555.2010.484031

Franklin P OSCEs as a means of assessment for the practice of nurse prescribing. Nurse Prescribing. 2005; 3:(1)14-23 https://doi.org/10.12968/npre.2005.3.1.17509

Fraser DM, Avis M, Mallik M The MINT project--an evaluation of the impact of midwife teachers on the outcomes of pre-registration midwifery education in the UK. Midwifery. 2013; 29:(1)86-94 https://doi.org/10.1016/j.midw.2011.07.010

, 3rd edn. In: Fry H, Ketteridge S, Marshall S New York: Routeledge; 2009

Gaberson KB, Oermann MH, 3rd edn. New York: Springer Publishing Company; 2010

Gibbs G, Simpson C Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education. 2004; 1:3-31

Goodman JS, Wood RE, Hendrickx M Feedback specificity, exploration, and learning. J Appl Psychol. 2004; 89:(2)248-62

Harding JLondon: SAGE Publications; 2013

International Confederation of Midwives. Essential Competencies for Basic Midwifery Practice. 2010. http://tinyurl.com/q9vspz9 (accessed 18 April 2016)

James A Ethnography in the study of children and childhood. In: Atkinson P, Coffey A, Delamont S, Lofland J, Lofland L London: SAGE Publications; 2007

Jay A Students' perceptions of the OSCE: a valid assessment tool?. British Journal of Midwifery. 2007; 15:(1)32-7 https://doi.org/10.12968/bjom.2007.15.1.22677

Engaging research: Focus group based public engagement. 2008. http://www.open.ac.uk/blogs/per/?p=550 (accessed 18 April 2016)

A guide to interview guides. 2006. http://tinyurl.com/z5q68xu (accessed 18 April 2016)

Lange G, Kennedy HP Student perceptions of ideal and actual midwifery practice. J Midwifery Womens Health. 2006; 51:(2)71-7

Lindlof TR, Taylor BC, 3rd edn. Thousand Oaks, CA: SAGE Publications; 2010

McTighe J, O'Connor K Seven Practices for Effective Learning. Educational Leadership. 2005; 63:(3)10-17

Mitchell ML, Henderson A, Groves M, Dalton M, Nulty D The objective structured clinical examination (OSCE): optimising its value in the undergraduate nursing curriculum. Nurse Educ Today. 2009; 29:(4)398-404 https://doi.org/10.1016/j.nedt.2008.10.007

Moorhead R, Maguire P, Thoo SL Giving feedback to learners in the practice. Aust Fam Physician. 2004; 33:(9)691-5

Muldoon K, Biesty L, Smith V ‘I found the OSCE very stressful’: student midwives' attitudes towards an objective structured clinical examination (OSCE). Nurse Educ Today. 2014; 34:(3)468-73 https://doi.org/10.1016/j.nedt.2013.04.022

Nicol DJ, Macfarlane-Dick D Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education. 2006; 31:(2)199-218

London: NMC; 2009

London: NMC; 2012

Offredy M, Vickers PChichester: John Wiley and Sons; 2010

Orsmond P, Merry S, Reiling K Biology students' utilization of tutors' formative feedback: a qualitative interview study. Assessment & Evaluation in Higher Education. 2005; 30:(4)369-86 https://doi.org/10.1080/02602930500099177

Pearce J, Mulder R, Bai C Involving students in peer review Case studies and practical strategies for university teaching.: Centre for the Study of Higher Education, University of Melbourne; 2009

Pintrich PR, Schunk DL, 4th edn. Upper Saddle River, NJ: Prentice Hall; 2013

Race P, Pickford RLondon: SAGE Publications; 2007

Reed M, Walker R Leading by Example: an Examination of Early Education Foundation Degree Students Completing Research Dissertations. Journal of Early Childhood Education Research. 2014; 3:(1)51-64

Robson SOxford: Blackwell; 1993

Schroth ML The effects of delay of feedback on a delayed concept formation transfer task. Contemporary Educational Psychology. 1992; 17:(1)78-82 https://doi.org/10.1016/0361-476X(92)90048-4

Selby C, Osman L, Davis M, Lee M Set up and run an objective structured clinical exam. BMJ. 1995; 310:(6988)1187-90

Shute VJ Focus on formative feedback.Princeton, NJ: Educational Testing Service; 2007

Sloan DA, Donnelly MB, Schwartz RW, Strodel WE The Objective Structured Clinical Examination. The new gold standard for evaluating postgraduate clinical performance. Ann Surg. 1995; 222:(6)735-42

Smith GHNew York: McGraw-Hill; 1954

Walker R, Solvason CLondon: SAGE Publications; 2014

Waugh A, Grant A, 12th edn. London: Churchill Linvingstone; 2014

The role of immediacy of feedback in student learning. 2010. http://davidwees.com/content/role-immediacy-feedback-student-learning (accessed 18 April 2016)

Wilson J Bridging the theory practice gap. Aust Nurs J. 2008; 16:(4)

Worth-Butler MM, Fraser DM, Murphy RJ Eliciting the views of experienced midwives about the assessment of competence in midwifery. Midwifery. 1996; 12:(4)182-90

Yorke M Formative assessment and its relevance to retention. Higher Education Research & Development. 2001; 20:(2)115-26 https://doi.org/10.1080/758483462

Student midwives' perspectives on efficacy of feedback after objective structured clinical examination

02 June 2016
Volume 24 · Issue 5

Abstract

Students' experience of feedback is considered an indicator of the efficacy of the assessment process. Negative experiences of feedback are unproductive in terms of the likelihood that students will act on and learn from assessment. To understand the impact of feedback on learning, this study explored the experiences of student midwives after receiving feedback following objective structured clinical examination (OSCE). Data were collected from second-year undergraduate student midwives who had recently completed OSCE, via a focus group. Students reported raised stress levels, concerns around legitimacy of feedback, and inconsistencies in the manner in which feedback was articulated. Assessment feedback in higher education should be used to empower students to become self-regulated learners. This is important for student midwives, for whom a considerable amount of learning is spent in practice. The study has implications for midwifery academics concerned with modes of assessment and quality of assessment feedback in midwifery education.

Assessment of learning and subsequent feedback is important in both theoretical and practice domains in order to ensure students understand the theory underpinning midwifery, and are able to practise competently to the standard required by the Nursing and Midwifery Council (NMC). It is incumbent on both the midwifery lecturer and the practice mentor to understand the key role of feedback in ensuring student midwives learn from, and are empowered by, the assessment process (Nicol and Macfarlane-Dick, 2006). Feedback following assessment should be both formative and diagnostic, providing information about student achievement to teachers and learners alike. Assessment and feedback contributes to student learning at university, assists the development of evaluative skills and reflection, and is essential for employment and lifelong learning (Gaberson and Oermann, 2010).

Background

Students are exposed to feedback in various forms, and their perception of feedback as ‘useful’ includes whether or not the feedback is relevant to the assessment and whether the feedback is perceived as positive or negative. Both positive and negative feedback may be deemed useful in certain situations (Orsmond et al, 2005), for example after an objective structured clinical examination (OSCE) where the purpose is to assess competency to carry out a particular midwifery skill. In this situation, feedback on performance should be unambiguous, clearly articulated and delivered in a manner that makes absolutely clear to the student those areas requiring immediate improvement. In reality, feedback can produce unclear results, requiring further investigation on the part of students who seek to learn from feedback and improve their performance (Fry et al, 2009). When feedback is not clearly articulated, students are unable to self-regulate their learning, improve future performance and thus gain the necessary skills to practise competently.

In order to ensure feedback following assessment is ‘fit for purpose’, and to improve feedback performance, this study sought to explore the experiences of student midwives after receiving feedback following OSCE. The rationale for choosing to focus on feedback following OSCE is twofold. First, it is likely that students may feel a level of performance anxiety due to the conditions in which OSCE occurs i.e. a simulated work environment, and this may affect how feedback is perceived. Second, the purpose of OSCE is to improve technical skills and the underlying knowledge required for safe and accurate application (Mitchell et al, 2009). Consequently, the feedback element of OSCE is key if students are to gain maximum benefit from what has been considered a contentious mode of assessment (Jay, 2007).

Objective structured clinical examination

OSCE is a series of stations where clinical skills are assessed by an examiner using previously determined objective marking criteria, and is widely used to test clinical skills (Selby et al, 1995). Sloan et al (1995) suggest OSCE is the gold standard for evaluating clinical performance. As skill transference and competence at the point of registration are of paramount importance (Bujack et al, 1991; Worth-Butler et al, 1996; Butler et al, 2008; International Confederation of Midwives, 2010; NMC, 2012; Fraser et al, 2013), testing midwifery skills through the use of OSCE would seem a legitimate method for evaluating clinical midwifery performance.

While OSCE is widely used in medicine, with comprehensive evaluation of the evidence to prove its worth (Barry et al, 2013), little evidence is available for the use of OSCE in midwifery education. Student stress and anxiety are suggested as reasons to avoid OSCE in midwifery programmes (Franklin, 2005; Brosnan et al 2006). However, performance anxiety aroused by undertaking OSCE may prove beneficial in preparing students to deal with and perform well in emergency situations in clinical practice (Duffield and Spencer, 2002).

Assessment feedback

Feedback is an important component of learning, yet it is not well understood (Yorke, 2001). Students usually expect assessment feedback to be provided in the form of written comments on assignments, with little understanding that feedback may be provided in different formats (Fry et al, 2009). Explaining different feedback formats can be problematic as students may be conditioned through prior experience to receive feedback in a particular way. Clarity of feedback—in whatever form it takes—is therefore essential. Feedback which is variously too long, too short or poorly articulated may result in the student simply paying no attention to it (Shute, 2007). When students are provided with clear feedback, there is an increased likelihood of self-regulation of learning, which in turn has an impact on the transference of learning from theory to practice.

Timeliness of feedback is integral to good feedback practice (Chickering and Gamson, 1987; Nicol and Macfarlane-Dick, 2006). Timeliness in relation to feedback means ‘close to the act of learning production’ (Wees, 2010). However, there is debate as to whether feedback should be delayed or immediate (McTighe and O'Connor, 2005; Fluckiger et al, 2010). Timeliness is arguably the most important criterion for ‘good’ feedback from the midwifery student perspective in relation to learning a clinical skill, in that midwifery skills need to be learnt fairly quickly in the university environment, prior to enactment in the clinical setting (Schroth, 1992; Corbett and Anderson, 2001).

Specificity of feedback is defined as the level of information presented in feedback messages (Goodman et al, 2004). Specific feedback is thought to produce benefits in the short term. However, those benefits may not endure over time or with modification of the task (Goodman et al, 2004). Specific versus generalised feedback is an important consideration, not least in midwifery education where competency in the performance of clinical skills is essential. Immediate specific feedback has the potential to modify the learner's thinking and/or behaviour for the purpose of improving learning in practice (Shute, 2007).

In light of the literature concerned with assessment feedback, this study sought to understand, from the perspectives of student midwives, the experience of receiving feedback following OSCE.

Study aims

The study aimed to:

  • Understand how feedback is experienced by student midwives following OSCE
  • Inform strategies to enable students to self-regulate learning
  • Maximise the potential of feedback to affect performance in practice settings
  • Facilitate best practice in giving feedback.
  • Methodology

    Ethical permission for the study was granted by the university's ethics committee. Particular ethical challenges related to the duality of the researcher as academic teacher and midwifery practitioner. It was important to be mindful of the potential for this study to affect the current and ongoing student/teacher relationship, the so-called insider/outsider phenomenon (Reed and Walker, 2014). James (2007) has suggested that, in order to counteract insider/outsider conflict, the researcher should take a personal approach by working ‘with’ the participants and not ‘on’ them. This requires a delicate balancing between the roles of practitioner and researcher (Walker and Solvason, 2014).

    It was important in this study to understand student midwives' perceptions of, and subsequent behaviours after, receiving feedback—specifically whether or not feedback was ‘fit for purpose’ in relation to enabling students to improve performance and competency in midwifery skills. A qualitative approach to research design was chosen for its ability to reveal a target audience's range of behaviours and the perceptions that drive those behaviours in relation to a particular topic or issue (Denzin and Lincoln, 2005). Data collection via a focus group allowed for insights that would be less accessible without the interaction found in a group setting (Lindlof and Taylor, 2010; Astin and Long, 2014).

    Sample

    Midwifery students in the second year of study were purposively selected to ensure the respondents were able to provide data of relevance to the topic of enquiry (Offredy and Vickers, 2010). Criteria for inclusion in the focus group required that students should:

  • Have undertaken OSCE assessment in the first year of the programme
  • Have received feedback
  • Be willing to take part in a focus group discussion.
  • Thirty-six student midwives fit the criteria and were invited to take part in the study by email. Six students expressed interest in the study and were subsequently invited to participate in a focus group discussion centred on experiences of receiving feedback following OSCE. Students were provided with detailed information about the study aims and duly consented to the study. It is possible the small number of student midwives who showed interest in the study was reflective of dissatisfaction with the feedback process. While this may or may not be the case, it was nevertheless important to listen to the voices of the student midwives who were willing to share experiences of receiving assessment feedback, while acknowledging these student midwives may not be representative of the cohort.

    The focus group

    Ideal focus group size is said to include no more than 8–12 members (Fern, 1983). Irrespective of ideal numbers, successful focus groups depend on location, seating arrangements, availability of participants willing to engage in the process, and availability of a moderator (Lindlof and Taylor, 2010). Smith (1954: 59) justified relatively small numbers in a focus group when arguing that participation ‘be limited to permit genuine discussion among all members'. It was important in this focus group to ensure all participants had equal opportunity to share experiences of feedback following OSCE. The interview guide was deemed particularly important in this respect, in that testimony is generally considered a weak source of evidence (Kennedy, 2006). Attention was paid to the interview guide to ensure other sources of evidence were used to inform the line of questioning, for example: previous student evaluations whereby feedback following assessment had been flagged as an issue; anecdotal accounts of poor assessment feedback from student midwives; and mentor accounts of student midwives' apparent inability to transfer theoretical learning to practice settings.

    A key role in the focus group method is the ‘moderator’, who should be able to ask specific questions for the discussion (Jensen, 2008). In this case the researcher, as a ‘knowledgeable’ person, acted as moderator. The focus group lasted for 45 minutes, after which no new information was forthcoming. Participants were subsequently thanked and audio equipment switched off. The focus group recording was listened to and transcribed verbatim.

    Data analysis

    Focus groups generally produce large amounts of data, thus a general aim is to reduce data (Robson, 1993). As only one focus group was conducted, it was not necessary to reduce data, but more important to ensure data were examined, categorised and tabulated in order to address the goal of the study, i.e. to understand from the perspective of student midwives the experience of receiving feedback following OSCE. Framework analysis involves a number of steps for analysing focus group data (Harding, 2013). First, the recorded focus group was listened to. Second, the transcript was read in its entirety. Third, notes were taken from the transcripts in order to get a sense of the focus group as a whole. Emergent concepts allowed categories to be identified, which in turn enabled identification of descriptive statements. Quotations were then lifted from the transcripts to illustrate thematic content. Student midwives were numbered 1–6 for purposes of anonymity.

    Findings

    Three main themes were identified from the data:

  • Students needed to be in a state of ‘readiness’ to receive feedback. The fact that OSCE heightened stress among students impinged on their readiness to receive feedback and act on it
  • Students were sceptical as to the location of OSCE, i.e. in the theoretical setting as opposed to the practice setting. The fact that OSCE is designed to assess clinical skills suggested to students that OSCE and subsequent feedback should be located within clinical settings. This appeared to have an impact on students' perception of the ‘legitimacy’ of feedback
  • Students were susceptible to the manner in which feedback was given. When feedback was perceived as overly critical, students' confidence in performing clinical midwifery skills was affected, which in turn had an impact on transference of learning to practice.
  • Readiness for feedback

    When asked to talk about their experiences of the OSCE process, students reported high levels of stress. In contrast to Muldoon et al (2014), who found student midwives to be neutral towards OSCE, students in this study expressed the view that OSCE raised stress levels, which affected performance:

    ‘We get so nervous [about OSCE], it's a horrible experience. The process is so stressful.’ (Student midwife (SM) 1)

    ‘It is so stressful, though it isn't helpful to be that nervous.’ (SM 3)

    Student midwives appreciated that an increase in stress during a simulated scenario for assessment may be similar to stress occurring in real emergency situations. The relationship between stress and performance under pressure was acknowledged, albeit in hindsight:

    ‘It is a positive experience in hindsight, as the event forces you to deal with the stress needed for adrenaline and emergency situations on labour ward.’ (SM 1)

    ‘In a way it helps us prepare ourselves for coping in a stressful emergency like shoulder dystocia.’ (SM 4)

    Student midwives noted that OSCE does not readily replicate practice. The implications for the usefulness of feedback from a simulated assessment are apparent:

    ‘The OSCE is all about performance and parrot-fashion learning, so the feedback you may get may not be applicable to real life.’ (SM 5)

    ‘It is so unnatural, and it's like you are acting, so if you are good at acting then you do OK, if you are quite sensitive to this kind of thing then it is not for you and you could become inhibited by the act of performing.’ (SM 6)

    For these students, at least, OSCE was too far removed from a ‘real life’ situation to be an indicator of performance in practice. This type of assessment may not be a true reflection of a student's practical skills (Anderson and Stickley, 2002).

    Legitimacy of feedback

    When asked to talk specifically about feedback following OSCE, student midwives expressed the view that feedback was generally received at a given point in time, for example, after a written assessment. Conversely, feedback in the clinical context was ad hoc, freely given at any time. Feedback provided by practising midwives, as opposed to academic staff, was seen as a more credible indicator of clinical performance. The fact that OSCE occurs in the academic setting and is assessed by academic staff raises questions about the legitimacy of OSCE as a means of assessing competency in clinical midwifery skills.

    ‘Yes, OSCE feedback is fit for purpose, but doesn't help us in practice as you are giving the feedback and not the clinician.’ (SM 2)

    ‘We would prefer it [OSCE] to happen in the clinical arena, with women, doing clinical skills and being observed with women.’ (SM 5)

    ‘If lecturers would come out to the clinical areas and make their judgements and assessments based on what they saw in clinical practice, it would reduce stress and feedback would be taken better.’ (SM 3)

    Articulation of feedback

    Despite the mode and location of OSCE and subsequent feedback, students required reassurance regarding their performance in order to develop confidence. Students discussed how feedback was variously provided by midwifery lecturers and practising midwives. A consensus existed regarding the manner in which feedback was given and its relationship to confidence when performing practical midwifery skills.

    ‘I didn't listen the minute she started being mean. I had been judged, and this discredited my whole practice. My confidence shot to an all-time low. I discussed it at a later date with my mentor in practice as I couldn't move on; it was affecting all my skills in practice. I felt useless, all because she was mean. Don't get me wrong, I am an adult. I can take constructive criticism and I wanted to learn about how I could improve, but the way she said it made me feel I wasn't good enough to be here.’ (SM 6)

    ‘During the feedback the mentor had given me a few hints about checking a baby top to toe, and I remembered them when I was doing the skill again. It's helpful, as I could pass the information to my mothers too. I will never forget that now.’ (SM 2)

    Overly critical feedback can directly affect students' confidence to perform the competencies they are expected to develop. This effect of feedback is counterproductive, in that confident students are thought to try harder, with more effort and persistence and, therefore, perform better (Pintrich and Schunk, 2013).

    Discussion

    Providing feedback is a vital aspect of supporting student midwives and is integral to the role of midwifery lecturer and practising midwife. Feedback, if perceived by students as unhelpful, untimely or inappropriate, can have a significant impact on students' ability to self-regulate and transfer learning from theory to practice, i.e. to bridge the so-called theory–practice gap (Wilson, 2008). Feedback should be constructive, rather than destructive, if students are to learn from and be empowered by the assessment process (Clynes and Raftery, 2008).

    OSCE is a contentious method of testing student midwives' competency to perform clinical midwifery skills, because the format can create unacceptable levels of stress (Jay, 2007). When stress levels are raised during the assessment process, students' readiness to receive feedback is adversely affected; they are unlikely to benefit from feedback and may, therefore, be unable to self-regulate learning and improve performance in practice. It is important for midwifery lecturers to understand the impact of stress on learning and to mitigate anxiety and promote deep learning through careful preparation of students prior to this type of assessment (Race and Pickford, 2007).

    Nerves are an expected element of any important situation, especially an exam environment, as the autonomic nervous system secretes catecholamine, namely adrenaline, which is primarily present for protection in the fight-or-flight response (Waugh and Grant, 2014). It is important for students to understand this physiology if assessment is to support rather than hinder learning. Stress levels are elevated at the thought of perceived danger and may result in mental and physical manifestations such as feeling nervous, dry mouth, increased heart rate and avoidance or fidgeting. Students in this study recognised that raised stress levels were likely to resemble those experienced during emergency situations, which might prove beneficial in relation to performing under pressure (Duffield and Spencer, 2002).

    Student midwives believed feedback to be unhelpful if the person giving feedback relating to competency in clinical midwifery skills was not a practising midwife. A hierarchy appeared to exist in how student midwives viewed the midwifery programme, whereby the practical element of midwifery education was deemed more important than the theoretical element. Midwifery lecturers may well dispute students' view that practice is more important than theory, and this argument is valid in that the standards for midwifery education hold that programmes should be equally weighted between theory and practice (NMC, 2009). Students are more likely to be disappointed in how feedback is given than in a lack of feedback (Brosnan et al, 2006). For feedback to be effective, it must be given in a supportive environment and must be specific to the assessment, with the giver of feedback mindful as to how the learner receives the feedback (Moorhead et al, 2004).

    When perceived as unfair, unjust or inarticulate, feedback is unlikely to have a positive impact on transference of theoretical learning to the practice setting, therefore reinforcing the theory–practice gap (Lange and Kennedy, 2006). This problem is compounded when learners have little confidence in the giver of feedback. In order to motivate the student and improve learning, feedback must be timely, effective and appropriate (Pearce et al, 2009). Feedback that focuses on growth of the learner makes sense to students and is far more likely to advance their learning (Gibbs and Simpson, 2004).

    Limitations

    It is important to note certain limitations to the arguments presented here. First, the paper draws on a small number of student midwives taking part in one focus group. Student midwives in this study may not be representative of the student body as a whole, nor of the cohort from which the sample was drawn. Nevertheless it is important to recognise and ‘give a voice’ to the student midwives who agreed to share their views about feedback on this occasion. For this reason, focus group was an appropriate design for this study. Second, student midwives talked about the experience of receiving feedback following OSCE, which is recognised as a stressful event and therefore not used in all midwifery programmes. In the absence of literature concerned with evaluation of OSCE within midwifery education, it was accepted that stress may have had an impact on students' experiences of receiving feedback.

    Conclusion

    Despite its limitations, this study raises important considerations for midwifery educators engaged in assessing and giving feedback to student midwives, in particular following OSCE. Attention should be paid to the consequences of feedback. Midwifery educators should ensure feedback is relevant to the student, specific to the assessment, timely, and appropriate. This will help to ensure that the feedback has an impact on self-regulation of learning and transference of learning into practice for student midwives.

    Key Points

  • Feedback has an impact on student learning and has the ability to empower students and promote lifelong learning
  • Educators need to understand the impact that stress has on the learning environment
  • Feedback should be given in a supportive environment, specific to the assessment
  • Feedback must be timely, effective and appropriate
  • Conflict of interest: The authors have declared no conflict of interest.