Latest Research on Professions Education : Mar 2022

Reflection and reflective practice in health professions education: a systematic review

The importance of reflection and reflective practice are frequently noted in the literature; indeed, reflective capacity is regarded by many as an essential characteristic for professional competence. Educators assert that the emergence of reflective practice is part of a change that acknowledges the need for students to act and to think professionally as an integral part of learning throughout their courses of study, integrating theory and practice from the outset. Activities to promote reflection are now being incorporated into undergraduate, postgraduate and continuing medical education, and across a variety of health professions. The evidence to support and inform these curricular interventions and innovations remains largely theoretical. Further, the literature is dispersed across several fields, and it is unclear which approaches may have efficacy or impact. We, therefore, designed a literature review to evaluate the existing evidence about reflection and reflective practice and their utility in health professional education. Our aim was to understand the key variables influencing this educational process, identify gaps in the evidence, and to explore any implications for educational practice and research.[1]


Technology-Enhanced Simulation for Health Professions Education

Context Although technology-enhanced simulation has widespread appeal, its effectiveness remains uncertain. A comprehensive synthesis of evidence may inform the use of simulation in health professions education.

Objective To summarize the outcomes of technology-enhanced simulation training for health professions learners in comparison with no intervention.

Data Source Systematic search of MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous review bibliographies through May 2011.

Study Selection Original research in any language evaluating simulation compared with no intervention for training practicing and student physicians, nurses, dentists, and other health care professionals.

Data Extraction Reviewers working in duplicate evaluated quality and abstracted information on learners, instructional design (curricular integration, distributing training over multiple days, feedback, mastery learning, and repetitive practice), and outcomes. We coded skills (performance in a test setting) separately for time, process, and product measures, and similarly classified patient care behaviors.

Data Synthesis From a pool of 10 903 articles, we identified 609 eligible studies enrolling 35 226 trainees. Of these, 137 were randomized studies, 67 were nonrandomized studies with 2 or more groups, and 405 used a single-group pretest-posttest design. We pooled effect sizes using random effects. Heterogeneity was large (I2>50%) in all main analyses. In comparison with no intervention, pooled effect sizes were 1.20 (95% CI, 1.04-1.35) for knowledge outcomes (n = 118 studies), 1.14 (95% CI, 1.03-1.25) for time skills (n = 210), 1.09 (95% CI, 1.03-1.16) for process skills (n = 426), 1.18 (95% CI, 0.98-1.37) for product skills (n = 54), 0.79 (95% CI, 0.47-1.10) for time behaviors (n = 20), 0.81 (95% CI, 0.66-0.96) for other behaviors (n = 50), and 0.50 (95% CI, 0.34-0.66) for direct effects on patients (n = 32). Subgroup analyses revealed no consistent statistically significant interactions between simulation training and instructional design features or study quality.

Conclusion In comparison with no intervention, technology-enhanced simulation training in health professions education is consistently associated with large effects for outcomes of knowledge, skills, and behaviors and moderate effects for patient-related outcomes.[2]


Rethinking programme evaluation in health professions education: beyond ‘did it work?’

Context For nearly 40 years, outcome-based models have dominated programme evaluation in health professions education. However, there is increasing recognition that these models cannot address the complexities of the health professions context and studies employing alternative evaluation approaches that are appearing in the literature. A similar paradigm shift occurred over 50 years ago in the broader discipline of programme evaluation. Understanding the development of contemporary paradigms within this field provides important insights to support the evolution of programme evaluation in the health professions.

Methods In this discussion paper, we review the historical roots of programme evaluation as a discipline, demonstrating parallels with the dominant approach to evaluation in the health professions. In tracing the evolution of contemporary paradigms within this field, we demonstrate how their aim is not only to judge a programme’s merit or worth, but also to generate information for curriculum designers seeking to adapt programmes to evolving contexts, and researchers seeking to generate knowledge to inform the work of others.

Discussion From this evolution, we distil seven essential elements of educational programmes that should be evaluated to achieve the stated goals. Our formulation is not a prescriptive method for conducting programme evaluation; rather, we use these elements as a guide for the development of a holistic ‘programme of evaluation’ that involves multiple stakeholders, uses a combination of available models and methods, and occurs throughout the life of a programme. Thus, these elements provide a roadmap for the programme evaluation process, which allows evaluators to move beyond asking whether a programme worked, to establishing how it worked, why it worked and what else happened. By engaging in this process, evaluators will generate a sound understanding of the relationships among programmes, the contexts in which they operate, and the outcomes that result from them.[3]


Flipped classroom improves student learning in health professions education: a meta-analysis

Background
The use of flipped classroom approach has become increasingly popular in health professions education. However, no meta-analysis has been published that specifically examines the effect of flipped classroom versus traditional classroom on student learning. This study examined the findings of comparative articles through a meta-analysis in order to summarize the overall effects of teaching with the flipped classroom approach. We focused specifically on a set of flipped classroom studies in which pre-recorded videos were provided before face-to-face class meetings. These comparative articles focused on health care professionals including medical students, residents, doctors, nurses, or learners in other health care professions and disciplines (e.g., dental, pharmacy, environmental or occupational health).

Method
Using predefined study eligibility criteria, seven electronic databases were searched in mid-April 2017 for relevant articles. Methodological quality was graded using the Medical Education Research Study Quality Instrument (MERSQI). Effect sizes, heterogeneity estimates, analysis of possible moderators, and publication bias were computed using the COMPREHENSIVE META-ANALYSIS software.

Results
A meta-analysis of 28 eligible comparative studies (between-subject design) showed an overall significant effect in favor of flipped classrooms over traditional classrooms for health professions education (standardized mean difference, SMD = 0.33, 95% confidence interval, CI = 0.21–0.46, p < 0.001), with no evidence of publication bias. In addition, the flipped classroom approach was more effective when instructors used quizzes at the start of each in-class session. More respondents reported they preferred flipped to traditional classrooms.

Conclusions
Current evidence suggests that the flipped classroom approach in health professions education yields a significant improvement in student learning compared with traditional teaching methods.[4]


The effectiveness of self-directed learning in health professions education: a systematic review

Objectives Given the continuous advances in the biomedical sciences, health care professionals need to develop the skills necessary for life-long learning. Self-directed learning (SDL) is suggested as the methodology of choice in this context. The purpose of this systematic review is to determine the effectiveness of SDL in improving learning outcomes in health professionals.

Methods We searched MEDLINE, EMBASE, ERIC and PsycINFO through to August 2009. Eligible studies were comparative and evaluated the effect of SDL interventions on learning outcomes in the domains of knowledge, skills and attitudes. Two reviewers working independently selected studies and extracted data. Standardised mean difference (SMD) and 95% confidence intervals (95% CIs) were estimated from each study and pooled using random-effects meta-analysis.

Results The final analysis included 59 studies that enrolled 8011 learners. Twenty-five studies (42%) were randomised. The overall methodological quality of the studies was moderate. Compared with traditional teaching methods, SDL was associated with a moderate increase in the knowledge domain (SMD 0.45, 95% CI 0.23–0.67), a trivial and non-statistically significant increase in the skills domain (SMD 0.05, 95% CI − 0.05 to 0.22), and a non-significant increase in the attitudes domain (SMD 0.39, 95% CI − 0.03 to 0.81). Heterogeneity was significant in all analyses. When learners were involved in choosing learning resources, SDL was more effective. Advanced learners seemed to benefit more from SDL. Conclusions Moderate quality evidence suggests that SDL in health professions education is associated with moderate improvement in the knowledge domain compared with traditional teaching methods and may be as effective in the skills and attitudes domains.[5]


Reference

[1] Mann, K., Gordon, J. and MacLeod, A., 2009. Reflection and reflective practice in health professions education: a systematic review. Advances in health sciences education, 14(4), pp.595-621.

[2] Cook, D.A., Hatala, R., Brydges, R., Zendejas, B., Szostek, J.H., Wang, A.T., Erwin, P.J. and Hamstra, S.J., 2011. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. Jama, 306(9), pp.978-988.

[3] Haji, F., Morin, M.P. and Parker, K., 2013. Rethinking programme evaluation in health professions education: beyond ‘did it work?’. Medical education, 47(4), pp.342-351.

[4] Hew, K.F. and Lo, C.K., 2018. Flipped classroom improves student learning in health professions education: a meta-analysis. BMC medical education, 18(1), pp.1-12.

[5] Murad, M.H., Coto‐Yglesias, F., Varkey, P., Prokop, L.J. and Murad, A.L., 2010. The effectiveness of self‐directed learning in health professions education: a systematic review. Medical education, 44(11), pp.1057-1068.

Spread the love

Leave a Reply

Your email address will not be published.