journal_logo

GMS Journal for Medical Education

Gesellschaft für Medizinische Ausbildung (GMA)

2366-5017


This is the English version of the article. The German version can be found here.
research article
simulation

In-house designed simulation courses versus society-accredited designs by international societies: A comparative analysis

Igor Abramovich 1,2,3
Jakob Beilstein 1,2
Eva Kornemann 1,2
 Joana Berger-Estilita 4,5
Torsten Schröder 1,2

1 Charité – Universitätsmedizin Berlin, Department for Anaesthesiology and Intensive Care Medicine (CCM/CVK), Berlin, Germany
2 Berlin Simulation & Training Center (BeST), Berlin, Germany
3 University of California San Francisco, Department of Anesthesia and Perioperative Care, San Francisco, CA, USA
4 University of Bern, Institute for Medical Education, Bern, Switzerland
5 Hirslanden Medical Group, Salemspital, Institute of Anaesthesiology and Intensive Care, Bern, Switzerland

Abstract

Background: Simulation-based medical education is increasingly important in postgraduate training, yet the comparative merits of in-house vs. society-accredited courses are still not well understood. This study examined these two approaches in three emergency medicine domains – prehospital, pediatric, and adult – to identify their respective strengths and potential limitations.

Methods: In a retrospective analysis, 1,263 participants from 57 sessions (2019–2023) evaluated six emergency medicine courses (three society-accredited, three in-house). A 25-item Likert-scale survey assessed aspects of course content, delivery, organization, and overall recommendation, alongside demographic questions and free-text comments. Mann-Whitney U tests and Cliff’s Delta were used for statistical comparisons.

Results: Society-accredited courses generally scored higher on guideline adherence, presenter competence, and practical relevance, whereas in-house formats excelled in areas like content scope and communication. Participant specialty, workplace, and training stage influenced ratings. Free-text feedback praised hands-on learning and small-group design but called for earlier material distribution, better logistics, and clearer guidelines.

Conclusions: Both in-house and society-accredited SBME courses exhibit distinct strengths. Adopting best practices from both models, may guide a hybrid approach that optimizes SBME outcomes. However, reliance on self-reported data and a lack of controls for instructor competence or teaching style limit generalizability. Future research should include a broader sample, more rigorous content analysis, longitudinal follow-up, and detailed participant experience data to enhance the depth and applicability of findings.


Keywords

simulation-based medical education, in-house design, emergency medicine simulation

1. Introduction

Simulation-Based Medical Education (SBME) has emerged as a transformative force in medical training, offering a dynamic and immersive learning environment that bridges the gap between theoretical knowledge and clinical practice [1]. Over the past few decades, SBME has gained significant traction, becoming an integral component of postgraduate medical education across various specialities worldwide [2], [3], [4]. Medical educators, policymakers, and healthcare professionals have widely recognised its potential to enhance clinical skills, improve patient outcomes, and mitigate medical errors [5].

SBME encompasses various simulation modalities, ranging from high-fidelity simulations to standardised patients and procedural part-task trainers [6]. These simulations replicate real-life clinical scenarios, allowing learners to practice clinical skills, decision-making, and teamwork in a safe and controlled environment [7]. As such, SBME has permeated various medical disciplines, including emergency medicine [8], surgery [9], internal medicine [10], and beyond [11]. It has become a cornerstone of postgraduate medical training and extended its reach to pre-graduate medical students and other healthcare practitioners, enriching their educational experiences and preparing them for clinical practice [12].

In recent years, the proliferation of SBME programs on a global scale has been accompanied by a concerted effort to standardise training frameworks and enhance educational outcomes. Medical societies, such as the European Resuscitation Council (ERC) [13], American Heart Association (AHA) [https://cpr.heart.org/en/resources/history-of-cpr] and Resuscitation Council UK [14] have played a pivotal role in meticulously devising SBME guidelines tailored to specific clinical contexts. These guidelines outline curriculum content, format architecture, and faculty preparation, providing a blueprint for high-quality SBME training. Yet, the relevance of comparing different SBME formats – particularly in high-stakes fields such as prehospital, pediatric, and adult emergency medicine – remains paramount for identifying optimal practices and improving patient care.

Despite the prevalence of certified SBME formats, in-house courses – such as those at the Berlin Simulation & Training Center (BeST) – remain distinct. They tailor content to specific specialties and competency levels, emphasizing flexibility and learner-focused design. These courses often employ innovative simulations, customized scenarios, and specialized debriefing methods, reflecting the expertise and resources of the hosting institution [15], [16], [17], [18].

In contrast, society-accredited courses adhere to standardised guidelines and accreditation criteria, providing a structured framework with prescribed curriculum content and instructional materials [19]. These courses undergo rigorous review processes to ensure consistency, quality, and adherence to best practices in SBME [20]. While this approach offers the advantage of recognized standards and uniform delivery across institutions, it may also limit opportunities for individualised learning and local innovation.

This study aims comparatively analyze in-house designed and society-accredited SBME courses within three key areas of emergency medicine – prehospital, pediatric, and adult – focusing on participant evaluations and course design features. By investigating participant evaluations of course content quality, delivery, organization, and overall satisfaction, we aim to clarify each approach’s respective strengths and potential limitations. Our findings intend to inform educators, policymakers, and healthcare stakeholders about how different SBME programs may best serve diverse learner needs, ultimately guiding more effective training initiatives and contributing to improved clinical outcomes.

2. Methods

2.1. Ethics

Ethics approval was waived by the Charité – Universitätsmedizin Berlin Ethics Committee (EA1/101/24) on 10 May 2024. No identifying data were collected, and all survey information was stored securely with restricted access. The study adhered to the Declaration of Helsinki and relevant Data Protection Acts.

2.2. Study design

This retrospective study compared post-course evaluations and free-text responses from in-house (S) vs. society-accredited (A) emergency medicine courses in prehospital (S-PHEM vs. A-PHEM), pediatric (S-PED vs. A-PED), and adult (S-ALS vs. A-ALS) settings (see table 1 [Tab. 1]). The primary focus was to identify differences in content quality, delivery, organization, and overall satisfaction. Detailed learning objectives and key goals for each course are provided in attachment 1 [Att. 1].

Table 1: This table compares the scope, financial considerations, and subject matter of in-house courses* vs. those certified by external accrediting societies

All courses were taught by experienced physicians (anesthesiology, intensive care, emergency medicine, or pediatrics). Prehospital formats also included Emergency Medical Services providers. In-house instructors completed a structured training program – observational hospitations and supervised teaching – before becoming full instructors. Society-accredited instructors followed their society’s formal pathway, typically involving high participant performance, an instructor training course, and additional hospitations.

2.3. Study outcomes

We compared participant evaluations from in-house (S) vs. society-accredited (A) courses. The primary outcome was the difference in average post-course scores for each category, calculated separately for the courses. The secondary outcome examined the impact of demographic factors (gender, location, specialty, training stage, workplace) on these ratings. Finally, open-ended questions provided additional insights into participants’ perspectives.

2.4. Data collection

Data were collected between January 2019 and December 2023 from participants immediately following each of the six course formats under study. The Berlin Medical Association Course Evaluation Form (English translation in attachment 2 [Att. 2]) was used, consisting of:

  • 25 Likert-like scale items, rated from 1 (strongly agree) to 5 (not at all).
  • Two free-text questions, prompting participants to describe beneficial aspects of the course and areas needing improvement.
  • Demographic questions (gender, practice location, medical specialty, training stage, workplace type).

No incentives were offered. However, completing the evaluation form was part of the mandatory procedure required by the Berlin Medical Chamber for accrediting Continuing Medical Education credits. It typically took about five minutes to finish the form.

The 25 Likert-like items were grouped into four domains:

  • Course content (items 1-14): Content relevance, instructor competence, presentation quality, adherence to guidelines, content scope, personal gain/feasibility, and disclosure of conflicts of interest.
  • Delivery of content (items 15-19): Methods of content delivery, learning objectives (individual/group work), quality of materials, and discussion opportunities.
  • Organisation (items 20-24): Course registration, service/support, moderation, timing, and participant numbers.
  • Possible recommendation (item 25)

Additionally, participants were invited to provide free-text remarks (items 26-27).

2.5. Data analysis

All evaluation data were securely stored to protect anonymity, and each of the six course formats underwent both within- and between-group comparisons. Because the Shapiro–Wilk test showed a non-normal distribution, data were analyzed non-parametrically. Mann-Whitney U tests (p<0.05) compared in-house (S) vs. society-accredited (A) courses and demographic factors, with Cliff’s Delta (Δ) quantifying effect sizes. Qualitative feedback was analyzed inductively by three authors: JBE identified recurring themes, IA refined them, and TS validated the final categories.

3. Results

3.1. Respondents’ characteristics

A total of 1,263 individuals participated in the six course formats across 57 sessions (January 2019-December 2023). Of these, 868 evaluation forms were returned (68.7% response rate). Table 2 [Tab. 2] illustrates the respondent characteristics and distribution by course type, while table 3 [Tab. 3] provides additional demographic details, including gender, location of practice, medical specialty, role, and workplace of participants.

Table 2: Number of course participants per course type and demographic data

Table 3: Detailed breakdown of participant demographics of participants who provide information in the above categories, not including not provided answers

3.2. Comparison of prehospital emergency medicine courses (A-PHEM vs. S-PHEM)

S-PHEM received lower ratings than A-PHEM on “current guidelines” (p=0.038, Δ=-0.123), “selection of contributions” (p=0.039, Δ=-0.176), “scope of content” (p=0.023, Δ=-0.214), and “own level of competence” (p=0.043, Δ=-0.234). It also scored less favorably on “presentation of conflicts of interest” (p=0.001, Δ=-0.351), “processing/mentioning of learning objectives” (p=0.044, Δ=-0.184), “time frame” (p=0.038, Δ=-0.123), “communication/social skills” (p=0.031, Δ=0.059) and “registeration process” (p=0.04, Δ=-0.029).

Within A-PHEM, males rated “critically reflective presentation” higher than females (p=0.029, Δ=-0.196). In S-PHEM, participants outside Berlin viewed “number of participants” more positively (p=0.047, Δ=-0.141).

Specialty differences were notable in A-PHEM: non-anesthesiology participants gave higher ratings to “current guidelines,” “communication/social skills,” “content of contributions,” “competence of presenters,” “own knowledge gain,” and “learning objective development alone or in groups” (all p<0.05, Δ range=-0.25 to -1.0). Similar trends appeared in S-PHEM for “interdisciplinary knowledge” (p=0.026, Δ=-0.341), “competence of presenters” (p=0.026, Δ=-0.222), and “learning objective development” (p<0.05 Δ up to -1.0). No significant differences emerged between residents and specialists (p=0.05). Finally, workplace influenced perceptions in A-PHEM, with university participants providing more favorable ratings across multiple aspects (p<0.05, Δ=0.333) (see figure 1 [Fig. 1]).

Figure 1: Stacked bar chart showing the percentage distribution of ratings by category for both the accredited prehospital EM course (A-PHEM) and the in-house PHEM course (S-PHEM). Asterisks (*) mark significant differences based on the U test (p<0.05). Full percentage data is provided in attachment 4.

3.3 Comparison of pediatric emergency medicine courses (A-PED vs. S-PED)

A-PED scored higher than S-PED on “critically reflective presentation” (p=0.037, Δ=-0.132), “interdisciplinary knowledge” (p=0.027, Δ=-0.126), “scope of content” (p=0.025, Δ=-0.144), “competence of presenters” (p=0.016, Δ=-0.125), “presentation of conflicts of interest” (p=0.004, Δ=-0.190), and “clinical practical skills” (p=0.002, Δ=-0.148). Within A-PED, females rated “processing/mentioning of learning objectives” higher than males (p=0.033, Δ=0.230). In S-PED, males gave marginally higher scores for “recommendation of the event” (p=0.017, Δ=-0.091). Location influenced A-PED (“communication/social skills”, p=0.034, Δ=-0.257), while specialty affected S-PED self-assessments (e.g., anesthesia vs. trauma, p=0.037, Δ=-0.33). Residents in A-PED rated “selection of contributions” less favorably (p=0.037, Δ=0.514), whereas specialists in S-PED tended to give consistently favorable evaluations (p<0.05, Δ up to 0.176). Workplace differences emerged as well, with university affiliations affecting ratings in both A-PED and S-PED (p<0.05, Δ=±0.333) (see figure 2 [Fig. 2]).

Figure 2: Stacked bar chart showing the percentage distribution of ratings by category for both the accredited pediatric emergence medicine course (A-PED) and the in-house (S-PED). Asterisks (*) mark significant differences based on the U test (p<0.05). Full percentage data is provided in attachment 4.

3.4. Comparison of adult emergency medicine courses (A-ALS vs. S-ALS)

S-ALS was rated more favorably than A-ALS in “current guidelines” (p=0.001, Δ=0.14), “communication/social skills” (p=0.006, Δ=0.14), “selection of contributions” (p=0.00659, Δ=0.18), “scope of content” (p=0.001, Δ=0.21), “critically reflective presentation” (p=0.00788, Δ=0.14), “own level of competence” (p=0.001, Δ=0.24), “presentation of conflicts of interest” (p=0.019, Δ=0.17), “processing/mentioning of learning objectives” (p=0.004, Δ=0.15) and “quality of working materials” (p=0.028, Δ=0.13). In contrast, A-ALS scored better on “presentation of conflicts of interest” (p=0.019, Δ=0.17) and “event moderation” (p=0.038, Δ=-0.13).

Location influenced “competence of presenters” in A-ALS (p=0.040, Δ=±0.197) and “opportunities for discussions and questions” (p=0.003, Δ=±0.198) and “registration process” (p=0.012, Δ=±0.256) in S-ALS. Specialty and training stage also mattered in S-ALS, with residents rating some aspects lower (Δ=-0.1 to -0.18) and specialists/non-physicians rating others higher (Δ=0.13 to 0.23, p<0.05). Workplace settings in S-ALS influenced ratings of “communication/social skills,” “learning objective development,” and “quality of working materials” (p<0.05, Δ=-0.15 to -0.19), as well as perceptions of presenter competence and discussion opportunities (p<0.05, Δ=±0.09) (see figure 3 [Fig. 3]).

Figure 3: Stacked bar chart showing the percentage distribution of ratings by category for both the accredited adult emergency medicine course (A-ALS) and the in-house (S-ALS). Asterisks (*) mark significant differences based on the U test (p<0.05). Full percentage data is provided in attachment 4.

3.5. Free text responses

Across 163 free-text responses, participants frequently praised the courses for their practical orientation, capable instructors, and small group sizes (see attachment 3 [Att. 3]). A-PHEM excelled in practical skill training, scenario diversity, and clear instruction but required earlier material distribution, more specialized skills, and improved logistics. S-PHEM stood out for realistic scenarios, engaging debriefings, and a supportive atmosphere yet needed better technical setups, standardized structures, and enhanced e-learning. In pediatrics, A-PED offered effective debriefing, practical relevance, and stable team structures, though alignment with current guidelines, smaller groups, and clearer scenarios were suggested. S-PED was recognized for its high practicality, small group sizes, and interdisciplinary approach but called for a more balanced theory-to-practice ratio, larger break areas, and clearer preparation materials. Finally, A-ALS combined a structured design with positive instructor engagement, while S-ALS provided hands-on training and small groups – both required refined logistics, earlier materials, and extended or more diverse scenarios. Detailed summaries of these recurring themes and feedback can be found in the supplementary material (see attachment 4 [Att. 4]).

4. Discussion

Comparisons of in-house (S-PHEM, S-PED, S-ALS) and society-accredited (A-PHEM, A-PED, A-ALS) SBME courses generally showed that accredited formats excelled in guideline adherence, organizational structure, and critical reflection – especially in prehospital and pediatric contexts. However, in the ALS domain, the in-house course was rated more favorably on “current guidelines” and several other aspects, while the accredited course received higher scores for event moderation. Demographic analyses revealed variations linked to gender, specialty, and workplace, hinting at specific needs or barriers in knowledge transfer; for example, gender-specific differences in communication and moderation styles may require targeted teaching strategies. In their free-text responses, participants praised the courses for their practical orientation, knowledgeable instructors, and small group sizes, yet recommended longer course durations, earlier material distribution, closer guideline alignment, and logistical refinements.

Does it really matter if the course is certified or not?

Whether a course is certified can significantly impact its quality, standardisation, and recognition within the medical community [21], [22]. Although accredited courses, as seen in this study – long recognized in the literature for their adherence to guidelines and organizational consistency – exhibited notable strengths, they also showed drawbacks [20], [23], [24], [25]. Participant feedback noted rigidity in accredited course structures, limited adaptability to learners’ needs, and occasional outdated content. In contrast, in-house formats offered flexible, interactive learning with tailored content and dynamic discussions. This adaptability – combined with guideline adherence – suggests refining both formats to capitalize on their strengths. In-house designs enable scenario customization, active engagement, and practical reflections, which participants especially valued for interdisciplinary collaboration, event moderation, and instructor involvement.

Impact of curriculum and structure vs other factors like instructor competency

The interplay between curriculum/structure and instructor competency is crucial in SBME effectiveness. While curriculum and structure lay the foundation for learning objectives and content, the instructor is essential for engagement, guiding discussions, and feedback [26]. A well-designed curriculum aligns with educational goals and supports active learning through simulations and hands-on activities [27]. However, its effectiveness can be enhanced or limited by the instructor’s ability to adapt teaching methods, foster a supportive environment, and offer individualized guidance. Therefore, even though curriculum and structure provide the framework for learning outcomes, the instructor’s expertise, communication skills, and teaching approach fundamentally shape the educational experience and learner engagement [13], [14], [28].

The role and competence of instructors seem to represent a significant factor influencing the success of SBME courses. While instructors in certified courses undergo rigorous training programs and formal certification processes, in-house courses often provide a more flexible structure, allowing experienced local experts to deliver content tailored to specific needs practically. Our findings indicate that the perception of instructor competence significantly contributes to participant satisfaction, particularly in moderation, discussion, and hands-on guidance. Future studies should further investigate the impact of instructor qualifications and experience on learning quality to identify additional opportunities for optimization. Ultimately, a synergistic relationship between curriculum design and instructor competency is essential for maximising the impact of SBME courses on knowledge acquisition, skill development, and clinical practice readiness.

Implications for medical education

Our findings highlight the importance of a differentiated approach to designing SBME courses. While certified courses offer advantages in standardized content and structure, in-house courses excel through flexibility and practice-oriented customisation. These insights can serve as a foundation for developing a hybrid course model that integrates the strengths of both formats and specifically addresses learners’ needs. This approach can ensure that medical training adheres to high standards and remains adaptive and participant-centred, fostering a more robust learning environment [29]. For healthcare professionals, choosing a mix of both accredited and in-house designed courses could maximise learning outcomes, preparing them better for clinical practice.

Study limitations

Our study is limited by its single-institution scope, which may reduce generalizability. Because we primarily rely on self-reported data, response or social desirability bias cannot be ruled out. Although we compare course structures, we do not control for variations in instructor competence or teaching style, which can significantly influence learning outcomes. While our analysis highlighted these themes effectively, a more systematic content analysis, such as Mayring's methodology, could offer a deeper understanding of qualitative patterns and is recommended for future research [30]. We also do not include longitudinal data to assess the long-term retention of skills and knowledge or the impact on clinical practice. Detailed data on participants' expertise, such as years of clinical experience or prior simulation training, were not systematically collected in our study. We acknowledge these limitation and recommend addressing these aspects in future research to enable a more nuanced analysis of group outcomes.

5. Conclusion

Both in-house and society-accredited SBME courses exhibit distinct strengths and areas requiring further refinement. In prehospital (PHEM) and pediatric (PED) courses, society-accredited formats generally demonstrated stronger guideline adherence and organizational structure, whereas in the adult (ALS) domain, the in-house course achieved higher ratings in several key dimensions, despite being shorter (10h vs 20h). These findings emphasize the potential value of integrating best practices from both approaches to enhance medical education and training outcomes. Future research should explore the long-term impact of instructor qualifications, examine demographic factors (e.g., gender) in shaping course evaluations, and conduct longitudinal studies on the sustainability of learning gains. Ultimately, developing a hybrid course format that leverages the advantages of both in-house and accredited designs may further optimize SBME programs.

Key points

  • Society- accredited SBME courses, like those from the European Resuscitation Council and American Heart Association, provide consistency, quality, and international recognition, but may limit opportunities for individualized learning.
  • In-house designed formats excell in areas like content scope and communication.
  • Both in-house designed and society-accredited courses have distinct strengths and areas for improvement.

Abbreviations

  • AHA: American Heart Association
  • A-ALS: ERC Advanced Life Support simulation format
  • S-PED: In-house paediatric emergency course: Bärenkind – Berliner Ärzte retten Kinder – Emergency Netzwerk für das Kind
  • BeST: Berlin Simulation & Training Center
  • CAPCE: Commission on Accreditation for Prehospital Continuing Education
  • CME: Continuing Medical Education
  • A-PED: ERC European Paediatric Advanced Life Support simulation format
  • ERC: European Resuscitation Council
  • S-PHEM: In-house prehospital emergency medicine simulation course - Notarztsimulation
  • A-PHEM: Prehospital Trauma Life Support accredited by Commission on Accreditation for Prehospital Continuing Education
  • S-ALS: In-house Advanced Life Support format
  • SBME: Simulation-Based Medical Education

Notes

Conference presentation

Data was partially presented at ERC Resuscitation 2023 in Barcelona, Spain.

Authors’ contributions:

  • IA: Conception of work, drafting of the manuscript, and data analysis.
  • JB: Data acquisition and substantive revision of the manuscript.
  • EK: Data acquisition and substantive revision of the manuscript.
  • JBE: Data analysis, data interpretation, and substantive revision of the manuscript.
  • TS: Conception of work and substantive revision of the manuscript.
  • The authors JBE and TS share the last-authorship.

Authors’ ORCIDs

Acknowledgements

We thank all past and present members of the Berlin Training and Simulation Center (BeST) who supported this study.

Competing interests

The authors declare that they have no competing interests.


References

[1] Kohn LT, Corrigan J, Donaldson MS. To err is human: building a safer health system. Washington (DC): National Academy Press; 2000. p.287.
[2] Savoldelli GL, Burlacu CL, Lazarovici M, Matos FM, Ostergaard D, Utstein Simulation Study Group. Integration of simulation-based education in anaesthesiology specialist training: Synthesis of results from an Utstein Meeting. Eur J Anaesthesiol. 2024;41(1):43-54. DOI: 10.1097/EJA.0000000000001913
[3] Mesko S, Chapman BV, Tang C, Kudchadker RJ, Bruno TL, Sanders J, Das P, Pinnix CC, Thaker NG, Frank SJ. Development, implementation, and outcomes of a simulation-based medical education (SBME) prostate brachytherapy workshop for radiation oncology residents. Brachytherapy. 2020;19(6):738-745. DOI: 10.1016/j.brachy.2020.08.009
[4] Huber L, Good R, Bone MF, Flood SM, Fredericks R, Overly F, Tofil NM, Wing R, Walsh K. A Modified Delphi Study for Curricular Content of Simulation-Based Medical Education for Pediatric Residency Programs. Acad Pediatr. 2024;24(5:856-865. DOI: 10.1016/j.acap.2024.04.008
[5] Frenk J, Chen LC, Chandran L, Groff EO, King R, Meleis A, Fineberg HV. Challenges and opportunities for educating health professionals after the COVID-19 pandemic. Lancet. 2022;400(10362):1539-1556. DOI: 10.1016/S0140-6736(22)02092-X
[6] Pilote B, Chiniara G. Chapter 2 - The Many Faces of Simulation. In: Chiniara G, editor. Clinical Simulation. Second Edition. Cambridge (MA): Academic Press; 2019. p.17-32.
[7] Rodgers DL, Securro Jr S, Pauley RD. The effect of high-fidelity simulation on educational outcomes in an advanced cardiovascular life support course. Simul Healthc. 2009;4(4):200-206. DOI: 10.1097/SIH.0b013e3181b1b877
[8] Sahi N, Humphrey-Murto S, Brennan EE, O'Brien M, Hall AK. Current use of simulation for EPA assessment in emergency medicine. CJEM. 2024;26(3):179-187. DOI: 10.1007/s43678-024-00649-9
[9] Shah AP, Cleland J, Hawick L, Walker KA, Walker KG. Integrating simulation into surgical training: a qualitative case study of a national programme. Adv Simul (Lond). 2023;8(1):20. DOI: 10.1186/s41077-023-00259-y
[10] Fadous M, Chen-Tournoux AA, Eppich W. Current Use of Simulation in Canadian Cardiology Residency Programs: Painting the Landscape to Better Visualize the Future. Can J Cardiol. 2024:S0828-282X(29)00199-5. DOI: 10.1016/j.cjca.2024.03.002
[11] Abramovich I, Crisan I, Dow O, Morais D, De Hert S, Østergaard D, Berger-Estilita J, Blank A. Simulation-based education in anaesthesiology residency training in Europe: A survey-based cross-sectional study. Trend Anaesth Crit Care. 2023;53:101310. DOI: 10.1016/j.tacc.2023.101310
[12] McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Med Educ. 2010;44(1):50-63. DOI: 10.1111/j.1365-2923.2009.03547.x
[13] Bossaert L, Chamberlain D. The European Resuscitation Council: its history and development. Resuscitation. 2013;84(10):1291-1294. DOI: 10.1016/j.resuscitation.2013.07.025
[14] Lockey A. Advanced Life Support (ALS) Course – the past to the present. London: Resuscitation Council UK; 2020.
[15] Dieckmann P, Gaba D, Rall M. Deepening the theoretical foundations of patient simulation as social practice. Simul Healthc. 2007;2(3):183-193. DOI: 10.1097/SIH.0b013e3180f637f5
[16] Mercer SJ, Moneypenny MJ, Fredy O, Guha A. What should be included in a simulation course for anaesthetists? The Merseyside trainee perspective. Eur J Anaesthesiol. 2012;29(3):137-142. DOI: 10.1097/EJA.0b013e32834d945a
[17] Chauvin SW. Applying Educational Theory to Simulation-Based Training and Assessment in Surgery. Surg Clin North Am. 2015;95(4):695-715. DOI: 10.1016/j.suc.2015.04.006
[18] Fischer H, Strunk G, Neuhold S, Kiblbock D, Trimmel H, Baubin M, Domanovits H, Maurer C, Greif R. The effectiveness of ERC advanced life support (ALS) provider courses for the retention of ALS knowledge. Resuscitation. 2012;83(2):227-231. DOI: 10.1016/j.resuscitation.2011.09.014
[19] Ko YC, Hsieh MJ, Cheng A, Lauridsen KG, Sawyer TL, Bhanji F, Greif R; International Liaison Committee on Resuscitation Education, Implementation. Teams (EIT) Task Force. Faculty Development Approaches for Life Support Courses: A Scoping Review. J Am Heart Assoc. 2022;11(11):e025661. DOI: 10.1161/JAHA.122.025661
[20] Redazione S. Are certification and accreditation a real need? Sim Zine. 2022:6. Zugänglich unter/available from: https://simzine.news/focus-en/are-certification-and-accreditation-a-real-need
[21] Heitmiller ES, Nelson KL, Hunt EA, Schwartz JM, Yaster M, Shaffner DH. A survey of anesthesiologists' knowledge of American Heart Association Pediatric Advanced Life Support Resuscitation Guidelines. Resuscitation. 2008;79(3):499-505. DOI: 10.1016/j.resuscitation.2008.07.018
[22] Frank JR, Taber S, van Zanten M, Scheele F, Blouin D; International Health Professions Accreditation Outcomes Consortium. The role of accreditation in 21st century health professions education: report of an International Consensus Group. BMC Med Educ. 2020;20(Suppl 1):305. DOI: 10.1186/s12909-020-02121-5
[23] Bullock I, Davis M, Lockey A, Mackway-Jones K. Pocket Guide to Teaching for Clinical Instructors. Hoboken (NJ): Wiley John; 2015. DOI: 10.1002/9781119088769
[24] Greif R, Lockey A, Breckwoldt J, Carmona F, Conaghan P, Kuzovlev A, Pflanzl-Knizacek L, Sari F, Shammet S, Scapigliati A, Turner N, Yeung J, Monsieurs KG. European Resuscitation Council Guidelines 2021: Education for resuscitation. Resuscitation. 2021;161:388-407. DOI: 10.1016/j.resuscitation.2021.02.016
[25] Kaye W, Rallis SF, Mancini ME, Linhares KC, Angell ML, Donovan DS, Zajano NC, Finger JA. The problem of poor retention of cardiopulmonary resuscitation skills may lie with the instructor, not the learner or the curriculum. Resuscitation. 1991;21(1):67-87. DOI: 10.1016/0300-9572(91)90080-i
[26] Ali L. The Design of Curriculum, Assessment and Evaluation in Higher Education with Constructive Alignment. J Educa e-Learn Res. 2019;5(1):72-78. DOI: 10.20448/journal.509.2018.51.72.78
[27] Guerriero S. Teachers' pedagogical knowledge: What it is and how it functions. In: Guerriero S, editor. Pedagogical Knowledge and the Cahnging Nature of the Teaching Profession. Paris: OECD iLibrary; 2017. p.99-118. DOI: 10.1787/9789264270695-en
[28] Teunissen PW. Experience, trajectories, and reifications: an emerging framework of practice-based learning in healthcare workplaces. Adv Health Sci Educ Theory Pract. 2015;20(4):843-856. DOI: 10.1007/s10459-014-9556-y
[29] Ayaz O, Ismail FW. Healthcare Simulation: A Key to the Future of Medical Education - A Review. Adv Med Educ Pract. 2022;13:301-308. DOI: 10.2147/AMEP.S353777
[30] Mayring P. Qualitative Content Analysis. Forum Qual Sozialforsch. 2000;1(2). DOI: 10.17169/fqs-1.2.1089


Attachments

Attachment 1Translation of the evaluation form of the Berlin Medical Association (Attachment_1.pdf, application/pdf, 190.55 KBytes)
Attachment 2Overview of primary learning objectives, focus and key goals of each course in the study (Attachment_2.pdf, application/pdf, 121.96 KBytes)
Attachment 3Free-text comments (Attachment_3.pdf, application/pdf, 113.84 KBytes)
Attachment 4Percentage distribution (Attachment_4.pdf, application/pdf, 193.29 KBytes)