Impact Assesment Report activity held on 19 February 14:oo (Amsterdam time) online.

Today, CDHVR partners held the Impact Assessment Report activity as an online meeting on 19 February at 14:00 (Amsterdam time). The session was dedicated to reviewing our current progress and strengthening how we capture evidence of impact across curriculum development, training materials, and digital delivery. The meeting was highly productive and provided clear direction for the next implementation steps, particularly regarding documentation quality, measurable outcomes, and partner responsibilities.

A major focus of the meeting was the evaluation of the CDHVR Curriculum Development and Implementation Document. Partners reviewed the overall structure of the document, confirming that it provides a solid foundation for a coherent learning pathway that can be adopted across institutions. The consortium discussed whether the curriculum narrative is sufficiently aligned with project goals, whether the modules are clearly described, and whether the learning outcomes and competency framing are transparent for educators, students, and external stakeholders.

During the discussion, partners assessed the curriculum document in four key dimensions:

  1. Clarity and completeness of the curriculum pathway
    The consortium reviewed whether the curriculum is presented as a logical sequence from introductory digital foundations to applied clinical or pre-clinical simulation practice. Partners highlighted the need to ensure that each module includes a clear purpose, target learner profile, prerequisites, expected workload, and defined outputs. This is essential for implementation and for later evidence collection in pilots.
  2. Learning outcomes, competencies, and assessment alignment
    Partners evaluated whether learning outcomes are measurable and mapped consistently to competency statements. The group emphasized the importance of rubrics and assessment criteria that can be applied across partner sites. Particular attention was given to ensuring that assessment is not only knowledge-based, but also performance-based, including skills, decision-making, and readiness for clinical transition where relevant.
  3. Digital delivery, evidence, and certification logic
    Because impact assessment depends on traceable evidence, partners reviewed how curriculum completion will be documented. The meeting underlined the need for completion rules, learner evidence requirements, and certificate criteria that can be verified through the learning platform or structured documentation. Partners also discussed the importance of standardizing evidence types such as screenshots, activity logs, rubric results, reflective tasks, and supervisor verification where applicable.
  4. Impact indicators and reporting readiness
    The group aligned on the types of indicators needed for the Impact Assessment Report, including participation metrics, completion and engagement rates, learning outcome achievement, user satisfaction, educator feedback, and feasibility indicators (technical readiness, time requirements, staffing, and barriers). Partners agreed that the curriculum document should clearly support these indicators by making outputs and assessment points visible and consistent.

In addition to the curriculum review, the consortium discussed practical next steps to ensure that improvements are implemented quickly. Partners agreed to continue refining the curriculum document through shared comments and targeted edits, focusing on improving module descriptions, strengthening learning outcomes and rubric mapping, and clarifying the evidence collection process that will feed directly into the impact assessment reporting.

The meeting also served as an important coordination milestone for upcoming activities. Partners confirmed continued online collaboration and reiterated the value of shared documentation practices to ensure consistent implementation across sites. The consortium thanked all partners for their contributions and emphasized that today’s discussion significantly strengthened the project’s readiness to demonstrate measurable, credible impact, both academically and in real-world educational settings.