Optimising STEM examinations for digital and remote delivery: a case study.

The rapid transition to remote digital teaching and learning modalities following the outbreak of COVID-19 pandemic has necessarily spearheaded the transformation of the assessment landscape. What was once considered unorthodox has become desirable, if not unavoidable: assessing students through essays or exams is steadily giving way to new learning measurement forms embracing the notions of authentic assessment and assessment of and as learning (Brown & Sambell, 2020b; Ashford-Rowe et al. 2014; Dann, 2014). Yet, despite a general appetite for this change, moving away from ‘business-as-usual conservatism’ (Brown & Sambell, 2020a: 2) has proven difficult to implement, particularly for STEM subjects.

Alternatives to STEM examinations used at the start of the “digital pivot” in Kaplan Pathways, caused reliability and validity concerns, proved resource heavy and not sustainable going forward. Referring to STEM examinations, the question of assessment design needed not have been how we move away from exams, but rather, how we optimize such exams for remote delivery.

Our presentation evaluates the approach taken by Kaplan Pathways to creating scalable, valid & reliable digitised examinations for STEM modules.

Having developed a digital item-banking model (Banerjee et al. 2016, Currier, 2007), we have leveraged its affordances to realise assessment transformation necessary to address the challenges experienced with alternative assessments for STEM subjects.

In this model, validity is ensured through close alignment of items with learning outcomes and learning content (Biggs & Tang, 2011), which also increases assessment relevance to students.

Reliability is enhanced through automated marking processes.

These also help to achieve resource efficiencies and to create instant feedback loops with students (Evans, 2016), and ultimately lead to long-term sustainability (JISC, 2020).

Security is improved through item randomisation, lockdown browsers and online proctoring.

Randomisation of items also allows us to minimise opportunities for academic misconduct.

Learning support is also built into the model, through provision of continuous, formative tests which mimic summative examinations (Gibbs & Simpson, 2005).

Introducing this model has also necessitated a radical redesign of examination questions: by introducing assessment blueprints we control and assure the quality of each item which, in turn, guarantees the stability of examination structure and equivalent examination experience for each student.

We will also discuss lessons we have learned along this journey, from the importance of continuous enhancement of staff digital literacy, through dealing with change management issues that come with a project of that nature, to ensuring strong collaboration with your IT department.

Key References

  • Ashford-Rowe, K., Herrington, J., Brown, C. (2014) Establishing the critical elements that determine authentic assessment. Assessment & Evaluation in Higher Education, 39(2), pp. 205–222.
  • Banerjee, S., Rao, N. J., Ramanathan, C. (2016) Designing Item Banks in Alignment with Course Outcomes for Engineering Courses. IEEE Eighth International Conference on Technology for Education (T4E): 152-155, DOI: 10.1109/T4E.2016.039.
  • Biggs, J., & Tang, C. (2011) Teaching for quality learning at university. McGraw-Hill Education (UK).
  • Currier, S. (2007) Assessment item banks and repositories. JISC CETI. Available from: https://www.academia.edu/2274270/Assessment_item_banks_and_repositories?auto=citations&from=cover_page [Accessed 20 January 2022].
  • Dann, R. (2014) Assessment as learning: blurring the boundaries of assessment and learning for theory, policy and Practice. Assessment in Education: Principles, Policy & Practice, 21(2):149-166.
  • Evans, C. (2016) Enhancing assessment feedback practice in higher education: The EAT framework. University of Southampton.
  • Gibbs, G., & Simpson, C. (2005) Conditions Under Which Assessment Supports Students’ Learning. Learning and Teaching in Higher Education (1): pp. 3-31. Available from: https://eprints.glos.ac.uk/3609/ [Accessed 20 January 2022].
  • JISC (2020) The future of assessment: five principles, five targets for 2025. JISC. Available from: https://repository.jisc.ac.uk/7733/1/the-future-of-assessment-report.pdf [Accessed 20 January 2022].
  • Sambell, K., & Brown, S. (2020a) Changing assessment for good: a major opportunity for educational developers. Available from: https://sally-brown.net/kay-sambell-and-sally-brown-covid-19-assessment-collection/ [Accessed 20 January 2022].
  • Sambell, K., & Brown, S. (2020b) Fifty tips for replacements for time-constrained, invigilated on-site exams. Available from: https://sally-brown.net/kay-sambell-and-sally-brown-covid-19-assessment-collection/ [Accessed 20 January 2022].