Objective structured clinical examinations (OSCE) are used for high-stake clinical skills exams in medicine, nursing, dentistry, veterinary and other health-science professional examinations, but also used in law schools and professional bodies. The costs of these serial exams and technical logistics are high and challenging, a reason for Dr Thomas Kropmans, senior lecturer Medical Informatics & Medical Education at the NUI Galway to automate these exams.
“Apart from the costs and logistics, some 30 per cent of the paper-based solution contained ‘errors’,” he said. ‘Students were passing or failing for the wrong reasons. In a limited amount of time (five minutes in general), examiners needed to mark performance and add on results manually. Forms were submitted although incomplete and marks appeared to be incorrect.”
A prototype written in open-source technology was launched in December 2008. Together with co-founder and software engineer David Cunningham, OMIS (OSCE Management Information System) was benchmarked with other existing systems out there. “We couldn’t imagine that other Schools of Medicine hadn’t tackled these issues before,” he said. ‘The paper-based exam is already used worldwide for 40 years but OMIS appeared to be unique.” While marking the final year exams in medicine, we now have access to a track record of electronically retrieved, stored and analysed exam results.
Objective and structured assessments
[caption id="attachment_30912" align="alignright" width="300"]
CLICK TO ENLARGE Fig 1: Students pass through a consecutive amount of different stations and in each of the stations they perform a different task. This task was marked while using a paper-based checklist without instant feedback on quality assurance[/caption]
This kind of assessment is known as ‘objective’ and ‘structured’, whereas the individual mark of examiners is highly subjective and due to interaction between examiner and student. However, the larger the amount of stations, the more robust the outcome. Qpercom assists in this type of examination, which might well suit engineering projects in each of these stations where students must discuss, design and figure out a ‘workable solution’.
Assistance is offered in terms of instant psychometrics of the assessment forms: how well are they designed and are they internally consistent? Do the stations and forms actually discriminate between ‘good-’ and ‘bad-performing’ students? How much do examiners and stations vary during different circuits and sessions of the examinations?
The examiner-assessment tool itself comes in two ‘flavours’: cross-platform-compatible Mobile Examiner APP and a cross-browser-compatible web Application. The OMIS application is deployed as an on-demand SaaS (Software as a Service) using best-in-the-business cloud-service technologies such as Amazon AWS services and Rackspace across multiple regions in order to serve all clients.
OMIS is packaged with a fully-fledged RESTful API (Application Programming Interface), making it possible to integrate with learning management systems like Moodle and Blackboard. The same API is also used by the application front end (Web APP) and the Examiner APP to communicate directly with the backend (database) using a more API-centric architecture instead of the traditional web architectures.
[caption id="attachment_30915" align="alignright" width="300"]
CLICK TO ENLARGE Fig 2: OMIS of Qpercom Ltd produces 'radar plot' highlighting how well students performed in each of the stations compared to the overall performance of the group. This is from Qpercom partner Minho University, Portugal[/caption]
“The system is designed to be highly interactive. It’s dynamic, with importance given to ‘customisability’ using the latest web technologies. We relentlessly pursue better usability and an improved user experience based on our users’ feedback and we monitor closely their user habits,” according to the COO David Cunningham.
Users have reported being very happy with the electronic retrieval, storage and analysis of their exam results. Prof Andrew Murphy of NUI Galway said: “Since 2009, assessors in General Practice OSCE stations submit marks, judgments and feedback electronically through Qpercom’s bespoke OSCE Management Information System. Real-time data entry reduces time-consuming work and errors, while speeding up the analysis and publication of results. We assess examiners’ scores in real time and our data are researched and published by researchers associated with this team of innovators, software engineers and learning technicians. We’re proud to have them on board at our School of Medicine.”
Qpercom international client list
[caption id="attachment_30918" align="alignright" width="300"]
CLICK TO ENLARGE Fig 3: The new situation whereby examiners use an Android or IOS tablet or laptop while observing the students performing a different task in each of the stations[/caption]
Currently, some 25 prestigious universities and professional bodies like Nobel University (California, USA), Karolinska Institutet and Umeå University (Sweden), University of Dundee and University of St Andrews (Scotland) and University of Sheffield (England) are using Qpercom’s assessment solution. All of these universities are highly ranked in the QS World University Rankings system. Ranking eighth in the QS system, the National University of Singapore is using Qpercom since 2012.
St George’s, University of London is the latest to come on board, which has sites in Chicago, Philadelphia, Puerto Rico, Cyprus and Israel. All of these sites use the same OSCE format and stations, but on different continents and different time zones. Previously, a staff member from St George’s had to be on site to deliver and collect the highly confidential assessment paper forms. They now can stay at home and administer the exams remotely and retrieve, store and analyse the results automatically.
“Due to these savings in expenses, the error reduction and reduced administration time, the return of investment is only six-to-eight months. In contrast, the cost for the paper-based assessments is estimated to be about €25,000 per year for an average school of medicine, hosting 1,000 students in a five-year programme,” according to Thomas Kropmans, CEO of Qpercom Ltd.
Last but not least, due to the Student Feedback Email System, students can be informed about their exam results instantly – according to the literature, feedback is the most effective learning tool in practice education. In the paper-based approach, students had to wait three to four weeks before they received their results. Even then, results were provided without feedback regarding how to improve their clinical performance. Qpercom does not sell software; it sells expertise in clinical-skills assessment, as is outlined in its published research.
The unique combination of having a spin out company associated with medical and health science institutions affords it the opportunity to perform research in clinical skills assessment. Recently PhD student Winny Setyonugroho graduated on the ‘Assessment of Communications Skills’ by introducing a gold standard for streamlining the 17 different domains of clinical communication skills. Currently, PhD student Markus Fischer is performing a study on ‘situational awareness’ of undergraduate medical students. It asks: Are students prepared for their postgraduate professional lives on the ward? Are they aware of all the facts they need to take into account to make a profound medical decision?
“When we formed the company seven years ago, we could never have envisaged the impact our software solution would have for the 25 universities and professional bodies we work with now. Since then we have co-published various scientific papers with some of our clients,” said Cunningham.