Experts are calling for action on medical devices that are prone to unfair biases, including blood oxygen monitors and certain artificial intelligence (AI) enabled tools, to prevent harm to ethnic minorities and women.
A report details the findings of the Independent Review of Equity in Medical Devices which looked at the extent and impact of ethnic and other unfair biases in the performance of equipment commonly used in the NHS.
It focused on optical devices such as pulse oximeters, AI-enabled devices and certain genomics applications, where evidence suggested there was substantial potential for harm.
The panel found evidence that pulse oximeters (blood oxygen monitors) – widely used during the Covid-19 pandemic – can overestimate the amount of oxygen in the blood of people with darker skin tones.
This could lead to delay in treatment if dangerously low oxygen levels were missed.
The experts say they did not specifically look at the use of these devices during the pandemic, but because there was an overwhelming number of people with very low oxygen levels “the likelihood is that that inaccuracy was large at that time”.
Daniel Martin, professor of perioperative and intensive care medicine, Peninsula Medical School, University of Plymouth, said: “We can only say that there’s association between the harm and the inaccuracy and not causation.
“But I think it’s a reasonably strong signal that there’s a potential of harm there, particularly during Covid when oxygen levels are so very low.”
The review makes a number of recommendations in relation to the devices, including patients being advised to look out for other symptoms such as shortness of breath, chest pain and fast heart rate.
It also suggests researchers and manufacturers should work to produce devices that are not biased by skin tone.
On AI-enabled devices, the review found evidence of potential biases against women, ethnic minority and disadvantaged socioeconomic groups.
It highlights potential underdiagnosis of skin cancers for people with darker skin when using AI-enabled devices.
The report suggests this is as a result of machines mainly being trained on images of lighter skin tones.
There is also a long-standing problem of underdiagnosis of heart conditions in women, which AI algorithms in medical devices could make worse, the panel suggests.
The University of Liverpool’s Professor Dame Margaret Whitehead, chairwoman of the review, said: “The advance of AI in medical devices could bring great benefits, but it could also bring harm through inherent bias against certain groups in the population, notably women, people from ethnic minorities and disadvantaged socio-economic groups.
“Our review reveals how existing biases and injustices in society can unwittingly be incorporated at every stage of the lifecycle of AI-enabled medical devices, and then magnified in algorithm development and machine learning.
“Our recommendations, therefore, call for system-wide action by many stakeholders and now need to be implemented as a matter of priority with full Government support.”
Among its recommendations, the report suggests there should be renewed efforts to increase skin tone diversity in medical imaging databanks used for developing and testing optical devices for dermatology, including in clinical trials, and to improve the tools for measuring skin tone incorporated into optical devices.
Enitan Carrol, professor of paediatric infection, University of Liverpool, said: “The NHS has a responsibility to maintain the highest standards of safety and effectiveness of medical devices in use for patients.
“We found no evidence of actual harm in the NHS, but only the potential for racial and ethnic bias in the performance of some medical devices commonly used in the NHS.”
Panel member Professor Chris Holmes warned that the Government needs to understand how AI, including programmes such as ChatGPT, will disrupt clinical and public health practices.
He said: “We are calling on the Government to appoint an expert panel including clinical, technology and healthcare leaders, patient and public representatives and industry to assess the potential unintended consequences arising from the AI revolution in healthcare.
“Now is the time to seize the opportunity to incorporate action on equity in medical devices into the overarching global strategies on AI safety.”
The review was set up in 2022 by then-secretary of state for health and social care Sir Sajid Javid.
He said: “The colour of someone’s skin or where they are from should not impact health outcomes yet the pandemic highlighted how too many of these inequalities remain.
“I hope this review and its important recommendations will help deliver much-needed change.”
In response to the report, health minister Andrew Stephenson, said: “I am hugely grateful to Professor Dame Margaret Whitehead for carrying out this important review.
“Making sure the healthcare system works for everyone, regardless of ethnicity, is paramount to our values as a nation. It supports our wider work to create a fairer and simpler NHS.”
The Department of Health and Social Care said significant action is already being taken to overcome potential disparities in the performance of medical devices.
This includes the Medicines and Healthcare products Regulatory Agency (MHRA) now requesting that approval applications for new medical devices describe how they will address bias.
NHS guidance has been updated to highlight potential limitations of pulse oximeter devices on patients with darker skin tone.
The government will also work with the MHRA to ensure regulations for medical devices are safe for patients, regardless of their background, while allowing more innovative products to be placed on the UK market.
Professor Bola Owolabi, NHS England’s director of healthcare inequalities, said: “Ensuring all patients get equitable access to high-quality healthcare remains crucial to reducing health inequalities and a priority for the NHS.
“I welcome the report’s findings and the NHS will work alongside Government and the MHRA to implement them and ensure NHS staff have the resources and training they need to tackle racial bias.”
Comments: Our rules
We want our comments to be a lively and valuable part of our community - a place where readers can debate and engage with the most important local issues. The ability to comment on our stories is a privilege, not a right, however, and that privilege may be withdrawn if it is abused or misused.
Please report any comments that break our rules.
Read the rules hereLast Updated:
Report this comment Cancel