A firm which reviews healthcare apps for several NHS trusts says 80% of them do not meet its standards.
Failings include poor information, lack of security updates and insufficient awareness of regulatory requirements, said Orcha chief executive Liz Ashall-Payne.
The firm’s reviews help determine whether an app should be recommended to patients by NHS staff.
There are about 370,000 health-related apps available online, Orcha said.
App developers can categorise their apps themselves and the ones reviewed by the firm include those tagged health, fitness and medical.
So far, the firm has reviewed nearly 5,000 apps and found many poor examples, including:
- A diabetes management app offering complex medical support without any back-up from experts
- A physiotherapy app offering exercise plans without any visible input from professionals
- An app to help smokers quit, which had not had security updates in more than two years
One of the criteria on which many apps fail is regulation, Orcha (Organisation for the Review of Care and Health Apps) says – although this can be unintentional, if developers don’t realise what is required.
“Innovators can get a bad reputation and that can be unfair,” says Liz Ashall-Payne.
“Imagine if you have experienced a challenge with your health or that a loved one and you just want to help others. You’re coming at it with good intentions but you wouldn’t necessarily know which regulation your product needs.”
And it is not necessarily straightforward.
Any app which offers to calculate medicine doses or timings, or diagnose injury or conditions is defined as a medical device. They require a CE quality mark, according to the government regulator the MHRA.
But even if an app does not fall into that category, its developer may, depending on the service offered, still need to contact a national regulator: the Care Quality Commission (England), Healthcare Inspectorate (Wales), Healthcare Improvement (Scotland), or the Regulation and Quality Improvement Authority (Northern Ireland).
For example, an app offering access to a virtual doctor would need to be registered with the CQC in England, but AI or machine learning elements would not fall under its remit.
And busy healthcare professionals might not be able to check this either.
“As a healthcare professional, you just want to get good health apps to your patients,” said Ms Ashall-Payne, a former NHS speech and language therapist.
“But it’s difficult to know which tools to recommend.”
The NHS also has its own public-facing app library, containing apps vetted by its digital team NHSX, which carried out some work with Orcha.
Apple and Google rules
Apple and Google have their own review process for allowing apps on their stores in the first place.
Google said in a statement that it reviewed all apps on a “case-by-case basis” and Apple says in its developer guidelines that medical apps “may be reviewed with greater scrutiny”.
Apple also says apps which claim to take X-rays or measure things like blood sugar levels using data taken by the sensors on the device are banned.
Dr Jermaine Ravalier, from Bath Spa University, worked on an app aimed at helping NHS workers tackle mental health issues.
“Lots of apps are put together that are either poorly designed or not researched thoroughly enough,” he said.
Ideally an app needs the input of a large sample of people living with the target subject, rather than a select few, or an individual, he said.
“One issue is the other side – once it’s been designed, rather than rolling it out, check whether it’s actually going to be effective.”