A leading med-tech expert has issued a stark warning about the risks posed to patient safety and data privacy by widespread AI tools used by GPs in NHS consultations, which were the subject of NHS guidance shared this week.
Dr Andrew Whiteley – a former GP and founder of Lexacom, one of the UK’s longest-established developers of next generation speech-powered products – is calling for urgent action to ensure Ambient Voice Technology (AVT) tools meet NHS compliance standards before being deployed in patient care.
“Harnessing efficiency gains in primary care via AI is important, but this should not come at the cost of patient safety or data security”, says Dr Whiteley. “The reality is that GPs are increasingly under pressure and time poor. Despite this, there is a clear hunger to interrogate emerging solutions, and embrace new systems that could be transformative for the clinicians themselves, and for patients too. So the task at hand is to offer them guidance to make choices that are compliant with NHS regulations.
“In particular, any solution that processes patient consultations through AI must, at minimum, automatically redact personal data before processing – otherwise the risk of breaches is simply too great. And crucially, we believe data must be stored in the United Kingdom, where it is protected under UK law – providing greater clarity and reassurance around accountability and legal safeguard.”
A recent Lexacom-commissioned YouGov survey found:
- 73% of UK adults do not trust big tech companies with their personal health data
- Over 8,000 data breaches have been reported across the UK health sector since 2019
“There’s a tension we must address,” adds Dr Whiteley. “The public is open to AI in healthcare – but only if it’s used responsibly. Trust is fragile. We owe it to patients to show that innovation can go hand-in-hand with robust protections.”