Eye scans from 150,000 NHS patients in the United Kingdom are used to test warning signs of eye problems, over-the-counter artificial intelligence tools that can be used to detect vision loss due to diabetes. However, researchers aim to avoid repeating previous NHS data sharing scandals by anonymizing records and allowing AI tests to run only on servers approved by the NHS itself.
In the UK, 2 million diabetics are scanned each year for signs of retinopathy, the leading cause of blindness in the UK’s working-age population. Each test contains one or more pictures taken with each eye, depending on the signs of the underlying disease, and is manually performed by an ophthalmologist, optometrist, and other trained technicians at about 1,000 per year. Screening of 10,000 photos will be done.
Last year, Alicja Rudnicka and her colleagues at the University of London at St. George conducted an experiment in which an AI system examined scans of 30,000 patients in London and Gloucester. AI reduced the number of required human checks by 50% and there were no false positives. Rudnicka’s team is currently working on a larger study that she calls a “real game-changer.” Many companies are invited to use NHS data to evaluate AI tools for future deployment potential in the UK.
“The idea behind screening is to prevent vision loss, which has proven to be very effective, but it’s just a lot of work,” Rudnicka said. “Having the workload in half is a huge deal.”
Since the team’s previous study collected data primarily from Caucasians, the next trial will collect more data from people of other races to develop a racially unbiased system. Is intended to guarantee. Since the pigment on the back of the retina is the same as the rest of the skin, scans from different ethnic groups can vary significantly, and AI was trained primarily on Caucasian images. Scanning skin colors of different ethnicities can be less effective.
Approximately 150,000 people screened under the NHS Foundation Trust, University College London Hospitals NHS Foundation Trust, and Moorfields Eye Hospital NHS Foundation Trust will be included in the trial. Rudnicka told New Scientists that he might consider opting out if the patient so requested, but it was unclear if she would be regularly notified about the trial during screening.
Previous AI exams on the NHS are controversial. DeepMind, a UK artificial intelligence company owned by Google’s parent company Alphabet, announced in 2016 that it is working with the NHS to predict or diagnose acute kidney injury. However, the data-sharing agreement with the Royal Free London NHS Foundation Trust gives DeepMind access to comprehensive data for 1.6 million people, including sensitive information such as whether they have been diagnosed with HIV and whether they are infected with HIV. It will be like. abortion. .. The contract was later found to be in breach of data protection legislation.
The new test anonymizes the data by removing personal information and storing the resulting dataset on a server within the NHS. Companies can install code on these servers and test them with data, but Rudnicka said he can’t remove any part of the dataset. AI only evaluates warning signs in images and cannot learn from datasets. The final list of participating companies has not yet been decided, but there could be as many as six.
Sam Smith, who runs the medical data privacy group Med Confidential in the UK, wants to see a guarantee that who will benefit from the test results and that patients will be informed that their data is being used. “Everything they find is open to knowledge, not personal gain,” he said. “Or will the patient feed his data to the company’s AI with no choice and not even notify the patient?”