INDIA_EYE_CARE_TECH_2-1556707081141
Medical personnel at the Aravind Eye Hospital perform an eye exam on a patient in Madurai Image Credit: NYT

The Aravind Eye Hospital will treat anyone who comes through the door, with or without money. Each day, more than 2,000 people arrive from across India and sometimes other parts of the world, crowding into the hallways and waiting rooms of this 43-year-old hospital at the southern end of the country. On a recent morning, Vt Muthusamy Ramalingamm, a local resident, walked into a room on the second floor, sat down and rested his chin on a small desktop device that pointed a camera into his eyes.

A technician tapped on a screen at the back of an eye scanner, and within seconds a diagnosis appeared on a computer against the wall. Both eyes showed signs of diabetic retinopathy, a condition that can cause blindness if untreated.

In most hospitals and clinics around the world, trained physicians make this diagnosis, examining a patient’s eyes and identifying the tiny lesions, haemorrhages and discoloration that anticipate diabetic blindness. But Aravind is trying to automate the process. Working with a team of Google artificial intelligence researchers based in California, the hospital is testing a system that can recognise the condition on its own.

Google and its sister company Verily targetted this type of blindness because of its prevalence and because it is the sort of illness that an AI system can detect early. Google is not charging the hospital while it tests the technology.

Researchers hope this AI system will help doctors screen more patients in a country where diabetic retinopathy is increasingly prevalent.

INDIA_EYE_CARE_TECH_5_(1)-1556707087120
Doctors at the Aravind Eye Hospital in Madurai, India, use computers to communicate with one of the small vision centres that the hospital runs Image Credit: NYT

Nearly 70 million Indians are diabetic, according to the World Health Organisation, and all are at risk of blindness. But the country does not train enough doctors to properly screen them all. For every 1 million people in India, there are only 11 eye doctors, according to the International Council of Ophthalmology.

The project is part of a widespread effort to build and deploy systems that can automatically detect signs of illness and disease in medical scans. Hospitals in the United States, Britain and Singapore have also run clinical trials with systems that detect signs of diabetic blindness. Researchers across the globe are exploring technologies that detect cancer, stroke, heart disease and other conditions in X-rays and in MRI and CT scans.

Recently, regulators certified the eye system for use in Europe under the Verily name. And the Food and Drug Administration recently approved a similar system in the United States. But hospitals are treading lightly as they consider deploying systems that are vastly different from technology traditionally used for health care.

Aravind’s founder, Govindappa Venkataswamy, an iconic figure in India who was known as “Dr V” and died in 2006, envisioned a network of hospitals and vision centres that operate like McDonald’s franchises, systematically reproducing inexpensive forms of eye care for people across the country. There are more than 40 of the vision centres around India.

In addition to screening patients in Madurai — one of the largest cities in southern India — the hospital plans to install Google’s technology in surrounding villages where few if any eye doctors are available. The new AI system could radically expand the number of people who can be screened.

“Right now, there is a bottleneck when it comes to just screening patients,” said Dr R. Kim, a nephew of Dr V’s who serves as chief medical officer at Aravind.

Behind the new screening methods are neural networks, complex mathematical systems that can learn tasks by analysing vast amounts of data. By analysing millions of retinal scans showing signs of diabetic blindness, a neural network can learn to identify the condition on its own.

On paper, the Google system performs very well. But when you roll it out to a huge population, there can be problems that do not show up for years.

- Luke Oakden-Rayner

A neural network is the same technology that is rapidly improving face recognition services, talking digital assistants, driverless cars and instant translation services such as Google Translate.

Because these systems learn from enormous amounts of information, researchers are still struggling to completely understand how they work — and how they will ultimately behave. But some experts believe that once they are honed, tested and properly deployed, they can fundamentally improve healthcare.

At Aravind, computer screens mounted on the walls of the waiting rooms translate information into the myriad languages spoken in the hospital.

During his exam, Ramalingamm, 60, spoke Tamil, the ancient language of southern India and Sri Lanka. He said he was comfortable with a machine diagnosing his eye condition, in part because it happened so quickly. After the initial screening by the AI system, doctors could treat the eyes, perhaps with laser surgery, to stave off blindness.

The system performs on a par with trained ophthalmologists, according to a study published in The Journal of the American Medical Association. But it is far from completely replacing a doctor.

Earlier in the day, Pambaiyan Balusamy, 55, sat in the same room. The Google system diagnosed “proliferative” retinopathy in his left eye — the most serious form of the condition — but it could not read the scan of his right eye, most likely because the eye had developed a cataract.

Doctors can sometimes make a diagnosis when faced with cataracts and blurry eye scans. The Google system still struggles to do this. It is trained largely on clear, unobstructed images of the retina, though Google is exploring the use of lower-quality images.

Even with this limitation, Kim said, the system can augment what doctors can do on their own. Aravind already operates small vision centres in many of the cities and villages surrounding Madurai. The hope is that the Google system can make eye screening easier in these facilities and perhaps other locations across southern India.

Today, in these vision centres, technicians take eye scans and send them to doctors in Madurai for review. Automated diagnosis can streamline and expand the process, reaching more people in more places — the kind of “McDonaldization” espoused by Dr V.

The technology still faces regulatory hurdles in India, in part because of the difficulty of navigating the country’s bureaucracy. And though Google’s eye system is certified for use in Europe, it is still awaiting approval in the United States.

INDIA_EYE_CARE_TECH_3-1556707084240
People wait in line at the Aravind Eye Hospital in Madurai, India Image Credit: NYT

Luke Oakden-Rayner, director of medical imaging research at the Royal Adelaide Hospital in Australia, said these systems might even need new regulatory frameworks because existing rules weren’t always sufficient.

“I am not convinced that people care enough about the safety of these systems,” he said.

Though these deep-learning systems are new, they are hardly the first effort to aid diagnosis through computer technology.

As Oakden-Rayner pointed out, software called breast CAD — approved by the Food and Drug Administration in 1998 — has been widely adopted in the United States to help with the detection of breast cancer, in part because Medicaid provides a rebate when the technology is used. But studies have shown that patient outcomes did not improve and in some cases declined.

“On paper, the Google system performs very well,” Oakden-Rayner said. “But when you roll it out to a huge population, there can be problems that do not show up for years.”

–New York Times News Service