Ramesh Raskar is Associate Professor at the Media Lab of Massachusetts Institute of Technology and director of the Camera Culture group, which has developed the NETRA (Near Eye Tool for Refractive Assessment) technology.
Other recent inventions include a camera that can look around corners, next generation CAT-Scan machine, imperceptible markers for motion capture, long distance barcodes (Bokode), and touch+hover 3D interaction displays (BiDi screen). He holds 40 US patents and has received four Mitsubishi Electric Invention Awards.
Raskar, who received his PhD from the University of North Carolina at Chapel Hill, speaks to Nandini Krishnan about his pet project NETRA and how he plans to take it forward, so that billions of people in remote areas can access eye care.
How did the idea of using mobile phones to measure refractive error strike you?
Last year, our group had an invention called bokode as part of another project. It’s basically a microdot, and when you take a picture of it, you actually see a bar code. It’s a clever way of hiding a lot of information like you would see on the barcode of a product, on a very small dot. And the interesting thing was that, although that microdot is supposed to be photographed by a camera, you can also hold that microdot close to your eye, and you will see the barcode. So it was a fun thing to do, and I showed it to many people. But when I brought it home, and showed it to my wife, she couldn’t see the barcode very clearly with her right eye, but she could with her left. And then we remembered she has differential power - no power in her left eye, but some in the right eye. The other people I’d shown it to were in office, so they had their glasses or contacts on. So that was a revelation that bokode could also be used to check whether someone has a refractive error.
What stage of development is it in? You say you’ve performed human subject tests, but not field tests. Could you explain that?
To do any human subject test, we need approval from a regulatory body. So we’ve been able to get approval from MIT, and so we’ve done human subject tests for about 20 people now. But to do field tests, we are waiting for approval, and we will be entering into a collaboration with LV Prasad Eye Institute in Hyderabad in India, and other institutions.
You’ve tested it on 20 subjects. Isn’t that a smaller sample size than usual?
This is the subjective test. We also did an objective test, where we put our device through a very sophisticated measurement test by using cameras and lenses to stimulate it. So you take a high quality SLR camera, and adjust it such that the camera has a refractive error. And it worked really well there. For the subjective test, 20 is a large number because each of them had a different kind of error, not just nearsightedness and farsightedness.
Can NETRA produce an accurate reading to diagnose astigmatism?
That’s the key. A lot of methods out there were getting astigmatism right. Because our method is on a software, we can show these lines at different orientations without a person ever having to remove a clip-on device. We do the test in eight different directions, eight different orientations, and from that, we can also treat astigmatism.
Is NETRA more accurate than the current methods of diagnosing refractive error?
Right now, the most common method is still using a reading chart. It is very inaccurate because it depends on many factors. For example, if the room is bright, your pupil size is small and you can actually see more than what your prescription will say. Even if you want to do it in the right setting, you need a couple of trial lenses, which will cost a lot of money. Then, the other test is that a device will shine a laser in the eye and the camera basically looks out of the eye. As you can imagine, shining a laser in the eye is only allowed by sophisticated devices. They’re expensive, bulky and you need a trained person to use it. The basic idea with NETRA is that by involving the user, we can empower people.
Does NETRA mean you don’t have to go to the optometrist? Optometrists don’t just check for refractive error, do they?
That’s an extremely good point. We do advise that this is not a replacement for an optometrist, because today’s optometrist will do a lot more. They also look at your retina and the blood vessels there, to see if you have a diabetes-related problem. But the other point is that these are things a sophisticated ophthalmologist will do. At the same time, what we’re saying is that there are millions of people who have no access to an ophthalmologist. Think about a remote village. It’s a lot easier for them to access a cell phone. Forty percent of the world population, about 2 billion people, have refractive error and many of them don’t find out. What’s very sad is that two percent of the world population is blind because of uncorrected refractive error, simply because they don’t have access to an optometrist. So it’s just a first wall of defence.
Are you planning to incorporate more parameters, so that eye diseases can be diagnosed?
Right now, we’re only aiming at refractive error. And based on initial tests on some other important problems like cataract, where the eyes get cloudy, we have a solution for that too. But we have not published it yet.
Will it help in diagnosing glaucoma, given that the progression of that disease is very slow?
For cataract and glaucoma, which are major causes of vision loss, we’re still working on models. We’re in the initial stage of testing for diagnosing cataract.
Who is your target audience? International Centre for Eye Health says refractive error is a lot more common among people who are below the poverty line, and can’t afford eye care. Are these the only people you’re targeting?
We are using the NGO model and partnering with a reputed institution. So there are two parts to this – one is measuring the problem, and the second is providing eye glasses. For the longest time, providing eyeglasses was the biggest challenge. They were very expensive. But now, they’re affordable, and the problem is access. You must have some expert person who will examine the eyes, and there is a shortage in the demographic that we’re targeting. This includes people who are too remote or too poor. There’s also a third audience – people who are not poor, but who have never realised they must have an eye test because their sight seems fine. I myself had never gone to an optometrist until I started working on this project. During our research at MIT, we found that a lot of people found out they had power, which they hadn’t realised before.
But your main target audience wouldn’t be familiar with technology either. So how will you market it to them?
There are several models. If you work with an NGO, they’ll have their own team. We’re thinking this is not for profit, but it must also be self-sustaining. For example, the same shopkeeper who sells the SIM card can also stock the clip-on device. And the people who use it can tell the shopkeeper their reading. When he next goes to town, he can buy the right glasses for them and bring it back. So there is an incentive for the shopkeeper.
Can it be used for children, and measured by their parents? What is the age range?
The reason we say it will be difficult to measure in kids is that unlike a sophisticated wavefront aberrometer, NETRA is not completely automatic. You can clip it on and get your refractive error immediately. Our method is more based on clicking on a few buttons, aligning the patterns or lines and so on. It’s really a matter of how old a child is before he or she can follow the instructions. So it depends on how well they communicate; it’s subjective.
Is it hygienic to use the same device on multiple patients, or is it disposable?
That’s important. When you use it on more than one person, it’s advisable to use a wipe between uses, so you don’t transfer anything from one person to the next. But since you’re only using it for a few seconds and it won’t be in contact, it’s not critical.
You say the device is used on mobiles because you need an LCD screen for programmable high-resolution display. Can it work on other LCD monitors such as a TV or laptop too?
The main reason we’re able to do this is that for the new mobile phones, the LCS screen is at an extremely high resolution for a Google Nexus phone or the next iPhone. What we’re doing now would not have been possible even two years ago, because they simply were not high-resolution enough then. And a TV or laptop or bigger monitor is not good enough for what we want to do, but we actually have a solution in the pipeline that may allow you to do it.
Do you have enough support for the initiative?
There is increasing awareness about it. When the United Nations and other bodies speak about world challenges, so far it was things like malaria and TB. But then refractive error has become one of the major goals of the UN Development Programme. It affects a large population, and their daily livelihood because if you can’t see, you can’t work. But because it isn’t as dangerous as the others, it didn’t get as much attention so far. However, it is a key issue now.
Read all about NETRA: Let your phone check your eyes!