How does a computer know that you’re really you? A few years ago, our phones began to check fingerprints and faces instead of passcodes—the type of biometric scans you used to see only in movies like Mission: Impossible. These methods are now commonplace, but they are also more fallible than you might think.
Naser Al Madi, an assistant professor in Colby’s Computer Science Department, is working with students to refine a potentially more secure way to verify a person’s identity: eye movement. With Ricky Peng ’24, he is exploring how artificial intelligence could help boost the accuracy of this promising technique.
It’s hard to imagine anything more unique to you than a fingerprint, or your face. But because these traits are relatively static, they are hackable. If an image can be copied, after all, it can be used to trick a sensor into perceiving it. And with the advent of the coronavirus pandemic, we also need a contact-free way to access public machines like ATMs and ticketing kiosks.
Beyond security and germ concerns, traditional scans won’t work as we become even more seamlessly integrated with our gadgets.
“Electronics are expected to move from cell phones to wearable technology like Google Glass,” Al Madi said, referencing Google’s computer embedded within eyeglasses. “You’re going to interact with a device through vision and through your eyes, and you won’t have a keyboard.” What, in that case, becomes your login signature?
Your moving eye is one option. “We don’t have a motor today that can replicate the movement of the human eye just yet,” said Al Madi. Each person has variations in the way they see, and the six muscles that control the eye can act almost like a password with six digits. But current research based on physical eye movement alone has only been able to identify people with 60-percent accuracy, he added.
Al Madi and Peng are combining physical movement with cognitive cues to create a reliable and more accurate individual “eyeprint.” Instead of simply gauging how your eye moves as it tracks a dot, for example, machine learning can help deconstruct how a person processes a sentence. So far, they have been able to boost the accuracy of eye movement scans to 73 percent.
As a kid in elementary school during science class, Peng watched a Google video about smart home systems that could be controlled with voice and eye commands. The video made a lasting impression, sparking his interest in automation and robotics. Al Madi’s lab was a natural fit with his interests, so he applied to join. His first project was contributing to an open-source library of code that anyone could use as a toolkit to analyze and process eye movement.
Noticing Peng’s talent at writing clear code, Al Madi suggested he take on the eye movement authentication project, which they cleverly titled, “Eyes, Window to the Soul or Key to Your Computer?” and presented at Colby’s Undergraduate Summer Research Retreat in August 2021. Peng spent the summer processing an eye movement dataset called GazeBase, which was collected from more than 300 participants at Facebook Reality Labs. The dataset recorded eye movements among the participants for activities such as reading poetry and watching video.
Peng’s task was to extract, analyze, and model cognitive and linguistic features for the language-based data, such as how long readers lingered on a first word, whether the reader was likely to skip words, and how long the reading took overall.
During the summer, Peng met with Al Madi three or four days a week and also for lunch weekly. In addition to that guidance, it helped that both are night owls.
“Many times, one of us would send an email to the other after 9:00 p.m. and would get a very quick response,” Peng said. “Then we would know we were both working and would start to shoot more emails back and forth to discuss any bugs, progress, or ideas.”
Experienced readers, Peng noted, are likely to skip over common and expected words. He gave an example: “The doctor said excessive drinking could hurt your…” Many readers won’t need to actually read the word “liver” to know that it completes the sentence.
Some eye movements can take as little as 15 milliseconds, Al Madi said, and a sophisticated eye tracker will take 1,000 samples per second. Given enough time and the right task, in theory it’s possible to identify someone based on how they read. But most commonplace devices can’t accomplish this on their own: an average smartphone camera takes just 30 to 60 samples per second. One area of research, Al Madi said, is focused on training software to fill in the frames that simpler cameras wouldn’t capture, making it more feasible for eye-movement tracking eventually to go mainstream.
Having tested the linguistics-processing theory using Facebook’s dataset, Peng and other students are now using an eye tracker assembled in Al Madi’s lab to collect and analyze their own data. The goal is to get even better accuracy to make eye movement a more realistic option for authenticating users.
Even if an algorithm succeeds at identifying someone based on how they read a sentence, the same person won’t necessarily be consistent. If you’ve had your morning cup of coffee, Al Madi noted, or you’re low on sleep, that will affect your eye movement. This variation isn’t necessarily a bad thing, since such small changes can act like a password that changes itself regularly. But any successful system will need to account for such individual variations, making it a remarkably complex problem.
Still, the promise of being able to ditch passwords and scans makes the effort worthwhile. After all, Al Madi said, “nobody can steal your eye movement.”
Getting a Head Start on AI
The Davis Institute for Artificial Intelligence and Halloran Lab for Entrepreneurship host SureStart, a summer program that teaches real-world skills to students interested in AI
Colby Debates a Blueprint for an AI Bill of Rights
The campus hears from a coauthor of the White House’s statement of principles for artificial intelligence
Elevating the Role of Undergraduates
Adaobi Nebuwa ’24, a computer science neophyte when she got to Colby, now plays a pivotal role in one CS professor’s lab
A Vital Element of AI? Empathy
How one Colby graduate is putting humanity into AI technology
A Force for Good
Students use innovation, creativity, and a 3D printer to give people in need a prosthetic hand