Driving Innovation: U of T Collaborators Study ‘Inattentional Blindness’ on Canada’s Roads

Have you ever been driving and your car gently pulls you back into your lane? Been warned to grip the steering wheel more firmly? Or alerted that the car in front of you is pulling away?
Advanced Driver Assistance Systems (ADAS) are designed to support the driver, helping you stay in your lane, or notifying you if there’s something in the environment to which you should pay attention.
But to make those systems work – and to make Canada’s roads safer for everyone – car designers and engineers need to first understand the limits: What do humans actually need? And when?
It all starts in the lab.
A new study by Professor Benjamin Wolfe, a faculty member in the Department of Psychology at UTM and Professor Birsen Donmez from U of T’s Department of Mechanical and Industrial Engineering is looking at driver distraction and exploring what intervening systems could be built to make drivers less vulnerable.
Together, they’re looking at a fundamental set of questions: Why do we fail to see things we think we should see? How do we combine this limitation of human vision and cognition – which psychologists call “inattentional blindness” – with automation in cars? And how can we build features that keep us safer?
Their project, titled “A Foundation for Next-Generation Driver Distraction Detection and Advanced Driver Assistance Systems (ADAS),” was recently awarded more than $500K over a 16-month period from Transport Canada’s Enhanced Road Safety Transfer Payment Program (ERSTPP).
I’m interested in how you gather information in real world tasks, and driving is one of those tasks where you do this all the time – and we do it incredibly well. It lets us drive safely, but we don’t often think about how complicated that process really is.
The ERSTPP was introduced in 2019 to provide funding to support initiatives that contribute to a safe and secure transportation system. From 2024 to 2026, the program is funding 35 projects focusing on everything from impaired driving to excessive speed, Connected and Automated Vehicles, enhancing ADAS technologies, and more.
By combining research conducted in each of their labs over the coming months, Wolfe and Donmez hope to provide Transport Canada with data on distracted driving that could be used to inform public policy, making cars better for drivers and roads safer for everyone.
Unfamiliar with inattentional blindness? You may have heard of the famous ‘invisible gorilla’ in which a person dressed in a gorilla suit wanders casually through a couple of teams passing a basketball back and forth while you’re counting how many times they pass the ball – and if you’re doing that, you’ll probably miss the gorilla.
Or for an example that may hit closer to home, that moment when you hit send on an email. And only then see the typo. In fact, it’s so common in our lives that Wolfe and his collaborators call it ‘Normal Blindness’ to emphasize just how, well, normal it is.
Research on driving describes this phenomenon called ‘looked but failed to see,’ which suggests that people may look right at something but just not report it. But for this to happen, they need to be distracted or focused on another task.

Wolfe, who co-runs the Applied Perception Psychophysics Laboratory (APPLY) Group with fellow psychology professor Anna Kosovicheva, has long been interested in how we gather visual information from the world.
He does what he calls ‘use-inspired vision science,’ looking to the world for real-world problems, such as digital readability, or what drivers miss on the road. He brings these problems into the Wolfe Lab to understand their core mechanisms – along with how they fail – and then conveys the data he collects to those doing the work in implementation, engineering, and policy.
I sit between psychology as an experimental discipline and Human Factors, an area of engineering which focuses on the user when engineering solutions and in systems. Human Factors is where engineering meets the human, and I’m one of the rare people who works with vision and cognitive scientists on the one side, and with engineers and practitioners on the other.
This isn’t Wolfe and Donmez’s first collaboration; two of the four grant proposals they’ve worked on together have been successful, including one for the XSeed interdivisional research funding program in 2023, which gave them the push and the initial results to apply for this much larger award from Transport Canada.
Wolfe's portion of the grant opens a number of doors. It’s enabled him to update some of his lab’s equipment, including purchasing a high-speed eye tracker essential for knowing whether people looked where they thought they looked in a road scene, as well as supporting a postdoctoral fellow and graduate students in his lab.
With the December 2025 deadline around the corner, the research is well underway. Donmez’ team runs a driving simulator in the Human Factors & Applied Statistics Lab, which enables her team to create highly controlled conditions which can be replicated from one person to the next and to observe how drivers respond to unanticipated events, drawing on scenarios studied in the Wolfe Lab. Meanwhile in the Wolfe Lab, they’re running desktop experiments using their database of road videos where they can distract drivers safely, along with state-of-the-art eye tracking technology to see what they looked at and what they didn’t.
Fun fact: there are large communities of people who share their dashcam videos on YouTube and Reddit. Over time, Wolfe and his postdoctoral fellow, Dr. Jiali Song, have taken this video and built the Road Hazard Stimuli, a publicly available database of dashcam videos for studying driver behaviour. They currently have more than 700 video clips – and of those, two-thirds are of near-collision situations. (You can learn more about the database in Song’s recent article.)
Each participant who comes into the lab is required to have a driver’s license – though Wolfe has plans to study people who have never been licensed – and sessions take anywhere between 90 minutes to two hours. To study distraction and inattentional blindness in the lab, participants will be asked to do two tasks at once: they’ll monitor a meter, such as the speed or engine temperature, and report when they see it in a certain position. At the same time, they’ll be asked to report anything dangerous that may show up on the screen, and they’ll do all of this while Wolfe’s team tracks where they look.
The convenience and safety of the desktop setup means that a participant can have many more ‘events’ in a given amount of time without being in any danger.
“In a simulator, you can have one thing that you need to respond to every few minutes,” Wolfe explains. “But in our lab, we’re going to do that to participants every few seconds.”
By watching where a participant’s eyes go, Wolfe and his team can tell when they look right at something but don’t report seeing it.
“If we can make this happen in the lab, where we can control the environment, as a follow up, we’re going to see if we can intervene,” says Wolfe. “Can we give them a cue or an alert at the right time that makes people less vulnerable to this effect?”
This collaboration and funding has allowed Wolfe’s team to ask a range of questions beyond just studying how to make roads safer for everyone and is helping to push his research towards more attentional questions – why don’t we see what we think we do?
“What I work on is in a different corner of psychology – I'm a bit different than my colleagues,” he says. “I want to understand how vision and perception work, but I also want to work with collaborators who can take what we learn and bring it out of the lab to designers, engineers and policymakers.”