It Started With Drunk Driving and Mobile Phones

Human Factors · Career · Research

Not the combination you'd expect. But that's where 25 years of watching technology change human behaviour began — in a research lab, with a driving simulator, and two questions that have followed me ever since: How do we make this better? And: Are we moving in the right direction?

School was always hard for me

My father died when I was 15. I became a people pleaser and tried to get through it. I had 100% attendance, but my grades were nothing to write home about. I shut down in ways I didn't fully understand at the time.

My mom didn't let me quit. She believed — fiercely, non-negotiably — that the biggest failure was never trying at all. So when I finished high school, she made it clear: she would help pay for university if I went right away. I enrolled at the University of Calgary as an undecided major, sampling Psychology, Sociology, Communications. I took a year-long English class that I despised by the second session — it was on Chaucer, and no matter how much work I put in, I got a C. So I stopped going and got a D. It destroyed my overall GPA.

I never took another English class. I refocused on Psychology, minored in Sociology, and kept going.

Then in my final semester, I took Human Factors 101 with Dr. Jeff Caird. It changed my life. I think he and that course saved my life.

Everything I had been studying suddenly connected: human behaviour, systems, technology, limitations, unintended consequences. The light came on. I pulled up my GPA. And I knew, for the first time, what I wanted to do.

I went to Jeff and asked to work in his lab. He told me to keep bugging him. So I did — I showed up, worked for free, made myself useful until he took me on. It was one of the best periods of my life.

CERL research lab

The CERL lab — where it all started.

The world was just beginning to change

To understand why Human Factors gripped me, you have to understand the time. I grew up with one landline phone in the kitchen. A television with bunny ears and three channels. Cassette tapes. Walkmans. Atari. Fax machines.

When I took Human Factors 101, email was still a novelty. Facebook didn't exist. Most people accessed the internet through slow dial-up connections. Technology was beginning to reshape daily life — but we were still early enough to see the seams. Early enough to ask what it was doing to us.

I worked in the lab for a year studying the impact of alcohol and cognitive distraction on driving. We used a driving simulator to understand how mobile phones — even basic ones — changed behaviour behind the wheel. Not in the obvious ways, but in quieter ones: how attention shifted, how decisions changed, how people overestimated their ability to manage both.

Technology changes behaviour faster than we realize. And humans adapt — sometimes in dangerous ways.

From research to real-world systems

I took extra classes so I could apply to graduate programmes in the USA. I went to the Driving Assessment conference in Salt Lake City on a road trip with my labmates — met researchers from different programmes, made connections, started to see where this could go. I took my GREs and applied across disciplines — Human Factors sits across Psychology, Engineering, and Computer Science programmes. I was accepted to several. I chose the University of Iowa — Industrial Engineering, Human Factors programme.

My mom drove me across the border and across the country. She helped me open a bank account and find my first apartment, then went back home to Canada. I started my Masters, finished in a year and a half, and stayed for my PhD.

My focus was on collision warning systems — early, faulty, inconsistent technology that gave off what felt like random alerts. I studied how people interacted with a system when it wasn't completely reliable. People would trust it when it worked most of the time, then fail to intervene in rare critical moments. Or they'd lose trust entirely after too many false alarms and switch it off — losing the genuine safety benefits.

The goal

It wasn't maximizing trust. It was calibrating it — understanding enough about how a system works and where it fails to navigate it well.

That idea has followed me through every role since.

Dissertation research on warning types

Dissertation research on collision warning systems and how different warning types affected driver response.

From the lab to Google — and across the world

By the time I was finishing my dissertation, the light had gone out a little. I was dragging. My advisor had gone on sabbatical. I'd been at it for years. I was going to be an academic — or something. And then an ad appeared in the HFES bulletin for an internship at Google. My mom convinced me to apply. I did.

After several stressful rounds of interviews, they offered me the position. I quit smoking. Wrapped up my research. We rented a place on Craiglist. My husband and I — both still finishing our PhDs — packed everything into our new car and drove from Iowa City to San Francisco with no plan beyond the next few months. I submitted my dissertation to the committee. We just took the leap.

I arrived and found out I was working on the autonomous vehicle project at Google X. It blew my mind.

Larry and Sergey visited the office. We used prototypes of Google Glass and an autonomous golf cart. I took a test drive from Mountain View to Carmel with engineers who would go on to found Waymo and Aurora. The sensors weren't perfect. The algorithms weren't perfect. And people are very bad at monitoring automation — the same patterns I'd studied in my dissertation were playing out in real time, at a scale and speed I hadn't imagined.

My mentor told me something I needed to hear: it doesn't have to be perfect to be useful. I've been trying to hold onto that ever since.

Google self-driving Caddy Monica with the Chauffeur prototype
Google driving simulator

Testing in the simulator — the same patterns from my dissertation, playing out at scale.

YouTube, Maps, and learning what research could really do

I defended my dissertation, converted to a full-time role, and joined YouTube as a UX researcher. I was surrounded by some of the first UX researchers at Google — people doing work I found genuinely inspiring. I felt, for a long time, like I didn't belong and it pushed me to be better. I worked on expanding the platform — personalization, content discovery, and the expansion into shows, music, and paid content. We ran immersion and diary studies to understand why and how people watched, what they needed, what would make the platform genuinely better. It was inspiring work, surrounded by smart and talented people who cared deeply about getting it right.

Then I moved to Google Maps and Android Auto — back to my roots in driving, this time in a product environment that moved at a completely different pace. We ran immersion studies bringing executives into different countries to sit with real users. I built a driving simulator from scratch. I introduced structured testing into a culture built around fast launches. I built frameworks for understanding how people actually navigated — walking, driving, motorcycling across different countries — that still shape how those products are developed today.

What I loved most was when research changed the direction of a product — not just the details, but the strategy. When a team heard something from a real user that reframed what they were building and why. We were not just generating insights. We were building knowledge and empathy that informed what got prioritised, what got cut, and what actually made it into people's hands. That shift — from research as validation to research as strategy — is what good human factors work looks like. It never got old.

The leap across the world

It started with what looked like a phishing email. It said it was from the head of product at Agoda — a Booking Holdings company based in Thailand. I almost ignored it. I took a call instead. That call led to a flight to Bangkok for a few days to interview.

It had the energy of early YouTube — expanding fast, hungry for insights, full of possibility. I knew immediately I wanted to be part of it. But it was not really that simple.

There were months of conversations, back and forth, negotiation and doubt. My husband thought I would make more money. I did not. We debated it endlessly — the logistics, the kids, the risk, what we would be giving up. In the end we committed fully. We sold the house. Sold the cars. My husband quit his job. We packed up our two and four year old and flew across the world with no plan beyond landing. We lived in temporary housing for four weeks waiting for our things to arrive.

My husband eventually got a full-time role at Agoda too — his desk a few feet from mine. Bangkok became home in a way neither of us planned. It has been the best move we have ever made.

Some bets aren't really about the money. They're about the life on the other side of the leap.

Agoda Bali field research

Field immersion research with hosts and travellers, Bali, Indonesia.

At Agoda I joined a small team on a fast-growing platform. My early focus was on Agoda Homes — building out the traveller and host experience, running large-scale studies across Japan, China, and Southeast Asia that informed the direction of the product and helped define the strategy. We worked iteratively, cross-functionally, and the research shaped what actually got built.

When my manager left, I stepped into the lead. We moved from UX research to product insights — managing a team of qualitative researchers and quantitative analysts, building programmes for partner, traveller, and customer research. We piloted new approaches to make our insights more valuable and to shape strategy rather than just validate decisions. I was learning to lead — and getting a lot of it wrong. Making the case for research in an organisation that trusted experimentation and data above everything else was hard. Convincing people to slow down long enough to understand something was harder.

Then COVID happened. The business grew tighter. Several reorgs. Some layoffs. Manager changes. Reduced budgets. Working from home with kids doing school at the kitchen table. I was trying to be a good leader and a good parent at the same time — and I had failed to put my own oxygen mask on first.

My body gave out before I did. It was the first time I really listened to it. I stepped back. Helped the kids with online learning. Nursed myself back to health. And started to ask, honestly, what I actually wanted next.

Looking back, I learned something that took me too long to understand: you need to hold strong and stay true — to what you know, to what you're doing, to why it matters. To your values and who you are. That's not a small thing to learn. It cost me a lot to get there.

Electrolux

Leading content systems across APAC and Europe for Electrolux.

After a year off I joined Electrolux, eventually leading website content across APAC and Europe — 150+ domains, cross-regional teams. On paper it looked like a different kind of work. In practice, it was Human Factors applied to an entirely different kind of system.

The questions were the same ones I'd always asked: why is this breaking down? Why are people redoing work that should already be done? Why isn't the output good enough — and where in the process does it go wrong? How do you design a system that holds together under real pressure, across different teams, cultures, and languages, and still produces something useful at the end?

It turned out I was still studying how humans interact with complex systems. The system had just changed from a car or a product to an organisation itself. And the same principles applied — understand the real problem, design for how people actually work, reduce unnecessary friction, build something that people can actually use.

I also learned something new: I like building things, not just understanding them. Not just generating insights but bringing them to life, owning their execution, seeing what happens when the idea meets reality. That matters for what comes next.

Dam and Flow is that next thing.

The industries changed. The work didn't. It has always been about how humans interact with complex systems — and how small design decisions create massive ripple effects.

Why I'm writing now

My mom died last year. She knew it was coming — two years of COPD, her life no longer being what it was. In those two years she became incredibly intentional: choosing what to keep, what to let go, repairing relationships, reducing the clutter that no one else would value. She wanted to live with dignity. And she did.

Her death made me realise that I have not been.

After she died, I spent time sifting through grief and also through everything she'd left behind. And I noticed how much clutter exists in my own life — digitally, physically. The purchases and the excess that technology enables. The way I'd drifted from the original question I started with.

That question was always: How do we make this better?

Over the years it slowly became: how do we grow the business? The drift was gradual. I barely noticed it happening.

Part of that drift was structural — not just personal. The field itself had been evolving and expanding. Human Factors merged with cognitive psychology and then with behavioural economics — the work of researchers like Daniel Kahneman and Dan Ariely revealing just how predictably irrational human decision-making actually is, and how reliably it can be shaped by the right design. UX research absorbed all of it and grew into something far more powerful than any of its predecessors alone. Together they became the toolkit for designing products and services that work with human psychology rather than against it. The science of how people think, decide, and behave became extraordinarily sophisticated. And that sophistication was increasingly deployed in the hands of businesses whose primary measure of success was growth — not safety, not wellbeing, not the original question the field was built to answer..

The knowledge kept expanding. The guardrails did not keep pace.

My kids have devices now. I watch them navigate systems designed to capture attention, not protect it. And I see myself in them — I still find myself scrolling when I meant to check just one thing. I helped build some of these systems. I know exactly how they work. And I still end up somewhere I didn't intend to be.

That's not a personal failing. It's a design problem. And it's exactly what the next post is about.

This blog is where I think out loud about that — about what these systems actually do, why it keeps happening, and what we can do about it. Not alarm. Not restriction. Calibration. The kind that comes from understanding how these systems work, what they're designed to do, and what we want instead.

My mom spent her last two years choosing what to keep. That feels like exactly the right place to start.

Further reading
Foundational books
  • The Design of Everyday Things — Don Norman
  • An Introduction to Human Factors Engineering — Christopher D. Wickens et al.
  • Kahneman, D. (2011). Thinking, Fast and Slow — Farrar, Straus and Giroux
  • Ariely, D. (2008). Predictably Irrational: The Hidden Forces That Shape Our Decisions — Harper Collins
Field operational testing
  • Field Testing Begins on Crash‑Warning System — Automotive Fleet
  • IVBSS Field Operational Test Final Program Report — DOT/NHTSA / UMTRI
Google's self-driving car project

This is an ongoing conversation — about what these systems do, why it keeps happening, and what we can do about it together. Whether you're a parent, a school, or someone trying to figure out the map — I'd love to hear what you're navigating.

Start a conversation →

Stay in the conversation

New posts, workshop announcements, and things worth thinking about.

No noise. Unsubscribe any time.

Send me a note to sign up →