Skip to content

[Conversations] ‘Does this đŸ«  emoji mean flirty or embarrassed? 😭’ and other questions to help make computer science more inclusive

A white woman with chin-length blond hair.

Transcript

Monika Sengul-Jones

You’re a computer scientist leading the LabintheWild virtual lab, which runs quizzes to help make computer science more inclusive. What led you to this work? 

Katharina Reinecke

For my master’s thesis in Rwanda, I built software for agriculture advisors that I thought was intuitive, but everyone hated it. The colors, the design. People are different, right? And differences influence what people find usable, what they find appealing—what they like. 

That experience taught me that design isn’t one-size-fits-all. It’s important to develop technology that is inclusive. We can’t just assume what we, or let’s say, a bunch of Silicon Valley developers, decide on is intuitive to everyone.

Monika Sengul-Jones

I love this story, and of course, that makes sense. How did the experience change your work in computer science?

Katharina Reinecke

Initially, I thought, OK, I know people need different user interfaces. So, my PhD thesis was on designing culturally-adaptive user interfaces (UI).  

Monika Sengul-Jones

How did you figure out what different people need so that the UI would be relevant?

Katharina Reinecke

That’s a great question—and honestly, one I’m still exploring. In my PhD, I tried to predict what people might like based on their cultural backgrounds. It kind of worked—people performed better when they liked the design—but the model wasn’t great because we lacked real data on preferences. So during my postdoc at Harvard, I built LabintheWild, a website with fun experiments like “Compare your visual preferences to others.” One study went viral and got 40,000 participants from around 180 countries. This input gave me real insight into what people find visually appealing. Now, LabintheWild can use that data to help designers customize websites for different audiences.

Monika Sengul-Jones

Very cool. I tried the tests on LabintheWild—and I recommend this to everyone, because it is this kind of partnership you have with your research subjects. They get to learn a little bit about themselves, too.

Screengrab of LabintheWild‘s homepage, which offers a range of quizzes that provide insight into user preferences and cultural differences, according to Reinecke.

The questions remind me of the internet in the earlier days, Buzzfeed was famous as a website for self-help quizzes, right? Like, what’s your spirit animal or what kind of partygoer are you? It’s fun, there’s an emotional satisfaction. It seems like your work is fostering a partnership with your research subjects.

Katharina Reinecke

Yeah, I think people find these quizzes satisfying. We’re trying to be more scientific than BuzzFeed; we’re not predicting your spirit animal [laughter], but we do try and make the results pages informative.

Monika Sengul-Jones

Totally. Who participates? Do you recruit?

Katharina Reinecke

Over the years, we’ve found people come from all over the internet. A newspaper will report on the lab. At one point, we got a lot of participants from bodybuilder.com. Reddit and Bored Panda have directed people to us. We actually did some studies looking into why people are coming and their motivations.

It’s diverse, just as people are. Some of them come because they just want to have fun and just, you know, test themselves. Some want to help science, especially the older population.

Some want to compare themselves to others. That is in line with some psychological theory; people love social comparison. Comparison helps us understand ourselves. 

Computer scientist Katharina Reinecke believes non-inclusive user interfaces cause harm. Her new book Digital Culture Shock (Princeton University Press, to be published in Aug. 2025) helps engineers apply her research findings to design more inclusively. Credit: Russell Hugo

Monika Sengul-Jones

How do your research findings make computer science technologies more inclusive?

Katharina Reinecke

Really good question.

We often build software that people use to reach particular groups, based on their differences. We aim to make our data and findings publicly available on the LabintheWild website. Like most academics, we write, give talks and conference presentations, work with students, and consult with companies and organizations.

Monika Sengul-Jones

What happens when technologies aren’t inclusive?

Katharina Reinecke

People are left behind. Some people won’t be able to use the software. One of my students studies dyslexia. If you don’t design a user interface for that group, they might not be able to read it. People with screen readers are another example. Exclusion has real-world consequences. People are left behind.

Bias in a dataset can impact who gets a loan, who gets a job, and medical treatment. When we don’t verify how we train our models to include the range of human variability, there’s a range of severity in what could happen when that falls short.

Exclusion has real-world consequences. People are left behind. Bias in a dataset can impact who gets a loan, who gets a job, and medical treatment.

Katharina Reinecke

Monika Sengul-Jones

I’m struck by what you said about people being left behind, or simply not trusting the user interface. What happens when we still must use systems that don’t work for us? 

Katharina Reinecke

I’ve thought about this question for a long time. We use websites often, throughout our lives; every day, we visit 100 different websites. This adds up. If the software systems you use don’t feel intuitive or make you feel like an outsider, and you still use them, either you’re going to change yourself to adhere to the values presented in those systems, or you’re going to be slowed down, experiencing friction like repetitive paper cuts. 

Monika Sengul-Jones

You have a new book coming out. Tell us more?

Katharina Reinecke

The book is about digital culture shock. Technology systems that are designed in a Western context and transplanted to other parts of the world. Often, technologists release a software system and assume the need is ubiquitous, that it should look the same everywhere. The book is for software developers and the general public to learn more about how to design technology differently. 

Katharina Reinecke’s new book is about the mismatch of design and cultural differences, and how software engineers can do better.

Monika Sengul-Jones

Thank you for this conversation and for your work with LabintheWild! I am heading over to take a quiz now to see if my understanding of emojis are shared by others. I mean, for some time now, I’ve used the melting face emoji đŸ«  to signal overwhelm. Then someone told me it’s flirtatious embarressment! Oops. Given I might be an outlier, I want to contribute my important insights on meaning to research program—🙃 [laughter]. Do you want to help, too? Here’s the link!

Learn more

Preorder Digital Culture Shock, published by Princeton University Press, and get it delivered to your mailbox this summer. Katharina Reinecke’s book officially hits the shelves on August 5, 2025.

Image Credit: Portrait of Katharina Reinecke (2024) by Russell Hugo of the Language Learning Center (LLC).
This transcript was edited for clarity by Monika Sengul-Jones.
In-kind support for this interview was provided by the University of Washington’s Language Learning Center and the UW Tech Policy Lab.