Research basics every designer should know

Leveraging design research fundamentals to improve product experiences

A digital illustration in a surrealist style. A hand holding a magnifying glass breaks through a tear in a royal blue cloud-filled sky. Surrounding it are spheres and cones and trailing ribbons alongside a Monarch butterfly, a speech bubble, a question mark, and an eye peering down through the lens of the magnifying glass.

Illustration by Patricia Doria

At the start of every user interview I conduct, before I even ask participants to tell me about their days and how photography fits into them, I tell them who I am and what my role is on the team. I can’t think of a better way to sum up why I love what I do and why research is so integral to the design process here at Adobe:

I’m Lirra, a researcher focusing mostly on Adobe Lightroom. My job is to ask real customers, like you, questions about how they work and what they think about ideas we have in development. It’s important to know that I’m not a designer, engineer, or product manager—you don't have to worry about my feelings, I’m here for your honest and open feedback. It’s only through conversations like this that we have any hope of achieving our goals of making Lightroom as good as it can possibly be. Thank you so much for your time, we could not do what we do or make what we make without you.

To build better products for our customers, we must first understand (through research) who we’re creating for and how we can best meet, and exceed, their needs and expectations. When research is conducted in an interesting and creative way, our product design work becomes exponentially better.

In addition to my day job helping the team bring new features to life, I also teach a course on research methods to design and marketing students. Although few of my students will end up doing research for a living, one of the first things I teach them is that research is not just delivering a pile of information and moving on to the next project, it’s about helping teams understand what the information means and what they can do with it.

My course is centered around tools and tricks my students can employ in their future work to make even scrappy, informal research efforts highly impactful. For designers who don’t have ten weeks to spend learning the basics, three focus areas—the building blocks of research design, the art and science of asking questions, and uncovering and presenting insights—will start them down the path toward research success.

The building blocks of research design

The first step in any research project is understanding your overall goals—research objectives. A typical UI study may have an objective like “explore the overall usability of Feature X, including aspects of delight and friction.” Whatever your objectives are, crystallizing them will light the path for everything you’ll need to decide later, including what you need to learn (key questions), who you need to collect data from (target audience), and how you’re going to conduct your research (methodology).

A digital illustration in a surrealist style of a Monarch butterfly being illustrated pixel-by-pixel. The right side of the butterfly is intact against a royal blue cloud-filled sky. The left side is pixelated against a black background and next to it, a hand is reaching into the illustration to build it with glowing colored pixels in shades of orange, black, green, red, and yellow.

Determining how specialized your research objectives are will help determine which branch of research to employ. As questions and audiences become broader, so, too, can your information sources. In industry, we tend to encounter four main types of research:

We have many options and sometimes more than one “right way” to do research. To determine the best path forward, think of it like a decision tree, with the first question being, How unique are my objectives?” Followed by, “How much time, money, and expertise do I have at my disposal?” Each of these factors can point you in the right direction for accomplishing your goals within a set of constraints.

Let’s say I’m trying to answer a question is on the niche end, like “What items do Lightroom Mobile users want to remove from their family portraits?”). Since there might be some outside sources (photo retouching trends, blogs, forums, tutorials, etc.) relevant to the category or types of behaviors I’m interested in, I’d want to conduct a combination of primary and secondary research. Starting with secondary research, which is less specialized, helps get the general questions out of the way so you can focus on primary research (more on that soon).

Research is not just delivering a pile of information and moving on to the next project, it’s about helping teams understand what the information means and what they can do with it.

With so much of the world’s information at our fingertips, the sheer volume of availability can make secondary research overwhelming. Where should you start? The world around us, the things we’re readily seeing and passively consuming daily—the category and cultural trawl, if you will. There’s a strategy for organizing secondary research often employed by marketers called The 4 Cs: Consumer, Category, Culture, and Client. Using this framework helps bucket pieces of information as you come across them, making it easier to recognize themes and larger trends. I also recommend always starting with a short list of favorite sources—places with trustworthy, fact-checked information, as well as nuanced explanations of cultural phenomena. (Yes, I looked up “brat summer” in the New York Times.)

Let’s say, I’ve done my secondary scour, but now I need to think about what my primary research is going to look like. Primary research has two big arms: qualitative and quantitative (or qual and quant as most of us say):

Qualitative research is exploratory in nature and best employed when you’re trying to discover something you didn’t know before (“How do new users approach editing their first image in Lightroom?”). It centers on gathering data through “conversations” which can mean anything from a live interview, to written or video diaries, to simple surveys (Yes, surveys can be qualitative!) with many things in between. The sample sizes tend to be smaller and more time is spent with each participant, gathering deep and rich inputs, but you don’t always know how prevalent these behaviors or opinions are.

Quantitative research, on the other hand, is about measuring the scale of something you know is happening (like the frequency of the top five friction points for new users). You gather data en masse through surveys and product analytics, where you have hundreds, even thousands of participants. With quant, we can make what’s called statistical inferences to help us better understand a group of people by analyzing the data of a subset of that population (e.g., looking at the data from 1,000 people to apply that thinking to a group of 100,000). But sometimes your understanding of why those data spikes or trends are happening is limited.

Some of the richest insight comes from the combination of qual and quant data, so you have a deep understanding of the reasons behind repeated trends and behaviors. I’m a qualitative specialist—my skill set is not in running detailed statistical analysis. Although I work regularly with this kind of data, my heart (and focus) is on getting to know small numbers of our customers and talking to them deeply about certain design stimuli or topics through a few common methodologies:

Common qualitative studies

Moderated

Unmoderated

Common quantitative studies

Each methodology has superpowers and trade-offs; determining which to employ on a particular project often comes down to whether you need self-reported or observational information and how many data points you and your team need to feel confident in your decision-making. Once you determine the best methodology or combination of methodologies for your project, you can determine who you’re going to talk to (your target audience) and begin defining them by certain demographic (who they are), psychographic (what they believe), or behavioral (what they do) factors.

Then comes one of the most fun and creative parts of conducting research: talking to humans.

The art and science of asking questions

At the very start of my career, one of my mentors told me: I don’t expect you to have all the answers, but I want you to be asking the right questions. It was extremely liberating to hear so early on in my professional development and it’s something that has rung true in a variety of contexts, none more so than research. At the risk of stating the obvious, if you’re not asking the right questions, your research will be doomed to the irrelevant pile of unread emails and too-long decks (the deliverable equivalent of “remind me tomorrow”).

A digital illustration in a surrealist style. A finger extends from the top edge of the illustration to push a three-dimensional gold button with a large question mark on it. Surrounding it, on a charcoal grey-to-purple gradient background, are spheres, trailing ribbons, and a watchful eye.

It’s worth remembering here that research is so much more than a conversation, as explained by my colleague Sharma Hendel in an article reminding people how important it is to remember that asking direct questions doesn’t always result in accurate answers. Whether you’re conducting an interview, focus group, or writing a detailed survey, having natural curiosity around the topic at hand certainly gives you a leg up, but remembering a few simple guidelines can help you along the way.

Guideline 1: Get comfortable with the basics

One of the most common mistakes my students make when first crafting questions is misusing the basic question types:

Most qualitative interviews should largely employ open-ended questions. The conversation will be quite dull and lacking in texture if every single question elicits a yes or no answer. Most quantitative surveys should emphasize close-ended questions. My students will sometimes rely too heavily on open-ended questions in surveys, making them difficult and time-consuming to analyze and even more difficult for respondents to answer (depending on the length and purpose of your survey, as well as your sample size, two or three open ends should suffice).

Leading questions are quicksand many non-researchers accidentally fall into, and one of the biggest areas to watch out for. As humans, bias can be ingrained, implicit, and difficult to extricate from our natural speaking patterns. For example, when conducting UX research, you must be very aware of how you frame even the simplest of requests. Something as seemingly innocuous as “How easy was it for you to complete this task?” is a leading question. Instead, try the open-ended “Describe your experience completing this task” or a close-ended “On a scale of 1–5 where 1 is very easy and 5 is very difficult, how would you rate your experience completing this task?”

Guideline 2: Start with an end in mind

Remember those research objectives that helped you define your methodology? Start there. Every question you ask should have a reason for being—a direct tie to one of your core objectives. Think about each objective, break it into components, and write down questions that capture each angle or way in. Don’t worry about being too thorough and having too many questions at the start, it’s much easier to edit down once you’ve got all your objectives covered.

Speaking of editing, often when we work with cross functional teams, stakeholders get so excited to have feedback that they come to the table with too many questions across a variety of detailed topics. If left unchecked, discussion guides (test plans, questionnaires, etc.) will bloat, respondents will fatigue, and findings will muddy. Referencing objectives to whittle down the team’s disparate queries to a streamlined, reasonable set of core questions will make your analysis easier and your recommendations even more impactful.

Guideline 3: Broad to narrow

When you’re meeting someone for the first time, do you start with specific, personal questions, like “What is your biggest regret?” or do you start with something easy and neutral, like “Where are you from?” or “What do you do for work?” (there are some cultural differences here, but that’s a topic for a different article). I imagine most of us would start with something neutral, easy, and broad. The same goes for research.

I start almost every Lightroom interview with “Please share a little about how you spend your days and how photography fits into your life.” Sure, I could ask people what their favorite food is, but I work on a photography app, so that’s not relevant to the task at hand. Broad, but pertinent starting points also provide critical context for our specific topics as we move through the interview. The same goes for unmoderated and written tasks… Before asking someone for their hot takes on the last season of Ted Lasso, you’d want to find out about their TV viewing habits and whether they even watched the series.

At the risk of stating the obvious, if you’re not asking the right questions, your research will be doomed to the irrelevant pile of unread emails and too-long decks (the deliverable equivalent of “remind me tomorrow”).

Guideline 4: Add just a little whimsy

At the start of this article, I mentioned that research conducted in an interesting and creative way helped final outputs be even more innovative. Creativity absolutely plays a role at each stage of research design and synthesis, but it’s perhaps most evident when framing and writing questions. There are so many techniques you can use to elicit engaging answers, but some of my favorites are:

There are myriad possibilities within these techniques and the richness of feedback will more than make up for the extra time it takes to think of new ways to construct and frame these types of questions. However, be mindful that these are “special” questions and should be employed strategically and sparingly to maintain the integrity of responses. Creative questions require creative answers, so if you’re exhausted from thinking through how to ask all these questions, just think of how fatigued your participants will be when answering them!

Guideline 5: Last, but most important... Don’t forget to be a human

You are a human talking to and learning from other humans. Write surveys that your grandmother could understand. Don’t try to cram 90 minutes of questions into a 60-minute interview. Don’t use so much jargon that only an expert could unpack what you’re really asking. One of the best and easiest ways to keep yourself in check is to answer your own questions. If you can’t do it reasonably and thoughtfully, then you certainly have no business forcing someone else to try.

Good research is about uncovering (and presenting) good insights

Uncovering insights is, simultaneously, the hardest and most important work researchers do. In my classes, I introduce insights early. Then, week after week I reinforce them to give my students maximum opportunities to practice and hone the muscles of uncovering and crafting the things we find to be true—but no one has yet managed to articulate.

A digital illustration in a surrealist style. Two sets of hands hold three of the four ends of clear plastic tubing (with spheres of red, teal, and yellow floating inside it) tied in a square knot around a blue metal pole. Spheres in red, teal, and yellow float inside the tube. A single hand extends from the left side of the illustration to hold only one end of the tubing, seems to be untying the knot,, while a pair of hands extending from the right side of the illustration are not only holding each end of the tubing, but have twisted it to make it more difficult to untie.

The definition of an insight is almost as nebulous as the art of identifying one. Ask three researchers and you will get three different answers (basically what amounts to shades of the same color separated by a few color swatches): One colleague says, an insight is “seeing something that everyone sees, but no one has thought.” Another says you should think of it “like the punchline of a joke—it resonates in a way that feels intuitive yet unfamiliar.” For me, an insight is a truth that unlocks and articulates the reasons behind specific behaviors, often taking multiple facts, observations, and data points into account.

As with many things, seeing is believing when it comes to insights. But they are often confused for data (facts and statistics, usually, but not always quantitative in nature), findings (direct results without a point of view), and observations (the intangibles of interacting with humans, the things they do and how they do them)—all of which have an important place in the research and reporting process, but none of which are truly insights. Some examples that illustrate how data, findings, observations, and insights differ:

There is no secret recipe or template for uncovering great insights, but one of the simplest tactics for exposing them is to keep asking yourself “Why?” and not stopping until you’ve found something new, but still true (a logical leap is ok, great, even, but flat-out fabrication won’t help you or your team).

In my course, I also talk to students about the importance of using verbal and visual storytelling to bring their insights to life. It’s one thing to diligently design and conduct studies, ask the right questions and augment takeaways with keen observations, but your impact will be severely diminished if the eyes of even your most invested stakeholders glaze over as you’re presenting hundreds of slides filled with tables, percentages, and tiny customer quotes.

For me, an insight is a truth that unlocks and articulates the reasons behind specific behaviors, often taking multiple facts, observations, and data points into account.

The Lightroom team will tell you that I’m constantly thinking of new ways to both involve them in the research process and share my findings along the way: I send Slack updates with top takeaways from individual interviews so team members who couldn’t listen live can keep up with the study as it’s progressing. I also spend time thinking about seasonal color palettes for my note-taking boards, so even my notes and “behind the scenes” efforts are visually pleasing.

When it comes to the final synthesis and share out of insights, takeaways, and data, presentations (slide decks) reign, though I also experiment with infographics and shortened formats that can facilitate conversation and interaction. But my main goal, when sharing my work with the Lightroom team, is to explain complex findings in the simplest ways possible—by minimizing words and maximizing visuals. To get your point across, sometimes a big bold sentence, in white type on a black slide, is all you need. Other times, a simple graphic best explains a person’s emotional journey. And certainly, a representative video clip of a customer’s struggles with a UI can say it better than any of us.

Good research is about interesting insights, mindful methodologies, and creative curiosity (apparently even annoying alliteration). It’s also true that research can help teams drive engagement, create new revenue streams, embody a user-first mindset, and make confident decisions. But at the end of the day, the best research comes from caring about the best research. By merely showing up to class for ten weeks straight, my students will be stronger, smarter leaders (within their project teams and beyond). Set yourself up for success by simply stopping to ask yourself and your team whether it’s the right time for research, what you’re hoping to learn, and how it will help you make decisions.

Special thanks to Sean Vidal, who was my collaborator on a Research Basics presentation given to product marketing manager interns this summer during which some of the above concepts and guidelines were crystallized.

Header copy
Design your career at Adobe.
Button copy
View all jobs
Button link
/jobs