The concept of continuous learning can be daunting to people with a lot of other stuff to do.
What do you mean we’re never done asking questions?
Won’t we get analysis paralysis?
Living with uncertainty is uncomfortable. Having a clear unambiguous answer is very comfortable. The key is not to optimize for your comfort, but to rethink how research integrates with the rest of your product work or other business decisions. And when I say research, I don’t mean conducting formal studies or producing reports (although those are often a part of it). I mean taking a systematic, goal-directed approach to generating useful insights and learning.
Continuous learning is no different from continuous shipping. Big releases take longer and big questions do too. We’re just used to thinking of the endpoint of research as a report rather than a decision—an artifact rather than an action. If you adjust your perspective to think about the different types of questions and scope of various decisions, everything gets much easier. Every time you make a product decision (or an organizational decision or a strategic decision, etc.) you are placing a bet. You will never have complete certainty, but the more you learn the more confidence you will have. This is an important point. In the same way that the perfect is the enemy of the good, an illusion of attainable certainty blinds people to the level of confidence they need to move forward.
The stages of research
It can be helpful to think about research questions in terms of how far along you are in solving the problem. Are you defining the problem space itself and looking for ideas, or do you have a solution at hand and need to understand how well it’s working to solve a known problem or achieve the outcome you’re hoping for?
The four types or stages of research are:
Generative: What problem might we solve?
Descriptive: What is happening currently/happened historically?
Evaluative: How well is our solution working?
Causal: Why is [x] happening/Why did [x] happen?
Generative research is typically the earliest stage and includes the broadest questions. However, even when you are deep in solutions, it’s good to step back from time to time and ask yourself whether you are seeing the whole picture. You may be missing out on the best problem definition. And your ability to solve a problem depends on how well you define it.
Descriptive research helps you understand the real world better so your naive optimism doesn’t bite you. (It comes for us all.) Whatever you are designing or building has to fit into the real world of concepts and things and processes and infrastructure that already exists. I don’t care how innovative your work is, everything else around it is going to stay wildly unchanged, so you’d better understand what you’re getting into. Good descriptive research yields a detailed understanding of the characteristics of a population or phenomenon being studied, such as public high school teachers or first-time home buying. (How decisions get made in one’s own organization is a tremendous topic too often neglected.)
My favorite thought experiment is using Google Street View to check in on the neighborhood you grew up in. Then, enumerate the visible changes since you graduated high school. How much has remained basically the same? And why?
Below is a shot of glamorous Canoga Park, CA (hometown of yours truly and Bryan Cranston before me) circa 2019. Yup—that’s definitely Los Angeles exactly as Blade Runner envisioned. So future!
Juicero has come and gone. Cupid’s Hot Dogs abides.
Evaluative research is what so many technologists—and entrepreneurs in a hurry—run to. Test! Validate! Sure you can run some experiments to learn useful things about how well your solution is working. But if you’ve set up shop in the wrong problem space, it is all too easy to mistake the polish of a prototype for the quality of an idea. A lot of start-ups go down like this, still.
And finally, there is causal research. Something happened and you want to know why it happened. You can do some causal research any time you notice an event or a pattern and want to explain it. If you notice an effect in your analytics you can turn to descriptive questions to figure out what precipitated it. You will not find causes in analytics unless something is actually functionally broken in your system. You can speculate about other causes, sure, but you won’t find them.
Integrating research into your ongoing work
There is no one right way to learn. There are a lot of wrong ways. The wrongest way to work is not to seek out new information at all. The second wrongest way is to pick an activity, like surveys or usability tests, and just do the activity in order to tick a box, rather than thinking about what you really need to know.
With a bit of planning, it is possible to fit any research question into or around the structure and cadence of your work, whether you use sprints, cycles, quarters, or years. I’ll call these work periods. The best relationship between learning and work period depends on the type of question.
In order to get the whole team thinking about this relationship in the same way, visualizations are helpful. The following model is just a suggestion. Feel free to take this as a starting point for defining how research works at your own company and modify it to meet your needs, capacity, and capabilities.
Strategic research — intended to help refine plans, priorities, or approach at a high level — could span one work period or multiple depending on the scope and breadth of inquiry. Strategic research might just answer one overarching question, such as “What is our core value proposition?” Or it might be a collection of smaller questions about internal workflow that merit in-depth consideration.
Research that guides strategy might be generative, descriptive, or causal. The output is a depth of understanding suitable to guide major decisions and yield durable insights that will continue to pay off into the future.
Tactical research projects are discrete studies that fit into a single work period, or perhaps span a couple if you are working in two week sprints (because that is a ridiculously short amount of time). Or, sure, take a fraction if you honestly operate on a yearly plan. It may be a study that stands on its own outside of any specific product work that yields a report or set of generally applicable findings.
The question or question set is of sufficiently high priority and relevance to have resources dedicated to it about from other work. The study might include a variety of activities or mixed methods—such as a quantitative analysis followed by a set of ethnographic interviews. The resulting output could be insights that relate to multiple ongoing projects or recommendations for a new project that supports current priorities and approach.
Tactical research might inform product enhancements, marketing initiatives, support content, or a combination of these. The goal is to better solve a customer need and deliver business value within the existing strategic direction.
Targeted inquiry is conducted by the team working together on a project. The output of targeted research is better decisions within a work period — either learning more about customer context and needs or testing a potential solution.
Descriptive research to learn more about the customer or their context happens should happen towards the beginning of a work period. Evaluative research to check how well a proposed solution is working comes towards the end, but should still allow enough time to respond to the findings and make changes.
Effective targeted research requires a well-defined and nimble process. And it should be part of the planning for the given work period. While you are working on a new feature, for example, allow yourself time to make sure you understand the lay of the land, and to test what you’ve built.
The lower the overhead of identifying research questions, planning the study, and recruiting participants (if necessary) the more realistic it will be to accommodate interviews, competitive research, or usability testing within a development cycle. Develop good habits and document the steps.
Remember, not all research needs to be primary research. Given a clear question, it’s possible to look back at transcripts from past interviews, historical usage data, or the published studies from other organizations to generate some insights.
You can try timeboxing small research projects. Say for example “What can we learn about [x] by the end of the day?” We do this all the time in our daily lives when planning vacations or making major purchases. It’s the exact same process with a bit more rigor and collaboration.
Celebrate your successes. Whenever you learn something useful or interesting, think of ways to share it that are memorable and fun. There is no rule that says research findings need to be dry (this isn’t academia), but a lot of teams act as though there were. If main points don’t stick, it’s like the work never happened. So, put on a show. And note anything you did that made a particular research project effective. You need to learn about learning, too!
Always approach learning with intention and joy
Learning new things is one of the most satisfying and joyful things about being human. It is also an essential ingredient for making your work meaningful, interesting, and innovative. Unfortunately, the overwhelming drive to deliver—to be productive in some visible, predetermined way—often crowds out opportunities and makes learning seem like a luxury.
Research gets a negative reputation when it is perceived something additional and idiosyncratic that doesn’t fit with the comfortable rhythms of “regular” work. That mismatch alone makes it seem extra effortful and sometimes even impossible.
Go with the flow and you can reclaim the space you need. Every organization has cycles, whether it’s the school year, the fundraising calendar, quarterly reporting, or a continuous series of iterative product development sprints. You can’t fight time, so work with it. When you think a bit ahead and map your questions onto your calendar, you’ll soon hit your stride of continuous learning.