Skip to main content

Researching and Designing for Actionability

· 5 min read
Jordan Lum
LINCS UX Co-op

Unsplash

The UX team has been conducting user research on LINCS tools to get them ready to move from development to production. So far, we have completed card sorts, usability tests, surveys, and interviews. This research has provided us with a wealth of information. However, if we want to translate what we’ve learned into meaningful, productive changes to the tools’ designs, we must keep actionability at the centre of our research practice...

During my UX studies, I took a course on user research techniques. Going in, I thought that UX research methods were fairly straightforward: speak to your users, synthesize the data you gathered from these conversations into personas and journey maps, and voilà—you’ve completed your user research. Sadly, I was mistaken. In one of the course’s assignments, my classmates and I were asked to observe participants as they completed a shopping task. When we noted that they were struggling, our professor simply asked, “Ok, but how?” In our first attempts at user research, we only looked for the problem; we had not yet grasped that we needed to understand exactly how and why participants had trouble with their task. Without digging deeper to understand our users’ goals and behaviours, our findings were not actionable, meaning that solutions could not be derived from our research.

Unsplash

This lesson taught me the difference between research and good research. New designers often conduct research to tick it off a checklist—to show that users have been consulted. Experienced designers conduct research to understand the user, and they turn that understanding into action. When user needs are not clear, the research does not serve the design, resulting in solutions that are not actually user centred.

During my time at LINCS, I’ve found it essential to think deeply about actionability while working on one of its tools—ResearchSpace—as it has moved through the design process. Site-wide usability tests for ResearchSpace, led by Robin Bergart (UX librarian) and Evan Rees (UX designer), had just begun when I started my position. I had the privilege to facilitate a few of the tests that were being used to determine how users understood LINCS ResearchSpace’s functionalities. Throughout the tests and as we subsequently analyzed our findings, I thought about actionability at both a low and a high level.

From a low-level perspective, I tried to make each observation into an actionable piece of data. During the tests, I wrote down detailed observations to connect participants' actions with their thought processes as they described them. When observing participants complete a task, I noted their specific actions, ranging from elements they clicked on to functionalities they noted as standing out most to them. For example, we asked participants to search for an entity in ResearchSpace. Many participants found entities successfully by accessing the contents of a dataset; in doing so, they overlooked the search box in the navbar, which they also could have used. Via observations about both what users did and did not do—and their descriptions of their actions as they were taking them—we uncovered rich data that allowed us to synthesize our results into meaningful patterns and themes.

Once we had these findings, we needed to find a way to extend actionability to others outside the UX team. For instance, we needed to share our findings with the developers who we were collaborating with to generate meaningful and feasible improvements to the tools. In practice, this meant that we had to find a way to convert our messy virtual UX whiteboard, scattered with ideas on digital sticky notes, into a format that the broader LINCS team would be able to read and understand. To do this, we wrote a report that defined each finding by highlighting pain points and our related key observations. This process helped us get our ideas across, but with hindsight I think we could have been even more effective if, instead of just presenting our findings at the end of the synthesis process, we invited a broader range of stakeholders to participate in the whiteboarding process. This strategy would have broadened the team’s understanding of the findings and invited more people to brainstorm solutions.

Even with some bumps along the way, our findings have uncovered pathways to solutions. For example, I designed an improvement to navigation by providing users access to ResearchSpace features right from the home screen. User needs informed my design’s layout, as the user research showed me that we needed a design that allowed users to quickly find and use ResearchSpace’s myriad resources.

Unsplash

Thus, with actionability in mind, our user research directly fed into our design solutions and helped address real user pain points. It has been exciting to put into real-world practice the user research methods I had previously only learned in theory. I cannot wait to see what is in store as we continue to research and design!