Scaling UX Research


Scaling UX Research

Continuous Discovery
UXDX Europe 2020

How to deliver research and scale research capability across a large agile organization of hundreds of product and feature teams.
In this talk Jennifer and Kelsey from Fidelity Investments will focus on 3 key components:

  • Research framework
  • Organisational design
  • Research democratisation
Jennifer Cardello

Jennifer Cardello, VP and Head of UX Research & Insights, Fidelity Investments,Fidelity Investments

Kelsey Kingman

Kelsey Kingman, Principal User Researcher,Fidelity Investments

**Kelsey Kingman: **I am Kelsey Kingman and I am a Principal User Researcher at Fidelity Investments. And today, I'm going to be talking to you about Democratization of Research. And my colleague, Jen Cardello is going to be talking to you about a lot more.
**Jen Cardello: **Hi, I'm Jen Cardello. I'm the Head of UX Research at Fidelity Investments. And we're going to walk you through some of the things that we've done over the past couple years to transform UXR at Fidelity in terms of scaling.
So, first things first, Kelsey and I are two members of a 35-person research team at Fidelity. We are stretched across four, very large business units at Fidelity. And so, we're just two voices but many people have contributed to the initiatives that we're going to talk about today.
And some context around the playbook that we're putting in front of you. So, first of all, Fidelity is a company that's been going through an Agile transformation since 2017. It's a rolling transformation. That's been moving through multiple business units, testing and learning as it goes through each of those business units and improving our understanding of what Agile is, what it means to be Agile and how to conduct research. The second thing that's important here is there are hundreds of scrum teams that we are supporting as a team both designers and researchers supporting those hundreds of scrum teams that are both feature teams and product teams. The next thing is that we have a very large and diverse product landscape. We serve many different types of customers from plan sponsors within companies, to employees of 401ks', to professional and personal investing clients, to advisors who are sitting on our platform. So, we have many different types of things that we build for many different types of users. And the next thing is, the big numbers is that we have a one to 15 ratio of UX researchers to scrum teams and that's a big reason why we have had to focus on scaling over the past two years.
Fidelity was founded in 1946. It's a 74-year-old company. The reason I'm telling you this is because it's really important around the mindset of all the associates who work at Fidelity. They care a lot about the customer experience. It's really on everyone's mind. In fact, it's part of our core competencies that everyone at Fidelity is exposed to user experience understands the underlying principles. That's how we've been able to stay in business for 74 years and create so many great experiences for our customers. So, there's a lot of pressure on our teams to create great digital experiences as well. And think about those from end to end and cross channel.
I wanted to talk to you for a little bit about what it's like to come into a company like this, a 50,000-person company with hundreds of scrum teams and try to make heads and tails of what's happening with UX research. So, this is the first 90 days in a really neat summary of what I did to make this work. So, the first 30 days was a listening tour. My manager and my stakeholders put together a list of people for me to go and meet and listen to very carefully on what it was they were being challenged to do and how they thought that we could partner better with them. The other important piece of this was also spending a lot of time with the UX research team to understand their perspective on what they had been doing and how they thought they needed to transform given the Agile transformation that was happening. Next was articulating the organization's needs. What does Fidelity need now that we're moving through a different stage of design and discovery and development then I wrote a plan. So, understanding all the things that I knew now, get it down into a memo and present that to my management team and to my stakeholders to get buy in. Once that was socialized the next 90 days, the next 30 days was all about iterating that plan and then finding a way to get I'm experiments in the water and start executing on this. One of the programs Kelsey's going to talk about democratization is one of those things we started almost immediately after these 90 days. Getting it in the water, figuring out how to test and learn and so this is an ongoing exploration. It's not just those first 90 days that counts. It's all the things that happen after it. So, test, learn, iterate and we're still doing that two years later.
When we think about a memo that gets put out to your stakeholders, this is the basic simple structure. There's an assessment of where we are, how we work going through an analysis of the research studies that we could get access to. Now, Fidelity's UX research group has been around for well over 20 years. So, what Kelsey and I looked at was research studies over the past six months in 2018 to see where they sitting in our framework and I'll show that to you later. We did a product organization needs assessment and talked about the capabilities of the team. What talents do we currently have? What skills do we need to grow? The plan talked about a strategic intent, a NorthStar for where we can get to in three to five years, complete with a vision and mission and OKRs' to define how would we know when we're being successful when we're getting there. We proposed some key initiatives that we would go after an org structure that I'll talk about and team span and coverage How many groups? How many domains/tribes will people be able to take care of? And then very importantly, the asks that we have of the organization. First of all, do you approve this plan? Second of all, we need headcount to make this happen. Can we get those people? Third, can we get recruiting support to get the very best people on our team? And then who can we count on as our allies, as we move through this transformation?
So, the listening tour was very interesting. It's a great time. The first 30 days at a company can be super stressful but if you think about just opening your ears and being a really good researcher, you learn a whole bunch of very interesting things. So, I went in there basically understanding what was happening. So, okay. Researchers conduct studies for teams. These hundreds and hundreds of squads, we call them and we conduct mostly usability testing which is what Kelsey and I realized through our analysis. And we're trying to do studies faster. But what I wanted to learn from listening was why. Why is this happening? A lot of these things seem really obvious but it's really important to get at the root cause. So, researchers conduct studies for teams. Why don't we do that? Really why we do it is because teams are being told we are Fidelity, we co-create. We involve our customers in creating great products and services and experiences for them but the teams didn't have access to our customers. They're seeking customer connection and they're doing it through us as researchers. So, it's not necessarily that they're looking for studies. What they're looking for is building connection and building empathy. We conduct mostly usability testing. Why? Because teams are moving really fast and they're falling in love with the solutions that they're creating. They're asking for research to validate those solutions and those designs versus utilizing research to aid in discovery. A lot of the reason for that is there's only 35 of us, so we're not embedded on every project. We're not there naturally throughout the process. They come to us when they think they're supposed to come which is usually too late. We're trying to do studies faster. Why? Because that's Agile, right? You do things really fast. But what the teams really need is to learn fast, not to get studies from us fast. So, how can we open up our minds? Use divergent thinking to come up with other solutions to increase learning velocity. Besides researchers trying to condense what normally would take this long and get it to be this long.
So, instead of defining UX research by what we do which is conduct usability studies, we turned it around and we tried to define UXR by the impact we can have. We can ensure useful and usable experiences. So, we're not getting rid of usability. It's still a core and very important piece of what we do and the value we deliver. But by starting earlier in the design and process, we can actually help teams define what's useful and what isn't and the reason why that's so important is because they're making huge investments in design and engineering. And if they're not building things that people need that actually respond to unmet needs in the marketplace then they're wasting their time. We can make great things usable but it's not great if they're not useful, if no one needed them in the first place. The second thing that we can have impact on is accelerating the learning velocity of the teams, helping them iterate faster and get to better value faster. And the third thing is connecting product teams to customers, get them that sense of knowing these humans that they're actually building things for a sense of empathy and understanding of root causes.
So, this is the playbook that we've put in place. This started in 2018. There are many initiatives that we have put together. This is just three things that we had to do, but there'll be follow on many, many presentations talking about other efforts. So, the first thing we did was we designed our org to meet the needs of the larger organization. Second thing was establishing a framework to work within and that would scale our capability because we were all speaking the same language and sharing a vocabulary with the organization. The next thing is building capability across and outside of ourselves. So, the first area is designing the org. How might we design and position the team to deliver the most impact? I like to think about this in terms of three A's of org design. First is altitude. How high do you fly? Where are you sitting? What types of projects are you taking? What types of initiatives are you attaching yourself to. Aperture is how big is the focus or how narrow the thing, the question that you're answering. Are you just validating a feature or are you actually gaining understanding of root causes underneath unmet needs? And then the third thing is how we arrange ourselves in teams to actually address the needs of the organization. So, with altitude, we like to use this diagram because it explains using the terms that the organization uses to describe the groupings. We have squads that are scrum teams, a group of squads lives in a tribe, a group of tribes live in a domain, a group of domains live in a business unit. When you think about what's happening, the thinking that's happening at a business unit level, that leaders at that level, it's really about 30,000 feet. And you go all the way down to the squads who are owning APIs, systems, services, end-to-end experiences, slivers of experiences and we call that street view. It's super valuable. It's super detailed but it's often very, very focused on specific feature functionality. So, what we realized is with only 35 of us, we got to come in here. We have to fly at domain and tribe level. The next area is aperture. So, how do we settle ourselves as far as the focus and how wide and broad our studies should be. Again, 30,000 feet to street view, we use Agile language to talk about initiatives, sub initiatives, epics, and user stories. This is also where we needed to move ourselves to focus on several very large, very mission critical initiatives and sub initiatives versus chasing after user stories and doing very, very tiny studies that could be completed in a two-week sprint. Third thing is the arrangement. Our team composition, we're diverse in background education and experience which aids us in developing great methods to use sharing background and having healthy debate around what we're seeing and what we're understanding from the things that we're seeing and hearing. We have many people in the team who have backgrounds in Psychology. We also have Design Strategists, engineers, human Factors Specialists, of course, and people who are educated in Sociology. I myself am an architect. I may be the only architect on the team but there's lots of very interesting backgrounds. We were all operating independently. We were kind of like lone wolves out in this very large organization working with the people who knew what we could do for them and what value we could deliver but that can get lonely, first of all. And it leads to a lot of disconnected work where you may have a high volume of work. "Oh, we did 350 usability studies last year." But we can't tell a cohesive story and have a strong narrative around impact.
So, what we did was we formed these pods, we call them. They have a leader and two or three researchers that are all working together against particular initiatives within a business unit and often aligned to a domain. So, this is what our org looks like now. We have, of course, a very key important research ops team and then for each business unit, we have a series of pods that sit in there and focus on core initiatives. The other important piece here is that we are not alone. Our research and insights team is actually much broader than just UX research. So, we have strategic market research, behavioral economics, brand, and advertising research, and customer loyalty that are our peers and we partner with on many projects which lends a lot of credibility to our practice, allows us to go deep on qualitative work when we want to partner for the quantitative work and allows us to mix disciplines as we're going after big initiatives.
So, some watch outs here when you're designing an org, don't start executing without getting feedback on your plan. You want to make sure you have partners, stakeholders, and team leaders reviewing the plan, providing candid feedback and poking at it as much as possible. It's a really important process to go through. The next thing with a plan is doesn’t set it and forget it. Things are going to change. You're going to learn more things. So, do evaluate the progress and pivot as you need. It's just like being a product manager. So, that's designing the org.
Next, I'm going to talk about the framework which is important for us to establish a shared vocabulary. So, how might we help product teams and researchers talk about discovery and design and identify when and how UX research can help. So, we start with a basic framework of build the right thing and build the thing right. And on top of that, we overlaid three basic phases. First is write problem. Let's quantify and truly understand what the unmet needs of our users are. It's a heck of a lot easier to get and a team aligned and moving with great momentum and purpose. If they all really understand the problem and they fall in love with that problem. So, we're capturing and quantifying customer problems. And we're building currently a database of user needs that's going to help teams do this at scale. The next thing is moving into a stage we call ‘right solution’, which is where we practice the diverged and converge ideation pattern. So, we're going broad to come up with many ideas. We're using resonance testing to narrow those down and then we're doing it again and again, until we get to a place where we have these really crisp, very resonant ideas or concepts that users want, they find something that is useful. So, once we're at that stage, we can move into done write. We can start doing really detailed design work. We can start prototyping and making sure we're measuring and then optimizing the experience once it's in market. What we did with that framework is we added some qualitative and quantitative methods throughout. This obviously is not the full catalog of everything a researcher can do. We didn't include things like card sorts and tree tests and every type of testing. But what we did was create a very skeletal set of basics to get you through the process of product discovery. We also added in UX auditing to make sure that we were calling out where we can help with existing experiences and finding problems within those that are worth solving.
Some watch outs when you're creating a framework. Don't underestimate the resistance to change. People really love the way they do things now. Oftentimes they're not looking for improvements on that. They think it's good going to add to the chaos. So, you need to build resilience in your team and you also need to teach them how to have conversations where people are feeling discomfort with the ways we're talking about evaluating their concepts or moving them through the discovery process. The other thing is you need to continually have conversations with stakeholders and your allies about this being a change agent, organization requires exceptional support and lots of cheerleading and lots of pep talks because it's hard to play that role. And don't be quiet. Don't let the change averse attitude, quiet you. Last year alone, we went on 45 road shows. We were on a continuous road show. We continue to do this, talking to teams about how we can partner with them, how we can help them create useful and usable experiences that drive value. So, that's designing the org and establishing a framework and one of the most important things that we've done over the past couple of years to scale ourselves and scale capability is the next program that Kelsey is going to talk to you about.
**Kelsey Kingman: **Great. Thanks, Jen. So, the third part of this was how might we accelerate the product team's learning velocity and shift the altitude of user research.
Now, as we've been talking about, we had one researcher for every 15 scrum teams, which was really affecting our learning velocity team's learning philosophy. It was taking 14 days or more between teams asking for research and then receiving it. And a hundred percent of user research at Fidelity was conducted by the user research team. Additionally, the vast majority, 85% of the studies that our team was doing was at that street view that we reviewed at the squad level versus 15% of the research was at the solution phase. We had a hypothesis that if we teach product team members how to conduct research then we would see an overall increase in learning velocity. We went in to change our learning philosophy from 14 days and decrease the amount of time it takes teams to obtain that user feedback.
We decided to come up with the democratization of research program. Now, just Democratization is making something available to all people and for something for all people to be able to understand. And there were three main components to this program. First, Education. We have two training programs, on moderate usability and pre-market A/B which we'll go over and just a little bit. The second element is enablement giving people in this program access to the tools and panels, the participants that they need in order to conduct research and third support one-on-one guidance from a user researcher throughout the entire process.
So, education, we started with remote unmoderated usability, where you have about six participants, her study and the point of this methods to gauge reactions to design and to identify pain points with that. And since we started this program, we have 144 graduates’ program and 325 studies launched by graduates of this program. Additionally, we have pre-market A/B and what this is, is a quantitative method where rather than waiting to have everything done in market, we help teams pick a winner between designs before the detailed design phase. So, we're helping teams decrease the amount of time it takes the side between two designs. Then we have support. This is incredibly important in a way of thinking about it is also quality assurance. So, throughout the entire process, people in this program have one-on-one guidance from a researcher. So, we are looking at their study plans. We are making sure that the research questions that they're trying to answer are appropriate for that method. We also have premade study templates. So, people aren't starting from scratch, they can take from the best principles and just move forward faster. Third, all studies are reviewed by user researchers before launch. So, we're making sure that anything is going in front of users is high quality. Additionally, one of the things to focus on here is that we're really building the capability and skills that people have. And the point of this is really empowering teams to do their research and do it right.
In one and a half years of this program, we have been able to change the learning philosophy for remote unmoderated from 14 days to five days which is a 64% increase and learning velocity. Second, for the advanced method that we have, which takes between 20 to 60 days because you have to wait for everything to be built and launched in production and wait for the sample size in order to calculate statistical significance. We've changed this from 20 - 60 to nine days; which is between a 55 and 85% increase in learning velocity. This is preliminary. We're still very young in this program. We only have a few graduates, but it's looking promising so far. Additionally, in 2018, as we covered before we were mostly at the street view level, 85% of our research was in that evaluative phase and all the user research at Fidelity was done by one of our team members.
2020, here we are, now we have 68% of our research is in that right problem/right solution phase and we're really making our way up there on the altitude. Maybe we're not a hundred percent at 30,000 feet but we're definitely increasing our altitude here. Additionally, 38% of the ‘done right’ of that evaluative research that we've done in 2020 has been run up by one of the graduates of our program. So, squads are increasing their learning velocity and I've absolutely participated and embraced this program. Now are watch-outs for democratization. First, don't forget to schedule time to implement your feedback. People will provide that feedback because they're very excited about this program. So, do adapt and iterate. So, the premarket AB study that we have is our method is in direct response to people banging down our doors to do their own quantitative research. Second, do not forget to set search for rogue studies. Some people are so excited to do research and get feedback to users that sometimes they try to skip a few steps. So, do set guidelines and procedures to ensure quality because sometimes people want to move fast and you need to make sure that every study has that rigor and has that value and that you're getting what you need out of the research. So, check who's nodding , who's nice and follow up with them.
**Jen Cardello: **Awesome. Thank you so much, Kelsey. So, that was the democratization program and we went through designing the org and establishing the framework as well.
So, here's a few key takeaways from us. First thing, listen to the org, figure out what they're trying to accomplish and what their challenges are and figure out how the skill sets of your team whether it's design, research, engineering can make those dreams come true and make those challenges more achievable. Second, focus on impact not output. It's not enough to say we did 350 usability studies this year. We want to be able to tell a great story about the impact that the team is having and that is a narrative. What it means oftentimes is aligning ourselves against an initiative and seeing it through from right problem to right solution to done right. Those are great stories. Those are not just great stories for individuals and for your portfolio but they're also great stories for an organism about how it drives and delivers value for their customers. Third, create a shared vocabulary. It's hard to introduce new language into an organization but rolling out the framework has had amazing advantages for us and we hear third party reports of people being in a meeting and hearing a leader talk about right problem, right solution and done right. So, we know that as annoying as we've been and with all of our road shows that has gotten through and people like the language and they've adopted it. Fourth, build capability beyond yourself. Being a multiplier is extremely valuable to an organization. Having people with that systems, mindset is important because they actually can see beyond the work they currently do and think about, "Wow. If more people could do this thing, had this skill, think of all the great things we could achieve as an organization." So, don't hoard your talents, try to build skill in other groups and then the fifth is be kind to yourself. Change is really hard especially in large established organizations. So, you need to find good cheerleaders and allies to prop you up when it gets tough and one of the most important pieces of that puzzle is having a great team to surround yourself with. So, I want to thank you all for tuning in and listening to our story and Kelsey.
**Kelsey Kingman: **Thank you everyone for really listening in and participating in this program.
Jen Cardello: We'd love to hear from you. Thanks.

Got a Question?

More like this?

Wed, Jun 10, 11:50 AM UTC

Continuous Research: The Qualitative Approach Of Preply
Kate Martynova

Kate Martynova

Head of Customer Insights, Preply

Mon, Oct 05, 7:00 PM UTC

Understanding the technologies that power “AI”—a product-owner’s guide
Lindsay Silver

Lindsay Silver

Global Vice President, Platform Technology, Conde Nast