Blitz Research: More Insights in Less Time
Blitz Research: More Insights in Less Time
In this talk you’ll learn about the top 3 ways to get better insights faster:
- How to structure user interviews for maximum value
- How to conduct effective competitive research
- How to get fast usability testing results
So up next, we have Kevin Philpott, who will be talking about Blitz Research, more insights in less time. And Kevin is the head of UX at Pi Insurance. So welcome, Kevin. Hey, everyone. And Jim, that last presentation, I couldn't have planned better. And it was such a perfect setup for this. Because I think what you did is you talked through the macro process of how you can speed up and descale your experiments. And I'm going to go deep in the weeds now of I'm a practical UX person. And how do I cut down some of this time if I don't have it, as I would like sometimes, to do my research, right? So this is an absolutely ideal setup. So thank you for that. So if you can just share your screen, I'll publish it now. We'll kick it off. Let's do this. All right. Let's make sure you were getting that. All right. Can you see that? Great. I can. So I'm going to leave you now. And I'll publish this. So see you shortly. Love it. Cool. Hey, folks. So today, we're going to talk about Blitz Research, which is really just a fancy phrase for how can you do research faster, right? Which of course, we always want to do everything faster. And I'm going to go through the five areas that I found helpful in terms of cutting down a research timeline.
As a bit of background, my name is Kevin Filtbot. I'm head of UX at Pi Insurance. I've worked in design leadership in other companies. I've done that for various different organizations like Geico Gnomes, IBM. And I'm on a number of tech councils, UX councils, work regularly published on Forbes, George Washington University and Richmond University advisory groups. I'm not teaching at Berkeley. I did read a book that mentioned Berkeley once, but that's about as close to get there to Jim, unfortunately. And you might have heard me doing a couple of talks in a bunch of various different places or maybe annoying you with a post on LinkedIn that got shared with you, right? But let's get right into what are we actually going to learn? What are you going to get out of this, right?
I'm going to talk about having an alignment meeting when you're doing your research. We're going to talk about how you can look at existing data to speed up how you do your research. Do we have some information on this already? We're going to talk about unmoderated research methods. We're going to talk about spending less time designing. And I'm going to share a couple of what I would call project management power phrases, which sounds very fancy altogether as I read that out loud. But basically some things that you can say in meetings that can help move things on a little bit, right? Caveat, there's no silver bullet, right? I'm not going to stand up here today and say that if you need to do a project and it requires research and it actually requires two days worth of research that you can cut that down to one hour. You cannot probably, right?
But this is a little bit of a discussion thinking about the context that we talked about earlier where you have all of these different actors in your organization, engineers, product, multiple projects and juggling different timelines, right? So sometimes you just don't have the exact amount of time that you would like to do your research, right? And some research is better than none. So sometimes you have to get creative. And these are some things that I've used myself over time that have helped me cut down the time that I spend on research, right? Sometimes when somebody asks you to do some research in an amount of time that is impossible, you will have to say no to that, right? There is not a one size fits all solution for this, of course, right? But with that caveat, let's go into our first item, which is having an alignment meeting.
Okay. So this might seem like more time, but I can tell you from personal experience that 30 minutes of an alignment meeting can save you 30 hours later on or more, right? What are you doing in an alignment meeting? You're setting expectations with all the different various stakeholders that will be interested in this piece of research, right? About what are you trying to achieve? How much time do you actually have to do it so that you don't get a bunch of scope creep additional questions that come in at the end, you kind of set that expectation that we've not maybe we've only got like two or three days to do this, right? You can even set up the expectation that folks are not going to get a super nice PowerPoint presentation, well designed with lots of videos spliced together, right? It can be a verbal readout if you need to do it faster, right? It can be just bullet points. So long as your group know that's coming, that's often quite acceptable, right? And you can waste a lot of time actually documenting your research and not moving towards the outcomes for the business that you're actually trying to move towards, I have found. You want to bring in a representative from each stakeholder group that will be viewing or is interested in your research. Part of that is just it makes it better, right?
From an integrative thinking perspective, what that really means is you are, for example, bringing in sales and seeing what they really want to get out of this. You're bringing in service, you're seeing what they really want to get out of this. You're thinking about your research from the customer and seeing what they really want to get out of this, right? And when you go through that exercise, sometimes you find these really nice synergies and you're really trying to maximize the value of the piece of research you're doing across all those different stakeholders. You're not going to be able to please everyone all of the time, but if you go through that exercise, you often kind of get out in front of this person wanted this from the research instead of getting that at the end when you're presenting the research or halfway through, right? It's helpful to share the goal of the research. You can send it in writing afterwards if you want with a summary of the alignment meeting to make sure that everyone remembers once you get to the end what was agreed to. It's very helpful to have a business case.
The reason it's helpful to have a business case, even if you are a UX designer, is you don't want your research to get reprioritized or cancelled in the middle of it, right? That is of course going to be detrimental to your timeline, right? And even if you're in UX and you're sitting there and you're like, oh, man, how am I going to put together a business case? Maybe you're in engineering, you're thinking the same thing, right? Just work with somebody else. You don't have to do it yourself, right? Somebody is requesting the type of work related to this research, right? Work with them. See if you can put together a business case about why you are interested in this area. And it really helps to try and quantify the value whenever you can so that if it comes up you have that. And many, many projects and many, many organizations don't actually have that quantified value. So you're just increasing the chances that it's not going to get canceled, right?
In this alignment meeting, you will probably have a lot of folks adding different things that they want. And if you have a short amount of time, you probably can't do that. So in certain cases, you can kind of just say, yep, that's a great idea. Let's add that to our research backlog for the next iteration, right? Because you simply can't on a short timeline add in everything else. And you don't necessarily want to tell people no, because you may be able to do it in the future. But you want to tuck it away as something that we'll look at later on, we'll assess the value of, and we'll see if we can prioritize that, right? All right. You've had your alignment meeting. Do you even really need to go and recruit people for your research, right? Maybe you already have the data that you need. It can take time, right? So you can check and get creative about the data sources that you can kind of consider that might answer your research question, right?
It's surprising how many folks really don't know what data is available in other departments in the organization. Maybe not, maybe not surprising. These are big behemoths of organizations in some places, right? And it's worth asking about things like recording calls, notes in customer relationship management systems like Salesforce. Is there an inbox with customer emails? And one of the really nice things about these types of things is that when you set up, say, for example, a user interview, you're really artificially creating that situation, right? And you're coming up with questions, you're trying to get people to imagine what they were doing. So these are like in the moment pieces of data, especially recorded calls. You can get some really great insights from those I have found. Social media, right? If you're looking for opinions, you can use tools like social searcher or UVRX. And you can search to see what's being said about your company if that's something that you're interested in or your company plus keyword if you're interested in perspectives about a certain thing that your company does, right? You can also look at session recordings. If you have tools set up like Mouseflow, Datadog, there's a whole host of them. You have tools that can record what users do online, right? And so you may just want to filter down from those recordings into key areas that you're interested in.
And you might not need to do usability tests. You might be able to observe those and see from those recordings what the problem area is, right? Review websites are a good one as well, too. If you're trying to see, for example, in a competitive analysis, not just what your competitor has but what do people think about what your competitor has, right? You can go to review websites, right? Trustpilot, Yelp, g2.com. And you can look at your own company as well, too. What are you finding? Are there recurring patterns and themes coming about that where you could improve your experience, right? Now, a lot of things that I'm talking about here are kind of like you can just do them, right? And the information is either there or it's not. But you can also be pretty proactive here if you want to set yourself up for success in the future, right? If you have a research archive, use it. Look at it. Search through it. If you don't have one, set one up. Dovetail I found to be a great tool for this. And it really is amazing how much research gets done because people weren't able to find a previous test or it gets lost in some set of folders somewhere.
So being able to store and have that information so you don't have to repeat research can actually be quite helpful. Feedback forms, if you can set those up on your website. Of course, you then may be proactively collecting some data around different parts of your digital experience that you can then go to afterwards instead of needing to start from scratch when a need arises, right? Quantitative data sources, marketing surveys, your company may have some of those. You may have, you may Google online, basically third party surveys related to the topic that you're interested in that might give you some helpful information, right? And Jim talked about this, suddenly when he talked about this, focus, right? You might not need to research everything. Really good example of this is you want to improve your digital experience. You might just want to go to your web analytics tool and look for drop off in your funnel, in your experience flow, right? Using the same example, a shopping cart, maybe you find honest or a shopping flow. Maybe you find on a certain page, you've got the majority of the drop off, right? So you're not researching everything. You may be able to do a much more focused piece of research and you're always really trying to de-scale the amount of research that you are doing to be just enough to make the decision that you want to make. Again, very, fits very tightly with some of the stuff that Jim was talking about.
All right. We talked about recruitment of research participants, sometimes taking time and we're going to talk a little bit more about unmoderated methods. So unmoderated usability tests are basically usability tests. You typically buy a platform like usertesting.com or any number of platforms out there. There's tons of them. They often have a pool of participants. You can set up a test. Someone will pick up that test and you have written the test script in advance and they go through that themselves. There's a number of advantages to this. First off, if you're doing moderated tests where you have to talk the participant through, you have to do the calendar scheduling, figure it out all out. This kind of happens on its own once you set it up when you do remote unmoderated usability testing. Tests can run in parallel. Depending on the type of participants and the type of testing that you're doing, it could easily happen if you've got a very generic audience that you have multiple tests going on at the same time and, you know, uploaded in the matter of a couple of minutes. Now, usually I would say you want like for most tests, at least in my experience, if you've got a general audience, you kind of want to give it a day or two days, right?
So I'd also say if you're under time pressure, over recruit by two users in these tests. There's a lot of professional testers out there at the moment, right? People that are just kind of like faking the tests to get paid the money, right? And if you're really under time pressure, you want to account for that a little bit, not have to get to the end and then kind of redo the tests and launch it again or whatever it is, right? So if you really care about its speed, you might want to over recruit a little bit for that, right? Keep it simple, right? One of my favorite usability tests is to start people on Google or on a blank page, right? A blank page is better because you're not leading the folks as to what search method to use or where to start. And you just ask them, go ahead and do X. Now, one of the benefits of an unmoderated remote usability test is that you can't interfere with that test. You can't ask questions like you could in a moderated session that might give you different behaviors than if people just went and did it themselves, right? Another really great thing about this type of test here is often people focus on the experience of their products in the context of just their own product. Like they'll start people in their own quote flow, for example, right? But really, some of the best information I've found is you start people and you see where they shop and who they look at even before they get to your experience. What are the things that they're thinking about in the broader context before they even get to you, right? So this is a super simple usability test and it's super, super valuable, right?
So it's a good example of something that you can set up, run, potentially get feedback within a day or two. And that of course is helpful and easier than taking potentially a week or two weeks to recruit people to come in and do some sort of moderated testing. What I've also found with moderate, unmoderated tests, if I want to be really, really efficient with this is maybe I'm looking for a certain type of experience that people are having certain problem, right? You can add a little bit of a post-test question to say something like briefly describe what you like least about your experience or you like most about your experience. And if you're running a bunch of these tests, that can help you really kind of identify without watching necessarily all of the videos, which one you want to look at. Now, there's a big difference between what people say and what people do. So obviously the ideal scenario is that you actually watch videos that you are putting out there, right? But if you're under extreme time pressure, this might at least help you prioritize which ones you want to look at first, right? Unmoderated interviews, right? A lot of these usability testing platforms as well too, you can just change it up a little bit and basically get unmoderated interviews, right? So here's an example from Try My UI. You could do this with a bunch of other tools though, right? And I might start this out with the scenario of we will begin this session on a blank page and you will be asked a series of questions about, insert your experience here, right?
Then think about a recent time when you did this experience, please describe that process from start to finish in detail. What did you like about that? What did you dislike? What would you like to have changed about that experience? Is there anything else you might want to add, right? So that again is a super simple test around an experience that you might want to kind of explore, right? And you can launch that again, unmoderated and get those results back potentially very Quickly. The other thing that I'll just call out here is I've used userinterviews.com for recruitment when I've needed to go for kind of a more niche audience. Some of these platforms like usertest.com or Try My UI tend to have like quite generic users, right? So it's a general population. If you're trying to go after, I don't know, doctors or small business owners, you might need to go for a more targeted approach. I found userinterviews.com to be good for that. You can set up your screener. They're really pretty Quick. You might get results back in a day or two for that. So that can be an option if your need is a little bit more niche. If you have very niche needs, that does tend to take you longer to get that recruit set up and it can cost you more, right?
There's of course all sorts of different ways that you can try and get around that, like maybe going to conferences and trying to test people at those conferences, right? But it is certainly a more difficult task if you have a very niche need. One thing I'll also say is if you are exploring different partners for recruitment, a really good question to ask them is if they have recruited this type of user before. It is not ideal to enter a relationship with a recruitment provider where it's their first time trying to figure out how do you get these people that can of course extend your timeline. All right. So that's unmoderated interviews. All right. And it looks like a mistake. Test time designing. You could test with low fidelity wire frames, certain concepts. You don't need to have everything built out to the nth degree fidelity, right? Sometimes you can get some really good feedback with that, right?
Another thing that you can do, again, this requires you to be a little bit more proactive, but having design systems, right? You might not have to design everything out high fidelity because you are, you already have basically the components on your website, right? And so having a design system either coded, which is what I was just talking about, or even just a whole bunch of templates in Figma or whatever tool you are using, right? So that you can kind of just like copy and paste some of these bits and pieces and just edit them as you go. Again, that can save you a lot of time. You have to build that up, of course, but it's much better than designing from scratch. Last one here. I'm going to talk about some phrases you can use to help you move things along, right? Instead of saying, when can you, stakeholder, get me this piece of information that I need for my research or make a decision? Ask how soon can you? Because that puts the emphasis on this is actually urgent and I need this. Give me the earliest possible time that I can get it.
Sometimes also you'll be in a meeting and you'll be trying to gain consensus, right? And a phrase that is super helpful I've found is based on what we have observed, you might state what those observations are. I hypothesize we should trial X, you make the suggestion on how to move forward, unless there are any violent objections. And there are actually a lot of things going on here that make this an effective way to communicate. First you're starting with based on what we have observed, right? The evidence, right? You're now not introducing your own opinion into this necessarily. I hypothesize so you're not saying, hey, this is definitely the best way to go. It's just a hypothesis to test. Sometimes when you're very definitive, people can get defensive about that and you can start a whole conversation. And unless there are any, sorry, next thing then is like you're putting forward a path of action, right? And then others can kind of chime in on that, right? You're not waiting for the group to come up with something.
And the last thing is unless there are any violent objections, there will always be like little things that could be done better, right? But you're kind of saying, hey, group, this is what we're going with unless there are any big objections. And that's often a good way to kind of move the group forward. Another one is just making sure you kind of set those expectations and timeframes, right? As a next step, we'll meet with whoever and whoever by Y date to determine Z, right? Make sure you're setting that expectation for the next step. Too often people leave meetings and it's like, oh, man, like actually we should have said what we're going to do next when we're in that meeting. And you can just keep up a nice cadence each time of always setting the next deadline to keep the project moving forward.
Last one is you will all at some point run into the scenario, which is you're looking for some confirmation or feedback online, right? It could be through Slack, it could be through your emails, and there's a group of different people, right? When you ask for that feedback, obviously you want to put in a date there. Sometimes if you don't put in a date, it will just drag on and drag on, right? Even sometimes when you put in a date, it just, and you're not really assertive about the timeline on it, it'll just go on. It'll drag on and people do other things and they don't respond, right? If you're in a group chat, you can say, please review this thing by this date and let me know if you object before we proceed, right? That means if you get a non-response by that date, you can move forward. Are there going to be people that get upset with that? Yes, of course. There are, again, no silver bullet here, but it's a much more effective way often than just letting decisions sit out there and not being able to move your project forward, right?
And so that is all the stuff that I wanted to run through with you guys today. I'm on LinkedIn. Again, you might have been annoyed by some of my posts in the past, right? If you search Kevin Philpott, which is a fairly unique name, you will find me for sure. And feel free to check me out if you have questions. Feel free to connect with me and ask them. And this has been absolutely delightful. I am at the end and I'm going to hand back over.
Great. Thank you very much, Kevin. Another great, great presentation for tonight. And yeah, don't worry. I think LinkedIn is a fair game for everybody to send out their posts, so I wouldn't be worried if you're annoying anybody.
Oh, I'm not. I'm not. Just to be super clear about that, absolutely not worried.
Great. There's a couple of things came up just as you were going through your presentation there. Thought it'd be good to ask. So you mentioned, and the first thing that you started with, and I thought it was a good point, is the alignment meeting. Now, you mentioned one thing that I thought was interesting there was that you're going to get people giving their pet ideas or they want to get their kind of whatever they can get put in. So at that point, do you have a very clear, I was a little confused about, do you have a very clear objective or is this more of a discovery of what are you going to research? And if it's the latter, then how do you effectively say no to people? Or are you taking everything on board?
So ideally, you have some sense of what you want to do or what the business value of this request is going in. I would say one thing that is super helpful as a designer in general is anytime somebody comes with a request that the business value is accompanied with it. That will filter out a lot of requests. And then when you have that, when you go into the meeting, you've got a very specific thing that you would like to do. It is, of course, more challenging if you get into that and it's kind of super exploratory and you kind of have to direct the conversation. I've done those as well, too.
I think what you have to do in those types of situations is you listen to the opinions and then you suggest based on that what you think the research should be and try and corral that conversation that this is what we would like to achieve. If there are other things that come up, you might decide to prioritize those, stack rank them and say, cool, first what we're going to do is we're going to look at how we might understand the most important thing. And then you can kind of deflect some of those questions to be like, OK, that's good. Let's get to that after we figured out how we want to do this or let's put this in our backlog. But it's certainly more difficult and it's an art that you practice over the years.
Great. I'm just hearing backlog prioritization, stack ranking. I wonder, do we get Jim back in here to suggest some Quick experiments?
Moving on to data, one thing I'm always a bit fearful of just relying on, because you gave some examples there of I guess it could be people calling in call center or maybe some CRM. The negative feedback tends to outweigh the positive. And that kind of when people are happy, they feel less of a need. Some people do. And it's fantastic when you receive that positive feedback. But the majority of the time, it's people who are negative. And do you feel that maybe you're getting biased in your opinions because 99 percent of people are quite happy and then you've got this vocal one percent.
So I mean, I think bigger picture, you're always trying to try to triangulate your research methods. Right. And if you have information from surveys that kind of can suggest what the overall level of satisfaction is, then you can go into the research understanding that and knowing that. Right.
The other thing is that often you're kind of going in and you want to understand more about a phenomenon or you want to understand more about opportunities to improve. Right. And you've done a bit of research, hopefully. Right. You can kind of after a little bit of time tell, is this an outlier or is this something that kind of comes up pretty frequently. Right. And it can be helpful to just understand in depth that single case, see if you see patterns across as opposed to just one particular instance of it. Right.
And even if it's not like a massive problem for other users, it could still end up being a benefit or a delighter for them. Right. So I think you just have to go into the research with the right frame of mind in terms of how you're actually going to interpret or communicate that data afterwards to be kind of be responsible about that.
Great. There was a whole heap of different tools that you mentioned in there. Is there is there something that you would recommend for people who don't have access to those tools and getting budget or getting sign off and approval for? Is there a business case there? How would you suggest for people to move forward there?
Yeah. I mean, so those tools can be incredibly expensive. UserTesting.com. You can have there's cheaper versions of those tools. Right. So that's one way you can approach it. If you need to build the business case as to why those tools can be important, sometimes what you unfortunately have to do is start without those tools and you show how long it takes to do this on your own. You can calculate the cost over time of how much is taking you as a designer to do this type of stuff. Right.
And also from a business perspective, it's deeply unsatisfying each time to have to wait like potentially two to three weeks to get insights on your stuff. And that's if you're really kind of planning it out, if you're doing that type of recruitment manually. Right. So one thing to do is to kind of take what you're currently doing, record the cost of it, record the time it takes, record your time as a cost and try and make the business case that way. Right. Because some of these tools, again, they're not all super, super expensive and you can sometimes put down on paper like, hey, look, we could pay for this or I can keep doing this and it kind of makes the decision.
Brilliant. Well, thank you very much. We've unfortunately run out of time, but again, really enjoyed your presentation and I hope everybody watching took a lot of value. So thank you very much. Love it. Thanks very much. Appreciate it. Okay. So that brings us to the end of our session today, but just wanted to do a shameless plug at the end. We are doing our main conference in the USA in May and we'll be physically present in New York City in the New World Stages for those of you who are in that part of the world. Tickets are on sale now. So if you did enjoy the conversations that we had today, but you just want to experience that in person and much larger covering all four of our core areas of product UX, design and dev and focused on how teams can actively move from working in that product project world to a more product team world. But thank you very much for joining in and look forward to seeing you again soon. And thanks again once more to Jim and Kevin for sharing their knowledge today. So thank you very much.