Dual Track Development: Involving The Whole Team In Discovery And Delivery


Dual Track Development: Involving The Whole Team In Discovery And Delivery

Enabling the Team
UXDX Europe 2020

Development work focuses on predictability and quality while discovery work focuses on fast learning and validation. Discovery and development are visualised in two tracks because it’s two kinds of work and two kinds of thinking. It is vital to involve the whole team in discovery tasks wherever possible and keep discovery work and progress visible to the whole team.
In this session, Agile Expert Jeff Patton will discuss how to involve the whole team in discovery and delivery. He will also discuss:

  • How to keep measuring and learning even after you ship
  • Two tracks, not two teams
Jeff Patton

Jeff Patton, Founder,JP Associates

Jeff Patton: It's early this morning. I haven't had nearly enough coffee. This could go horribly wrong. I've also got to talk about a whole bunch of things. And if anybody's ever seen me speak before they know that I don't have a gift for brevity. Also, I present a little bit weird. Here's what I mean. Let me see if I can share a screen. All right. You can see my hand hopefully. Now, let’s start from the beginning. I'm here to talk about Dual Track Development. I need to talk about this first. I need to explain my terms. I need to say what I mean by Dual Track. And then I've got some specific lessons learned from people doing this for years so that you can make this better so that you can actually involve your team so that it actually works so that it isn't just a handoff between one group to another group. I need just talk about core strength and no, I don't mean it's not a yoga thing. I'll explain that in a second. It's important you keep discovery visible. It's important you involve the whole team. It's important to change your process so that you can and it's important that you keep paying attention to what you've shipped or you'll keep those outcomes visible. I'm going to start by talking about all these things or laws. I'm going to start by talking about this one thing to begin with. First, I want to give you a brief history lesson. I've had people tell me that I came up with the term Dual Track Development. I know that my friend, Marty Cagan has spoken a lot about Dual Track Development and people give him credit for that. But the person who really deserves credit put her up here is this first person, Desiree Sy used to work for a company called Alias. It's now Autodesk and Desiree don’t work for them anymore. But I first met her in about 2004 and she explained to me the way that she and her product team worked. They were early adopters of Agile Development. And I said, you've got to write this down. This is common practice but the way you're describing it as better, or it's a little bit the way everybody's doing it. But I think if you wrote it down, people would get it. And years ago, or finally in 2007, she wrote this particular paper, easy to find online. If you look forward in it, she drew this particular model. Now, embedded in this paper was this short phrase. Although the Dual Tracks depicted in figure three seem separate, blah, blah, blah, you can read that. She moves on to explain how these tracks really coordinate and how people are working together and awful lot talks about how they visualize things. I remember sitting with my friend, Marty Cagan and we were talking about this model and he said, "Well, where did you get it?" And I said, "Well, I got it from here." And I pulled up this paper, I pointed to the word stool tracks and Marty said, "Okay." and started referring to it from that then on. That's why we call it that. Now, let me explain to you what's going on in there and I'm pretty sure most of you haven't read that paper. It is a term that's derived from working with Agile Development. Twist my camera. As soon as I start drawing pictures over to the left here. If you're working with Agile Development, you know the basic model here. We start with a backlog and we're going to pull things out of that backlog. You're going to work in these short development cycles. If you're talking about scrum, that's a sprint and that's a couple of weeks. You'll start every couple of weeks with a planning session where you're going to plan how much work you can do. You do a bunch of work and at the end you stop and reflect on what you've built. The goal here is to be predictable. You start every couple of weeks to my planning saying, this is how much we can get done. And you end every couple of weeks by actually measuring if you've got that stuff done, predictable is important. And the done stuff you're talking about is working software stuff you could ship to your customers. Ideally you stop and review this stuff. And well, you look at the quality is a good from a UX perspective, a functional perspective, a technology perspective. But this isn't discovery work. If your team is responsible for more than just building stuff fast, if your team is responsible, actually responsible for whether you get some benefit from it, you've got to use a different process and well, that's a discovery process. In a discovery process, you acknowledge that we are starting with ideas solution ideas, and you start by building a backlog, but it isn't the same thing as that backlog. This backlog is well, they're all the reasons your ideas might suck or might not work out. If your idea is going to be successful, it has to really solve a problem for people. People have to really want your solution. They have to be able to see it and try it and use it easily, keep using it. And enough of them have to do that your company gets some ROI or benefit. And all those beliefs that you have are your assumptions. And if those beliefs aren't true, those become risks and sometimes when you try and think of the design of this thing, you've just got questions that need to be answered. This backlog is not a product backlog. It is a learning backlog. That's what drives this discovery work like all backlogs that thing at the top is the most important. That's our riskiest assumption and given our riskiest assumption, we turn that into a question. We can answer with a test or with some learning activity like do people want the solution? Well, let's create just enough of a prototype to show them and evaluate their interest. Can they easily learn to use it? Let's create a prototype that allows us to do usability testing. Can we actually build this stuff using our technology? Let's create what agile people call a Spike to look deeper, the technology, but you get to work and you create a test and then you get out into the world and do something actually test it, actually put it in front of people and all you get back out of that is data. The data changes your idea. It changes your risks and assumptions. This is how discovery works. You can tell it's really different than this stuff. First off, predictability is out the window. I can't predict what I was gonna learn. If I already knew what I was gonna learn, it wouldn't be called learning. The best we can do over here is to time box. We can time box our learning. We can say, look I'm going to budget this much time to do this type of experiment. And usually we expect these things to be in the best-case hours, a day or days sometimes long runs around this cycle might take a week or weeks, but they're faster. This stuff is also not predictable. Well, you can't predict one experiment after another because what you learned on the first experiment will change what you do on the next experiment and so on. So, it looked, we time box this stuff, and if we focus on velocity here, it's learning velocity. Now, if you're working with a product team that is responsible for getting stuff built and building the right stuff, you've got to do both of these things in the same process. And when we twist this and draw this type of model of this is where we get the model that looks like this. Again, start with those ideas and given those ideas, I might do some experiments that take up a few days or a week or a few hours. And after enough experiment, every time around this loop, what I'm reassessing is my confidence. I'm confident that I'm building the right thing. Now, when I am confident, I can drop it or I can build that product backlog and I can drop into what are these comparatively longer loops where I'm focusing on building, working software at some point in time, I'll actually ship that feature. So, I can really evaluate it. We draw the model this way and it looks an awful lot like a waterfall. Yes, things drop from a top to the bottom from the discovery track to the delivery track. But it's messier than that. Look, I might start with an idea and start doing discovery work while I'm doing delivery work on this idea just finished and I might validate that we're solving a problem that people want our solution that we can build it. At this point, I'm really sure I'm going to build it. So, I'm going to drop, well, I'm going to do an interim drop. I'm going to drop stuff that I know I need to build, but I'm not done with discovery yet. I've got to build better UI prototypes to do more usability testing and do a little bit more research and I'm going to keep dropping as I go. And this gets super messy. I'm dropping work as I'm doing discovery work and yes, it changes things and it's chaos. If you go back to Desiree's Paper, she'll talk about, "Yeah, this is chaos. This is the way it works, but it's so much better than the way we used to work." Overlapping these two models. This is what gives us this term Dual Track, but it is two tracks. It is not two teams. So, let's talk about the team part. Let's see if I can get back on track so I can actually finish in time for you to ask some questions. All right. Let me tell you what I mean by this core strength thing. If I've got a product and I've got a team that's responsible for it. I've got a bunch of different people on that team. It takes enough people to actually to make good decisions. The team isn't being fed work by some other group or some other team. If they really are deciding what to build and build it. They've got to have enough people or the right people to make good decisions, but they've also got to be able to execute or actually build the stuff. They don't make the decisions and hand them off to other people. That's what makes this a real product team. So, teams swell, they get fairly big seven plus or minus two is the standard but I see teams that are 10 or 12, things like that. Now, this is a product team. It's a decision makers and executor's that all work together. I want to unpack that decision-making thing, and this helps us understand what we mean by what a cross-functional team is. Going to leverage a mantra from my friend, Marty Cagan, who I did not see his taco early in the week because I was working and I'm ticked off. I hope it was good old mantra from him. He says, "Look, if you're going to build a successful product, it's an intersection of these three concerns. These days you'll say four, but I'm going to merge these together. The product we build has to be valuable for you to make a good decision on what's valuable. You need to understand your business." That means you need to know how your business makes money. You need to understand your business's vision and strategy. And what's valuable is what's aligned with your vision and strategy. What makes your company money? If you don't understand those things, you can't make good decisions. Teams that don't fail. If you're building a product you're selling to other people, you've got to understand you. Your customers, the people that choose the product or buy the product and the value proposition they're going to get part of what's in the part of those customers is understanding your market and inside your market or other competitors. Competitors are the alternatives your customers could use instead of your product there are explicit competitors, other products they could buy, but there are also hidden or ghost competitors work around cobbling together a solution out of productivity tools, or just still a manual process. That's what you're competing against. Your solution has to be more valuable than your competitor’s solution even if they're just work arounds. Next concern is, well, what we design has to be usable means you need to understand who the users are and well, how they work or how they'll use your solution to actually accomplish anything. And I'm drawing a distinction between users and customers, if you're building software for a living, you already got this down. I think customers choose a product user use it. So, these are users and choosers. For consumer products, users and choosers are the same people but for a B2B product or something we built for internal use the choosers aren't the same as the users. It users have to use it, but it's only through using it that we actually get that value prop that your customers actually get that value proposition. Now, usable doesn't mean just understanding users and how they work. It means we have to identify problems and we have to identify solutions, not just identify solutions but actually design those solutions so they are usable. So, we're going to make good product decisions. We've got to be able to design solutions that will make tactical decisions on how the solution design works. Finally, any idiot can come up with super cool ideas that we can't afford to build. I do it all the time, but the right solution has to be feasible to build given the time and tools that we've got that means if you're building a technology product, you have to understand how to build technology products. If it's software specifically, that means you need to know how to write code. If you're going to make good technology decisions. Now, I could ask you how many of you have piles of legacy code in your organization? And most people's eyes would roll. Everybody has legacy code. Everybody has existing stuff that they're adding to and if you don't know how much it cost to make a change in your code. That's not going to help. Yes, if we build from scratch, it would cost this much. But given where we are right now today, how feasible is it to change our product? Other aspects of feasibility are those hidden technical aspects. Things that if you screw up, you're in trouble things, your customers expect. Things like scale and performance. Things like security well look, I assume you're going to keep my data safe. I don't want to specify that and then finally, the last big thing to pay attention to in feasibility is technology trends. If we're building new technology for building new tech products out of today's technology, there's a high likelihood we're getting behind of where the real potency comes here. The real innovation comes from is applying what's just becoming possible to the problems that users have today. For instance, a current technology trend is AI and machine learning. If we can figure out how to apply what's becoming possible are machine learning through the problems people have today. That's how we really get ahead and create solutions that our competitors are going to end up copying. So, look, if we're going to make the decisions, you can see it isn't going to work to have one product manager or one person doing that. That's a lot of stuff, a lot of skills, a lot of expertise, a lot of experience to have. So, we look for a collaborative group of people, people that understand things from a business perspective, that's a product manager or an Agile terms of product owner. And those terms ideally should be synonymous. When we talk about usable, we're talking about someone who is a user experience person or a product design is another term we use for people designing things that are technical tools. DX is the sexy new term. I hear a lot for developer experience, but we've got to understand how our users work. And then finally, if we're going to make good technical decisions, we need someone who is a senior technologist not junior. We look at these kinds of people working together to make good decisions. Now hopefully, let me draw some pictures. Hopefully these people that are working on a business concern. And well, let me do the technical concerns. Look, I know you've got a technologist on your team or ideally you should I know that you ideally have a business person or a product manager, someone who understands the business on your team. And ideally, you've got someone that understands your users on your team that's just enough to make good decisions, but for you to execute, it's going to take probably a lot more technologists, testers, dev ops people, others. A product team is composed of all those people. So, that we can move fast. And there's an old mantra that crowds don't collaborate so that we can move fast. One of the things that allows us to move fast, especially through discovery work is to identify what we refer to as a poor product team. A lot of you have already figured this out. If you're a product manager or a UX person, you know that if you can collaborate with an engineer thing go better. I see strong partnerships between UX and engineer, strong partnerships between product management and UX and ideally all three. So, sometimes you've got an implicit core team. Those are the people that talk and work together most often. In other organizations, this has made explicit. One company I've worked with from over the over the years is (inaudible). And some of you may be using JIRA every day that's the one of the products that they make, I walk around at Lawson they'll show me where teams sit and they'll point to a group of desks they're together. And they'll say, "This is where the triad sits." Triad is jargon and it lasting for a core product team. I hear the word triad. I hear the word trio. I hear the abbreviation TO3 used by some companies I've worked with. The weird thing is that team of three can have two people in it or three or four, even five people in it. It varies. It refers to these three concerns. The strength of your product team is only as strong as its core. And when we talk about leading discovery and facilitating discovery, I'm looking at the core to do that. I rely on them. Now, they were going to talk about involving the whole team but let me give you a few other strategies. Let's talk about keeping this stuff visible and I'm not going to draw all these pictures. I want to show and tell just a little bit. Primary strategy we use for keeping work visible is the same strategy you've seen in any crime show you've ever watched. It's the evidence board. I've watched lots and lots of detective shows and crime shows and never not once have I heard that the detectives rush in and say, "Quick, we need to get this stuff into PowerPoint." that's not the way it works. You spray it out on the wall. And as you get new evidence as things change, you update the board. They change fast enough and we want to see it all together. When I see product teams working, I see them build these fairly messy, fairly elaborate evidence boards. I took the picture I was in front of Beth and Archie and these are stakeholders for this us company called CarMax or if you're from the US you probably know who they are. They are the planet's largest used car dealership. Now, everybody asks why Beth is looking so grumpy there she's looking grumpy because the hypotheses they're working on are not panning out. They're not testing well. Their beliefs aren't panning out on that board are those hypotheses. That's what they're talking to stakeholders about. They've got concept sketches; they've got metrics from running experiments. They've got simple personas and personas sketches. They've got sticky note evidence things they've gathered from interviews. And when I rock walk around their organization, every team has whiteboard space with these evidence boards in various degrees of completion depending on where they are in their process, that they update and they change constantly. Lots and lots of variations of these, but they are pretty consistent about having the hypothesis, having ideas for our solution designs, having what our current experiments are and in post COVID days, those things live on tools like Mural or Miro. I mentioned it last year earlier, this is the guy I know best that it lasted and his name is Sharif for evidence, he really exists look him up on YouTube. There’re several videos and talks from him out there but he knows what he's doing. He's managed met him when he was a product manager for Confluence years ago. And now he manages a large group of product managers there. As I walk around their organization, it's covered with these variations of evidence boards stuff that shows what they're currently working on, where it is in status that a lot and lots of evidence of discussions they're having. I've got lots of pictures of people doing this. These people were at the Australian football league. There's a lot of stuff going on in their tiny wall. This is the group that builds the mobile app and the website for the Australian football league wander around and talk to people inside of Spotify and work with them a little bit before one of their problems is, they have these nice, beautiful walls. And so, they end up using strategies that ended up not using their walls. Things like rolls of butcher paper or well, this group working together, you can see these big foam board cords that they're using to keep things visible. Now, I'm watching time running short got to make this work. Couple more things. I want to point out. That's part of visibility. It's not just your hypothesis. It's not just your ideas and how they change. It's the work you're doing, it's common to have tasks for discovery, we've got to figure out what we're doing to support that next best experiment and building a task of to do's doings and dones’. That's common and then well, this particular board does have discovery tasks running. It has the hypothesis that most important thing or the thing we're trying to learn next. I'm a bottom of the board are what we call deltas. Deltas are well, what we learn, that's how we make the learning velocity visible. I'll always ask people. At the end of every cycle of discovery, what did you confirm? What did you think was true that you really validated really is true? What were you wrong about? Because if you weren't wrong about anything then you were look, if you're always right, then you're either super smart or you're fooling yourself. So, you were wrong about something what's brand new information, the unknown-unknowns’ things you couldn't have known, but that came out. We set up for our next round of discovery by asking what questions do we want to answer next or get answered those things go back into our learning backlog. And then we always come up with new ideas or opportunities. Look, if you're going to involve the whole team, I wasn't going to draw all this crap while we were talking, I wanted to show you pictures, keep your hypotheses, you’re learning backlog, your problems and context, things like personas and journey maps. Keep your solutions visible those are UI designs and solution maps even technical design and keep your plans visible and keep your learning visible. Those are the kinds of things that you put on an evidence board. And those are the kinds of things that are sometimes way too invisible. Third thing. I want to get through five. I'm like really go fast for the really go faster. The fifth thing. Look, you're going to need to involve the whole team in what you're doing or need to but you should. Let's talk about the easiest ways to do that. Leveraging this build measure, learn mantra that you probably have seen in Lean startup or Lean UX thinking where measure means that's how we actually get out there and do those tests. So, I've got lots of examples. So, let me back up. I didn't want to skip this quote. This lady's name is Leah Buley have known her for a long time. And I don't know exactly where this quote is written down because she said it to me in a bar over a drink. It's got to be hopefully it's in her book or some variation of it is, but "Design isn't a product that designers produce. It's a process that designers facilitate. You look at a lot of discovery activities, look like UX design activities and research activities and it's common for designers to now orchestrate design, figure out how to involve groups of people. These people are on site observing customers using something guy on the right is a product manager person on the left is a UX personnel or a researcher. Guy behind the guy on the right is an engineer, he's been trained to take notes and listens super important for him to get the empathy. Now, I've got lots of pictures of people out there talking with customers, observing users directly doing this and it's always something I'm encouraging. Here's the pull back this quote from my friend Sharif. Again, he gets has to be every time I use this picture because he doesn't like that picture of himself. He's said this and keeps supporting it that they don't do customer interviews without having a developer in the room. This is critical. Look, I've mentioned CarMax earlier. This is my friend Archie Miller and this is one of his evidence boards or one of his team's evidence boards. And there's a whole bunch of these simple two by twos. They bastardize a bit of an empathy map and they use these deport notes in for interviews. They've taught everybody how to do this, not just engineers and testers on teams but marketing people and business people, and customer service people. Anybody is welcome to sit down and watch an interview. You just have to know how to listen and take notes. They create these simple rooms where people can observe remote interviews and sit-down and take notes while the interview is underway and then synthesize those together. Everybody works together to observe experiments and take notes. Once we've got notes, we use those notes to update or make changes to these design artifacts things like personas. Sorry, things like personas, things like simple journey maps and maybe the last way I want to make sure that there's two simple ways that we involve teams. So, common practice that's referred to as design studio or design sketching, or a lot of companies has this nickname of crazy eights that because we want to ask people that we don't delegate the design of the UI to just the UX people. We ask everybody to weigh in on the design. That crazy eights practice gets called that because you take a sheet of paper, a three paper specifically fold it three times and open it in now has eight boxes in it. And we want people to sit down and draw eight variations. Eight variations or a flow that has eight panels or some amount of panels. Everybody draws, everybody comes back and synthesizes and shares and people work together to build simple prototypes. And I'm not going to talk through this and test simple prototypes. These are people testing a simple paper prototype. Now look, I want to bookend Leah's quote with another quote from another old friend that look we are involving everybody, but designed by community is not designed by committee, just because well, just because everybody participates doesn't make it a democracy. We don't want engineers voting on the best UI design any more than we want product managers and UX people voting on the best code. Everybody it's important for everybody to weigh in but that's when we involve people that's the way this works. I'm not going to update this. I want to move on. But look, when we're building, we're going to need team help to participate in design and things like design studios, we'll need teams to build more technical prototypes. We're going to need team time to observe interviews, to take notes and synthesize those notes and make sense of what we've learned. There are ways to involve the whole team but not all at once. Let's talk about how you change your Agile process in order to involve the team. If you're using a simple Agile process like scrum, right you've got a sprint planning session and in that sprint planning session, you talk about your delivery work. The stories that you intend on building this sprint and how much time they're going to take and you'd work hard to predict it. Now, I'm going to tell you from now on, in your sprint planning, you need to start by talking about your discovery work. What are the hypotheses that we're working on? What are the most important things to learn next? And what kinds of tests or experiments might we be running? What kinds of activities are we planning out of talking about this? We can talk a little bit about a time box it's UX people in product managers generally spend most of their time doing this but the rest of the team, maybe not I see teams using going in budgets like anywhere between four hours or half day and two full days to participate in things like observing interviews or participating in a design studio or helping to synthesize notes or we're working on more sophisticated prototypes. Once we have that time box or that time budget, we carry that into our delivery planning. So, we use the rest are oftentimes a bulk of the time on delivery. Look, in review, yes, you're going to review the delivery or the software you build but it's important that your reviews now review the discovery and that's where that little Delta next thing comes in handy. If we talk about what we validated, what we were wrong about what's new information that's the real product of discovery. We can also show prototypes things like that but show what we're working on and but what we're delivering. Now, this plan and review cycle is carried out every day and processes like scrum. Look, we have a daily standup and you know how that goes. You review what you did yesterday and you plan what you're going to do today. I'm going to tell you. First tip I've got for you is to stop going around in a circle and asking people what they did yesterday and what they plan on doing today. It focuses on, well who's busiest and it feels like a status report because it is. In your stand-up meetings, start by talking about delivery, but don't go person by person, go story by story, or backlog item, my backlog item. Talk about who worked on it yesterday. Where are we in it? And how do we move it forward today? It turns into a discussion, well, a real-life planning discussion and how do we get our work? How do we get more work done faster? And how do we work together to do it? Split your standup meeting, talk about delivery and then talk about discovery. Same thing. We're talking about the hypothesis or the current idea. We're working on what we're doing today. And if everybody's there that can say, "Hey, if there's interviews going on this afternoon, I can step in and observe and take notes. I know how to do that and it fits in my day." Now, let me look. I gave you the high points for that. It's easy to find an article on line on Dual Track Development find that but at the bottom of that article or our links to recipes, or if you had, if you're practicing Agile or scrum by the book this is the new book. There are four recipes there that describe the way planning and daily stand-ups and team reviews and stakeholder reviews go it Agile Development or Dual Track Agile Development. Last thing. And I think I can say this in just a minute. So, I have a chance to exhale, sorry if I'm going so fast, you're not catching any of this stuff. I hope you did. Look, your team is going to work together. They're going to do discovery work. They're going to drop things to a delivery truck and people are going to start to use those things The actual outcomes happen when things come out and despite all of our best efforts, the discovery work doesn't always go perfectly. Look, every time you release a whole feature, I will ask teams to stop and reflect first on, let's call this the actual effort. I'll ask them to arrange things they're they've released. And I don't mean the individual stories. I mean, the whole features from small to large things that are similar will start to stack up. You get things that take days on the left and weeks towards the middle months and there's somethings whole features to take quarters to deliver. Look, every time it's delivered, stop and say, "Okay, we got that whole thing done. How long did that take?" And then one of the big questions we ask is how long did that take relative to what we expected? If it took a lot longer. Market at these deeds and stuff, We're challenged. They were late. They took a lot longer than expected. Now, everything we ship, if we talk about the actual outcome starts in the I don't know category because we don't know until people start to use it but above that, we've got a continuum that goes from awesome, well, down to awful and in the middle is what I'll call the thud zone where it's not awesome or awful. If at the end, if in every sprint review, you revisit this and say, "Hey, we shipped this last sprint or ship this a little while before, where is it now? While we still don't know what the outcome is, or people complained immediately and it was awful or people love this and it's awesome." These things start to bubble up and move around. This keeps that visible and it lets the team start to say, "Gosh, this was a thought and we want it to be awesome. Do we need, what do we need to do to improve this thing? Everybody always notices from building this type of thing, how few things end up in awesome and how much is in the middle zone here? And people also start to notice that whether it was late or not, has nothing to do with its outcome or its awesomeness late things can be awesome. On time, things can be awful. There's no correlation between that. It moves teams, focus back to the outcome. That's it.
Rory Madden: Well done. Stand back, take a breath for a few seconds, but that was really good. I really, really enjoyed that really visual. Seems a lot of the people who are watching today and we have a few questions in. So, the first question was, how do you recommend teams build an evidence board that says an information radiator but in a virtual environment?
Jeff Patton: I showed one very lightning speed Euro board. It everything's a virtual environment now and we're figuring that out. It used to be easy for me to walk around environments and take pictures of their boards. And I noticed that evidence board as a common pattern. I do see teams using Mural and Miro and making sure everybody has access to that. And so, that's all I've got is that use, try using those tools that allow that give everybody control and that are spatial, that work like virtual whiteboards.
Rory Madden: And that they're improving a lot as well as they'd be moving along with the updates. Someone else mentioned as well and if different people from the team are getting involved at different times, how do you ensure continuity of knowledge across the teams for the people who weren't present?
Jeff Patton: So, two things we're using those ceremonies like sprint planning and daily stand-ups and sprint reviews to make sure everybody hears about what's in progress. Where we're using things like those evidence boards, so people can see it. And first off, it's also important that everybody in your team are adults and they like each other and talk to each other so that they can lean over and ask somebody what's going on with that or they can catch up if they weren't there by talking with people. And then finally, or did I already say, those routine meetings are how we socialize things that's what that review is for is to keep people up to speed. But yeah, people are coming in and out, lean on your team to help communicate that lean on those artifacts, lean on those ceremonies to do it.
Rory Madden: And just one final question that came in and we just barely have enough time to cover it. Someone was asking, but how do you recommend getting stakeholder who aren't on the team to buy into this approach? This comes up all the time. How do we get someone, a partner?
Jeff Patton: Two things first, there's a recipe in there for stakeholder review. And the minute I found that when stakeholder review, isn't just about showing what you built, but showing the discovery, evidence, stakeholders get really excited by that especially when they see. Busted myths or things we were wrong about or really hear what customer sets. So, that stakeholder review is a key part the work visible to them and the only other thing I've got to say, if you're in an organization where you have to get permission to do things, there's an old friend of mine that said just do it. They don't know what you're doing anyway. Ideally, you shouldn't have to ask permission to do your job well. And if you really is responsible for the success of these things, this is important, do it and show them and the evidence and keep that visible.