How We Think About Product Testing At Drift

By Molly Sloan

Today’s episode of Growth is all about testing. Matt Bilotti is joined by Drift’s growth tech lead Vignesh Mohankumar. Together Matt and Vig discuss the ins and outs of testing products. How should you test the products you’re building? When should you test? And equally important – when shouldn’t you test? Matt and Vig dig into the considerations you need to keep in mind when testing products and they share real-life examples from building and testing products at Drift. Don’t miss out.

You can get Growth on Apple PodcastsSoundCloudSpotifyStitcher or wherever you get your podcasts. Or listen to the full audio version below ?

Like this episode? Be sure to leave a ⭐️⭐️⭐️⭐️⭐️⭐️ review and share the pod with your friends! You can connect with Matt and Vig on Twitter @MattBilotti @_vig.

Subscribe & Tune In

Apple Podcasts Spotify SoundCloud

Full Transcript

Matt Bilotti: Hello and welcome to another episode of #Growth. I’m your host Matt Bilotti, and today I’m super excited to be joined by Vig Mohankumar who is the tech lead on the Growth team here at Drift. I’ve realized that him and I have what could be podcast episodes basically every single day.

Vig Mohankumar: They should be recording us. [crosstalk 00:00:20] They should just put these mics next to our desk, and then have it live streamed and that would be-

Matt: I actually think that would be pretty good.

Vig: I think people would watch.

Matt: Yeah. I figured why not just have Vig on the show and we’ll have one of these conversations about a topic that we’ve been going back and forth on a lot lately.

Vig: Right. This is my first podcast, so I thought it would be good to have a piece of paper and a pen to make it look like I’m a veteran of the podcast world, but there’s no notes on here. [crosstalk 00:00:47] I won’t be using these at all.

Matt: Especially for all the audio listeners out there, he does have a pad-

Vig: I really have a pen.

Matt: There’s no notes.

Vig: No notes.

Matt: Probably not gonna write anything.

Vig: No.

Matt: That’s fine.

Vig: Handwriting’s real bad.

Matt: Okay. To go ahead and jump into the topic, we’ve been having a lot of discussions around, call them arguments if you will, around should we test this thing or should we not. If we’re gonna test it, it’s gonna take a week, but we’re pretty sure that it’s gonna work so why are we gonna test it in the first place, right? I just want to toss that out there as a starting point.

Vig: Yeah. This is definitely the hardest part of getting started with basically Growth which is, especially if you come from a product background where, and especially at an early stage company where we were here for a good amount of time, you’re usually measured on, okay, let’s just get this thing out. We have a channel called Shipyard where every engineer, once they put something out on production, you just post in the channel saying here’s what I shipped and here’s how it works and here’s where it is. Especially for me, I would just measure myself on, okay, I got five ships this week, six ships this week. That would be my high score counter kind of thing [crosstalk 00:02:00]-

Matt: That’s it. I’m the best engineer. Look at all this value. It’s all working. Yeah.

Vig: Yeah. Some of my friends, I’d be like, “Yo, I got five more shifts than you this week.” Very odd dis, but yeah [crosstalk 00:02:13]-

Matt: As the company grew and we started moving into this Growth stuff-

Vig: Yeah, we started working on this about a year and a half now? [crosstalk 00:02:19] A year ish now. It’s really tricky ’cause you get to this place where you’re like are the things I’m putting out there actually gonna … are they doing anything? Are they actually working? That’s definitely the reason to AB test rate. Fundamentally, that’s why. But yeah, it’s kind of a big question around when should you do it and when should you not. It’s a good point that we’re talking about it.

Matt: Yeah. Okay, so let’s dig into that. When should we test, and when should we not? Should we not test when we sit around the room and say we’re 90% sure. Before we even run anything, our intuition tells us we’re so certain that this thing is right. [crosstalk 00:03:02] Do you test it?

Vig: Intuition’s good. You have to start with the hypothesis, right? Scientific method. You have to start with, okay, why am I doing this, and what do I think it’ll actually do? If you ship a new button, you have to have some reason for what impact it’s gonna have. Any good experiment starts with that. That’s good. The thing you’re talking about after this is your intuition of, okay, I believe that this thing is probably gonna work because I’m a smart person and I’m not gonna be dumb enough to put something out there that’s not gonna work.

Matt: Yeah, yeah. How could we be that dumb?

Vig: How stupid are we? There’s some level of ego here because it’s like I’ve been doing this for awhile. I’m probably not gonna make a mistake. But I think AB testing has been a tool for humility for me.

Matt: That’s funny.

Vig: [crosstalk 00:03:52] from Slag told me this once, he was like it’s a tool for humility to make sure that you actually realize what you’re doing, and are those the types of things that are gonna work. Basically, you can think about it that way of sometimes you need to make sure basically that what you’re thinking, your intuition, your hypothesis are fundamentally right. So that that affects what you’re thinking about going forward to.

Matt: For me, there’s a point in which you have to start AB testing which is when you obviously have product [inaudible 00:04:23], you have enough volume that you can reasonably get results in awhile, there are parts of our funnel that we would love to test stuff on.

Vig: Yeah. It’d be awesome.
Matt: We would love to. But if we ran something, it would take us two and a half months to actually get significance. [crosstalk 00:04:38] That is a scenario … And that’s if it worked.

Matt: Right, so if it’s even working. That’s a scenario where we could probably look at this and say it’s gonna take us three and a half weeks to get a result on this. Is the potential result actually gonna impact the bottom line enough that that thing is worth testing, or do we have enough background and understanding of the customer that this is clearly better customer value?

Vig: It’s worth explaining that. I think if you’re so far in the funnel that, or if you don’t have enough volume to the point where an AB test isn’t gonna get significance, it probably means you probably shouldn’t be working much in that part of the funnel because people aren’t getting there. If people aren’t getting that far, is that really where you should be working in a place where only five people see it a month?

Vig: I think it’s sometimes easy to get trapped in, oh, I really need to ship stuff here, but if no one’s getting there and they’re getting stuck at your sign up flow or activating at the very top. For us, that’s what … installing Drift, right? Putting Drift on your website. Having a conversation. There’s no point in fixing the settings page to make it so it’s clear to change your color. If no one’s even getting through to the point of seeing the settings page. That’s kind of the first thing to look at is how many people are getting here. What’s the volume here. I think that’s definitely the first question to ask.

Matt: Yep. That’s to make a decision of testing or not. You still might have a core product team that’s gonna fix the settings page ’cause your 20 customers, if you’re still a young company, your 20 customers are like, “I can’t change the color.”

Vig: This is more for a growth team. If you’re on a team that’s trying to really work on distribution or getting volume through the funnel, yeah you gotta think about how many people are getting there in the first place. That’s the first question.

Matt: Okay, so let’s take an example that we’d recently been discussing. We’re doing some experimentation on the onboarding right now. The designer on our team was looking at one of the steps, and said, all right, this step could be much easier if we, just for simplicity sake, let’s say that there were two options of things to do and they were side by side, and she said why don’t we put these on top of each other because that’ll just make it easier, and it’s clearly people work from top to bottom when they’re kind of filling out these [crosstalk 00:06:57]-

Vig: I won’t lie, I looked at the thing, I was like this is so much better. Amanda’s a great designer. I looked there, I was like how did we ever not have it this way in the first place.

Matt: Yeah. Do we test that?

Vig: That’s a tricky one, right? Because in some cases, I would say no because how could this possibly be worse. But I think for us, we need to test it because we didn’t know realize if there was actually a bottleneck in that first step. Even if the new step was simpler and better, that’s totally fine and good, but it may not actually move the needle. It may not get people further through your sign up flow. For us, it was important to test it to try to understand, okay, is this better at all, and if it is better, how much better is it? That’s kind of the reason we did it. Then, we can use that learning going forward.

Vig: I think a good way to think about this is what would you do based on different scenarios. Let’s say we learn that after two weeks the results are the same. What would we kind of do there? In this case, we would probably still ship it regardless because even though it means that we’re not getting more people through to the end, it’s still a better experience. People are gonna be less annoyed. That’s fine. If it’s better, if it’s 10% better-

Matt: Great.

Vig: Sick.

Matt: Great.

Vig: We’re doing our job.

Matt: Nice job.

Vig: Awesome. Now, but then there’s if it’s 10% worse, what are we learning? What’s the learning there?

Matt: Yup. Then the question comes was the 10% drop worth that better experience for those other people?

Vig: Right.

Matt: Right? To me, it kind of comes down to how much volume is there at that part of the funnel. This was step two of the onboarding, and so most of the people giving us their email and the website were seeing the step that a very small change has a pretty big impact down the line. A 10% change when there are two people a day seeing something, what, that’s like a handful of people per month?

Vig: Yeah.

Matt: Right. That’s maybe it’s worth the 10% hit to give better experiences to those people that are going through.

Vig: There’s the other side of learning, too, which is we learned that it’s the same. I think the learning there is even if it’s a better experience, we only have so many people that can … we’re not like Facebook. We don’t have a million engineers to do whatever we want. We really have to think about is this the best type of experiment we should run going forward? If we learn that merging steps together isn’t getting people further in the flow, we probably shouldn’t do another merge step for a different part of the funnel later on. That’s a good learning to have.

Depending on how many types of experiments you’re doing, and what types of experiments you’re running, if you’re gonna try to do similar things across your funnels it’s worth experimenting to get the learning out of it. That’s why I think about, okay, what if it’s worst, what if it’s the same, what if it’s better. Just generally, what would you do? But there are cases where even if it’s worse or better or the same, you don’t care and you’re gonna ship it anyways. Let’s say you have a bug.

Matt: Yeah, yeah. Great, so do AB test fixing the bug?

Vig: Please don’t do that. Please don’t AB test. It just ruins the name of AB testing if we get to the point where it’s like I have to AB test this bug fix.

Matt: Right, right. The thing was broken, but maybe because it was broken and made it harder, and it qualified the people better, yeah.

Vig: Right, yeah. Yeah, or it made them mad, and then by making them mad they really rushed through the rest of the … I don’t know.

Matt: This is where you’ve gone too far.

Vig: Gone too far.

Matt: You’ve taken the concept and just-

Vig: You’re in like fifth order of thinking. You’re in this meta world. Yeah.

Matt: Right. At the end of the day, don’t make a bad experience for your customers.

Vig: Yeah, your product still sucks if your thing’s broken. You know what I mean?

Matt: Yeah, yeah.

Vig: I can’t even … what about an example you sign up with an email, and it gets you through the funnel, but it actually changes your email in the middle. Say Matt at Drift signs up, and then someone else is like I’m also Matt at Drift or something, and then it changes it’s email. You get through the rest of the funnel, but you don’t realize that your email wasn’t the one you signed up with for some reason.

Matt: Now, it’s a different … then, the next time you try and sign up-

Vig: Yeah, I’m just trying to explain that’s a case where people are getting through the funnel, but it’s a broken experience. Regardless of what the end result is, it’s wrong. You should not AB test that. You should just fix the problem. Maybe not the best example. It’s hard to think of stuff on the top of your head.

Matt: Yeah, yeah. Metaphors are tough.

Vig: Yeah, metaphors are tricky.

Matt: It’s interesting. Then, it begs the question of what we were talking about at the very beginning of the podcast was when you’re early on and you’re building product, and maybe you’ve been building products for 10, 15 years, you build an intuition. Now, what we’re saying is you can be running a lot of tests and you learn from those tests, and then you learn that, okay, maybe merging the steps isn’t helpful. Now, do you have an intuition, or should you go test that again the next time?

Vig: I think it depends on your scenario, right?

Matt: Yeah. Yeah, yeah.

Vig: If it’s a case where … you probably should just test it again because unless you have a situation where … I guess this goes back to what we were talking about at the beginning. If it’s gonna take two weeks to test this, and you’re pretty certain that this is gonna make an improvement in your funnel, you can use your learning you had last time of saying, okay, merging two steps together is probably not gonna move it. Let’s not make that change because it’s not gonna make a difference. You can use the learning that you had in the higher part of the funnel at kind of later stage. I would say that’s how I would think about it, but sometimes it’s not completely transferable.

That’s kind of the problem. You have to use some amount of common sense to figure out, okay, the learning I had here, does it apply somewhere else? An example is let’s say you get … you have a product that has an onboarding flow in one product, and your company also has a different onboarding flow. That’s the kind of case where if you have similar traffic to both, you can kind of use the learnings in one for the other I would say. That’s kind of the example I would use at least.

Matt: Makes sense. I want to change. Talk a little bit on this. One of the other things that we’ve run into is let’s say we’re working on the onboarding, we want to help people get through, more people get through, do we test a bunch of changes at once or-

Vig: Oh yeah.

Matt: Yeah, this is another one that we talk about a lot. Let’s say we want to change the color of this button. We’re gonna do a full pass on onboarding, we’re gonna change the color of this button, we’re gonna merge these steps, we’re gonna move this option here, we’re gonna change the way the fields look, and is that the test changing all these things at once? Or do you change each individual thing and wait for results on each of those.

Vig: For context, our old onboarding was built July 2016.

Matt: Long time ago.

Vig: Long time ago. No one has worked on the onboarding for awhile. Then, we were like, wow, we should really make this thing … We should just redo it honestly. There’s no way it made sense to copy and past it, and AB test each part out. It would’ve taken us two years. Then, we would’ve been fired, and then we wouldn’t have jobs, and we wouldn’t be talking about this here.

Matt: Nope.

Vig: I think that’s the first step is if you AB test this, will you get fired. Maybe that could be the first thing you think about. But yeah, no it’s fair point because it’s like we had a designer on the team and her idea was basically we have consistency in design of what we do across the product. We have certain buttons that we use. We have certain colors, and themes, and font sizes, and inputs. Should we AB test every single one of these things out to [crosstalk 00:14:42]-

Matt: Individually.

Vig: Right.

Matt: Yeah.

Vig: To make sure it doesn’t have a negative impact. The answer there was no. We ended up just saying let’s just do it out. Let’s try to do a one to one transfer. Let’s do the best we can. But let’s make the upgrades that we need to make. We’re using an old input style. It’s used in [inaudible 00:15:02] input style. If we’re using a green that doesn’t exist in our color palette anymore, let’s change the green. But we tried not to do anything that would effect the flow. We didn’t say, okay, let’s just get rid of these two steps. Let’s [crosstalk 00:15:17]-

Matt: We didn’t change the text on the CTAs.

Vig: Right. We kept the CTA text the same because then you get into a really tricky situation which is like is this going up or down because I changed the style? Or because I changed the text?
Matt: Or is it the way that the page is structured?

Vig: Right.

Matt: Yeah.

Vig: You gotta do your best to try to keep that. When you have situations where you can’t AB test every single thing, do your best to do everything as a [inaudible 00:15:47] and then AB test the whole thing. Just to make sure nothing’s gotten worse. Then, yeah, after that the question comes up of if I have one step where I change the CTA, and then the next step I want to merge two steps together after that, does me changing the CTA have an impact on what they do after that? Becomes a problem.

We at Drift have kind of avoided, we’ve tried to avoid this basically. We have parts of the funnel of saying, okay, people sign up. All the website experience is we would isolate that I would say. Anything that happens on the website, isolated. Then, they come to the onboarding. Anything from onboarding to finish it I would say, installing, and then getting to a dashboard is a second layer. We try not to do two tests there at once. Then, once they get into the dashboard, the dashboard experience would be a third layer. We can usually have three experiments at once basically. You can have lots of different website pages. We usually have at least four or five experiments running at once.

Matt: But they’re on very distinct [crosstalk 00:16:56] parts of the funnel. Yeah.

Vig: Yeah. But once you get much more scale, you can randomize it. Then, you can [crosstalk 00:17:02]-

Matt: Like a Facebook can do or a Pinterest can do. Yeah.

Vig: Yeah, and then you can kind of it’s called normalizing it. You would just try to understand what impact does this improvement have on effecting other AB tests that are running. Gets a little complicated, so we’re just trying to avoid it because we’re not at the stage where we’re changing this copy and this copy. We’re trying to do big bets anyways usually, so hasn’t really been a problem for us at least.

Matt: Yeah. Yeah, and as it comes back to the whole concept of bite size changes and big swings which talked about a couple episodes ago of do you make this big change where we haven’t worked on onboarding in awhile. Do we make a big change to it and see if that’s the new normal? Or do we make the small incremental changes and see how those add up overtime? Cool. I think that’s it for this episode.

Vig: Sounds good.

Matt: Yeah. How many stars should the listeners and viewers rate this episode?

Vig: Is it out of 10?

Matt: No, it’s out of five.

Vig: Out of five?

Matt: Yeah.

Vig: Oh, probably like a 10 still, right?

Matt: 10 stars?

Vig: Yeah. Just rate it twice, right?

Matt: Yeah, yeah.

Vig: Yeah.

Matt: Yeah. That’s good. Probably not allowed, but at least five stars.

Vig: At least five, yeah.

Matt: Cool. All right. Vig, thanks for joining today, and thank you for listening. If you have any feedback, thoughts, ideas, whatever it might be, send me an email at Matt@drift.com. We’d love to hear it, and-

Vig: You respond to those, right?

Matt: Yeah, I respond to those.

Vig: Nice.

Matt: Yeah.

Vig: You don’t have a hired person that responds [crosstalk 00:18:31]-

Matt: No, no, no, no.

Vig: Oh, nice. Cool.

Matt: I’m not that important. Yeah. All right. Thanks for listening. See you on the next episode.