On today’s episode of Growth, Matt talks about all things activation with Thibault Imbert, Director of Growth for Adobe Creative Cloud. Thibault explains what activation looks like at Adobe – how they define it, set goals around it, and measure it (hint: it’s not just about the quantitative).
Then Matt and Thibault take it a layer deeper and discuss a topic near and dear to Matt’s heart – experimentation. Thibault shares his advice on how to come up with ideas to experiment and test against the activation metric, how his team thinks about leading vs. lagging indicators, and how they make sure they’re not doing something that will hurt retention later down the funnel. Want to find out what Thibault says is his team’s north star for activation? Listen to the full episode.
Subscribe & Tune In
Matt Bilotti: Hello and welcome to another episode of #Growth. I am your host Matt Bilotti and I am really excited to dive deep into activation with a guest today, that I am thrilled to have. I have Thibault Imbert from Adobe. He is the director of growth at Creative Cloud. Thibault, thanks for joining today.
Thibault Imbert: Hey Matt, this is awesome to be here. Thanks so much for the invite.
Matt: Absolutely. And today we’re going to dig into activation, which Thibault is certainly an expert on. And talk all about how they think about it at Adobe, which I’m really excited to learn about, myself. Thibault, do you want to give the audience a quick background on yourself?
Thibault: Yeah, no, absolutely. I started in the early 2000 as a developer, a web developer in agencies, moved to, became a teacher in programming. So I come from programming world and then I moved to design and became a PN by accident. And I’ve been at Adobe for more than 10 years now, which is kind of crazy in the valley. And, but every two, three years the company has given me the opportunity to get a job, where I’m like, “You guys are insane to give me this job. So I will probably, I will definitely take this opportunity.” And it’s been, it’s been amazing.
Matt: That is awesome. And it has led you to quite a cool opportunity leading growth across Creative Cloud. That’s quite a suite of products.
Thibault: Yeah. We started with growth inside Spark, which is a product that we incubated a few years back, with a different approach to building products and back then, the general manager, Aubrey Cattell, was very, a big sponsor of growth. And so he gave me that opportunity to drive growth for spark. And then later, the leadership from Creative Cloud was like, “Hey, maybe we can try what the Spark team is doing locally, at scale, on a this multi billion dollar, our business.” And it was like, “Oh my god, that’s amazing. Yes, I want to do that.” And so here I am.
Matt: That’s so cool to hear that it kind of began as this organic thing happening on one specific product and then it was scaled out. I think that is a very successful model, rather than the other side of just like, “All right let’s spit up a growth team and apply them to all these things.” Right? Because you had a pretty good base layer for how it looks at your company. So that’s really cool.
Matt: Let’s go ahead and jump into the activation topic, which is what we’re going to go through today. And I want to start at a really high level, because I think a lot of people have different ideas of what this word means. How would you define activation?
Thibault: Yeah, so activation, if you think about it, it’s I think, generally in product, in the product world we, the closest proxy is onboarding. And I think onboarding makes sense, but generally, it’s not necessarily precise enough in terms of what do you want people to do to get to form a habit around the core value prop? And so for us, activation is really like the bridge to now.
How do you get a user from the signup to habit formation and to use the product consistently? So for us activation is across 28 days, from the moment the person signs up to habit formation, and all of these steps that are involved in the journey from downloading the product, doing the first actions that are going to be leading indicators of retention, this is for us, activation. This is 28 days. And then after 28 days, there’s another team that’s in charge of that, to really deepening engagement. So activation is really that, that really critical first impression to get you just hooked into the product.
Matt: Got it. And the 28 days, is that a consistent for each different product that Adobe has that you work on? Is it a 28 day standard for all the different products around the activation metric?
Thibault: So right now, we’ve started with 28 days because we’ve looked at all of our products and if you think about the product ecosystem we have from Photoshop, InDesign and even new products, these are the products where you need to, we think that an habit needs to span across more than one week or two weeks, to really prove that you’re forming a habit. Now, one other thing that we are discovering is that, in some mobile products and some other products that are more consumer oriented, the habits and the frequency can be a little bit higher for products that are, for instance, more geared towards social and things like that.
You can actually argue that you can form a habit around using Lightroom if you’re using it for maybe one or two weeks consistently. But in general, we like to use 28 days because if you get a user to use the product consistently across 28 days, even with a different natural frequency, we believe that the user is really set up for success, to come back to the product the next month.
Matt: Interesting. That’s really cool, how you have that standard across. I, for some reason, I assumed that you would have a different timeframe for different products based on what you’re talking about, which is the habit formation. Can you explain a little bit more about how you, one, think about habit formation and two, how you define that for each product?
Thibault: Yeah, absolutely. So I mean habit formation, the way we like to talk about it in the team is, it’s kind of like if I ask you to do squats right now or pushups and you’re doing 20 pushups and you’ve done them. And I say, “Yeah, do another 20.” And you do them. Have you formed a habit in being healthy and working out? No, but if you do 20 pushups every day for four weeks, you’re on your way to actually form that habit.
And when we say habits, Matt, we really look at the core retention metric, the core metric that proves that you’re getting value from the product. So for us, when we say activation, what does that mean? That means that you’ve done the action that we believe is fundamental to getting value from the product. So I’m not part of this team, but WhatsApp is probably messages sent, right? That’s probably what shows the habit. In Instagram, it’s probably posts, but for us, it’s really about getting something out of the product.
Now there are some differences between products. Some products it’s maybe continuous edits, but at the end of the day, you always need to get something out of our products. So if you look at our activation metrics in most of our products, it’s really about getting people to do that action a specific number of times, that shows that this user is now successful with the product.
So let’s take for instance, Rush this new video product that we launched targeting YouTubers. What is the core retention metric for a product that’s, generates, creates videos? Well you can argue, it’s basically, exporting a video that you’re going to share. If you’re not doing this, you’re not getting value from the product. So for us, activation is really around a metric, the core retention metric, which is this metric that shows that the user is getting the value from the product that you design.
Matt: Got it. And how do you get to, you said take an action X number of times. For listeners out there, I’m sure some are wondering, “All right, that’s cool. How do I pick the number? Do I just pick one that makes sense? Do I have to go hire a data analyst? How do I think about that?”
Thibault: So what we do, we do a combination. And that’s funny, because we literally just had a meeting about this, before this meeting. So there’s two components to that. One is, we do a qualitative research. So we talk to actually, customers and we say, okay, we basically look at the users that are still engaged. We look at the people that are basically still, that survive activation.
And we talk to them and we’re like, “Hey, what are the things, what were the moments of delights? What are the things that really, really you felt were like magical moments in your experience?” And what you’ll find is, that some people will say, “Do you know what? For me, that was the moment I use the audio feature. And when I did audio docking and that was awesome.”
Okay. So you do that qualitative research and you don’t need to talk to, honestly, a million customers to get signals. And in addition to that, we run quantitative analysis doing correlation, using machine learning. And what we do, we look at the users that are still engaged in the product for a longer period of time. And we kind of reverse engineer their journey back to the first day. And we say, “Okay, what are the things that these people, that are successful now and retain, have done, that the others have not done, that are not successful?”
And often, what you find is that there’s, to find the right ones, there needs to be a strong correlation between your qualitative and quantitative research. And what that means is that you end up with metrics that are still correlated. It’s not causal. So basically, you’re saying the people that are still engaged in the product a month too, have been using the audio feature. Does that mean that using the audio feature causes you to be retained? Probably not, but it gives you a signal that this, these users are getting value from this feature. So you probably want to highlight these things as part of the onboarding and basically do AB testing to get to causality.
So really the North Star, for us, is to get to causality, where we can pick these metrics and then say, “Okay, it turns out that the people that we tested and we got these people to do that, there’s really a stronger engagement in week four and months two. Therefore, they’re probably causal. And once they’re causal, you can really anchor your team to be obsessed about these metrics. For a lot of our products, we’re still iterating on discovering, we’re not into causality here yet, but we have correlation. And now, what the teams are doing is, doing experimentation to get to causality.
Matt: That’s really cool. So for some reason, my interpretation before I talked to you about this, was Adobe’s huge and they must have so much data, that they could probably just make all their decisions on data. And so the fact that you bring up the qualitative and the whole like, “Well, yeah we also have to talk to customers”, for some reason, that’s a little bit of a surprise. It makes so much sense. That’s really a good point that you can’t do just the data because you have to be able to merge these things together.
Thibault: Yeah. What we like to say, the team, is that quantitative tells you the what, but the qualitive will tell you the why. You can say, you can look at the cohorts of users and say, “You know what? Most of these users doing this.” And if you ask why, no query in the world will tell you, why is that happening? You have to actually pick up your phone, put a survey, get on Slack, talk to customers.
And customers will tell you, “Oh you know, I did this because there was this thing. The app did that and I didn’t understand. So therefore”, and you’re like, “Oh, okay, that makes sense.” So honestly, many times when the quants, because quants, you can honestly, you can manipulate the quant. You can tell, you can say whatever you want. You can make the data say whatever you want, but qualitative interview really connects you back to the reality, where users are telling you, “I’ve been doing this because of that.” And so we really, really always go back to qualitative to make sure that we’re not getting the negative or the wrong signals through quants and make the wrong decisions.
Matt: Yeah, that is great advice for everyone out there thinking about activation. Okay so, let’s say you’ve gone through some of that process. You’re rolling out a new product, you’re starting to apply growth to an existing product. You got a pretty rough metric around what your activation is going to look like. How do you then set a goal?
So there’s two parts that I want to talk about. Setting goals and then, reporting around that. So how do you think about setting the goal? And then are reports like this big plug and play thing at Adobe or are they custom for each? Sorry, I’m hitting you with a double barrel question there.
Thibault: No, no, it’s great. And these are so relevant to the work we do every day. James, who’s with me in the room, is smiling, who’s working on the team. The thing is, for the reporting, we do the reporting, a specific dashboard that we called the product health dashboard. This is a dashboard that is very visible to executives and all the leadership and all the product teams are looking at it, marketers.
This is this kind of like our source of truth. One source of truth where we plot and trend these metrics. And we’ve learned a lot by going through this exercise. The minute you put a metric on a dashboard, you really want to make sure that everyone feels good about that metric. And so that’s how we do reporting and what we do is, we send updates, weekly updates, and we say, “Okay, these are the metrics and they’re going up or down.”
Matt: When you say feels good about the metric, is that they feel good about the fact that they know that it’s true? They feel good about the fact that we all agree that [crosstalk 00:12:48] are the one to follow?
Thibault: Yeah, exactly. It’s really what, you really want to make sure when you talk about activation, that the product team really believes that these are really the right leading indicators, that they’re the right qualitive metrics that are leading indicators to retention. Because the moment you put that on the dashboard, that executives are looking at, obviously, you want to make sure that people feel good about this thing going up. Or even down, and have people be like, “Yeah, it’s going down. These are the right metrics. It’s going down because this is happening and we’re going to actually figure that out.”
So that’s really important that you have this trust between product. We often talk about growth sitting in the middle of marketing and product. You really want to make sure that there’s this really tight partnership and this is something we’ve done at Adobe, where in every growth squad that we’ve created, there’s a product manager from the product team that is involved. And the reason we do that is because one, they are the product managers. So it would be weird for us to do experimentation in a product, that they own. We’re on the lease, right? So we got to have the tenant and the owner here, as part of the conversation.
And then what we do is, we make sure that they feel good and they are always connecting us back to qualitative. When sometimes, the growth team might be looking at data and say, “Oh my god, we should do this because of that”, product managers can say, “Hey, hold on. We actually did a survey. It turns out people don’t do this because of that, but because of this.”
“Oh very interesting.” And when it comes to goaling, what we do is we have marketing, a business operations team that is working with us. And we have engagement managers that we work with, as part of the growth team that are working on goaling. And what we do is that we set quarterly goals and we need to report on that. And so, what we do as growth team is, we then experiment and report and track target, these input metrics, which if they’re the right one, will actually move the output. So we really talk about input metrics and output metrics. Output being retention and the input being what we target through experimentation.
Matt: So one thing that, we talked a couple of months ago, and one thing that you said, which we just relearned here over at Drift, on our own, is that you, oftentimes, don’t want to goal yourself on an increase in a rate. You don’t want to set a goal to increase an activation rate by X percent. Could you expand on that a little bit, if you know what I’m referring to?
Thibault: Yeah. Well I think one thing we’ve talked about in the past is that, often when you set goals through, basically, points lift or percentage lift, I’m actually fine with this as long as you also report on absolute numbers of people that you’re lifting. Because it’s very easy to bullshit the numbers by saying we had a 16% or 16 points lift and everyone’s like, “Holy cow, this is amazing!” And then no one’s asked me, “How many people is that?”
So you can have 16 points lift, but you’re actually lifting 200 people. You’re not going to move the business. But if you have one point lift to a million and a half users, that’s money. So that’s the thing that we do in our reporting, is that we do set goals through points increase, but what we do is we also report the lift in absolute numbers. And how many people did this move? So that way it allows us to really make sure that we’re not celebrating wins that are super small or are super big in lift, but super small in absolute numbers.
Matt: Right. Yeah. And then the other part of that, that was a painful lesson to learn for us, at least, was it was really great to see, so let’s say, we were working on driving sign ups. We were able to drive sign ups 20% up, but then those sign ups didn’t actually result in any additional activations. Right? So that’s the other part of it, kind of what you were saying, is you have to make sure that it actually drives the impact down the funnel and turns into retention.
Thibault: Yeah, and this is interesting you mentioned that example with Drift, what you guys went through. There’s this really interesting article, I’ll send it to you, Matt. Hopefully you can communicate that or you guys can Google it, the listeners. LinkedIn published an article from their data science team, called the Science of Growth, I think. And basically, the data science team explains that at LinkedIn, they had sign up as a success metric for acquisition, which makes a lot of sense.
And they realized that over time, the acquisition team was pouring a ton of users and they were seeing, obviously, this success metric go up, but it was actually, a lot of tire kickers and the activation was not following. And actually, it was not really converting a lot of now. The ratio conversion to new to now was really low. So what they did is, they changed the success metric to, for the acquisition teams, to qualified sign up. And what a qualified sign up is at LinkedIn, and maybe it has changed, but from the article, it is someone that you bring in, as the acquisition team, that signs up, but then creates multiple connections on the network and completes their bio.
And I, this elevates the game too, in terms of, who do you bring? And it forced the growth team at LinkedIn to really look at the acquisition team and say, “Not only you need to bring signups, but you need to bring the high quality users because that way activation is always also, going to go up.” And I thought that was a really smart way to work with acquisition.
Matt: Great, so I’ll make sure that that link is in the description for all of our listeners, to the LinkedIn article. All right, let’s say you now have your goals, you have your reports, you know where your activation metric is. How do you think about having your teams come up with ideas against that activation metric and thinking about how are we going to start to move this thing?
Thibault: Yeah, and it’s a great question and just before I answer this, I want to explain why it’s so important to have activation metric. Because other thing that happens generally, when I started and growth is that you say, “Okay, we’re going to actually, we need to, at the end of the day, we need to move that outputs, which is week four retention. And retention is really the outputs. You need to understand the inputs.”
So if you go into a meeting and you talk about how to move the output, you’re going to get a lot of random stuff like, “Oh, we should improve our emails. Why don’t we do this? Why don’t we do that?” What you want is, really box a team and have people think in, with actually hear constrains really help ideation. So you want to say, “Okay, our first step in activation is getting people to do this, like importing a video by day three. What can we do? What are all the things that we can do, to get people to do that? Let’s talk to customers. Let’s see what other products are doing. Let’s look at all the different channels we can leverage. And what are the features that we can build even, to move that metric up?”
And so what we do, Matt, is we’ve learned a lot and for some people in the audience, maybe you folks went through that too. We started with talking about output and you get a lot of ideas and they’re not really actionable, not really moving the right metric. So now what we do, we say, “Okay, for this sprint, we’re going to be moving that metric. We’re going to be getting people to do this thing.” And we have basically, six weeks sprints, where four weeks is spent on execution and then two weeks on research.
And during these two weeks of research, we spend quite a lot of time brainstorming as a group, where it’s design led now. Where the growth designer is leading that conversation, where we look at what other products are doing and we really brainstorm on, what are the things that we can do to move that metric up? And I really believe that good ideas come from everywhere.
So having a marketer, having an engineer, having people, even from maybe some, the sales organization, having this blend of people with different mindsets and culture, helps a lot. You can actually have a lot of bias by getting two people that have lived in Silicon Valley for 20 years and they think that people will do that. It’s great to have someone from the middle of nowhere in Europe or the US, that would say, “Actually, my sister or my mom, would never do this.”
Oh, okay. Well it helps recalibrate on the reality. I always like to say that we as product managers, marketers, live in San Francisco or big cities, grew up with technology. We are the worst people to actually come up with what people think and how people, how normal human beings think and approach software. So keeping that mindset of beginners, to reference a previous podcast that you guys did, keeping that beginner and that every, how to put this? The normal human being thinking is a huge asset in your team. So try to have diversity in who you have in the room.
Matt: That’s great. I think it’s something that most people don’t think about as much as it relates to the ideas that your team might have around moving a metric like activation or sign ups or anything like that. Okay. So my last question here, around activation, is how do you then, so you’re experimenting stuff. You’re trying things to move activation. How do you think about leading versus lagging indicators and making sure that you’re not doing something that’s going to hurt the business downstream or looks really good on activation, but doesn’t help further down the funnel?
Thibault: So it’s an interesting question. The way we think about it is, when we do that correlation work, looking at activation, we look at, okay, are these correlated with retention? And we, when we at Adobe talk about retention, we talk about financial, like basically, retained subscribers. And when we talk about just repeat use it’s, we just say repeat use or active use.
And so when we look at these inputs, we do the pre work to make sure that they are helping the business. And at Adobe, in the growth team, what we take pride in is that, at the end of the day, for us being successful at the growth team, means people spending more time in the products being creative. There’s no other, I mean, we’re not monetizing through a different way. Our people, the likelihood of someone canceling, you don’t even need to do correlation work. If the person’s not downloading the product and not using the product, of course people are going to cancel.
So for us, we look at activation metrics moving week four engagement. We look at, are they helping to drive retention in 50, 52 weeks? And if that’s the right thing, then getting more people to use the product is a healthy thing. Now, what’s interesting is that, increased engagement doesn’t necessarily always means retention. There will always be variables in the equation. People losing their job, people changing careers, people going on vacation, wanting.
So it’s never going to be perfect, but we believe that if people are more and more engaged in the products, spending more time in the product, it’s going to help the business. So that’s how we think about it and if we end up in a situation where getting people less engaged in the product is driving the up, then I will retire and open a bakery and do other, something else because I don’t understand how this thing works.
Matt: That’s a good way to wrap it all up and this Thibault, thank you so much for joining us. I know I learned a ton. I hope our listeners learned a ton, as well. Is there a place you could point people to learn more about activation or how Adobe thinks about this stuff?
Thibault: I would love us, as a team, to be more active on medium and stuff. I wrote some stuff in the past, but we haven’t published a ton because, well like everyone, we’re super busy. One thing I would recommend is Brian Balfour at Reforge has done a fantastic job working with Andrew Chan and Casey Winters, former lead from Pinterest. They have assembled this program called Reforge, reforge.com.
I’m not affiliated in any way, it’s just I really like the work that they’ve done by putting really, really good foundationals. To me, it’s the best, probably, growth training around. So for the listeners that don’t know about it, check it out. It’s really good. What they do, they do a bootcamp. Many people from our team are going through this. Every new employee that joins the growth team is going through reforge. Because it has a lot of the foundational stuff that we talk about every day and they spend a lot of time talking about activation in their retention series. So I really encourage everyone to check that out, if they can.
Matt: All right, sounds good. Well thank you again. I do want to throw out there, on the first time on #Growth, there is a sequel podcast to this, which you can check out on Drift Insider. Thibault is going to join again, with the guy who was also in the room, James that he works with, to talk about what they call Project White Glove, which is all around how they collect customer issues based on user behaviors and aggregate that using machine learning, to share with the other teams at the company that, to make decisions. So I’m super excited about that. If you want to check it out, go to drift.com/insider. Thibault, thank you again. Really appreciate it and we will catch you on the next episode.
Thibault: Thanks so much Matt and thanks for putting this podcast together. I’m an active listener and it’s awesome to build this community, so thanks for having me and good luck with the podcast.
Matt: Absolutely. All right. Thanks everyone. See you on the next episode.