Looking to become a data-driven organization? Well, there’s no silver bullet, but there are definitely some tricks that Dan Wolchonok, Head of Product and Analytics at Reforge, has picked up along the way. Before joining Reforge (an e-learning company that provides growth masterclasses for product and marketing pros), Dan was a founding member of HubSpot’s sales product growth team, and later went on to form the company’s product analytics team.
On today’s episode of Growth, Matt throws his toughest data questions at Dan. They talk about how to approach historical data and data tracking, how to pick reporting tools, and who should own analytics. Then Matt and Dan go deeper and talk through the thing that separates the top one percent of the companies from the rest – good retention. To hear what Dan has to say about retention curves (and the best way to analyze them), listen to the full episode.
Subscribe & Tune In
Matt Bilotti: Hello and welcome to another episode of #Growth. As always, I am your host, Matt Bilotti and today I am excited to dig into a topic that we have not really touched on yet, which is all around product analytics. And with me today I have a guest who I used to work with at HubSpot and when we were there together his nickname was Data Dan, I believe it’s probably still his nickname today. I have Dan Wolchonok, who leads product and analytics at a program called Reforge. Dan, thanks for joining today.
Dan Wolchonok: Thanks for having me on, man.
Matt: Absolutely, do you want to give the audience a quick run down on who you are?
Dan: Sure, I’ve been building tech products for the past fifteen years. Most recently I was at HubSpot for five years and I was in charge of our product analytics and about a year ago I left to join Reforge and I am the head of the product and analytics team there.
Matt: Very cool. For those of you who are trying to learn more about growth Reforge is a very good resource so might wanna check that out. All right, so I wanna dive straight in and I’m going to start at a high level and then we’re going to go into some tactics a little bit further down in the conversation. So let’s say I’m a company, I’m getting started and everyone’s telling me that I need to be more data driven, be more analytical, how and when do I start? When is it time to start paying attention to data?
Dan: I think it’s time to start paying attention to data when you can’t have conversations with all of your users anymore. When you can’t talk to them, email them, watch them, watch what they’re doing. I think that’s when you want to start thinking about using behavioral analytics to keep track of what all of your users are doing.
Matt: I love that because it’s actually a very clear point, most of the time when I talk to people about when to start analytics it’s like, “Oh, when you hit a million dollars of revenue, right?” This feels like a much more natural … Because it could happen at any point depending your scale of your consumer app. It happens way sooner than if you were a B to B tool.
Dan: Yeah, definitely. If you’re an enterprise company and selling one deal is immaterial just go talk to those people. I mean, oftentimes people can lie to you about what they’re doing and I think behavioral analytics can be helpful to understand what they’re truly doing, not just what they say they’re doing. But, if you’re just getting started I think it’s fine to say, “Okay, we can’t keep tack of what everyone’s doing, let’s put some systems into place so that we have a good foundation going forward”.
Matt: Makes a lot of sense. So, let’s say we are ready to make that leap; we can’t talk to anyone anymore, I’ve been working on it for a year and I used to know every single customer by name, but now I don’t and I can’t reach out to them all because I have too many other things going on. Can you give maybe a quick run through of how you think about your funnel and analytics related to it? Do you start by getting one big funnel report and then you work with that for a while? How do you define the funnel? Where is a good place to start?
Dan: Yeah, so I think the first thing you want to do is bring a first principled approach to things. So you wanna ask questions like, “How are you supposed to start using your tool?”, “What are the key actions that people should take?”, “What represents value for your customers?” And how you can measure that. And then ultimately just what you want your users to do and how is it important for your business. And if you keep track of the things that are important for your customers and for you as a business, those are some of the things that … that’s literally the first thing you should do when you want to keep track of what’s happening within your application. That should drive things.
So as an example, once you go through that exercise, one of the things that I think people should do from the very get go is they should keep track of events like when people sign up for your product and then when they do certain key actions and the answer to what represents value. That should hopefully be the end of a funnel report, to say, “Of the people that signed up for your product, how many then convert for step x,” whatever it is.
Matt: Makes sense, so this is one of the things that we had a challenge with at Drift early on when we started getting to this point where we were ready to become analytical. How do you think about data accuracy? Should you, when you’re in this spot, should you strive to get all the tracking right and get it right now so that you don’t have to go back on it later? Is eighty percent correct, is that okay? How do you think about that?
Dan: So six years ago when I joined HubSpot we started using Mixpanel, and Mixpanel was like … I’m not a religious person but when I was exposed to Mixpanel it was like a religious experience for me. I was like, “Wow, this is the coolest thing, I can’t believe you can do this. This is so powerful.” And I then became the owner of Mixpanel when I was at HubSpot early on in my tenure there and I was talking with Mixpanel about how they instrumented their own application and I was blown away because when we were at HubSpot, ultimately I think we ended up having thousands of events. We may have even had tens of thousands of events over time because it was absolute chaos. But I remember being blown away. Mixpanel had something like less than ten events that they were keeping track of.
Matt: Oh wow.
Dan: Which was a really good perspective for me because it showed what good clean living looked like, and back then Mixpanel wasn’t that sophisticated. There were only a couple of reports, but they only had one event for viewing a report and there were attributes on an event that said what kind of report it was; whether it was a retention chart, or a funnel chart, or a time series chart. And so it felt like they had spent some time, maybe an hour or two, thinking through, “What are the types of features we see ourselves adding over time, and what is a sane way of representing that with names of events?” And so you could use a tool like Keep, which just automatically keeps track of everything that happens in your web application so you can retroactively go back and cherry pick events that happened, but I think it is very helpful to think through … There are going to be certain types of actions in our application, let’s pick some generic names so that later on if we rename a feature, we change something, the way we send in events doesn’t have to fundamentally change.
Matt: That is so interesting that they had so few events to start with. Because I think there’s this very clear tension and desire to say, “All right, we’re going to add tracking. We don’t know how many trends are under the hood that we haven’t seen yet so let’s go track everything,” right? And kind of the same thing at HubSpot. It’s fascinating to me that a analytics company started off very very focused.
Dan: I mean, at HubSpot we released this feature called Sidekick and we named this event “Viewed Sidekick” in Gmail. And then over time the feature was changed and then the name of the product was changed and so ultimately the name of the event didn’t represent what was happening anymore at all, and so there was only one or two people that could actually give you the answer to a question about how many people were using that feature. And it just goes to show we didn’t put any forethought into how we sent in those events into our system so that we could do reporting on that later. And you’re never going to have perfect clarity about how things are will evolve, but we really didn’t think about what that event represented at a base level and giving it a generic name so that it would age gracefully.
Matt: Huh, that’s pretty funny I can think of some examples of situations that we’ve been in that I’m the product manager and I’m trying to analyze a report in Amplitude and I’m pulling up this even and it’s like, “Conversation Started,” and I’m like, “Oh cool, this must be it,” and then I build a report around it and then I’d show it to my tech lead and he’s like, “No, no, no that’s “Conversation Started” that they had with us, not “Conversation Started”that they have in their account”. It’s so true, it’s so easy to back yourself into this corner especially when you track everything up front. You just kind of name stuff and … all right because you’re in this mode of go, go, go.
Matt: So you had mentioned that you think maybe it’s worth an hour or so of just thinking through. Is there a particular way to think about that? Is it put it in a document and then collaborate on it with a couple other people and get agreement that this scales, is it to sit down and do a brainstorm on this? Is it a job for one person or should a team be thinking about it?
Dan: I think a lot of growth teams typically organize with a product manager, a tech lead, and then a bunch of engineers and hopefully a designer. And this may feel like a luxury but I think it’s worth having at least a thirty minute chat where someone on the team can say, “These are all the things that I think our app does today. These are the things I think it realistically will do in the future. Let’s think through how to name some of these events and come up with a event dictionary to represent: here’s the name of the event, here’s where it happens, and here’s what it represents.” And as soon as it’s completed it’ll be out of date but hopefully you have a little bit of documentation and you put thirty minutes of thought into thinking, “This is the current state of functionality we have. This is what we will have in the future. And these are the key attributes we expect to be able to segment those events by.”
Dan: So in your example, “Started Conversation”, you’re going to want to say, “Does this happen on mobile or desktop?”, and oftentimes I see people where they in the event name it says “Started Conversation – Mobile” or “Started Conversation – Desktop” and in a perfect world, you have agreed upon that that name will be “Started Conversation” and you passed an attribute that represents whether it was on mobile, whether it was on desktop, whether it’s a big company, whether it’s a small company, whether the … you know, what time it was, whether the agent was available or not. And having not just the name of the events but some of the things that you’re going to want to segment those events by, I think that can be really helpful.
Matt: Okay so let’s say that missed the boat, we totally missed the boat on good tracking. Either I showed up late and it was already done and it’s a total mess or I was a part of that mess and now this is our life. How do you approach that? Is it kind of forget what’s there and just start new on top of it? Do you recommend digging around then picking out the ten things that seem like they’re the best? Where does someone go from here, because if I had to guess I think that’s where most people listening are probably at.
Dan: Yeah I think historical data is so valuable, as long as someone knows how to make sense of it I wouldn’t declare bankruptcy. A lot of these tools nowadays have functionality where you can combine events into new events and give them a new name, and then you can hide the old events. So it may be worth going through the exercise of spring cleaning, you know, you may even want to Marie Kondo it and figure out the new events you want to set for your names and then have the old events map into those new events. If you don’t know what any of the events mean, it might just make sense to start over, start a new project. Keep the old one in case someone figures it out but just start a new one. But generally I think that historical data is just so valuable when you say … when you release something new and say, “I want to compare the historical behavior with the new behavior,” so I’m a big advocate for not throwing things away just because it’s a little messy.
Matt: Right, and so as long as someone knows what that thing means and then you should probably … if there’s like one person at your team that knows what this data table is, what this fifty events mean, then you should probably document that down before that person moves on. Okay so I want to take a little bit of a shift here, I think that was great. We spent way more time on that than intended but I think that specifically, like the data tracking, is something that usually everyone just kind of assumes that other people have figured it out but it’s actually pretty tricky and it was hard for us to find any good content on it so thank you.
So I want to shift over to … You’ve got all these events, you’ve got your data coming in, you’ve picked out your reporting tool, maybe. Oh actually, all right, how do you pick your reporting tool? You know, you’ve set up your events, how do you pick the tool that you’re going to run your analytics on? Because I know that you used to run some stuff in Google Spreadsheets, other times you’re using Mixpanel. How do you think about choosing the right option?
Dan: Yeah. I think for most companies, honestly, any of these tools are going to be just fine. And I think the value you’re going to get out of it is the effort you put in. So for most companies I think just using it is going to be really helpful. For us personally, we used Amplitude at the time because they made it really easy to switch between … Because we were a B to B company, we wanted to look at time series charts, funnel charts, and retention charts by users as well as by organizations. And they had a feature at the time that made it really easy to switch back and forth, to say, “I want to look at my weekly active users, and then switch one thing in that chart,” and it would show you the same criteria but it would show you weekly active companies. And so for us at the time that was a big deal because we were a B to B company.
If you’re not B to B, that feature is worthless to you. And so I would really think about just adopting these tools but then also thinking about … each tool has is strength, right, no tool is perfect and so it’s just an exercise in saying, “What are the specific features they have and is that really going to make a difference for us?” And like I said, I don’t think for a lot of companies it’s going to be make or break.
Matt: Makes a lot of sense. So all right you’ve got your events, you’ve got your tools, now what charts do I build? I hear Facebook uses DAU’s, daily active users, and these other tools do monthly active users. Is there a standard set of analytics that I should go from the get go with, like build these five reports and then build one report for each step of my funnel? How do you think about the actual reports that you’re making?
Dan: Yeah. So I would think about what’s important for the business. Ultimately, you need new people coming in the door, and so I would care about things like sign ups and I would think that you’d have a whole bunch of charts that showed the funnel of how successful those sign ups were being in using out your product and hitting some key point. You may also want to go really deep on it, you may want to segment it and say, “I want to see that funnel conversion rate by source”. People that come in looking for your product hopefully have a higher conversion rate than people who are coming from a random Quora thread, and so how deep you want to go on that top of the funnel stuff is up to you but I would expect any good team to know what their new cohorts look like coming in. That’s kind of the top of the funnel.
The next thing I would think about is things like, “What do your retention curves look like?”. The thing that separates the top one percent of companies from the rest is having good retention and so if a team wasn’t keeping track of their cohorts over time I’d be really concerned. And what does that look like? I expect to see mutually exclusive sign up cohorts trending over time and I look to see the size of those cohorts as well as how well they retain. And I look at how steep the initial drop off is in the initial part and then I look to see, “Are those retention curves flattening off?”.
And again I probably want to see those retention curves segmented by some meaningful metric. Maybe for a B to B company it’s the size of the company, and I might expect that smaller companies don’t retain as well as large companies based on whatever the dynamic is in your product. But that’s one of the key things that I would look at is, “How well are the cohorts of signed up users retaining in terms of using your product on a daily, weekly, or monthly level depending on whatever the natural cadence is for people interacting with your product?”.
Matt: Got it. So you think about what is the normal usage pattern? So if you had something like a consumer app and you expect your users to open it every single day, you probably want to look at cohorts on a daily basis as opposed to a monthly basis, but of course you’d probably want to have both but one you look at all the time.
Dan: Yeah. I mean pick any of the meal companies that send you meals on whatever, on a monthly cadence or a weekly cadence. It’s very different from something like Messenger, where you’d be expecting to be chatting with your friends on a daily basis. This is another place where I would try and understand from your tools, one of the things that I look at is depth of engagement. Which is, in the past month, what percentage of your users have used it for one day? For five days? For ten days? For thirty days? And understanding what that natural frequency looks like.
And it depends, right? When I’d worked on B to B products, not everyone uses these products on the weekends and so if you were to look at it daily you might be worried or you may want to pull out the weekends. But for other products, maybe you’re a shave club. Maybe you want to ship razors on a monthly basis. So one analysis doesn’t always work for every company and that’s why I think you got to go back to first principles to say, “What is my product, what’s the value proposition, how frequently delivering it for customers?”, and you need to adjust whatever your dashboards are, whatever your charts are, accordingly.
Matt: Yep, that makes a lot of sense. So one thing that I just want to touch on, you mentioned that the difference between companies that are paying attention to their retention … That really sets them apart. You have a blog post that kind of outlines the starkness of an example of different retention rates. Can you just touch on that with some numbers for the audience to kind of hold on to?
Dan: Yeah. So I’m sure everyone’s heard of some of these companies that have had these meteoric rises. There’s been a bunch of tweets from Paul Graham saying, “Oh this company is the fastest growing company come out of Y Combinator,” and then literally a couple months later that company’s out of business. And the reason for that is because they were brand new, they were signing up lots of people, and they weren’t properly looking at their retention curves. And so they were providing incentives for people to sign up, get something for free basically or heavily subsidized and then once the discounts ran out they never came back. And they scaled their user acquisition too quickly, it put way too much money in the user acquisition and ultimately none of those people stuck around. And so all their charts looked fantastic initially because everything was up and to the right and this is often the case with the retention problem, it’s called the silent killer for a reason.
What ends up happening is that none of those initial users end up retaining and then you stop spending as much money on user acquisition and your daily active user, weekly active user, monthly active users, it all craters. And one of the things you can easily, easily do is model out what different retention rates look like, and it’s quite incredible the difference between the scenario where your retention curves flatten off versus if they go to zero. When you look at it from a weekly active user monthly active user standpoint, your charts will continue to go up and to the right and your revenue charts look exponential.
If paying customers stick around indefinitely and you do a better job of converting those free users into paying customers over time and they ultimately keep upgrading for new offerings and new features you offer, your revenue curves look exponential up and to the right even if your weekly active user and your monthly active user charts don’t. And let’s say they’re just linear or pseudo flat, it’s mind boggling the difference that a couple of changes in your retention curves will manifest to the overall trajectory of your business.
Matt: And the thought that you spend the time building these reports now and then you look at it and you see this big big challenge and then you go work on the challenge and then a couple weeks later, a couple months later you come back and make new reports, you look at those reports. How do you think about ongoing maintenance of your analytics.
Dan: You need to constantly saying are these the right events, because your product is going to change. You’re going to be releasing features that are total duds and you need to rip those out and you’re going to be adding features that get a ton of adoption. And so you need to be constantly updating what constitutes active retention in your product. So ultimately the thing you care about and that I think you should be monitoring all the time … Like if you’re running a premium product that hopefully keeps growing in terms of new people signing up and you release new products you want to sell people on them, especially for SaaS, you want your retention curves to flatten off because if they don’t then it just becomes a game of how little can you acquire customers for and how much revenue can you extract out of them. Because if they’re only around for two years or if they’re around for a year it’s just a game between how much money you make off of them versus how much do they cost.
Ultimately what’s going to happen is all of your acquisition channels are going to become … There’s going to become more and more competition on those acquisition channels so the LTV of those cohorts could potentially go down over time. And so that’s why it’s so key that your retention curves flatten off because those people stick around for a very long time if not indefinitely, and it’s not just a game to see how cost effectively you can acquire new cohorts and users.
Matt: Yeah I’m going to take a bit of a turn here outside of … and maybe this is the answer … Outside of maybe not paying attention to your retention curves and having analytics around that, is there another major pitfall that you see people make in regards to their product analytics or is that the number one thing?
Dan: To your point I think it’s really easy to take your eye off the ball, and there’s all sorts of reasons why your retention could suffer. A new competitor could come out with something that the people switch for. I’m sure you’ve talked about this a million times but there’s this balance between shipping really fast and having a reliable product and I’ve seen it myself where teams stop looking at some of the metrics, their product quality slips because their pumping out a ton of new features and people frankly get fed up with your product and they leave. And so you can see lots of different issues manifest themselves through your retention curves. You can see reduced quality, you can see your features becoming less competitive with competitors, and so I think that when people stop looking at these things, they’re ripe to be disrupted; they’re ripe to have someone else come around and eat their lunch. So frankly I think you always have to be looking at the sizes of your cohorts coming in, the conversion rate of those cohorts to healthy engaged users, the conversion rates of those users to paying customers, and then the retention of those paid and free customers over time. I think you’d be nuts to stop looking at those things.
Matt: Yeah I think that’s a really good point, especially as your company scales more and more I think we started to cross this threshold where we didn’t have really good alerting. We had the reports and they were getting posted to this algorithm that people kind if stopped looking at every day, and then one day we looked at one of the charts and we were like, “Oh, that thing dropped five days ago, we probably should have known that”. And so there’s definitely a threshold where, one, you need to be looking at it every single day and then if there is a point where you may stop looking at it every single day just because, like Dan was saying, if you’re focused on new product launches or whatever it might be, whatever changes, then you should probably make the immediate leap to add some sort of triggered alerts before saying, “Oh we don’t need to look at this thing,” or “I’m too busy right now,” because that’s when you really really get yourself in a bad spot like you were saying.
All right, I have one last question here. Who on your growth or product team should own analytics? Is there an owner to analytics? Actually I’m going to ask one more other question after this so two more questions, but who is the owner of analytics?
Dan: This is such a BS answer but as with all things, it depends. I think it comes back to what kind of organization you have and the roles and who’s responsible for what. So I could easily construct two different variations where there’s a growth team that owns it and then there’s a product team that owns it. There’s this constant tension and pull between teams that are making optimizations to produce a business objective, let’s say, increase a conversion rate or increase engagement and then teams that are building very tech heavy features. And there’s this constant push and pull between those types of teams and so it really depends who’s in the best place to understand the reasoning for why you need this tooling, who’s going to use it, and what the output is going to be from all these reports.
And so I think there are circumstances where it makes sense for a growth team to own it and then there are circumstances where it makes sense for the product team to own it. And frankly I think it’s fine for it to bounce back and forth between teams as the organization evolves and as the teams evolve. I know that’s kind of a BS answer but it really does depend. I don’t think there’s a silver bullet answer for all organizations out there.
Matt: All right and then I’m going to ask you one last one which might have a very similar answer. How do you get your organization to become data driven in the first place? So if I am listening to this and I am looking at my team or my company and I’m thinking, “Man we don’t have any analytics, there’s like no reporting here, and no one’s really paying attention to the data and we’re all running off gut,” how do you move forward from here? Do I put this on my shoulder as in I become the Data Dan of my company and I go teach everyone the new ways? Or is it a thing that will kind of happen naturally as a result of a lot of painful things that go on where missed here and there? How do you get the organization around, “All right it’s time to get our analytics in a good spot”.
Dan: This is such a hard question.
Matt: That’s why I saved it for last.
Dan: Yeah I know it’s such a good one because technology is easy but the hardest thing to do is to change human behavior, it’s to change how people act, right? And so I had this same experience where I was trying to change the culture and get lots of people to adopt data and making decisions and I think I was … I expected there to be a whole bunch of change overnight. I thought I would put out a couple of analyses, everyone would see the light and they’d all come running. I had a team of analysts of data scientists that we get to hire and we would hold office hours every single week and we’d publicize them, we’d let everyone know, we had email reminders, we had Google shared calendar, all this stuff to get people to come ask questions and the office hours were always a ghost town. No one ever came and it was super frustrating for me and the entire team because people had questions but they weren’t coming and asking them.
Dan: Again, I don’t think there’s any silver bullet way of making your organization flip on a dime and in some way become data … To use data to help make decisions, along with all of the other inputs you have whether it’s qualitative research or feedback, business feedback, any of that stuff. The thing that I would do is I would identify people that are advocating for initiatives, that are asking for resources and are looking for help in getting data to justify those initiatives. I would look for ambassadors and I would try and make them successful. I would help them run the analysis, I would help them get the data that they need, I would help them be successful and then everyone is going to want to model their behavior after the people that are successful in your organization.
So if you can point and say, “This person pitched initiative A, B, and C, I helped them do it, and they got a whole bunch of resources to go do that, they did that with some of my help”. You’re going to way more people coming to you with questions and asking for advice and asking to run analyses if they know that that is a path to be successful at their organization. I think that’s probably one of the things I wish people had told me was, “Seek these ambassadors out, help them be successful, and you’re only going to create more of them,” rather than try and teach everybody everything at once, which I think is impossible.
Matt: Yeah that is quite a task. All right, I think this is actually the longest growth episode so far on the podcast and I think it was well worth it we covered some really amazing stuff so Dan, thank you again, really do appreciate it. Any parting words for the audience? Where can people go to learn more about this stuff?
Dan: Sure. As I said before, the company I work for is called Reforge, we offer a lot of programs and also free content for anyone who’s interested in taking a first principled approach to try and improve their product and their business so if you’re interested in more of an academic approach to this stuff Reforge is definitely a great option.
Matt: Wonderful. Well Dan, thank you again. I hear your family in the background your baby making some noises so I think it might be time to call an end to this but thank you again so much fr everyone listening, really appreciate you tuning in. If you’ve got any feedback, any suggested topics, suggested speakers, whatever it might be shoot me an email at email@example.com and in typical Seeking Wisdom fashion, you know, six stars only. And Dan, thanks, and we’ll catch you on the next episode.
Dan: Thanks so much man, see you.
Matt: See you.