Discover more from The Analytics Engineering Roundup
Ep 33: How does data drive growth in practice? (w/ Abhi Sivasailam)
Demistifying what data growth work looks like in practice, from a practitioner who's led both functions at multiple companies.
Abhi is a growth and data leader, and an excellent Twitter follow. Most recently, he was Head of Growth and Analytics at Flexport, where he helped the company to grow 10x over the past 3 years. Previously, Abhi led growth and data teams at Keap, Hustle, and Honeybook.
In this conversation with Tristan and Julia, Abhi explains his methodology for setting up a new growth data organization, and how you might be falling victim to the dreaded "arbitrary uniqueness" bug.
Listen & subscribe from:
Key points from Abhi in this episode:
Is every analysis unique? Are companies unique?
I think there's one of the themes for me; I think as both of it's probably a lot of folks out there know I'm on sabbatical for the rest of the year and doing a lot of just trying to help, trying to be helpful and advising and peeking at a lot of companies.
The theme that looms largest in my mind, the word that I find myself repeating a hundred times a week, is arbitrary, or the phrase is arbitrary uniqueness. And I see this arbitrary uniqueness everywhere, right? And the reality is most companies are not snowflakes, even though the companies would like to pick that.
And most data teams are not really snowflakes in terms of the kind of problems we need to solve, even if we'd like to think that. And I think one of the things that we need to see a lot more of is standardization. Standardization of the analysis, standardization of the metrics, standardization of the data.
And I think we'll get there. But yeah, I see this arbitrary uniqueness in a lot of places. And I think there's a natural reaction of, Hey, that's a little too reductive, right? I think, yeah, it is reductive; it's reductive at the 80/20 point, right? Sure, it's reductive, but I also explain 80% of the variance here, right?
And that long tail, 20%. Of course, you're a snowflake. Of course, you're special, your metrics are special, your data is special, and the analysis is special. But if it got to that 80% point, we've unlocked most of the value, right?
How do you work with growth teams to actually move the needle for the business?
Yeah. Recapping the story a little bit, this was a company several years ago. The company I don't mind sharing the company is called Keap today. It was Infusionsoft at the time. So Infusionsoft, maybe Tristan knows, was a very large MarTech company at one point, right?
A hundred million ARR, a hundred million plus ARR, 40,000 customers. Fairly large! People don't talk about them a lot for a variety of reasons.
It was a big deal for a long time but growth was slowing.
And in the month I joined, I was brought in as a wave of leaders to help do this turnaround. In the month that I joined, hundred plus million ARR 40,000 customers, one net new SMB right? Which is shocking. So growth had really hit a plateau. And we brought in great operators. I ran growth and data.
We brought in great leaders for marketing and sales, and customer success. And really tried to turn this company around operationally. And we succeeded. So 12 months later, we went from one net new acquisition to 700 plus, right? Net new monthlies and revenue growth were even better than that.
So things were going great. 12 months after that, the company was losing 200 to 300 customers monthly. Hadn't just hit a plateau but had gone off of the plateau. And the real root cause was that the kernel is that we no longer had a product market fit in a good market.
As Tristan said, at one point, very competitive with HubSpot. This company was founded in 2001. First dot com boom. We were sitting here in the mid-2010s, and the world had changed, customers had changed, and we were no longer skating to where the puck was going. And again, this was a company that was executing really well, right?
It was firing on all cylinders; all the functions were performing. The data teams were extremely integrated into the company. Growth teams, lots of latitude and support. But all of that exceptional quality and focus was oriented towards the here and now where the puck is. And we ended up in a local maximum as a company, right?
Basically due to kind of strategic mistakes. This was a bummer now. So this was really impactful to me. And I spent a lot of time reflecting on how did this happen? What could growth and data have done? What should they do moving forward? Even when I was there, thinking about how should we orient and structure growth and data to help, I came up with basically, these answers that fall into the kind of these two buckets.
How do I orient these teams, and then how do I integrate them into the rest of the company? And I can start with how I orient the teams and Tristan, I think this will speak to how attached at the hip are growth and data. So when I come into companies, the,
Yeah, as I said, I think these learnings kind of bubble down into how I orient the teams and how I integrate them. And I think what you are addressing is the latter. Like how do I integrate? At least, that's how I see it. And I think a big part of these strategic misses, right? How you solve these strategic misses is to get data to be involved or own the strategic planning process. And that is the biggest thing. I can talk a little bit about how this happened to Flexport because we did this pretty well.
Let's zoom in on tech at Flexport, which was the organization that I was a part of. So when companies do annual planning, there's the typical planning cascade, right? Mission, visions, strategy, roadmap, tactics. The company does its mission, vision, strategy, and then net cascades down into various functional groups.
Zooming into tech. The integration of data into the annual planning process for tech is absolutely essential to ensuring that we're continuing to be pointed at the right kind of big bets. So what that looks like at Flexboard is at the outset, when executives are thinking about it, when the CTO is coming up with the strategy for the tech strategy, data is in the room.
And that's because we recognize that strategy will be blind without data. The classic example that I've seen at multiple companies now where if data wasn't in the room, the company would've made strategic mistakes is around data network effects. People love to talk about; executives love to talk about data network effects.
Every time an executive says we should do X because of data network effects, somewhere, a kitten dies. Data people's job is to prevent that. The question is, Flex could have done plenty of things to orient around technology strategy anchored on data network effects.
But there are fundamental questions. Do we have data network effects? Do they matter? Where do we have them? And if you don't really have clarity on this, but your strategy is, say, data network effects, you're screwed. And this is just one example. Really any strategy, any strategic narrative an organization can align around needs at the point of strategic inception to be rooted in.
The point of the strategic concept is validated by data folks. So that's where we start, right? So analysts and scientists are there with leadership to define the strategy. Then, analysts set the metrics, right? In this case, the CTO didn't set the metrics. CTOs direct but didn't set the metrics.
Analysts, product folks, and product analysts set the metrics that represent the strategy. They set the North Star metrics. They also lead the translation of those metrics into financial value. So at Flexport in the last planning cycle, we had three North star metrics that the product analyst set them, and they say this one has this financial value.
Y financial value, Z financial value? They then take those metrics, and they're the ones that take them down to the PMs and the PM teams, where the product teams now need to orient around these North Star metrics. And they need to come up with their own output metrics and their own initiatives.
And who defines the output metrics on the PM teams? It's the product analyst. So product analysts actually define the metrics, and this caused plenty of consternation. Because the PMs disagreed half the time with the metrics that the product analyst wanted. But the product analysts are accountable for setting the metrics for the product teams in partnership with a shared context.
But the product analysts are accountable for defining those metrics and how they relate to the North Star. So then the PMs go and plan and say, great, we have our output metrics on the product team. Here they are; here are my five initiatives, right? Then product analysts go in, they take the initiatives and their impact on the app metrics, and they're the ones that score the impact on the tech-wide metrics, right?
They're the ones that say, Okay, you are going to move this lever 5%. I have determined that will impact the North Star metric by 20%. And then the product analysts are the ones that are sitting in the room to draw the cut lines, right? Now, this is very uncommon. But I think it's essential and an orientation that I wouldn't have been so stranded on if not for experiences like the one I Keap, right?
Because this is how you get data to represent this perspective that prevents strategic plateauing, right where you end up in a local maximum. I can pause there.
What does growth mean? What are its different flavors? And what is so special about marrying growth teams and data teams together?
Sure. There's a lot there. First of all, growth is a highly nebulous term. You ask five people whether growth people or in adjacent functions, and you know what growth is, and you'll get 10 different answers. It's the one job family that's more nebulous than data science.
But to me, there are three types of growth teams that I've managed that, back to the 80/20 that really represent most of the work that growth people do.
So the way I think about them is there are funnel hacking teams, there are PLG teams, and there are blue sky bets teams. So funnel hacking teams are a team that is absolutely founded on experimentation, right? And not in we want to be data-driven, we want this abstract aspirational sense, but in an intrinsic sense, right?
This is how the team works. It is founded on experimentation. It's founded on build-measure-learn loops. For these teams, the definition of growth is really the scientific method applied to KPIs, right? You don't have a roadmap. You have the objective to put points on the scoreboard relative to funnel metrics and to generate learnings, and that's what you're doing, right?
So that's that first bucket. The second bucket is PLG. Oh, and I should say about that first bucket: I've seen that work as both a charter team, that is, here are your metrics, go off and just experiment and generate learning. And, work in these build-measure-learn loops as long as you hit your outcomes.
I've also seen them work as Center of Excellence teams where they're centralized, and they'll go out to different pockets of the enterprise. And I've managed both as well. The second is PLG, right? So PLG stands for product-led growth, mostly common in SaaS, very hot these days.
I think it's starting to be rebranded as product-led sales. But to put it simply it means that you are helping the product sell. So that usually looks like you're taking your product; you're externalizing parts of that product to create value before you capture it.
Create value for prospective customers that help turn them into customers. And in practice, that too requires lots of full-funnel thinking and experimentation. And then the third is basically blue sky teams, which can look like grow-at-all-costs teams or look like long-term skunkworks, right?
Here's a new direction that we would like to go. It's going to require a lot of experimentation as well. This is very different from the KTLO, the core of the business. But, we want to invest it in any way for frontier work and we're going to get a cross-functional pot of folks together to move fast, break things and experiment.
So those are the three types of growth teams that I've seen, that I manage, and how I think about them when I run these end to end. So what I typically do is run end-to-end growth and data teams. Data full stack, right? So data science, analytics, engineering, platform, all those fun stuff, as well as growth teams.
Most recently, I had all those data teams, but I also had products, like growth, growth marketing and growth product, and started each of those three teams, right? And I orient them to work as closely as feasible together. And I've done this now at several companies. So shared rituals, wherever possible, shared training, shared objectives, shared resources, and people that can move from place to place. And, that's a sort of an unusual model, right?
I've never really heard of many companies having this many teams. And I think the last thing you'd asked was, what's behind this? And it's simply that when I got into tech, data science was the sexiest job of the century, and then people started talking about growth, and I couldn't have that.
I couldn't have a second sexiest job. So I had to bring it in. I think this question I get a lot. One, I think there are just a lot of synergies. I like to say growth gives data context, and data gives growth focus. Growth and growth mandates are basically a way to give data and analytics teams like a telos, right?
It gives them a point; it gives them a perspective. And meanwhile, data teams help growth. Really zoom in on the efforts, right? Because growth teams could do a million different things, experiment, and ideate in a bunch of different directions and data and the kind, these data tools help growth teams focus, right?
Relatedly, I think data people make the best growth people. And that's another great reason to keep these teams very close together. I think the extent that these build-measure-learn loops start with ideation. One of the ways I describe ideation in this, like DMAs that are at the top of the funnel for experimentation, is that you're wandering around a dark cave with a flashlight, like a dim flashlight.
And the most important thing here, the most important first-order concern, is where do you point your flashlight next? And data people, and I think UX people, UX researchers, are really good at that. That first-order concern of if I'm going to ideate in the first place, where should I point?
Where should I point the resources of this group? And if you don't treat that as a first-order concern, if it's a second-order concern, your ideation funnel will not be very strong, right? At least in my experience. And the third, and I think this is what we'll spend maybe a lot of the rest of our time talking about, is, I think growth in data teams are better at what they do when they're tightly aligned on some set of key projects, some kind of shared orientation.
And that has been clear to me, and I think I've chatted with both of you from prior experiences at companies where these companies struggle to grow despite growth. Doing the right thing, data doing the right thing, this function is doing the right thing, but not really being put in the same direction.
So anyway, that was a lot. So let me pause there before we go further.
What is it that you've done or that you tell people to do so that they can get to that right level of seeing the forest for the trees and make these really strategic company-level bets?
I talked about lessons from Keap as how you integrate these data and growth teams and how you orient them. Going back to how you orient them: when I take over these teams, the very first thing I point them to is two projects. These are the North Star animating projects.
The first one is what I call a dashboard tree and the second one is what I call levers. So what a dashboard tree really is just a bounded, cascading set of dashboards that express what is the fundamental formula of the business? How does this business actually work?
When I came into Flexport, we had 2,700 Periscope dashboards for a company of 750 people, right? So actually 2,700. Wow. It was probably more than 2,700, right? The first thing we orient towards is, hey, let's distill all this down into 50. And these 50 have a bound at 50. They're cascading.
So they have this structure that kind of represents how the business works. So there's an L0; this is how the company works. Written large, zooming in, there are L1s L2s, L3s, and L4s. And it's a tree. We decompose just like a driver tree but applied to dashboards.
And then in each of those, for that component, we express that fundamental formula of the business. And this is all hands. So growth people are doing this data, people are doing this, and this is a North Star thing that they are all oriented towards. And growth is involved because they're either the sneeze or they're going to really drive the context collection from sneeze on; what does that fundamental formula look like for a function?
And of course, it's going to be in turn, hugely valuable for growth, right? Because it then gives them a picture of how the business works. But, if you can take resources to really focus on, hey, if I were to distill this company down into this kind of master driver tree and really try and understand what metrics make sense.
How do people think about all of their tacit contexts? How do they think about what the fundamental formula of the business is today? That makes this job a lot easier. Coming back to your question, Julia, right? If you've done that legwork, if you haven't done that legwork and you're reinventing it every planning cycle, it's always going to be a hard time.
It's always going to be a scramble. It's always going to crash. People that do this exercise realize, first of all, there's a lot of arbitrary uniqueness here, right? It tends to be very similar company to company. And two, the drivers themselves, the relationships may not be durable, but the drivers themselves are pretty durable, right?
If you really think about what the drivers are, you find that there are very similar sets of drivers that you don't need to decompose all that much unless the business radically changes. I think a key part of this is to do that legwork, right?
If you can orient growth and data towards that as a shared project, then you've established that groundwork for a shared sense of reality for all the executives to plan on top of and all the data folks to drive metrics from.