On Measuring: Company Metrics, Team KPIs, and OKRs
Featured guest author: Erica Louie! Making a case for OKRs and the importance of Team KPIs.
Happy belated Thanksgiving!
Have you thanked your data team (or data tools) this holiday season?
I hope you’re all feeling uncomfortably stuffed, somewhat relaxed, and optimistically ready for the impending slew of ambitious initiatives awaiting over the horizon. In this roundup, we’re going to discuss and make a case for all jaded IC’s sworn enemy: OKRs.
‘Tis the season for fiscal year planning.
As most teams buckle down for a quieter Q4 and the holiday break, data and finance teams are frequently scrambling between setting fiscal year budget plans and reworking how to measure the company’s performance via Company OKRs, preparing for planned initiatives,1 and balancing the big questions of “How do we measure whether a team is performing well?”
Given the macroeconomic climate,2 questions around performance and impact will be increasingly more frequent and more important. While the concept of “Core Company Metrics” is not a controversial topic, OKRs (Objectives and Key Results) are usually met with an internal (or, for the bold, external) eye roll, a hand wave, a deep frown or scowl. Some employees will say these OKRs do not apply to their teams; if you asked a random employee to recite the OKRs, most couldn’t list all of them, some not even one.
I, too, used to be a non-believer, a doubter, a “hater” of OKRs. However, after working closely with my sworn enemy for the past two years while constantly reiterating our process, I have grown quite fond of them. Now, I could not imagine not having OKRs. The principle of OKRs, in theory, makes sense. They address areas of the business that require intentional improvement and constant cross-functional conversations in which all executives synchronously meet and discuss every. single. month.3
But, more than we care to admit, the process behind and execution of OKRs are riddled with miscommunication, lack of vision, and unclear direct connections between the KRs (key results, i.e. metrics) and an IC’s day-to-day work. In theory, we dream of a utopia where OKRs stimulate cross-team collaborations, clear measurements of performance improvements, and glean actionable insights across the organization. In practice, the executive team sets the objectives and data teams scramble to define the key results; teams struggle to connect their initiatives and team-level OKRs directly to a company OKR and data teams scramble to measure each team’s OKR; employees are disconnected from company OKRs and become disoriented with ever changing roadmaps.4
OKRs aren’t the problem. Rather, it’s the confusing and interwoven maze of how we choose to relate these metrics to the entire company’s day-to-day. What if we could become a step closer to this utopia and the path to such a seemingly lofty goal is much simpler and more scalable than we think?
Defining + Interconnection
Imagine three types of main metrics: Core Company metrics, Team KPIs, and OKRs
Core Company metrics
Proposal change: annual
Purpose: Reflective of the company’s health and performance
“Is our business healthy and growing while adhering to our mission and values?”
Team KPIs
Proposal change: annual
Purpose: Reflective of the team’s performance, ties back to company metric(s), limited to 1-3 KPIs per team.
“Is our team contributing to the growth and health of the greater company?”
OKRs
Proposal change: biannual
Purpose: Focuses on areas of the business we wish to improve/focus on for the year
“Are we on-track towards achieving the key focus areas we set for the year?”
Team MBRs: Company metrics ←→ Team KPIs
Core company metrics and Team KPIs are stable and focused metrics because they change once per year and teams know their KPIs, their goals, and the inputs. They will also know which Core company metric their KPI contributes to.
During MBRs (i.e. monthly business reviews, such as Marketing All-Hands, where business functions or the executive team discuss where they landed against their monthly goals), teams will focus on discussing their delivered initiatives and whether the initiatives moved their KPI which then tie into whether they contributed to moving a company metric. This also sets up a framework for giving their initiatives a set purpose and goal. For example: “We believe that once we deliver XYZ initiative, it’ll work towards moving our KPI.”
Leadership members will note how their teams performed, what metrics they were able to move, and the upcoming initiatives that are hypothesized to move their KPI and thus the company metric that it’s linked to. They will then bring back these learnings and context to the company monthly business reviews where the executive team will
Company MBRs: OKRs ←→ Company metrics ←→ Team KPIs
Naturally, OKRs should link to a core company metric (or multiple), whether it is the actual KR (e.g. increasing active accounts) or a segmented piece of it (e.g. “Improve conversion rates” which will move Monthly Active Accounts). While objectives are typically owned by an executive leader’s business function, some additional context can be added by other leadership members.
Example + Scenario:
In H1 O1, we had KR 1.0: Improve retention rate to X% MoM
This KR ties back to Monthly Active Accounts because we know the input conversions would involve an increase in Accounts.
During the Marketing All-Hands, the Acquisition team would discuss how their Monthly Account Sign-ups KPI was moved after launching an improvement to the onboarding flow.
Meanwhile, Customer Success All-Hands, the teams are discussing CSAT and NPS throughout H1. During these monthly meetings, Support can discuss how their initiatives are working to improve or maintain CSAT which will then could affect NPS scores.
For the Company Monthly Business Reviews, the CPO would report on this metric but with some input from the CMO on how the Acquisition team’s initiatives might have helped improve this metric.
Let’s say in H2, the company changes the objective and now the KR is “Improve customer experience scores to XX by EOY”
This shift means rather than touching the Monthly Active Accounts company metric, we are focusing on the other core company metric: NPS score.
Good news! We already have prior art on NPS scores from Customer Success. And other teams whose KPIs contribute to moving the NPS score already have conversations regarding this metric from their previous Team MBRs.
For Marketing All-Hands, the Acquisition team is still discussing their Monthly Sign-ups KPI. It’s still important and their initiatives are still moving forward because their KPI relates to improving the company metrics Monthly Active Accounts.
For Customer Success All-Hands, they’re diving a little deeper in NPS score changes and the initiatives around improving it (e.g. Customer Health Scoring). The MBR conversation continues to be relatively the same as the Team KPIs have not moved, but the initiatives to move these will be noted by CCO.
For Company-Level MBR, conversations around Objective 1 could be led by the CCO rather than the CPO.
Ultimate Goal
The beauty of this framework is in its simplicity because it has a narrower focus for teams and flexibility for changing top-level focuses for OKRs.
Clear cross-functional connections: We’ll naturally know team KPIs that might be cross-functional (e.g. a product team’s KPI may intersect with a customer success’s KPI because they both touch the same core company metric). This will also build a natural framework of working across teams when planning and launching initiatives.
Direct impact of initiatives: Teams can focus on their KPIs, the company metrics they tie back to, and potentially a KR. This also means ICs have a clearer path of impact/value to the entire organization. Monthly business function meetings (e.g. Marketing All-Hands) can focus on what initiatives are currently happening with clear signs of whether delivered initiatives had impact (i.e. if they moved their KPIs).
Flexibility of changing OKRs: Similar to the above example, if the executive team decides to change an objective in the second half of the year, this framework accounts for flexibility. This could be the market is changing or maybe an area of the business needs more focus. Regardless, these changed KRs will naturally touch a core company metric and teams whose KPIs touch this core metric will already have historic conversations and records of their delivered initiatives. No work needs to be done, just a shift on where the company is focusing the conversation.
Leading indicators: If we see a group of connected, cross-functional KPIs are moving up or down together or conversely, then these could become leading indicators toward building a company flywheel or potential levers which data teams can dig into.
Less load on data resources: Meanwhile, data teams’ jobs will be significantly easier during Q4 because their main focus will be defining and measuring the OKRs for the following fiscal year. And even when initiatives change mid-year, there is a narrower set of metrics to focus toward moving. Rather than brainstorming and building a laundry list of metrics per initiative, their data partners can simply ask: “How will this initiative move the inputs into your KPIs?” With this simpler framework, data teams can finally get 7 hours of sleep, eat 3 meals a day, and enjoy their holiday break.
Other morning {{ warm_beverage }} readings ☕
Below I’ve chosen a few recent blogposts as paralleled readings to the above’s soapbox session. Enjoy!
While we’re on the topic of metrics, Robert Yi points out the tradeoffs between depth vs accessibility with the Semantic Layer.
Meanwhile, as data teams attempt to build out Team KPIs and assist with measuring their initiatives, Mary MacCarthy urges data teams to break out of their bubbles. If data teams want to help serve their business stakeholders, it’s just as important to understand their tech stacks, how they measure success, and the overall business context.
Between defining team KPIs and implementing metrics, what does good data quality practice mean to your team? Emilie Schario started a series called “Understanding your Data” and in this post, she writes on making a development process that works.
Just as our business counterparts have weekly dashboards on their Team KPIs, data teams should also have a weekly dashboard on their operational metrics (e.g. platform/vendor usage and spend, uptime/downtime via pipeline errors, etc). The data team at dbt Labs is spending the first week of the New Year heads down (“New Year, New Us” week) on some performance tuning, project optimizations, and process revamps. I remember sitting with Niall Woodward discussing Snowflake’s performance tuning and so I was very stoked to see he wrote a blogpost on Faster Queries with Clustering.
Most of which will change and fluctuate, no matter how much preparation goes into it.
This should become a game. How many times have you heard “Given the macro…” and how many of us know what this specifically calls to? And how many of us are too scared to ask at this point?
An interesting question to ask when setting OKRs is “Do we want to discuss this topic and these metrics every single month?” And will the outcomes of those conversations hold any actionable value?
Great write up!