radar
SIXTYNINE Digital Logo

FORA10-PERSONCOMPANY

Most 10-person companies use "data-driven" the way they use "innovative": as decoration. Here is what the practical version actually looks like, with named research and a concrete starting list.

By sixtynine.digital

The phrase "data-driven" was designed for companies with data teams. Most 10-person companies have no data team, no data warehouse, and no analyst to lean on. They still need to be data-driven. The version that fits a 10-person business is much smaller, and much more useful, than the buzzword suggests.

What we'll argue here

  • "Data-driven" at enterprise scale (BI stacks, warehouses, analyst teams) is the wrong reference point for a 10-person company. McKinsey research shows data-driven companies are 23x more likely to acquire customers and 19x more likely to be profitable (McKinsey, 2022), but the playbook behind that result was built for 10,000-person organisations.

  • Practical data-driven for a small team is three habits, not a stack: knowing your core numbers without opening a spreadsheet, removing arguments about which number is right, and writing down what you expected before you find out.

  • Most small companies waste their first year of "data effort" on dashboards nobody opens twice. Gartner estimates 52% of average organisational data is "dark data": collected, stored, never used (Gartner).

  • The tooling decision is downstream of the question of which decisions actually matter. Get the question right first.

Why "data-driven" was designed for someone else

McKinsey has been writing about "the data-driven enterprise of 2025" for the better part of a decade (McKinsey, 2022). The phrase, and the playbook behind it, was built for organisations large enough to staff a data function. The advice assumes a data warehouse, a BI platform, an analyst (or three), and a backlog of reporting requests. None of those things exist at 10 people. The buzzword scaled down without being redesigned.

That mismatch creates a strange dynamic. Small-company founders read the same enterprise content their board members read, then attempt to compress a 10,000-person playbook into 10 people. The compression usually shows up as one of two failure modes. The first is theatre: a dashboard tool gets bought, three dashboards get configured, nobody opens them after week two, and the company keeps running on Slack messages and gut feel. The second is paralysis: the founder reads enough about "data maturity" to feel behind, and stops making good gut decisions because they don't feel "rigorous" enough.

Neither one is data-driven. Both are reactions to a phrase that wasn't built for the scale.

The smaller you are, the more your "data infrastructure" is just a few people who know what they're looking at. That is not a deficiency. It is the actual shape of the problem.

What does "data-driven" actually look like at 10 people?

At small scale, data-driven is three habits. Knowing your core numbers without opening a spreadsheet. Removing the small daily arguments about which number is right. Writing down what you expected before you find out the answer. None of those require a tool. All of them require attention.

Habit one: know your numbers cold. A 10-person company has somewhere between five and twelve numbers that genuinely matter. Cash on hand. Burn rate. Active customers. Revenue this month, last month, and the same month a year ago. New customer acquisition cost. Gross margin per main product or service. If the founder cannot say those numbers out loud without checking, the company is not data-driven, regardless of how many dashboards exist. McKinsey reports that around 65% of organisations expect to transition from intuition-based to data-driven decision-making by 2026 (McKinsey, 2022). Most small companies will overshoot in the wrong direction: they will buy tools before they have memorised the numbers those tools are supposed to surface.

Habit two: agree on which number is right. A small team can usually have whatever numbers it wants. What it cannot afford is two versions of the same number. Marketing's "leads this month" and sales's "leads this month" do not match. Finance's revenue and the founder's mental revenue do not match. The cost of resolving these discrepancies live, every week, in standups, is enormous. According to a 2025 IBM Institute for Business Value report, more than a quarter of organisations estimate they lose over $5 million annually to poor data quality (IBM, 2025). The 10-person version of that loss is smaller in absolute dollars and bigger in proportion of energy.

Habit three: write down what you expected. Before launching a campaign, hiring a salesperson, or changing a price, write a sentence: "We expect this to do X by Y date." Two months later, compare. This is the cheapest analytics you will ever do, and it is the only one that compounds. It also exposes the difference between people who make calibrated decisions and people who don't, which is information you cannot get any other way.

The three habits together do not require Tableau, Looker, Power BI, Snowflake, dbt, or anything ending in "ware". They require somebody on the team to care enough to be the keeper of the numbers, and a founder who is willing to repeat the same five figures every week until everyone else can repeat them too.

Where most 10-person companies waste their data effort

Most small-company data waste is not technical. It is decisional. Gartner research finds that around 52% of average organisational data is "dark data": collected and stored, but never used for analytics, decisions, or anything else (Gartner). At 10 people, the dark proportion tends to be even higher, because the cost of building and maintaining reports is borne by the same people who would use them, and the maintenance debt usually wins.

The waste shows up in four predictable patterns.

The first is buying a BI tool before deciding which decisions matter. Power BI, Looker Studio, and a dozen Notion-flavoured analytics products are now cheap or free at small-company volumes. The marginal cost of buying the tool is zero. The marginal cost of integrating, configuring, and maintaining it is not. According to Forrester's 2025 BI research, simpler tools see roughly 40% higher user adoption than heavier alternatives in organisations under 1,000 employees, which tells you something about the gap between "we own a BI tool" and "anyone uses it". The same pattern is worse at 10 people.

The second is confusing reporting with decision-making. A weekly report is not a weekly decision. If the report doesn't change what the team does next week, it is just commentary.

The third is mistaking dashboards for understanding. A dashboard that nobody can explain in one sentence is not a dashboard. It's a wallpaper.

The fourth is over-instrumenting product analytics before the product has stable usage. Mixpanel, Amplitude, PostHog, and similar tools are excellent. They are also expensive in time before you have enough users to make the data signal louder than the noise.

The waste is not in the tools. The waste is in the decision to start with the tools.

What should a 10-person company actually instrument first?

Start with four areas, in order. Cash and runway. Acquisition cost and payback. One leading indicator per function. Two or three "are we still doing what we said we'd do" markers. Anything else is premature.

Cash and runway. This is the only number that genuinely cannot be wrong. Update it weekly, in writing, in the same place. Keep it visible to the founders and (where appropriate) the team. If you have ever made a hiring decision without updating the runway model in the same week, you are not data-driven yet, regardless of how many dashboards you own.

Acquisition cost and payback. What does it cost to bring a customer in, and how long until they pay that back? At 10 people, you don't need a marketing mix model to answer this. You need a quarterly back-of-envelope calculation that is consistent enough to compare across quarters. The number will be rough. Rough and consistent beats precise and yearly.

One leading indicator per function. Sales: pipeline coverage ratio, or qualified meetings per week. Marketing: weekly inbound from each meaningful channel. Product (if applicable): activation rate or first-week retention. Operations: a single delivery quality metric. The discipline is one number per function, not five. Five becomes none.

"Are we still doing what we said we'd do" markers. Each quarter, the founders and the team agree on three to five things that will change because of work being done. Write them down. Two or three weeks before the quarter ends, compare. This is not analytics. It is decisions being kept honest. It catches drift earlier than any monthly report will.

That is the entire starter list. None of it requires a BI tool. All of it requires somebody to take the keeper-of-the-numbers role seriously, and a founder who is willing to look at the same five figures every week instead of the new shiny ones.

What this list intentionally does not include: anything in the AI category. Not because AI is overhyped (it isn't, in operations), but because AI applied to bad data and weak decision habits will scale the badness. According to a 2025 SBE Council survey, 38% of small businesses cite the adoption of new technologies, including AI, as a contributor to their performance (SBE Council, 2025). That number is real. It also belongs to companies that already had the basics in place. AI is a multiplier on a coefficient, not the coefficient itself.

When does it make sense to graduate to actual tooling?

There are three reliable signals. Two people are arguing about the same number twice in a month. Manual reporting is taking more than two hours per person per week. Decisions are stalling because someone is waiting for a data extract.

The first signal is about trust. If two people in different parts of the team consistently land on different versions of the same metric, you have a tooling problem, not a discipline problem. A small "single source of truth" investment pays back fast at this point. According to Gartner, organisations using modern BI platforms make decisions roughly 5x faster than those relying only on spreadsheets (Gartner via 2025 BI research). At 10 people, the value is less about speed and more about ending the disagreement.

The second signal is about cost. If reporting is eating two hours per person per week across three or four people, that is roughly 30 person-hours a month, every month. The break-even on a small BI investment arrives quickly. Below that threshold, automation costs more than the time it saves.

The third signal is about latency. If a decision waits for someone's calendar to pull a number, the cost is not the analyst's time. It is the decision delay. That is the most expensive form of data debt small companies carry, because it is invisible.

A trade-off worth naming: the tooling decision is also a maintenance decision. A BI tool you adopt today is a maintenance commitment for as long as you use it. Schemas change. Sources break. Reports drift. Forrester's 2025 work on BI tooling notes that adoption rates differ sharply between simpler and more complex platforms in companies under 1,000 employees. Pick the simpler tool. The boring one. The one whose monthly maintenance cost is closer to zero. At 10 people, the only thing worse than no BI tool is a BI tool that quietly stops being right.

The version that compounds

Data-driven, at 10 people, is not a stack. It is a habit. The tooling decision is downstream of the question of which decisions actually matter, who owns each number, and what the team agreed to compare against. Get those three things right and the company will out-decide most of its competitors before it owns a single dashboard.

The companies that get this right do not look impressive from the outside. They look small, careful, and slightly boring. They know their numbers. They argue about decisions, not about which spreadsheet has the right cell. They write down what they expected and check. When they finally graduate to a BI tool, the tool inherits a clean question, not a messy one.

The buzzword version is theatre. The practical version is operational hygiene applied to numbers. The first one impresses investors. The second one runs the business.

Frequently Asked Questions

What does "data-driven" actually mean for a small business?

For a small business, data-driven means three habits, not a tooling stack: knowing the five to twelve numbers that genuinely matter without checking, agreeing on which version of each number is correct, and writing down decisions and expected outcomes before checking actuals. McKinsey research finds data-driven organisations are 23x more likely to acquire customers and 19x more likely to be profitable, but the playbook behind that statistic was built for 10,000-person enterprises (McKinsey, 2022).

Does a 10-person company need a BI tool like Power BI or Looker?

Usually not in the first year. A BI tool starts paying back when two people in the team disagree about the same number twice in a month, when manual reporting takes more than two hours per person per week, or when decisions stall waiting for someone to pull data. Forrester's 2025 BI research shows simpler tools see around 40% higher user adoption than heavier alternatives in organisations under 1,000 employees, so when you do adopt, pick the simpler tool.

What metrics should a 10-person company track first?

Four areas, in order. Cash on hand and runway, updated weekly. Customer acquisition cost and payback, calculated quarterly. One leading indicator per function (sales pipeline coverage, weekly inbound, activation rate, delivery quality). Two or three "are we still doing what we said we'd do" markers per quarter. Anything beyond that list is premature for a small team.

Why do most data-driven initiatives fail at small companies?

The dominant failure pattern is buying a BI tool before deciding which decisions matter. Gartner reports that around 52% of average organisational data is "dark data": collected and never used for any decision (Gartner). At 10 people, the dark proportion tends to be higher, because the same people who would use the data are the ones maintaining the reports.

Is gut feel the opposite of data-driven for a small company?

No. At small scale, calibrated gut feel from someone who knows the numbers cold is closer to data-driven than a dashboard nobody opens. The opposite of data-driven is uncalibrated gut feel: making decisions without ever writing down what you expected, so you cannot tell which of your instincts work. Writing down expectations before launching anything is the cheapest, highest-return analytics a small company can run.

READY TO UNLOCK NEW OPPORTUNITIES FOR YOUR BUSINESS?

Just tell us if you are interested in our solutions. We are more than happy to prepare a personalized demo for you!