Every grants manager knows the spreadsheet. Not just one spreadsheet — the ecosystem. The master tracker. The assessment matrix. The conflict of interest register. The reporting due dates sheet. The payments log. The board summary pivot table that only Sarah knows how to update.
These spreadsheets work, in the same way that a screwdriver works as a hammer. You can do it. It leaves marks.
The cost of managing grants in spreadsheets is real, but it's almost entirely invisible because it shows up in the wrong places: in staff time, in Friday afternoons spent reconciling versions, in the quiet anxiety before a board meeting when someone has to check that the numbers are right. Nobody has ever written "spreadsheet overhead" as a line item in an organisational budget. But it's there.
Let's put some rough numbers on it.
A medium-sized grants programme — say, 200 applicants per round, two rounds per year, 60 active grants at any time — will typically involve the following spreadsheet work:
Application tracking: Logging incoming applications, assigning reviewers, tracking assessment status, recording decisions. At 15 minutes per application across two rounds: 100 hours per year.
Assessment coordination: Distributing scoring sheets, collecting completed assessments, reconciling scores, identifying borderline cases, preparing the ranked list for the panel. Per round: 20–40 hours. Per year: 40–80 hours.
Conflict of interest management: Collecting declarations, cross-referencing against the applicant list, updating the register, ensuring reviewers are appropriately recused. Per round: 8–15 hours. Per year: 16–30 hours.
Reporting tracking: Logging report due dates, sending reminders, recording receipt, flagging overdue returns, chasing non-responders. Per active grant: 2–4 hours per year. Across 60 active grants: 120–240 hours per year.
Payments management: Tracking payment schedules, reconciling against milestone completion, generating payment requests, logging disbursements. Per active grant: 3–6 hours per year. Across 60 grants: 180–360 hours per year.
Board reporting: Pulling information from multiple spreadsheets, reconciling totals, formatting, checking for errors, presenting. Per board meeting: 8–20 hours. At six meetings per year: 48–120 hours.
Total: somewhere between 500 and 900 hours of spreadsheet administration per year for a medium-sized programme. At a fully-loaded cost of $60–80 per hour for a grants coordinator, that's $30,000–$72,000 per year in staff time spent on data administration — not on programme work.
That's the visible part.
The staff time cost is real but it's only part of the picture. There are three other costs that don't show up in any budget line.
Spreadsheets are error-prone. This is not a criticism of the people using them — it's a structural property of the tool. The European Spreadsheet Risks Interest Group (yes, this exists) has studied spreadsheet errors in professional settings for decades. Their finding: in large, complex spreadsheets, error rates of 1–2% per cell are normal. In a 500-row assessment matrix with 15 columns, that's 75–150 errors.
Most errors are caught before they matter. Some aren't. The ones that aren't tend to cluster around high-stakes moments: the panel ranking, the payment schedule, the board report.
The cost of a meaningful spreadsheet error in a grants context isn't just the correction time — it's the trust damage. If an applicant discovers their score was recorded incorrectly, or a grantee is told they've missed a reporting deadline when the date in the system was wrong, the damage isn't just operational. It's to the relationship, and to your programme's reputation for running a fair, competent process.
"Which sheet is current?" is a sentence that should never need to be said, but in spreadsheet-based grants management it's said constantly.
Multiple staff members work with the same data. The master tracker lives on a shared drive. Someone downloads it to work offline, makes changes, and re-uploads. Someone else made changes in the online version in the meantime. Now there are two versions and nobody is quite sure which one has the most recent assessment results.
The cost isn't just the time to reconcile. It's the decision-making time lost when people don't trust the data in front of them. When a team isn't confident in the accuracy of its tracker, people start making decisions from memory, from email threads, from their own notes. That's where real errors happen.
In most spreadsheet-based grants programmes, one or two people understand the system intimately. They built the pivot tables. They know what the colour coding means. They know that column P is actually the pre-weighted score and column Q is the final, despite the labels saying otherwise.
When those people leave — or go on parental leave, or get sick — the institutional knowledge walks out with them. Onboarding a new person into a complex, organically-evolved spreadsheet system takes weeks, not days. During that transition period, mistakes happen.
This is a genuine governance and continuity risk. It's also entirely invisible until it becomes a crisis.
Purpose-built grants management software isn't a luxury for large programmes. It's an operational investment that typically pays for itself in the first year.
What changes when you move from spreadsheets to a proper system:
Assessment is routed, not distributed. Applications are assigned to reviewers. The system tracks who has assessed what. Conflicts are flagged automatically against the COI register. Scoring happens in-platform. The ranked list is generated automatically. The panel sees what they need to see, not a 500-row spreadsheet.
Reporting is tracked, not chased. Due dates are set when a grant is approved. Reminders go out automatically. When a report is submitted, it's logged. Overdue returns surface in a dashboard, not from manual cross-referencing. Programme staff spend their time on late-report conversations, not on figuring out who is late.
Payment schedules are tied to milestones. When a milestone is marked complete, the corresponding payment is flagged for processing. The payments log is a live view of what's been disbursed and what's due — not a manually maintained sheet that someone updates when they remember.
Board reporting is a report, not a project. The data is in the system. The report is generated. The board sees accurate, current information without someone spending two days pulling it together.
The most common objection to moving away from spreadsheets is: "We've always done it this way and it works fine."
The honest response is: it probably works better than you think in some ways, and worse than you think in others. The parts that work fine are visible — the assessments happen, the grants get paid, the reports come in. The parts that don't work are invisible — the hours spent on coordination, the errors that don't get caught, the knowledge risk when someone leaves.
The question isn't "does the spreadsheet work?" The question is "what could your team do with 500 hours a year if they weren't spending it on spreadsheet administration?"
For most programmes, that's the equivalent of a part-time programme role. That's the money you're spending on the spreadsheet.
You don't have to do everything at once. If you're not ready to move your whole programme off spreadsheets, start with the part that causes the most pain. For most teams, that's either:
Moving one workstream into a proper system gives your team a proof of concept and builds the case for doing the rest. It also gives you a direct comparison: what did this take us before, versus now?
The hidden cost of spreadsheets is real. You're paying it every year. The question is whether you want to keep paying it.
The biggest hidden costs are staff time spent on administration — data entry, error correction, and reporting preparation — plus risk exposure from data integrity failures, and compounding overhead as your programme portfolio grows. Most teams underestimate these costs because they appear as ordinary working hours rather than a visible budget line.
When you're running three or more concurrent programmes, spending more than a few hours a week on spreadsheet administration, or have experienced a data integrity incident, it's time to evaluate alternatives. The tipping point for most teams is a combination of volume and risk — not a single trigger.
For most teams running multiple programmes, yes. The ROI calculation typically shows positive returns within 12–18 months when you account for staff time savings, reduced error risk, and improved reporting efficiency. The switching cost is a one-time investment; the cost of staying on spreadsheets is ongoing.