How to Present Grant Programme Impact to Your Board

Most grant programme impact reports for boards are too long, too dense, and not designed for the way boards actually make decisions. They contain everything the programme team thinks is relevant — activity summaries, financial tables, grantee lists, outcome statistics, case studies — without a clear narrative thread or a set of decisions the board needs to make.

The result is board members who skim the report before the meeting, ask generic questions, and defer to management on everything. That's not effective governance.

Here's how to design an impact report that gives your board what it actually needs.

Understand what your board is trying to do

Boards govern programmes; they don't manage them. Their role is to ensure the programme is achieving its purpose, that funds are being used appropriately, and that there are no material risks requiring attention.

That means the questions your board impact report needs to answer are:

  1. Is the programme achieving what it's supposed to achieve?
  2. Are funds being used appropriately?
  3. Are there any issues or risks the board needs to know about?
  4. Is there anything the board needs to decide?

Everything in your report should serve one of these four purposes. If it doesn't, consider whether it belongs in the report at all.

Structure for decision-making, not comprehensiveness

A board impact report that's designed for comprehensiveness — documenting everything that happened — is a compliance document. A board impact report that's designed for decision-making — surfacing what matters — is a governance tool.

A practical structure:

1. Executive summary (one page): What the programme achieved this period, against what it set out to achieve. Key numbers. Any significant issues. Any decisions required. Board members who only read one page should still have a complete picture.

2. Programme performance (two to three pages): Outcomes data against your indicators, with comparison to previous period or to target. Brief analysis of what's driving performance — what's working and what isn't. Don't just report numbers; tell the board what the numbers mean.

3. Financial summary (one page): Budget vs. actuals, commitments vs. available balance, projected year-end position. Flag any significant variances with a brief explanation.

4. Risk and issues (half a page): Any material risks or issues that have emerged since the last report, with your recommended response. Don't bury risks in the narrative — surface them explicitly.

5. Grantee portfolio summary (one page): A high-level view of the funded portfolio — number of active grants, geographic and sector distribution, any significant grantee issues. This can be largely visual (a map, a pie chart, a brief table).

6. Case study (one page): One grantee story told well. Not a list of anecdotes — one specific story that illustrates what the programme is achieving at the human level.

7. Appendices: Detailed grantee lists, full financial statements, full assessment reports — for members who want the detail. Not required reading.

Lead with the headline, not the methodology

A common reporting mistake is leading with how the data was collected before telling the board what the data shows. Board members don't need to understand your measurement methodology before they can engage with your findings.

Lead with the finding: "This round, 82% of participants reported improved confidence in managing grants — up from 74% in the previous round."

Then, if relevant: "This is based on a post-programme survey of 180 participants."

The finding first, the methodology second. The board can ask about methodology if they need to; they shouldn't have to read through it to find the result.

Be honest about what isn't working

Board reports that only present positive results train boards to distrust them. If your programme is underperforming on a key indicator, say so — and come with an explanation and a proposed response.

"Uptake among Māori organisations has been lower than expected this round (18% of funded organisations vs. a target of 25%). We believe this reflects the timing of the deadline, which fell during a busy period for iwi governance cycles. We're adjusting the deadline for next round and will report back on whether uptake improves."

That's a board that's being treated as a partner in programme management. Boards that get honest reporting trust the management team more, not less.

Calibrate report length to your reporting cycle

For monthly board meetings, a two-page exception report is often more useful than a comprehensive update — the board is close enough to the programme that deep quarterly reviews are more productive than shallow monthly ones.

For quarterly reporting, a five to seven page report is appropriate. For annual reporting, a more comprehensive document is warranted — but still structured for decision-making, not documentation.

Ask your board chair what length and format works for your board. Different boards have different preferences, and the most technically correct report is useless if your board members don't engage with it.


Part of the Tahua grants management series

This article is part of the complete guide: What Great Grant Outcome Reporting Looks Like.