Grantmaking Annual Report: How to Report Publicly on Your Grants Programme

A grantmaking annual report serves multiple purposes: it demonstrates accountability for how funds were distributed, communicates the programme's impact to grantees and the broader public, provides transparency to applicants who were declined, and supports governance accountability to the board and any oversight bodies. For government-funded programmes and publicly accountable foundations, it is also a legitimacy document — evidence that the programme is well-run and that funds are reaching their intended purpose.

This guide covers what a grantmaking annual report should include and how grants management software supports its preparation.

Who the report is for

Different audiences need different information from an annual report:

Applicants (successful and declined). Past and potential applicants want to know how many applications were received, how many were funded, what the typical funded grant looks like, and what priorities guided selection. This information helps them assess whether to apply in future rounds and how to position their applications.

Grantees. Active and past grantees want to know how their programmes fit into the funder's broader portfolio, and what the funder has learned across the portfolio.

The public. For government funders and community trusts, the public has a legitimate interest in knowing how public or community funds were distributed — to whom, for what purpose, and in what amounts.

Governance bodies. Boards, committees, and oversight bodies need the information to discharge their governance responsibilities. The annual report provides both a record and accountability evidence.

Media and advocacy organisations. Journalists and policy advocates may use the report to assess whether the programme is serving its stated purposes, whether there are patterns in who does and doesn't receive funding, and whether the funder is living up to its equity commitments.

What to include

Programme overview. Brief description of the programme's purpose, any changes to strategic priorities in the reporting year, and notable programme developments.

Application activity. Total applications received; eligibility rate (percentage that proceeded to full assessment); total applications assessed.

Funding decisions. Total grants approved; total grant value; number of declined applications; success rate. If the programme received significantly more applications than it could fund, saying so contextualises the competitive nature of the process.

Grant distribution. How grants were distributed — by geographic area, by grant purpose or category, by grantee type (organisation size, type, sector), by grant size. For equity-focused funders, this may include demographic data about grantees and beneficiaries.

Individual grants list. For government and community trust programmes, publishing a complete list of grants — organisation name, grant amount, brief grant purpose — is standard practice. This level of transparency is expected for publicly accountable programmes.

Post-award performance. How the active grant portfolio is performing: milestone completion rates, reporting compliance, any significant grantee issues or grant variations.

Outcomes and impact. What did the funded programmes achieve? Aggregate outcome data from grantee reports — participants served, activities delivered, outcomes achieved — compiled across the portfolio. This is the "so what" of the report.

Learning. What did the funder learn in this year? Changes to programme design, assessment criteria, or grantmaking approach that came from this year's experience. This builds the programme's credibility as a learning organisation.

Financial summary. Total funds distributed; administration costs; comparative year-over-year data.

How grants management software supports annual reporting

Structured data. An annual report is only as good as the underlying data. Grants management software that captures grant activity in structured, consistent fields enables the aggregation and analysis that annual reports require.

Portfolio-level views. Aggregate reporting on the portfolio — distribution by category, geography, grant size — should be available as a self-service export from the platform, not a manual compilation exercise.

Outcome data aggregation. If outcome data is collected through the reporting module, the platform should be able to aggregate indicators across the portfolio — total participants, average scores on wellbeing measures, employment outcomes — for the impact section of the annual report.

Grant list export. A clean export of all grants in the reporting period — organisation, amount, purpose, geography — is the foundation of the grants list section.

Comparison data. Year-over-year comparison requires consistent historical data. Platforms that have held grant records for multiple years provide the historical foundation for trend analysis.

Common weaknesses in grantmaking annual reports

Impact claims without data. "We funded 47 organisations that improved community wellbeing" is not an impact claim — it is a count. Impact claims need to be specific: what changed, for how many people, and how was it measured.

No geographic analysis. Reports that list grants but don't analyse geographic distribution miss an accountability dimension — particularly for community trusts and government programmes where geographic equity is a legitimate expectation.

No declined applicant data. Reporting only on funded grants omits the information applicants and the public need to assess the competitive nature of the programme. Success rates are an accountability indicator.

Absence of learning. An annual report that only records what happened without acknowledging what was learned or what will change is a compliance exercise, not a genuine accountability document.


Tahua provides grants management software with portfolio reporting, outcome data aggregation, and grant list export capabilities that support annual grantmaking reporting.

Book a conversation →