The data generated through grants management — applications, assessments, awards, reports, and outcomes — is one of the most underutilised resources in philanthropy. Most funders use this data only for basic administration: tracking who has applied, who was funded, and when reports are due. The sophisticated use of grants data — to understand portfolio patterns, identify equity gaps, evaluate programme effectiveness, and inform strategic decisions — remains rare. This gap represents a significant missed opportunity for improving grantmaking.
A single grant cycle generates substantial data:
Application data:
- Applicant organisation profiles (type, size, location, demographics)
- Programme descriptions and funding requests
- Assessment scores and assessor comments
- Decision rationale
Portfolio data:
- Grant amounts by purpose, organisation type, geography, funder priority area
- Distribution of grants across the portfolio (are 20% of grantees receiving 80% of funds?)
- Repeat vs. first-time grantees
- Multi-year grant commitments
Outcome data:
- Reported outputs and outcomes from grantee reports
- Grantee self-assessment of progress
- Funder assessments of grantee performance
Applicant pool data:
- Who applied but wasn't funded
- Decline reasons
- Eligibility failures
Historical data:
- Trends in application volume and quality over time
- Changes in the grantee community
- Programme investment patterns over multiple years
The first level of analytical use of grants data is basic portfolio reporting — understanding what you've funded.
Essential portfolio reports:
- Total funding by grant round
- Funding by purpose/priority area
- Funding by geographic distribution
- Funding by organisation type (community group, registered charity, incorporated society, etc.)
- Average grant size by category
- Number of unique grantees vs. repeat grantees
- Multi-year grant commitments vs. single-year grants
Why this matters: Portfolio reporting that only the organisation's management team sees doesn't drive accountability. Publishing basic portfolio data — on the website, in annual reports — creates transparency and enables community assessment of whether the funder's stated priorities match actual investment patterns.
Equity analysis asks: who is accessing our funding, and who is being excluded? This requires disaggregating portfolio data by:
Grantee demographics:
- Leadership demographics (CEO/executive director gender, ethnicity)
- Governance demographics (board composition)
- Staff demographics
Community demographics:
- Geographic distribution (urban vs. rural vs. remote)
- Ethnicity of communities served
- Socioeconomic characteristics of funded communities
Application and success rates:
- What proportion of Māori-led organisations that apply receive grants?
- How does the success rate of small organisations compare to large?
- How does the geographic distribution of applications compare to the geographic distribution of awards?
This analysis frequently reveals patterns that are invisible without data:
- Māori and Pacific-led organisations may apply at lower rates and receive smaller grants
- Rural and remote communities may be systematically underrepresented
- Small organisations may have lower success rates despite serving significant community need
- Assessment scores may systematically disadvantage applicants who write in less formal English
Aggregating outcome data across the grant portfolio reveals patterns that individual grantee reports can't show:
Reach aggregation: How many people total are reached by the grant portfolio? How does this compare to the target population?
Outcome aggregation: What are the most common outcomes being achieved across the portfolio? Are there outcomes that are frequently claimed but rarely evidenced?
Sector mapping: What sectors and sub-sectors are being funded? Where are the gaps?
Impact trajectory: Is the portfolio's reported reach and outcomes increasing or decreasing over time?
Grantee performance distribution: Are a small number of grantees responsible for most of the portfolio's reported outcomes? This might indicate concentration risk or identify high-performers worth increased investment.
The most analytically sophisticated funders use data in more advanced ways:
Predictive assessment. Using historical data on applications, assessment scores, and grant outcomes to understand which assessment criteria are most predictive of grant success. This allows refinement of assessment criteria to focus on what actually matters.
Network analysis. Mapping the relationships between funded organisations — who partners with whom, who refers clients to whom, who shares leadership — identifies network effects and gaps in the ecosystem.
Geospatial analysis. Overlaying grant investment with demographic data reveals geographic equity patterns that aren't visible in tabular reports. Heat maps of grant concentration relative to need distribution are powerful visual tools.
Trend analysis. Tracking portfolio metrics over multiple years identifies emerging patterns — growth in applications from specific sectors, decline in rural applicants, increasing grant sizes in specific areas — that inform strategic decisions.
Comparative benchmarking. Comparing portfolio metrics against peer funders or sector averages provides context for understanding whether your portfolio is representative. This requires data sharing with peer organisations.
Analytics is only as good as the underlying data. Common data quality problems in grants management:
Inconsistent categorisation. If different staff members apply different category labels to similar grants, portfolio analysis by category is meaningless. Consistent taxonomies, applied reliably, are essential.
Missing data. Optional fields that are inconsistently completed produce holes in the data. Where data is essential for analysis, it should be required at the point of entry.
Free text overload. Applications with extensive free text response fields (application narratives, assessment comments) contain rich information but are difficult to analyse systematically. Natural language processing tools can help, but require significant setup.
Reporting quality variation. If grantee reports are collected in inconsistent formats, aggregating outcome data is difficult. Structured reporting templates with defined fields produce more analysable data than open-format reports.
Historical data gaps. If previous grant cycles used different systems or processes, historical data may be incomplete or incompatible. Migrating and cleaning historical data is expensive but necessary for long-term trend analysis.
Purpose-built grants management software provides significant analytical advantages over spreadsheets:
Structured data entry. Consistent taxonomies, mandatory fields, and dropdown options produce cleaner, more analysable data.
Built-in reporting. Most grants management platforms include standard reports — portfolio summaries, funding by category, grantee lists — that require no additional analysis work.
Custom reports. More advanced platforms allow custom report configuration, filtering by multiple criteria, and export to Excel or BI tools for further analysis.
API access. Enterprise platforms provide API access that allows grant data to be combined with external data sources (population statistics, sector data) for more sophisticated analysis.
Dashboard visualisation. Real-time dashboards that show portfolio status, upcoming deadlines, and key metrics allow quick situational awareness without manual report generation.
The most transparent funders publish their grants data in open formats that allow external analysis. The 360Giving standard in the UK has enabled significant cross-funder analysis by creating a common data format. New Zealand philanthropy has less developed open data infrastructure, but the principle is the same: publishing grants data in machine-readable formats benefits the sector.
What to publish:
- Grants awarded (organisation name, amount, purpose, grant period, geographic area)
- Declined applications (aggregated, not individually identified)
- Portfolio summaries by category, geography, and outcome
- Learning reports based on portfolio analysis
Publishing this data is an act of transparency and a contribution to sector intelligence. It also creates accountability — communities and researchers can analyse whether the funder's stated priorities match its actual investment.
Tahua's grants management platform provides the analytics infrastructure that data-driven funders need — from standard portfolio reporting to custom analysis, with the data quality and export capabilities that make genuine insight possible.