Running a Grant Assessment Panel: Process, Documentation, and Common Problems

The assessment panel is the governance mechanism at the heart of most competitive grants programmes. It is how funders make decisions that are defensible — where judgements are collective, documented, and traceable back to criteria. A well-run panel produces decisions that hold up to scrutiny. A poorly designed panel process produces decisions that look reasonable at the time but cannot be adequately defended when challenged.

Why panel process matters

The panel process is not primarily about making good funding decisions — though it helps with that too. It is primarily about creating a defensible record of how decisions were made.

A single assessor making a funding decision unilaterally is efficient but creates accountability risk. There is no independent check on bias, no record of reasoning against criteria, and no evidence that the process was consistent across applications.

A panel of assessors scoring against explicit criteria, declaring COIs, deliberating on borderline cases, and producing a documented recommendation creates a record that can be examined, tested, and defended. For government funders, this record is part of meeting their statutory accountability obligations. For charitable foundations, it demonstrates that the board's grant-making is governed by a fair, consistent process.

The components of a panel process

Assessor selection. Panels should be composed of people with relevant expertise and independence from applicants. The rationale for panel composition — why these people were chosen — is worth documenting, particularly for government programmes where the selection of assessors can itself be subject to scrutiny.

Briefing and training. Before assessment begins, assessors should be briefed on the programme objectives, the assessment criteria and their weighting, the scoring scale, the COI declaration requirement, and any constraints on communication between assessors during independent scoring. This briefing should be documented.

COI declaration. Every assessor should declare any conflicts of interest before reviewing applications. Declarations should be collected per application, not as a single blanket statement. See the guide on conflict of interest management for detail on how to structure this.

Independent scoring. Assessors score applications independently, against the criteria, before seeing other assessors' scores. The independence of scoring is critical — if assessors can see each other's scores before submitting their own, social influence effects undermine the validity of the individual scores.

Score collation. Once all assessors have scored all applications assigned to them, scores are collated. The collated scores reveal where assessors agreed (low variance) and where they disagreed (high variance). High-variance applications require deliberation.

Panel deliberation. The panel reviews the collated scores, focuses discussion on high-variance applications and borderline cases, and arrives at a recommendation list. The deliberation should include discussion of relative merit across applications where the funding envelope does not allow all borderline applications to be funded.

Documentation of the recommendation. The panel's final recommendation — which applications are recommended for funding, at what amount, and with what rationale — should be formally documented. For government programmes, this recommendation document supports the Ministerial decision.

Decision and notification. Funding decisions are made by the appropriate authority (board, programme manager, Minister), recorded, and communicated to applicants.

What to document at each stage

The documentation standard for a grant assessment panel should be set by the accountability requirements of the programme. At minimum:

Assessor selection: Who was on the panel and why.

COI declarations: Every declaration, every management decision on declared conflicts.

Individual scores: Every assessor's score for every criterion for every application they assessed, with any written comments. These should be recorded in the grants management system, not held in assessors' personal notes.

Variance analysis: A record of which applications had high inter-assessor variance and how that variance was resolved in deliberation.

Panel deliberation notes: A summary of the panel's discussion on each application that was subject to deliberation. This does not need to be a verbatim transcript but should capture the key points and the reasoning.

Panel recommendation: The formal recommendation document with the recommended list, funding amounts, and rationale.

Decision record: The authority that made the final decision, the date, and any variation from the panel recommendation (with reasons).

Common problems in panel management

Independent scoring not enforced. When assessors discuss applications before submitting their scores — whether in a meeting or informally — the independence of the scores is compromised. The system should enforce independent scoring by preventing access to other assessors' scores until all scores are submitted.

COI declarations collected in bulk, not per application. A single declaration form at the start of the panel process is not adequate. Per-application declarations catch conflicts that a blanket review misses.

Panel deliberation undocumented. "The panel discussed and agreed" is not a useful record. Deliberation notes should capture the substance of the discussion, particularly for applications where the panel reached a different conclusion from the scores alone.

Scores held in assessors' personal documents. Assessors should score inside the grants management system, not in their own spreadsheets or word documents. Scores held outside the system cannot be included in the official record and create data management problems.

No process for managing assessor withdrawal. When an assessor withdraws mid-round (illness, disclosed conflict, unavailability), the programme needs a protocol for how to handle their partial scores and how to assign their remaining applications. This should be planned for, not improvised.

Panel recommendations varied without explanation. When the decision-maker varies the panel recommendation — funding less than recommended, funding a declined application, changing conditions — the reasoning should be documented. Unexplained variations create accountability risk.

The governance case for structured panel process

For government and Crown entity funders, the panel process is a governance mechanism. The funding decision is ultimately made by a person with delegated authority — a programme manager, CE, or Minister — and the panel process is the evidence base for that decision.

If a funding decision is challenged, the questions will include: what criteria were applied, who assessed the application, were there any conflicts of interest, what scores were given, what did the panel recommend, and why was that recommendation accepted or varied? A well-documented panel process can answer all of these questions. A poorly documented one cannot.

For charitable foundations, the panel process demonstrates to the board, to donors, and to the public that grant decisions are made on merit, consistently, and with independent scrutiny. This is not just a compliance matter — it is what makes a grants programme worth running.

How your grants management system supports panel process

A purpose-built grants management system handles the panel process as a core workflow:

  • Assessor assignment — applications distributed to assessors, with COI-checked exclusions enforced
  • Independent scoring enforced — assessors cannot see each other's scores before submitting their own
  • Score collation and variance analysis — automatic calculation of aggregate scores and variance
  • Panel deliberation documentation — notes recorded against each application in the system
  • Recommendation generation — a ranked list produced from the system, not manually compiled
  • Complete panel record — all scores, declarations, notes, and recommendations retained in the system

A process that runs through email, individual documents, and spreadsheets cannot meet the same documentation standard without significant additional effort — and the documentation will be fragmented, harder to retrieve, and more vulnerable to gaps.


For funders reviewing their assessment panel process, the government grants management solution page explains how Tahua supports compliance-heavy programmes. To see the panel assessment workflow in practice.

**.

book a conversation →