What Grant Applicants Actually Want — and Why Most Grant Portals Disappoint Them

Grant portals are built from the inside out. The funder knows what information they need, what workflow the application feeds into, and what the assessment panel will ultimately read. The portal collects that information in a sequence that suits the funder's internal process. It is logical — from the inside.

Applicants come from the outside. They do not know your workflow, your assessment criteria, or your panel process. They bring their own time constraints, their own anxiety about eligibility, and often limited experience of grant applications. When a portal is designed without accounting for that, the results are predictable: applicants abandon partway through, submit incomplete responses, or never start at all. The funder sees lower application quality, thinner pools, and more support emails. The root cause is not applicant capability. It is portal design.


The Design Disconnect: Built for Funders, Not for Applicants

The person who configures a grant application form almost always understands the programme objectives, the eligibility criteria, and the assessment framework. What they rarely do is sit down and attempt to complete that form as a first-time applicant from a small community organisation, working on a laptop in a school staffroom between 5pm and 7pm on a Tuesday.

This is a structural problem. The person who knows most about what the form needs to capture is also the person least able to experience it as a stranger. Without deliberate testing with people outside your team, the result is a form that follows the order in which the funder processes information, uses internal terminology applicants do not share, and buries eligibility conditions in programme guidelines rather than presenting them where they would actually help.


The Save-and-Return Problem

Ask any experienced grants manager about applicant complaints and the save-and-return issue will come up within the first three minutes. An applicant starts a form, gets a certain distance through it, and cannot continue that session — their child needs collecting, their shift starts, the school computer lab closes. They return the next day to find their work is gone, or the session has expired, or they cannot locate their draft in the portal.

For a volunteer treasurer completing an application on behalf of a community sports club, or a kaitiaki of a small hapū with no administrative support, losing an hour of work is enough to abandon the application entirely. Grant applications already carry anxiety — the uncertainty about eligibility, the pressure of making a strong case — and discovering that work has been lost compounds that into something closer to demoralisation. Those organisations will not apply again.

Automatic draft saving is a basic expectation for any form applicants will invest meaningful time completing. Save progress continuously, allow return across multiple sessions, surface drafts clearly. And give applicants enough information to plan before they start: if the form takes two hours and requires three documents, say so at the beginning.


Guidance Versus Forms: The Difference Between a Portal That Helps and One That Simply Asks

There is a fundamental difference between a portal that presents questions and a portal that helps an applicant answer them well.

Most portals are in the first category. They present a field — "Describe the need your project addresses" — and wait. The applicant decides what "describe" means, how long the answer should be, what level of evidence is expected, and whether they are addressing the question the assessor actually wants answered. Different applicants make different decisions. The result is responses that are incomparable and often incomplete.

Better practice is contextual guidance: short explanations, embedded next to or beneath each question, that tell the applicant what the programme is looking for and what a useful answer includes. Not the answer itself, but the framing. "We are looking for evidence that the need exists in your specific community — data, anecdotal evidence, or reference to local research are all acceptable" is guidance. "What need does your project address?" is not.

Guidance has an important secondary effect. It signals to the applicant that the funder has thought about what it is asking for, which creates confidence that the application process is fair. An applicant who understands what each question is trying to elicit is more likely to produce a response that genuinely represents their organisation's capabilities. The quality improvement is mutual.


Eligibility Screening Done Wrong and Right

Screening applicants out before they invest time in a full application is good programme design. It reduces wasted effort on both sides: the applicant who does not qualify does not spend an afternoon on a form, and the funder's team does not process an ineligible submission.

The problem is how most eligibility screening is executed.

Done wrong: a paragraph of eligibility criteria in the programme information document, which the applicant is expected to have read before they start the form. Ineligible applications arrive anyway because criteria were unclear, because the applicant hoped they were close enough, or because they found the form before they found the criteria.

Done right: a structured eligibility check presented as the first step of the application process, before any form fields appear. Each criterion is stated plainly, in active language: "Your organisation must be a registered charitable entity" rather than "Registered charitable status is required." The applicant works through each condition and receives an immediate, clear indication of whether they are eligible to proceed. If they are not, they are told specifically which condition they did not meet — not rejected with a generic message.

This matters because the way ineligibility is communicated affects the applicant's relationship with the funder, and with grant applications more broadly. An applicant who receives a clear explanation of why they did not qualify can adjust their situation for future rounds, refer to a more appropriate programme, or understand that the exclusion was about programme fit rather than their organisation's merit. An applicant who receives "sorry, you are not eligible" and nothing more walks away with a grievance and no path forward.


The Feedback Gap: What Happens After a Decline

Most applicants who are declined receive a letter or email that tells them they were unsuccessful and thanks them for their application. Most receive nothing else.

This is a missed opportunity at best and a failure of basic respect at worst.

The applicant invested time — sometimes considerable time — in good faith. They submitted something that represented their best attempt to meet the programme's requirements. They deserved to know not just that they were unsuccessful, but why. Was their project outside the programme's focus? Did their proposal not demonstrate impact clearly enough? Were there higher-priority applications in this round? Was it a competitive decision rather than an outright rejection?

These distinctions matter. An organisation that was told their application was strong but not funded in a competitive round will apply again, better prepared. An organisation that was told their project was outside programme scope can redirect their effort. An organisation that received no useful feedback will either apply again with the same proposal, or not apply again at all.

Better practice is structured feedback linked to assessment criteria: a brief summary of how the application performed against each criterion, and where it fell short relative to funded applications. This does not require pages of narrative. It requires a clear assessment framework and the discipline to communicate outcomes in terms of that framework. Some funders worry about the administrative burden of providing this feedback. The burden is lower than it appears when the assessment framework is designed to generate it, and the downstream benefits — improved application quality, stronger sector relationships, reduced repeat-submission processing — outweigh the cost.


Document Upload: The Specificity Problem

"Supporting documentation" is not a useful label.

When a grant form presents a document upload field labelled "supporting documentation" or "attachments," applicants make wildly different decisions about what to include. Some upload a financial report, a letter of support, and a photograph. Some upload a 40-page strategic plan. Some upload nothing because they are not sure what is expected. None of them are wrong, given what the form told them.

Document upload fields should be specific: what document is required, what format it should be in, what the maximum file size is, and why it is needed. "Audited financial statements for the most recent completed financial year (PDF, max 5MB)" is a useful instruction. "Financial information" is not.

Beyond labelling, the list of required documents should be communicated before the applicant starts the form. Nothing is more disheartening than completing a lengthy application and discovering on the final page that you need a document you do not have to hand. Applicant-oriented portals present required documents in the eligibility check or the programme overview, so applicants can gather what they need before they begin.


Three Things a Funder Can Do Today Without New Software

Not every improvement requires a new platform. Three changes that can be made inside most existing portals without a system rebuild:

Add plain-language guidance to your top five most-abandoned questions. Pull your drop-off data (most portals have it, even if no one looks at it) and identify where applicants are stopping. Write two or three sentences of contextual guidance for each of those questions. Publish a revised form before the next round.

Move your eligibility criteria to the front of the process. If your current portal presents eligibility in a document link before the form, restructure your programme information page so the eligibility conditions are the first thing an applicant reads — in plain language, in a checklist format. You can do this in your website and your email communications without changing the form itself.

Add a pre-submission checklist to your application confirmation email. The moment after an applicant submits is when they are most anxious about whether they have done it correctly. An automated confirmation email that lists exactly what they submitted, what happens next, and when they will hear back costs almost nothing to configure and eliminates a significant proportion of the "did you receive my application?" queries that follow every round.


The quality of your funded portfolio is shaped by the quality of your portal long before any assessment happens. Applicants who understood what you were asking for, felt confident enough to apply, and had the tools to complete the form properly produce better applications. That is a design outcome — and it is within your control.

See what an applicant-centred grants portal looks like in practice — book a walkthrough.