A grants management software demo is a controlled environment. The vendor has practised the flow, the demo environment has clean data, and the features being shown are the ones that look best. Your job in a demo is to move beyond the polished sequence and see what the platform actually does in conditions that resemble your real-world programme.
This guide covers the questions that cut through the presentation layer — and the things you should insist on seeing, rather than accepting a description of.
The most important demo preparation is documenting your specific requirements before you start any vendor conversation. Without a requirements document, demos become guided tours of features that may or may not match your needs.
Your requirements document should cover:
Programme specifics. How many grant rounds do you run per year? What is the average number of applications per round? What is the range — smallest and largest round? Do you manage multiple programmes simultaneously with different criteria and forms?
Assessment complexity. Do you use a single assessor model or a panel? Do assessors score independently before deliberating, or is scoring done collectively? Do you have COI management requirements, and how complex are they?
Post-award structure. What do your milestone schedules look like? Do you use instalments tied to milestones, or lump-sum payments? Do you require verification of milestone completion by independent third parties?
Compliance obligations. What documentation standards apply to your programme — OIA, ANAO, Charity Commission, Freedom of Information? Are you in a jurisdiction with specific data residency requirements?
Integration needs. What systems does grants management need to connect to — financial management (Xero, MYOB, SAP), CRM, applicant database?
Walk into every demo with this document and evaluate what you see against it, rather than letting the vendor define the evaluation criteria.
"Can you show me how a grant administrator sets up a new round from scratch?"
Do not accept a demo of a pre-configured round. Ask to see the configuration process — adding questions to a form, setting up eligibility criteria, configuring the assessment template with weighted criteria. This reveals how much configuration requires IT support versus programme staff self-service.
"Show me the COI declaration and management workflow."
The presentation will describe COI management. What you want to see is the actual workflow: where the assessor declares conflicts, what happens when a conflict is declared (who sees it, what options the convenor has, how the resolution is documented), and what the final COI record looks like in the assessment documentation.
"What does the OIA/FOI response package look like for a specific grant application?"
Ask them to pull the complete decision record for a specific application in their demo environment — all scores, all assessor comments, the panel recommendation, the final decision, and all correspondence. Export it or show how it would be produced for a regulator. If this requires manual assembly from multiple screens, that is important information.
"Show me the post-award dashboard for a programme with 50 active grants."
Ask to see what the portfolio-level view looks like — which grants have outstanding milestones, which ones are overdue, total value by status. If this view does not exist, or if it requires a custom report to produce, that is the operational reality for your programme manager.
"How do I make a configuration change after a round is open?"
Ask what happens when you need to add a question to a form after the application window has opened. Can this be done by a programme coordinator, or does it require a support ticket? What happens to applications already submitted?
"What does the applicant experience look like from a mobile browser?"
Ask to see the application portal on a mobile device. Community organisations and grassroots groups often apply from phones. If the portal is desktop-only in practice, that shapes who can apply to your programme.
"Who is your NZ/AU/UK customer that runs a programme most similar to ours?"
Ask for a specific reference customer, not a case study — someone you can call and ask directly about their experience. Vendors with genuine market presence in your segment will have reference customers. If the reference customers offered are all in a different market or programme type, that is informative.
Watch the loading times. Slow page loads during a demo — when the vendor is controlling the environment — suggest performance issues that will be worse in your production environment with real data volumes.
Watch how errors are handled. Ask the vendor to demonstrate what happens when an applicant submits an incomplete form, or when an assessor tries to complete a score for an application they have a conflict on. Systems that handle error cases gracefully have been built with real-world use in mind.
Watch what requires the vendor. Notice how many times the vendor says "we would configure that for you" versus "you can do that from the admin settings." Every instance of vendor-dependency is an ongoing cost and delay point.
Watch the report builder. Ask to run a custom report — for example, all applications that scored above a threshold on a specific criterion. If this requires the vendor to build a custom report rather than the programme coordinator doing it in self-service, that is ongoing overhead.
Watch the data export. Ask to export all data for a programme — applications, scores, decisions, correspondence — in a standard format (CSV, JSON). Data portability protects you if you ever need to migrate to a different platform.
After the demo, send a simple test: ask the vendor to provide a written answer to the three hardest questions that came up during the demo — the ones where they said "that's a great question, let me check" or redirected to a different feature.
How they respond — and how quickly — tells you something about the support experience you will have when your programme is live and something goes wrong.
When you are ready to evaluate Tahua.
**.
book a conversation → We will show you exactly how COI management, panel assessment, post-award tracking, and OIA/audit documentation work for a programme like yours.