MSP Solutions

How MSPs Should Structure Client UAT Projects

LogicHive Team9 min read
80%
of UAT effort is repeatable across client projects
3x
faster UAT setup with reusable test libraries
0
the number of MSPs that should be winging it

Here's a scene that plays out in MSPs every week. A senior consultant kicks off a new UAT phase using the spreadsheet template they've refined over six projects. Down the corridor, a junior consultant copies a folder from the last project they worked on and renames it. On a third project, someone starts from scratch because they couldn't find the template. The client on each project gets a completely different experience — different formats, different levels of rigour, and different quality of reporting.

This is not a people problem. It's a process problem. And if your MSP is delivering more than a handful of projects a year, it's a problem that costs you real money — in rework, in scope creep, in clients who lose confidence in your delivery capability. This guide is about fixing it.

1. The Problem with Ad-Hoc UAT

Most MSPs do not have a UAT problem — they have a consistency problem. Each consultant has their own approach, their own templates, their own way of explaining things to clients. When it works, it works because that particular consultant is experienced and diligent. When it doesn't work, the MSP has no way of understanding why, because there was no standard to deviate from in the first place.

The symptoms are predictable:

  • Inconsistent client experience — one client gets a beautifully structured UAT phase with daily status reports and clear sign-off criteria. The next client gets a shared spreadsheet and a verbal "are we happy?" at the end
  • Knowledge loss when consultants leave — their templates and processes walk out the door with them because nothing was ever standardised
  • Inability to price UAT accurately — if every project does it differently, you have no baseline data on how long UAT actually takes. So you guess, and you usually guess low
  • Quality depends on who's available — your senior consultants run tight UAT processes. Your juniors struggle. Clients notice the difference, and some get a materially worse service
  • No ability to scale — you cannot grow your delivery capacity if every project requires a consultant to design the UAT approach from scratch. As covered in our MSP UAT guide, scalability demands repeatable processes

The Hard Truth:

If you asked three of your consultants to run UAT on the same project independently, you would get three different processes, three different test plan formats, and three different client experiences. That is not a practice — it is a collection of individuals doing their own thing.

2. Phase 1: Define Your Standard UAT Framework

The first step is to define what UAT looks like at your MSP — not on a specific project, but as a standard methodology that every project follows. This does not mean removing all flexibility. It means establishing the non-negotiable structure within which consultants can apply their expertise.

Standard Phases

Every UAT project should follow the same phases, regardless of platform, client size, or consultant preference:

  • Preparation — environment readiness, test data setup, tester identification and training. This phase has clear entry criteria: the system must be stable, data migration must be complete, and the client team must understand what they are about to do
  • Test case creation — building or adapting test cases from your library, reviewed and approved by the client before execution begins. See our guide on how to write effective test cases for the detail
  • Execution — the actual testing period, with daily status tracking and structured defect logging. Every tester knows what they are testing, how to record results, and where to log issues
  • Defect resolution — triaging defects, prioritising fixes, retesting resolved issues. This phase runs in parallel with execution but has its own rhythm and governance
  • Sign-off — formal acceptance based on predefined criteria. Not a verbal agreement. Not an email. A documented sign-off that your client and your MSP can both point to six months later

Standard Roles

Define who does what on every project. The titles might vary, but the responsibilities should not:

  • Test lead (MSP side) — owns the UAT plan, manages the schedule, runs daily standups, produces status reports. This is typically the lead consultant on the project
  • Testers (client side) — execute test cases, record results, log defects with evidence. These are the business users who will use the system day-to-day
  • Client approver — the person with authority to sign off UAT completion. Ideally a project sponsor or senior stakeholder, not the same person running the tests
  • Technical support (MSP side) — available to investigate and fix defects during the execution phase. Not the same person as the test lead, if you can help it

Standard Deliverables

Every project produces the same set of documents. No surprises, no gaps:

  • UAT plan — scope, timeline, roles, entry/exit criteria, risk register
  • Test cases — structured, numbered, traceable to requirements
  • Daily status report — tests executed, pass/fail rates, open defects, blockers
  • Defect log — every issue recorded with severity, evidence, and resolution status
  • Sign-off document — formal record of acceptance, outstanding items, and any conditions

Practical Tip:

Write your standard framework down. Put it in a shared location. Make it the first thing new consultants read when they join. If it only exists in someone's head, it is not a standard — it is a preference.

3. Phase 2: Build Reusable Test Libraries

This is where most MSPs leave enormous value on the table. Every Business Central finance migration has the same core processes: general ledger posting, accounts payable, accounts receivable, bank reconciliation, VAT reporting, month-end close. Whether the client is a manufacturer, a distributor, or a professional services firm, those core processes are fundamentally the same.

Yet most MSPs write test cases from scratch on every single project. It is an extraordinary waste of time and expertise.

How to Build Your Library

  • Start with your most common platform — if you deliver mostly Business Central projects, build your BC library first. Dynamics 365 F&O, NetSuite, or SAP can come later
  • Organise by module and process — not by client. Your finance module library should contain test cases for GL posting, AP, AR, bank rec, fixed assets, and reporting. Each test case should be generic enough to reuse but detailed enough to be useful
  • Include expected results — a test case that says "post a purchase invoice" is not useful. One that says "post a purchase invoice for 1,000 GBP plus VAT, verify vendor ledger entry shows 1,200 GBP, GL entries show 1,000 GBP to expense account and 200 GBP to VAT account" actually tells the tester what to check
  • Tag for customisation points — mark which elements of each test case will need to be adapted per client (account numbers, dimension values, approval thresholds) so consultants know what to change and what to leave alone
  • Version and maintain — treat your test library like a product. Update it when platforms release new features, when you discover gaps on a project, or when a consultant improves a test case. Learn more about structuring test cases in our test case writing guide

A well-maintained test library means a consultant starting a new project can have eighty percent of their test cases ready within hours, not days. The remaining twenty percent — custom integrations, bespoke workflows, industry-specific processes — is where their expertise adds real value.

Built for MSPs Managing Multiple Clients

Standardise Your UAT Process with LogicHive

Reusable test libraries, consistent reporting across clients, and structured sign-off — all from a platform designed for MSPs.

Start Free Trial

4. Phase 3: Standardise Client Onboarding

The client's first experience of UAT sets the tone for the entire phase. Get it wrong and you will spend the next three weeks answering the same basic questions, chasing testers who do not understand what they are supposed to do, and untangling defect reports that say "it doesn't work" with no further detail.

Most MSPs explain UAT from scratch on every project. The lead consultant stands up in a meeting, gives their personal interpretation of what UAT is, and hopes the client understood. There is a better way.

Your Standard Kickoff Pack Should Include

  • What UAT is (and what it is not) — UAT is the client validating that the system supports their business processes. It is not the MSP demonstrating the system. It is not training. It is not an opportunity to add new requirements. Be explicit about this. Reference our complete guide to UAT for the fundamentals
  • What the client is responsible for — providing testers, making them available for the agreed period, reviewing and approving test cases, logging results honestly, and signing off when criteria are met
  • How to execute a test case — step-by-step guide with screenshots. What a pass looks like. What a fail looks like. Where to record the result. How to capture evidence
  • How to log a defect — what information is needed (steps to reproduce, expected vs actual result, screenshots), where to log it, and what happens after they log it
  • How sign-off works — what criteria must be met, who signs off, what "conditional sign-off" means, and what happens to outstanding defects after sign-off
  • The schedule — when testing starts, daily standup times, when status reports come out, when the phase is expected to end

Practical Tip:

Record a short video walkthrough of your kickoff pack. New testers who join mid-way through the UAT phase can watch it on-demand rather than needing someone to re-explain everything. It takes thirty minutes to record and saves hours across every project.

5. Phase 4: Implement Consistent Reporting

Reporting is where the gap between your best and worst consultants becomes most visible to the client. Your senior consultant produces a polished daily status report with clear metrics, trend analysis, and an honest assessment of go-live readiness. Your junior consultant sends a bullet-point email saying "testing is going well, a few issues found."

The client paying the same day rate for both consultants deserves the same quality of reporting. Here is what every client should receive, regardless of who is running their project:

Daily Status Report

  • Test execution progress — total test cases, executed, passed, failed, blocked, not started. Shown as numbers and percentages
  • Defect summary — total open defects by severity, new defects raised today, defects resolved today, defects awaiting retest
  • Blockers and risks — anything preventing testing from progressing, and any risks to the timeline
  • Plan for tomorrow — what will be tested next, any dependencies or prerequisites

Go-Live Readiness Dashboard

As you approach the end of UAT, the client needs a clear, honest view of whether the system is ready for go-live. This is not a subjective judgement call — it is a dashboard showing measurable criteria. Managing this across multiple ERP projects simultaneously makes standardised reporting even more critical.

  • Test completion rate — percentage of planned test cases executed and passed
  • Critical defect status — all P1 and P2 defects resolved and retested
  • Sign-off status — which modules or processes have been signed off, which are outstanding
  • Outstanding risks — any open items that could affect go-live

Why This Matters Commercially:

Consistent, professional reporting builds client confidence in your MSP. It differentiates you from competitors who treat UAT as an afterthought. And it protects you — if a client complains about go-live quality six months later, you have documented evidence of what was tested, what was found, and what was signed off.

6. Phase 5: Build in Quality Gates

Quality gates are the checkpoints that prevent projects from lurching forward before they are ready. Without them, UAT becomes a formality that everyone rushes through because the go-live date is fixed and no one wants to be the person who says "we are not ready."

Entry Criteria: "Ready for UAT"

UAT should not start until these conditions are met. Make them non-negotiable:

  • Environment is stable — no ongoing development or configuration changes during UAT. The system under test is the system that will go live
  • Data migration is complete — master data and opening balances have been loaded and verified. Testers are working with real data, not placeholders
  • SIT is complete — system integration testing has been signed off. The MSP is confident the system works technically before asking the client to validate it from a business perspective
  • Testers are identified and briefed — the client knows who is testing, they have been through the kickoff, and they have time allocated in their diaries
  • Test cases are reviewed and approved — the client has seen the test cases and confirmed they cover the right processes

Exit Criteria: "UAT Complete"

Equally important — define what "done" looks like before you start:

  • All planned test cases executed — not "most of them." If test cases were descoped, that decision was documented and agreed
  • Pass rate meets threshold — typically ninety-five percent or above for core business processes
  • All P1 defects resolved and retested — no critical defects remain open at sign-off
  • All P2 defects resolved or have an agreed remediation plan — high-severity defects either fixed or formally accepted with a workaround and a fix date
  • Formal sign-off obtained — the designated client approver has signed the acceptance document. Our guide on UAT best practices for MSPs covers sign-off governance in detail

Non-Negotiable:

Quality gates only work if you are prepared to enforce them. If you routinely waive entry criteria because the project is behind schedule, you do not have quality gates — you have aspirations. The whole point is that they hold, especially under pressure.

7. Common Mistakes MSPs Make

Having worked with MSPs at various stages of maturity, these are the patterns that hold back even capable organisations. If you recognise yourself in this list, you are in good company — and you are also leaving money and reputation on the table.

  • Letting consultants freelance the process — "We hire experienced people and let them do their thing." That works until your experienced person leaves, or takes holiday during a critical project, or delivers a substandard UAT and the client blames your MSP, not the individual
  • Refusing to invest in templates because "every client is different" — they are not. Eighty percent of a BC finance implementation is the same whether your client makes widgets or sells consulting services. The twenty percent that differs is where consultant expertise adds value. The other eighty percent should be templated
  • Skipping client onboarding — assuming the client understands UAT because you explained it once in a steering committee meeting three months ago. They do not remember. They have day jobs. Brief them properly at the start of UAT, not before
  • Not tracking time spent on UAT — if you do not know how many hours UAT consumes across your projects, you cannot price it accurately in future proposals. Track it. You will almost certainly discover you are under-pricing it. Explore how to sell UAT as a value-added service rather than burying it in project costs
  • Treating UAT as the client's problem — yes, the client executes the tests. But the MSP owns the process, the tooling, the reporting, and the quality gates. If UAT fails, it is your go-live that slips and your reputation that suffers
  • No post-UAT retrospective — every UAT phase generates lessons. Which test cases were missing? Where did the client struggle? What took longer than expected? Capture these and feed them back into your framework. If you are managing testing across multiple clients, these insights compound rapidly

Frequently Asked Questions

How long does it take an MSP to build a standard UAT framework?

Most MSPs can build a solid initial framework in four to six weeks if they dedicate time to it. This includes defining standard phases, creating template test cases for your most common platforms, and building a client onboarding pack. The key is to start with your most frequently delivered project type rather than trying to cover everything at once. Refine it over three or four projects and you will have something robust.

Should every client get the same UAT process?

The framework should be the same, but the detail will vary. Think of it as eighty percent standard, twenty percent customised. Every client should go through the same phases, use the same reporting formats, and meet the same quality gates. But the specific test cases, the number of testers, and the timeline will be tailored to the project scope. The mistake is thinking every client is so different that you need to start from scratch each time.

How do MSPs handle UAT when the client has no testing experience?

This is exactly why a standard client onboarding pack matters. Most clients have never done formal UAT before. Your onboarding pack should explain what UAT is in plain language, what the client is responsible for, how to log test results and defects, and what sign-off means. A thirty-minute kickoff session walking through this pack will save you hours of confusion later. Do not assume the client knows what you mean by "test case," "defect," or "acceptance criteria."

What tools should MSPs use to manage client UAT?

At minimum, you need something that gives you centralised test case management, real-time progress tracking, defect logging with evidence capture, and structured sign-off. Spreadsheets work for one project but fall apart when you are managing multiple clients simultaneously. A purpose-built UAT platform lets you standardise your process, maintain test libraries across clients, and give every client the same quality of reporting regardless of which consultant is running the project.

How should MSPs price UAT in their project proposals?

You cannot price UAT properly unless you track how long it actually takes. Most MSPs underestimate UAT effort because they have never measured it. Start tracking time spent on UAT across projects — preparation, test case creation, client support during execution, defect triage, and sign-off. After three or four projects, you will have real data to inform your estimates. As a rough guide, UAT typically consumes fifteen to twenty percent of total project effort for a mid-market ERP implementation.

Free Forever Plan Available

Build Your MSP's UAT Practice on Solid Foundations

LogicHive gives MSPs reusable test libraries, consistent client reporting, and structured sign-off across every project — so your UAT quality scales with your business.

No credit card required