NZ Government (AoG) Standards
Every public service agency in Aotearoa must meet the All-of-Government web standards. If you can test for them — in an AI-assisted workflow — you are the most employable QA in the country.
1 The Hook
It is April 2026. A Crown entity in Wellington launches a new public-facing portal for citizens to apply for a government service. Six weeks in, a complaint lands with the Office of the Human Rights Commissioner: the portal’s CAPTCHA blocks blind users, the form labels vanish on mobile, and the PDF receipts are not tagged. The portal is technically "WCAG 2.1 AA" — but the standard moved to WCAG 2.2 AA on 17 March 2025, and the agency never re-audited.
The Government Chief Digital Officer (GCDO) requires the agency to remediate. A contractor is brought in at $180/hr. The fix takes three months. The project sponsor is asked pointed questions by the Minister’s office. The internal QA team is told: "This should have been caught before go-live."
What went wrong? The testers tested for accessibility in the abstract. They never learned the specific four AoG standards that apply to NZ government web products, the 2025 transition to WCAG 2.2, or how to evidence compliance in a way that survives an audit.
This is the gap this page fills. If you want to work on government contracts — and in NZ, a huge portion of tech spend is government — you need to know these standards cold.
2 The Rule
AoG testing is compliance testing. The standard tells you what to test, what "pass" looks like, and what evidence you must produce. Your job is to run the checks, produce the evidence, and flag gaps before the Ombudsman does.
"All-of-Government" (AoG) means the standards apply uniformly across every public service department, Crown entity in the Executive branch, and (by mandate from the GCDO) most non-public service agencies. They are not guidelines; they are mandatory requirements issued under the Public Service Act 2020 and enforced by the Government Chief Digital Officer.
3 The Analogy
AoG standards are the building code for government websites.
When you build a house in NZ, the Building Code tells you exactly what the handrails, stair treads, and fire egress must look like. A Building Warrant of Fitness proves you met the code. You do not get to argue about whether the rules are a good idea — you meet them, you document it, you move on. AoG standards work the same way. The Web Accessibility Standard is the "stairs and handrails" of your site. The Web Usability Standard is "signage and egress." The Digital Service Design Standard is "the inspector’s checklist." Miss any of them and you cannot get occupancy.
4 The Four AoG Standards You Must Know
The NZ Government Web Standards live at digital.govt.nz. Four matter to testers:
| Standard | What it mandates | In force |
|---|---|---|
| Web Accessibility Standard 1.2 | Every web page must conform to WCAG 2.2 Level AA. Replaces the 1.1 version (WCAG 2.1) in March 2025. | Effective 17 Mar 2025 |
| Web Usability Standard 1.4 | Consistent global elements: govt.nz link on home page, standard privacy policy structure, accessible copyright statement, meaningful link text. | Effective 17 Mar 2025 (replaced 1.3) |
| Digital Service Design Standard | Ten-point design principles for government digital services: user-centred, inclusive, accessible, safe, secure, outcome-focused. | Current |
| NZISM (Information Security Manual) | Baseline security controls for government systems: authentication, identity, logging, supply chain, zero trust. | Continuously updated by GCSB/NCSC |
WCAG 2.2 AA — What changed from 2.1
The 2.2 update removed one obsolete criterion (4.1.1 Parsing) and added six new Level A and AA success criteria. If your last a11y training was pre-2025, you are missing these:
- 2.4.11 Focus Not Obscured (AA): When an element has keyboard focus, it cannot be entirely hidden by a sticky header, cookie banner, or floating widget.
- 2.4.12 Focus Not Obscured (Enhanced, AAA): No part of the focus indicator may be hidden.
- 2.4.13 Focus Appearance (AAA): Focus indicators must have minimum size and contrast.
- 2.5.7 Dragging Movements (AA): Any drag-and-drop interaction must have a single-pointer alternative (users with motor impairments cannot drag).
- 2.5.8 Target Size (Minimum, AA): Touch targets must be at least 24×24 CSS pixels (with exceptions for inline text links).
- 3.2.6 Consistent Help (A): If help is offered on one page (chat, contact link, help page), it must appear in the same relative location on all pages.
- 3.3.7 Redundant Entry (A): Users must not be asked to re-enter information previously entered in the same session, unless essential.
- 3.3.8 Accessible Authentication (AA): Cognitive function tests (remember a password, solve a puzzle, identify images) must have an alternative.
- 3.3.9 Accessible Authentication (Enhanced, AAA): No cognitive function tests at all.
Interview gold: Nine out of ten candidates will name "WCAG 2.1 AA" when asked what the NZ standard requires. Say "WCAG 2.2 AA under Web Accessibility Standard 1.2, since 17 March 2025" — you will visibly surprise the interviewer.
5 How to Test Each Standard
Web Accessibility Standard 1.2 (WCAG 2.2 AA)
- Automated pass: Run
axe DevToolsandLighthouseon every key page. Save the JSON/HTML reports as evidence. - Keyboard-only pass: Unplug the mouse. Tab through every page. Confirm focus is never obscured by sticky headers (2.4.11).
- Screen reader pass: NVDA (Windows, free) + JAWS (enterprise, paid, dominant in NZ gov). Confirm form labels, landmarks, live regions.
- Target size pass: Inspect touch targets on mobile. Measure in CSS pixels. 24×24 minimum (2.5.8).
- Drag alternative pass: Any drag UI (file upload, slider, reorder) must work with click/keyboard (2.5.7).
- Authentication pass: Login, MFA, CAPTCHA must not require solving puzzles or recognising images without alternative (3.3.8).
- Redundant entry pass: Run a full application workflow. Are you asked to re-enter name, address, email? That fails 3.3.7.
- Consistent help pass: Is the "Contact us" or "Help" link in the same location on every page? (3.2.6)
Web Usability Standard 1.4
- Home page displays the agency name and/or logo
- Home page has a visible link to govt.nz
- Copyright statement is present and linked from the home page
- Privacy statement follows the standard NZ structure (what we collect, why, IPPs, how to request correction, how to complain to the OPC)
- Link text is meaningful out of context ("Apply for KiwiSaver" not "click here")
- Breadcrumbs on pages more than two levels deep
Digital Service Design Standard
This is a design-process standard, not a page-level check. As a tester you verify the evidence the product team produced:
- User research was done (interviews, personas, pain points documented)
- Service was tested end-to-end with real users, including users with disabilities and users with low digital confidence
- Success metrics are defined and measurable (task completion rate, time on task)
- The service has a plain-English name and purpose statement
- Māori language support where relevant (te reo labels, bilingual content)
NZISM alignment
Full NZISM testing is the domain of the security testing page. From an AoG compliance angle, verify:
- TLS 1.2+ enforced on all pages (NZISM crypto baseline)
- Session timeout and reauthentication align with agency classification (OFFICIAL, SENSITIVE, RESTRICTED)
- Authentication flows support RealMe or equivalent federated identity where applicable
- Audit logging captures access to personal information (required by both NZISM and Privacy Act)
- Supply chain: all third-party scripts and CDNs are documented and risk-assessed
6 AoG Compliance in an AI-Assisted Context
This is the part nobody else teaches. When your team uses AI to generate UI, copy, or test scripts, the AI does not know about AoG standards unless you tell it. You are the human-in-the-loop who enforces compliance.
When asking AI to generate a form, include the relevant AoG criteria in the prompt:
Generate a KiwiSaver enrolment form. Must comply with:
- WCAG 2.2 AA (Web Accessibility Standard 1.2)
- Touch targets >= 24x24 CSS pixels (SC 2.5.8)
- All inputs have associated <label> (not placeholder)
- Error messages linked via aria-describedby
- No drag-only interactions (SC 2.5.7)
- No cognitive-test CAPTCHA (SC 3.3.8)
- Privacy collection notice per NZ Privacy Act IPP 3
AI will happily produce code that looks accessible but fails 2.5.8 because its default button padding is 16×16. It will invent ARIA attributes that do not exist. It will generate "friendly" error messages with colour-only differentiation. Your job is to run the automated suite and the manual screen-reader pass on every AI-produced artifact, no exceptions.
Ask AI to: summarise axe JSON output into a compliance matrix, draft WCAG conformance statements from raw test logs, generate screen-reader test scripts from a user story. The AI drafts — you verify against the actual tool output. Never paste AI-produced compliance claims into an official report without running the underlying check yourself.
The AoG + AI job-security thesis: AI can generate markup. AI cannot certify compliance with a statute. The human who knows which standard applies, what the evidence must look like, and who to file it with is the one who signs off. That is the job that does not go away.
7 Evidence & Reporting
Compliance is not proved by saying "we tested it." You need an audit trail the GCDO, an agency’s internal audit team, or the Office of the Ombudsman could ask for. At a minimum your evidence pack should include:
- Accessibility Conformance Report (ACR): A filled VPAT (Voluntary Product Accessibility Template) listing every WCAG 2.2 AA criterion as Supports / Partially Supports / Does Not Support / Not Applicable, with notes.
- Automated scan outputs: axe JSON, Lighthouse reports, WAVE screenshots. Timestamped. Kept for the product lifespan.
- Manual test log: Which tester, which assistive tech (NVDA 2024.x, JAWS 2025, VoiceOver on iOS 18), which pages, which results.
- User test evidence: Session recordings (with consent) or written summaries from real users — including users with disabilities — completing the core journeys.
- Privacy Impact Assessment (PIA): Required for any service collecting personal information. Links compliance back to the Privacy Act IPPs.
- Sign-off trail: A named individual has signed a statement that the service meets AoG standards, with a date and a scope.
For testers on contract: Keep a personal portfolio of redacted ACRs and conformance statements you have contributed to. It is the single most persuasive asset in a gov-sector interview.
8 Common Mistakes
🚫 Citing WCAG 2.1 AA in 2026
I used to think: WCAG 2.1 AA is the NZ standard.
Actually: Web Accessibility Standard 1.2 moved NZ to WCAG 2.2 AA on 17 March 2025. Anyone still quoting 2.1 is a year behind. Know the six new 2.2 criteria cold.
🚫 Assuming AoG applies only to *.govt.nz sites
I used to think: My private-sector employer does not care about AoG.
Actually: Any vendor building for a public service agency, Crown entity, or District Health Board must deliver to AoG standards. Procurement contracts cite them directly. If you build gov software, you comply or you do not get paid.
🚫 Running axe and calling it compliance
I used to think: An axe clean report means WCAG conformance.
Actually: Automated tools catch ~30–40% of WCAG issues. The ACR requires a manual pass for the remainder. No axe run alone has ever satisfied an AoG audit.
🚫 Letting AI-generated UIs ship without a compliance pass
I used to think: Modern AI writes accessible code by default.
Actually: AI tools frequently produce non-semantic markup, invented ARIA, missing labels, and tiny touch targets. Every AI-generated artifact needs the same conformance test every hand-written one does.
🚫 Forgetting te reo Māori
I used to think: Bilingual content is a "nice to have."
Actually: The Digital Service Design Standard expects genuine consideration of te reo Māori and Te Tiriti obligations. Services aimed at Māori audiences or with general public reach should include te reo where practical. Macrons matter — "Maori" and "Māori" are different words.
9 Now You Try
Task: Pick one NZ government service page (try ird.govt.nz, msd.govt.nz, or health.govt.nz). Produce a one-page mini-audit covering all four AoG standards.
- Accessibility 1.2: Run axe. Note violations. Tab through the page — is focus ever obscured by a sticky header (SC 2.4.11)? Measure a few touch targets (SC 2.5.8).
- Usability 1.4: Home page — is there a visible link to govt.nz? Is the copyright statement linked? Does the privacy statement follow the standard structure?
- DSD: Search the site for "user research", "accessibility statement", or an equivalent — is there public evidence of the design process?
- NZISM (surface check): Is the site on TLS 1.3? (Browser padlock → Connection details.) Are login flows reasonable? Any obvious third-party script bloat?
- AI angle: Does the site disclose any AI-assisted features? If so, is there human oversight information?
10 Self-Check
Click each question to reveal the answer.
Q1. What is the current NZ Web Accessibility Standard and what WCAG version does it require?
Web Accessibility Standard 1.2, which requires WCAG 2.2 Level AA, effective 17 March 2025. Version 1.1 (WCAG 2.1) is superseded.
Q2. Name three WCAG 2.2 success criteria that did not exist in 2.1.
Any three of: 2.4.11 Focus Not Obscured, 2.5.7 Dragging Movements, 2.5.8 Target Size (Minimum), 3.2.6 Consistent Help, 3.3.7 Redundant Entry, 3.3.8 Accessible Authentication.
Q3. Who enforces AoG standards and what is the consequence of non-compliance?
The Government Chief Digital Officer (GCDO) sets and enforces the standards. Non-compliance can trigger a complaint to the Human Rights Commission or Office of the Ombudsman, mandatory remediation, procurement consequences for vendors, and reputational cost to the agency.
Q4. What is an Accessibility Conformance Report (ACR)?
A filled-in VPAT template listing every WCAG 2.2 AA success criterion against the product, marked Supports / Partially Supports / Does Not Support / Not Applicable, with tester notes. It is the central evidence artefact for AoG accessibility compliance.
Q5. When AI generates UI code, whose job is it to verify AoG compliance?
Yours, the human tester. AI does not know the specific NZ standards, does not produce audit-grade evidence, and cannot sign off on compliance with a statute. The tester primes the AI with the standards, runs the automated + manual passes on the AI output, and produces the conformance statement.
Q6. What evidence would you keep to prove WCAG 2.2 AA compliance in an audit?
A filled ACR/VPAT, timestamped axe and Lighthouse scan outputs, a manual test log (tester, date, AT version, pages, results), user-test summaries including participants with disabilities, and a dated sign-off statement with named scope.