Privacy Testing
The NZ Privacy Act 2020 is a statute, not a guideline. Breaching it can trigger notifiable-breach reporting to the OPC, compensation orders, and reputational damage your CEO will not recover from. Testers catch these problems before they become incidents.
1 The Hook
In 2023 a well-known NZ retailer shipped a new loyalty app. It collected name, email, phone, date of birth, postcode, purchase history, and — through a third-party SDK the dev team did not read closely — coarse location in the background. There was no consent screen on first launch. The privacy policy was two clicks deep and predated the app.
Six months later a researcher at a local university posted on Mastodon about the background location collection. The story ran on Stuff by Thursday. The Office of the Privacy Commissioner opened an enquiry. The retailer’s legal team sent a notifiable-breach notice to 80,000 affected customers. Engineering spent two sprints pulling the SDK out. The CX team took a month of angry calls.
The QA team had never asked "what personal information does this app collect, under which IPP, and is the collection purpose disclosed?" Nobody on the team knew the Privacy Act well enough to know those were the right questions.
Privacy testing is quality testing. If personal information is leaking, the product is broken regardless of whether the feature "works."
2 The Rule
Privacy testing verifies that personal information is collected, stored, used, disclosed, and destroyed in line with the NZ Privacy Act 2020 — specifically the 13 Information Privacy Principles (IPPs).
Unlike functional testing, privacy testing is policy-driven: you start from the IPPs and the product’s Privacy Impact Assessment (PIA), and you verify the build matches them. If the PIA says "we only collect email," your test is: prove nothing else leaves the device or browser.
3 The Analogy
Privacy is a contract between you and the user.
When a tenant signs a lease, the landlord promises to only enter the house with 24 hours’ notice. If the landlord lets themselves in to "check the curtains" without warning, the tenant has grounds to sue — not because the curtains were damaged, but because the trust contract was broken. Your privacy policy is that lease. Every piece of data your product collects, stores, shares, or retains longer than promised is a potential breach of contract with the user. Privacy testing is the tenant’s inspection: are you doing what the lease says, no more, no less?
4 The 13 Information Privacy Principles
The NZ Privacy Act 2020 defines 13 IPPs (12 original + IPP 3A, which applies in full from 1 May 2026 for indirect-collection scenarios). Each IPP is a testable rule:
| IPP | Principle | Test question |
|---|---|---|
| 1 | Purpose of collection | Is every field collected needed for a lawful, clearly defined purpose? |
| 2 | Source of information | Do we collect directly from the person, unless an exception applies? |
| 3 | Collection from the individual | Does the UI display a collection notice covering who, why, disclosures, rights? |
| 3A | Indirect collection notice (from May 2026) | When we receive data about a person from a third party, do we notify the person? |
| 4 | Manner of collection | Is the collection lawful, fair, and not unreasonably intrusive? |
| 5 | Storage and security | Is data encrypted at rest and in transit, with access controls? |
| 6 | Access by the individual | Can a user request and receive their own data within 20 working days? |
| 7 | Correction of information | Can the user correct inaccuracies, with an audit trail? |
| 8 | Accuracy before use | Do we check the data is accurate and current before we use it for decisions? |
| 9 | Retention | Is there a documented retention schedule, with automated deletion? |
| 10 | Use of information | Do we only use data for the purpose it was collected for? |
| 11 | Disclosure | Do we only share data where the user consented or a lawful basis exists? |
| 12 | Cross-border disclosure | If data leaves NZ, is the recipient bound by comparable safeguards? |
| 13 | Unique identifiers | Do we avoid creating unnecessary universal IDs and disclose ones we do use? |
IPP 3A is new and important. Since the Privacy Amendment Act, agencies that collect personal information indirectly (from a data broker, an API, a partner) must tell the individual about it. Full compliance is required from 1 May 2026. Any product that pulls user data from a third party needs a test case here.
5 Testing Techniques
Data minimisation testing (IPP 1)
Start from the PIA field list. Compare to the actual request payload.
- Open DevTools → Network. Trigger the collection point. Inspect the POST body.
- Confirm every field in the request is on the PIA’s "collect" list.
- Look for
device_id,advertising_id,user_agent, IP geolocation — each of these is personal information and each needs a justification. - If the form asks for date of birth "for age verification," check whether a yes/no "18+?" would serve the same purpose. If yes, DoB fails IPP 1.
Collection notice testing (IPP 3)
- Every form that collects personal information displays, at or before collection: who you are (agency), purpose, intended recipients, whether supply is mandatory, consequences of not supplying, right to access and correct
- Notice is in plain English (and te reo where relevant)
- Notice is readable before the Submit button is enabled, not buried in a "terms" checkbox
- Privacy policy is linked from the notice and is current (dated within the last 12 months)
Storage & security testing (IPP 5)
- Data at rest: database column encrypted where sensitive (passwords hashed with bcrypt/argon2, PII fields encrypted or tokenised)
- Data in transit: TLS 1.2+ enforced end to end (check with
testssl.shor SSL Labs) - Access logs exist and cover PII reads, not just writes
- Backups are encrypted and access-controlled
- No PII in error messages, stack traces, URL parameters, or browser console logs
- No PII leaking to third-party analytics (inspect the Network tab for requests to
google-analytics.com,segment.io, etc. while entering data)
Access & correction testing (IPP 6 & 7)
- A "download my data" or "view my information" path exists and returns the complete record within a reasonable time (statute: 20 working days)
- Exported data is in a common, readable format (JSON, CSV, PDF)
- Correction request has a clear user-facing path and produces an audit trail
- The exported data actually matches what the DB holds for the user (test with a seeded account)
Retention testing (IPP 9)
- There is a documented retention schedule per data type
- A scheduled job deletes or anonymises records past retention
- "Soft delete" records are hard-deleted after grace period
- Backups respect retention (deleted user data is not frozen in a 7-year backup forever)
- Account deletion removes all personal info including support ticket attachments, chat transcripts, and audit rows
Cross-border testing (IPP 12)
- List every third-party service in the product (analytics, payments, email, hosting, AI APIs)
- For each, identify the data sent and the jurisdiction of the processor
- If data leaves NZ, verify a lawful basis: user consent, DPA with comparable safeguards, or one of the Privacy Act exceptions
- AI APIs (OpenAI, Anthropic, Google, Azure) send data overseas — if the product uses them, there must be a disclosed basis
Indirect-collection testing (IPP 3A, from May 2026)
- Identify every source of personal information that is not the individual (partner APIs, data brokers, social logins, referral programmes)
- For each, verify a notification is sent to the person when their data enters the system
- Notification contains: what was collected, from whom, purpose, their rights (access/correction)
- Check exception handling — IPP 3A has narrow exceptions (impracticable, serious threat). Document why each applies.
6 Breach Detection & Response
Since December 2020, the Privacy Act has required mandatory breach notification. If a privacy breach is "likely to cause serious harm" to affected individuals, the agency must notify both the Office of the Privacy Commissioner and each affected person as soon as practicable. Failing to notify is itself an offence.
As a tester, you are not the incident response team — but you validate that the detection and response capability exists.
- There is a documented breach response plan with named roles and timeframes
- The system can detect unusual access patterns (log analysis, SIEM alerts) — run a mock "bulk-read by a single user" and confirm it triggers an alert
- There is a rehearsed notification template for both the OPC and affected individuals
- The agency knows who is authorised to notify (Privacy Officer, CISO)
- Test data never flows into production alerting — but production alerts genuinely fire in staging
- The on-call documentation covers privacy breaches, not only uptime incidents
Physical, psychological, financial, or reputational harm. Examples: health info leaked, login credentials exposed, financial details disclosed, location history exposed to a stalker, sexual orientation outed. The OPC’s NotifyUs online form walks through the severity assessment — worth bookmarking for anyone doing privacy work.
7 The Privacy Impact Assessment (PIA)
A PIA is a document the product team produces before building a service. It lists what personal info will be collected, why, how it will flow, the IPP risks, and the mitigations. It is the source-of-truth for privacy testing.
Good PIAs include:
- Data inventory (field name, data type, source, purpose, retention, recipients)
- Data flow diagram (user → browser → API → DB → analytics → partner)
- IPP-by-IPP risk assessment with mitigations
- Cross-border transfer section if any data leaves NZ
- AI/automated-decision section if AI is involved in processing
- Named Privacy Officer sign-off
If there is no PIA, stop. Ask for one before writing tests. Every PIA assertion is a test case. Every missing entry is a risk.
Template: The OPC publishes a free PIA toolkit at privacy.org.nz.
8 Common Mistakes
🚫 Treating privacy as "legal's job"
I used to think: Privacy is the lawyers’ department.
Actually: Legal drafts the policy. QA verifies the build matches the policy. Nothing gets verified if testers do not own it. Read the PIA as carefully as the acceptance criteria.
🚫 Not inspecting third-party requests
I used to think: Analytics and ad pixels are out of scope.
Actually: Facebook Pixel, Google Analytics, Hotjar, and AI APIs frequently ship personal information overseas. Open DevTools Network tab with filter -your-domain.com and watch what leaves your page on a user workflow. If you do not know what you see, file a ticket.
🚫 Using real personal data in test and staging
I used to think: Cloning prod is the fastest way to get realistic test data.
Actually: Cloning prod propagates PII into lower environments that usually have weaker controls. One laptop with a DB export is a breach. Use synthetic or tokenised data, or at minimum an irreversibly anonymised sample.
🚫 Confusing the Privacy Act with GDPR
I used to think: If we are GDPR-compliant, we are covered in NZ.
Actually: They overlap but are not identical. GDPR has fines up to €20M or 4% turnover; NZ has compensation orders, compliance notices, and criminal penalties for specific offences. IPP 3A is a distinctly NZ rule. Cross-border (IPP 12) has different exceptions to GDPR Chapter V. Do not assume equivalence.
🚫 Forgetting AI-processed data is still personal data
I used to think: Once I send a support ticket to an LLM for summarisation, it is not "our" data anymore.
Actually: The moment personal information goes to an AI API, you are making a disclosure under IPP 11 and a cross-border transfer under IPP 12. You need a basis, a DPA with the vendor, and (from May 2026) potentially a notification under IPP 3A if AI output becomes new personal information.
9 Now You Try
Task: Pick any NZ sign-up flow (a retailer loyalty app, a news site paywall, or a gov service). Produce a short privacy report covering these steps:
- Walk the sign-up flow with DevTools Network tab open. List every domain that receives data.
- For each field the form asks for, write one sentence on its purpose under IPP 1. Flag anything that seems excessive.
- Find the privacy policy. How many clicks from the sign-up form? Does it cover the IPP 3 notice elements?
- Find the "access my data" path. Is it there? How long does it promise?
- Find the "delete my account" path. Does it say what happens to backups?
- If you see any data going to the US, EU, or Australia — is there a consent or stated basis under IPP 12?
10 Self-Check
Click each question to reveal the answer.
Q1. How many Information Privacy Principles are in the NZ Privacy Act 2020?
13. The original 12 plus IPP 3A, which covers indirect collection and is fully in force from 1 May 2026.
Q2. What triggers mandatory breach notification under the Privacy Act 2020?
A privacy breach that is likely to cause serious harm to affected individuals — physical, psychological, financial, or reputational. The agency must notify both the OPC and affected individuals as soon as practicable.
Q3. What does IPP 1 (purpose of collection) mean for a form that asks for DoB?
You need a lawful purpose for collecting DoB. If "age verification over 18" is the real purpose, a yes/no "Are you 18+?" would serve and would be less intrusive. Collecting DoB anyway would fail IPP 1.
Q4. You notice the sign-up form’s POST payload includes a field the PIA does not list. What do you do?
Raise a defect. Either the PIA is incomplete (update it) or the build is over-collecting (remove the field). Either way the PIA and the build must match before release.
Q5. A product sends support ticket text to an overseas LLM for summarisation. Which IPPs are in play?
IPP 10 (use for the purpose collected), IPP 11 (disclosure to a third party), IPP 12 (cross-border transfer). You need a disclosure basis, a data-processing agreement with the vendor, and — if the AI-generated summary becomes new personal information shared with another party — potentially IPP 3A notification from May 2026.
Q6. What is a PIA and why does a tester care about it?
A Privacy Impact Assessment is the document produced by the product team listing what personal information the service will collect, why, how it flows, and how each IPP is satisfied. Every assertion in the PIA is a test case. If there is no PIA, there is no privacy testing scope.