Specialised · Beyond Manual Testing

Compatibility Testing

Your users do not all have the latest MacBook Pro and Chrome. They have old Android phones, Safari on iPad, government-mandated Edge browsers, and 1366x768 laptops. Compatibility testing finds where your product breaks for real people.

Specialised Cross-Browser — Device Matrix ~15 min read

1 The Hook

An NZ local council launched a new rates payment portal. The development team all used MacBooks with Chrome. Every test passed. The design was beautiful.

On launch day, 30% of users could not complete payment. The problem? The payment modal used a CSS Grid layout that broke in Safari 14 — still common on older iPads and iPhones. The "Pay Now" button was off-screen and unclickable. Worse, the fallback did not work because the JavaScript polyfill was not loaded for that browser version.

The council received angry emails from ratepayers who tried three times, got charged twice (because the success confirmation failed but the payment went through), and demanded refunds. The fix took two weeks. The reputation damage lasted much longer.

The team tested on 1% of their user base's configurations. Compatibility testing would have caught this on day one of QA.

2 The Rule

Compatibility testing verifies that software functions correctly across the full range of environments, configurations, and platforms that real users employ.

It is not about supporting everything. It is about knowing what you support, testing those combinations, and being honest with users about what does not work.

3 The Analogy

Analogy

A car manufacturer testing their vehicle on NZ roads.

A car might perform perfectly on German autobahns. But NZ has gravel roads, steep mountain passes, single-lane bridges, frost heaves, and corrosion from salt air. A car that cannot handle these is useless here, no matter how well it performs overseas. Compatibility testing is the equivalent of road-testing your software on the actual surfaces your users drive on: old browsers, small screens, slow connections, and outdated operating systems.

4 Types of Compatibility Testing

Browser Compatibility

Test on Chrome, Firefox, Safari, Edge, and any browser your analytics show users actually have. Include at least two versions back from current. In NZ government and enterprise, Edge is often mandated. In education, Chrome dominates. In rural areas, users may be on older Safari due to older iPhones.

Operating System Compatibility

Windows 10/11, macOS, iOS, Android, and sometimes Linux. Each OS has different font rendering, file handling, security permissions, and input methods. An app that works on Windows may fail on macOS due to case-sensitive file paths or different certificate handling.

Device Compatibility

Desktop (various resolutions), tablet (iPad, Android tablets), and mobile (dozens of screen sizes). Touch targets that work on a 6-inch phone may be impossible on a 4-inch screen. Hover-dependent features fail entirely on touch devices.

Backward Compatibility

Will old data files open in the new version? Will an API change break existing mobile app versions that users have not updated? Backward compatibility testing prevents forcing users to upgrade immediately.

Network Compatibility

Test on fast fibre, slow rural ADSL, 3G mobile, and intermittent connections. Use Chrome DevTools Network throttling to simulate "Slow 3G" (common in parts of rural NZ) and verify the app degrades gracefully.

5 Test Matrix Strategy

You cannot test every combination. A pragmatic matrix focuses on what your users actually use. Start with analytics, then apply risk-based prioritisation.

Sample NZ compatibility matrix for a web app
BrowserWindows 11macOSiOSAndroid
Chrome (latest)TestTestSpotTest
Chrome (n-1)TestSpot-Spot
Firefox (latest)SpotSpot--
Safari (latest)-TestTest-
Edge (latest)Spot---
Samsung Internet---Spot

Test = full regression suite. Spot = smoke test on critical user journeys. - = not tested (insufficient user base or not applicable).

Key principle: Test deeply on your top 3-4 combinations (usually 70-80% of traffic). Spot-check the long tail. Ignore combinations with <1% usage unless they represent high-value users (e.g., a CEO on an old iPad).

6 Tools in Action

BrowserStack — Cloud device lab

BrowserStack gives you instant access to 3,000+ real browsers and devices. No emulators. Real iPhones, real Samsung Galaxys, real macOS machines. Perfect for spot-checking Safari on old iOS versions or testing on devices you do not own.

NZ pricing is reasonable for small teams, and they offer free tiers for open source. Integrates with Selenium, Cypress, and Playwright for automated cross-browser testing.

Sauce Labs — Automated cross-browser

Sauce Labs focuses on CI/CD integration. Run your existing Selenium/Playwright/Cypress tests against their cloud grid with a single configuration change. Includes visual regression testing to catch unintended UI changes.

Chrome DevTools Device Mode

For quick responsive checks, Chrome DevTools Device Mode (F12 → Toggle device toolbar) simulates viewports, touch events, and user agents. Add custom device presets for your analytics top devices. Not a replacement for real device testing, but excellent for rapid iteration.

Virtual Machines: For testing Internet Explorer or legacy Edge on Windows, Microsoft provides free VMs for VirtualBox, VMware, and Hyper-V. Essential for enterprise apps that must support older browsers.

7 Responsive Design Testing

Responsive design is not just about making things smaller. It is about adapting layout, content, and interaction to each context.

  • Breakpoints: Test at your defined breakpoints (e.g., 320px, 768px, 1024px, 1440px) and between them.
  • Touch targets: All interactive elements must be at least 44x44 pixels on touch devices.
  • Font size: Text must remain readable without horizontal scrolling at any width.
  • Images: Verify srcset and sizes attributes serve appropriate image resolutions. Mobile users on limited data plans should not download 4K desktop hero images.
  • Navigation: Hamburger menus, dropdowns, and mega-menus must all work on touch. Hover-only menus are broken on mobile.

8 Common Mistakes

🚫 Testing only on developer machines

I used to think: If it works on my MacBook in Chrome, it works everywhere.
Actually: Developers represent less than 0.1% of the user base. Your users have old hardware, outdated browsers, corporate firewalls, and ad blockers. Test on what they use, not what you use.

🚫 Supporting too many browsers

I used to think: We must support every browser since Netscape.
Actually: Supporting Internet Explorer 11 in 2024 costs more than it earns. Use analytics to identify what your users actually have. Define a support policy, publish it, and stick to it. Dropping IE11 support can save 30% of frontend development time.

🚫 Relying on emulators for everything

I used to think: Chrome DevTools mobile emulation is enough.
Actually: Emulators approximate layout but not behaviour. They do not replicate real touch events, OS-level font rendering, GPU acceleration bugs, or vendor-specific CSS quirks. Always verify on at least a few real devices before release.

🚫 Ignoring orientation changes

I used to think: Users keep their phones in portrait.
Actually: Many users rotate to landscape for forms, videos, and reading. If your layout breaks on rotation, or if modal dialogs get cut off, that is a real bug. Test both orientations on real devices.

9 Now You Try

🎯 Build a Compatibility Matrix

Task: Imagine you are testing an NZ online booking system for holiday homes. Based on typical NZ user patterns, build a compatibility matrix with:

  1. At least 4 browser/OS combinations to test fully
  2. At least 3 combinations to spot-check
  3. 2 combinations you will explicitly not support (with justification)
  4. 1 device-specific consideration for rural NZ users

10 Self-Check

Click each question to reveal the answer.

Q1. What is the difference between cross-browser testing and responsive design testing?

Cross-browser testing verifies functionality across different browsers and versions. Responsive design testing verifies that layout and usability adapt correctly to different screen sizes and orientations. Both are needed.

Q2. Why are emulators not enough for mobile compatibility testing?

Emulators approximate but do not replicate real device behaviour. They miss touch gesture nuances, OS-level font rendering, GPU bugs, vendor CSS quirks, and real-world performance constraints like thermal throttling.

Q3. How do you decide which browser versions to support?

Use analytics data from your actual users, combined with business risk. Support the combinations that represent the majority of your traffic and revenue. Publish a support policy so users know what to expect.

Q4. What is backward compatibility and why does it matter?

Backward compatibility ensures new versions work with old data, old APIs, and old clients. It matters because users do not upgrade immediately. A mobile app update that breaks the API for old app versions strands users until they update.