Specialised · Beyond Manual Testing

Mobile Application Testing

Mobile is not just small desktop. It is a different world: touch, sensors, battery limits, spotty connectivity, and constant interruptions. Testing mobile apps requires a fundamentally different mindset.

Specialised ISTQB Mobile Testing — CT-MAT ~15 min read

1 The Hook

An NZ transport app launched with great fanfare. It showed real-time bus locations, fare calculations, and contactless payment. The team had 500 automated tests. They tested on iPhone 14 Pro and Samsung Galaxy S23. Everything passed.

Two weeks after launch, reviews tanked. Users reported the app drained their battery in 3 hours. It crashed when they switched to airplane mode on the ferry. The payment screen broke when they rotated their phone. And on older Android devices — still common among students and seniors — the app was unusably slow.

The development team had never tested battery consumption. They had never tested on a device with less than 6GB RAM. They had never tested what happens when GPS drops in a tunnel. They had treated mobile like "small desktop."

Mobile testing is not web testing on a smaller screen. It is a specialised discipline with its own risks, tools, and techniques. ISTQB has an entire syllabus dedicated to it for good reason.

2 The Rule

Mobile application testing verifies that an app functions correctly across device types, operating systems, network conditions, and real-world usage patterns — while respecting the constraints of battery, memory, storage, and sensors.

Mobile apps live in a hostile environment. They share resources with 50 other apps. They run on hardware that varies by two orders of magnitude in performance. They are interrupted constantly. They must work without network access. Testing them requires thinking like a user on a bus, not a tester at a desk.

3 The Analogy

Analogy

Testing a scuba diving computer.

A desktop app is like testing a calculator on a desk: controlled environment, reliable power, no distractions. A mobile app is like testing a dive computer: it must work underwater, in the dark, with gloved hands, under pressure, with limited battery, and if it fails, the consequences are serious. You cannot test a dive computer by dipping it in a bathtub. You need depth, cold, salt, and time. Mobile apps need the real world: spotty signal, background apps, low battery, and gloved fingers.

4 Device Fragmentation: The Core Challenge

Android alone has over 24,000 distinct device models. Screen sizes range from 3.5 inches to 8 inches. RAM ranges from 1GB to 16GB. Processors range from budget MediaTek chips to flagship Snapdragon chips. OS versions range from Android 8 to Android 14.

iOS fragmentation is lower but still significant:

  • iPhone SE (small screen, Touch ID) vs iPhone 15 Pro Max (large screen, Face ID, Dynamic Island)
  • iPad (touch only) vs iPad Pro with Magic Keyboard (pointer + keyboard)
  • iOS 15, 16, 17 — each with different capabilities and permissions models

Testing strategy:

  • Tier 1 (comprehensive): Your top 3 devices by user base. Test everything.
  • Tier 2 (smoke): The next 5-7 devices. Run critical path tests.
  • Tier 3 (cloud): Spot-check on BrowserStack or Sauce Labs device cloud.
  • Tier 4 (unsupported): Devices below a defined threshold (e.g., Android < 8, iOS < 15).

5 Interruptions and App Lifecycle

Mobile apps are interrupted constantly. Your tests must verify graceful handling of every scenario.

Incoming Phone Call

Does the app pause correctly? Does audio stop? Does the call end return the user to the correct screen with data intact? Does a video call (FaceTime, WhatsApp) behave differently from a voice call?

Push Notifications

Test notification display, deep linking to the correct screen, and behaviour when the user dismisses or taps. Test with 50+ notifications queued. Test notification permission flows (denied, allowed, later).

Backgrounding and Foregrounding

Switch to another app for 30 seconds, 5 minutes, 2 hours. Does the app resume correctly? Is unsaved data preserved? Does the session timeout? On iOS, apps may be terminated by the system while backgrounded — verify state restoration.

Low Battery and Power Saving Mode

iOS Low Power Mode and Android Battery Saver restrict background activity, reduce animation frame rates, and may disable location services. Test your app with these modes enabled.

Device Rotation and Foldables

Rotate during form entry, during video playback, during payment. Test foldable phones (Samsung Z Fold/Flip) where the screen dimensions change dynamically.

6 Gestures and Input Methods

Mobile users do not use mice. They tap, swipe, pinch, long-press, drag, and shake. Each gesture must be tested for responsiveness, accuracy, and accidental trigger prevention.

  • Tap targets: Minimum 44x44 points (iOS) or 48x48dp (Android). Test with actual fingers, not just a mouse pointer.
  • Swipe gestures: Verify swipe direction, velocity sensitivity, and edge cases (very short swipes, diagonal swipes).
  • Pull-to-refresh: Does it trigger when scrolling up from the middle of a list? Does it respect the user's scroll position?
  • Pinch-to-zoom: Test on images, maps, and documents. Verify that double-tap zoom also works for accessibility.
  • Long-press: Context menus, selection modes, and drag-reorder must not trigger accidentally during scrolling.
  • 3D Touch / Haptic Touch: On supported iPhones, test peek-and-pop and quick actions.
  • Stylus and external keyboards: On tablets, test with Apple Pencil, S Pen, and Bluetooth keyboards.

7 Offline Mode and Connectivity

Mobile networks are unreliable. Ferries, tunnels, rural roads, and basement car parks all create dead zones. Your app must handle them gracefully.

  • Offline functionality: What can the user do without network? Read cached content? Fill forms that sync later? Nothing?
  • Sync behaviour: When connectivity returns, does the app sync automatically? Does it handle conflicts (e.g., user updated a form on two devices)?
  • Queueing: Actions taken offline should queue and execute in order when online. Users must see the queue status.
  • Data usage: On mobile data (not WiFi), does the app warn before downloading large files? Can users restrict background data?
  • Network switching: Switch from WiFi to 4G mid-action. Switch from 4G to airplane mode. Does the app recover gracefully?

8 App Store Compliance

Apple and Google have strict guidelines. Violations can get your app rejected or removed.

Apple App Store Review Guidelines

Key tester-relevant rules: Apps must not crash. Sign-in with Apple is required if you offer other third-party sign-in options. In-app purchases must use Apple's system (no external payment links). Apps must request permissions only when needed and explain why. Privacy nutrition labels must be accurate.

Google Play Store Policies

Key tester-relevant rules: Target API level must be recent (currently API 33+). Apps must handle runtime permissions correctly. Deceptive behaviour (fake buttons, hidden functionality) results in immediate removal. Accessibility services must not be abused.

Testing tip: Review the store guidelines before testing begins, not after development is complete. Many rejections are for issues that testers could have caught: missing permissions explanations, broken navigation, or misleading screenshots.

9 NZ Context: Devices, Carriers & Accessibility

Every mobile market is different. What passes in London may fail in Dunedin. Here is the local picture.

Device matrix for NZ apps

As of 2026, NZ mobile usage skews slightly more iOS than the global average (iOS ~55–60%, Android ~40–45%) but Android still dominates in price-sensitive segments (students, seniors, rural users). A pragmatic NZ test matrix:

  • iOS Tier 1: Latest iPhone (15/16), one from 2-3 years ago (iPhone 13), one budget/older (iPhone SE 2nd/3rd gen)
  • iOS tablet: Current iPad — government and education sectors use these heavily
  • Android Tier 1: Current flagship (Samsung S24/Pixel 8), mid-range (Samsung A54 or similar), budget (a device around $300)
  • OS versions: Latest two iOS majors, Android 11→current (older Androids are common in NZ due to device longevity)
  • Foldables: If your users include executives or power users, Samsung Z Fold/Flip have non-trivial NZ share
  • Accessibility devices: an iPhone with VoiceOver enabled, an Android with TalkBack enabled — at least one of each in every test cycle

NZ carrier and network realities

NZ has three main mobile networks (Spark, One NZ, 2degrees) with meaningful coverage differences. You cannot assume metropolitan-grade 5G — most of the country outside main centres is 4G, and rural areas drop to 3G or no signal.

  • Coverage blackspots: Interisland ferry, Tongariro Crossing, Catlins, many West Coast roads, parts of Central Otago. Apps used by travellers, drivers, or outdoors users will hit these.
  • Data caps and plans: Many NZ plans still have meaningful data limits. An app that silently uses 500 MB/day will be uninstalled. Test your data consumption.
  • Roaming: Aussie travel is extremely common. Test behaviour when the device is roaming (push notifications, background refresh, payment verification SMS).
  • Emergency Mobile Alerts (EMA): NZ’s emergency alert system can interrupt your app with full-screen alerts. Does your app survive the interruption mid-transaction?
  • 2FA SMS delays: NZ SMS delivery is generally fast but can be delayed during outages. Test what happens if the code takes 60–120 seconds to arrive.

Mobile accessibility in NZ

The Web Accessibility Standard 1.2 applies to web content on mobile, but native mobile apps fall under equivalent obligations via the Human Rights Act 1993 and public-sector procurement contracts. Targets to test:

  • iOS VoiceOver: every interactive element has a meaningful accessibility label, actions are discoverable, focus order matches visual order
  • Android TalkBack: same principles; also test the reading-order in scrollable lists
  • Dynamic Type (iOS) / Font scaling (Android): the layout must not break at the largest accessibility font size
  • Reduce Motion: animations respect the OS setting
  • Colour contrast: WCAG 2.2 AA applies — 4.5:1 body, 3:1 large text
  • Touch target size: iOS 44×44 pt, Android 48×48 dp, WCAG 2.2 SC 2.5.8 baseline 24×24 CSS px
  • Haptics and audio cues: do not rely on sound alone; do not rely on haptics alone
  • Te reo Māori: if your agency or brand commits to bilingual content, verify te reo strings render correctly including macrons (Māori, not Maori)

Privacy on mobile

The NZ Privacy Act 2020 applies to mobile data collection. Mobile-specific checks beyond the web equivalents:

  • Permissions are requested only when needed, with clear rationale strings
  • Background location collection is disclosed prominently — this is the single most common NZ privacy complaint pattern for mobile apps
  • Third-party SDKs (analytics, ads, attribution) are audited for what they collect and where they send it (IPP 11 and IPP 12)
  • Deleting the app clears local caches, tokens, and cached PII
  • Biometric data (Face ID, Touch ID) stays on-device and never leaves — verify in network traces
NZ real-world testing tips
  • Take a test device on the Wellington-Picton ferry for 3 hours — the coverage chaos is a genuine stress test
  • Test on a tramp in a coverage blackspot with offline expectations documented
  • Simulate airplane mode at various points in a transaction flow
  • Test during a real Emergency Mobile Alert test (quarterly nationwide test)
  • Put the device in a fridge for 10 minutes then test — cold batteries lose capacity rapidly

10 Tools in Action

Appium — Cross-platform automation

Appium is the open-source standard for mobile test automation. It uses the WebDriver protocol to control both iOS and Android devices from a single test script.

// Appium test example (WebDriverIO)
describe('Login Flow', () => {
  it('should login with valid credentials', async () => {
    await $('~email').setValue('test@example.co.nz');
    await $('~password').setValue('KiwiTest99!');
    await $('~loginButton').click();
    await expect($('~dashboard')).toBeDisplayed();
  });
});

Appium supports real devices, emulators, and cloud services. Its main downside is speed — tests can be slow due to the WebDriver protocol overhead.

Espresso (Android) and XCUITest (iOS)

Native frameworks provided by Google and Apple. Faster and more reliable than Appium because they run inside the app process. The trade-off: separate codebases for iOS and Android. Best for teams with platform-specific expertise.

Firebase Test Lab / AWS Device Farm

Cloud services that run your tests on hundreds of real devices. Firebase Test Lab is cost-effective for Android. AWS Device Farm supports both iOS and Android. Both provide logs, screenshots, and videos of failed tests.

Android Studio Profiler / Xcode Instruments

For performance testing: measure CPU, memory, battery, and network usage. Identify memory leaks, excessive wake locks, and battery-draining location requests. Essential for apps that must run efficiently on budget devices.

11 Common Mistakes

🚫 Testing only on emulators

I used to think: The iOS Simulator and Android Emulator are good enough.
Actually: Emulators do not replicate real-world performance, thermal throttling, memory pressure, or gesture accuracy. A bug that never appears on an emulator can crash every 5 minutes on a real iPhone 8. Always verify on physical devices before release.

🚫 Testing only on the latest flagship devices

I used to think: iPhone 15 Pro and Galaxy S24 represent our users.
Actually: Many users are on 3-5 year old devices with degraded batteries and limited RAM. In NZ, older iPhones and budget Android devices are common among students, seniors, and rural users. Test on a 3-year-old mid-range device at minimum.

🚫 Ignoring battery and data consumption

I used to think: Performance means fast response time.
Actually: On mobile, performance also means battery life and data usage. An app that drains 20% battery per hour or downloads 500MB on mobile data will be uninstalled quickly. Use Xcode Instruments and Android Profiler to measure.

🚫 Not testing the update path

I used to think: Fresh installs are the only thing that matters.
Actually: Most users update, not reinstall. Test upgrading from the previous version: does data migrate correctly? Do preferences persist? Does the new version crash on first launch if old cached data exists?

12 Now You Try

🎯 Mobile Test Plan Exercise

Scenario: You are testing an NZ banking app that lets users check balances, transfer money, and pay bills. Design a test plan that covers:

  1. 3 interruption scenarios and expected behaviour
  2. 2 offline scenarios
  3. 1 gesture-specific test
  4. 1 battery/performance concern
  5. 1 app store compliance check

13 Self-Check

Click each question to reveal the answer.

Q1. Why is mobile testing fundamentally different from web testing?

Mobile apps run in a resource-constrained, interruption-heavy environment with unique input methods (touch, sensors) and platform-specific lifecycle rules. They must handle offline mode, battery limits, and OS-level restrictions that web apps do not face.

Q2. What is the minimum recommended tap target size for iOS and Android?

iOS: 44x44 points. Android: 48x48 density-independent pixels (dp). Smaller targets lead to mis-taps and user frustration, especially for older users or those with motor impairments.

Q3. What should happen when an app is backgrounded during a critical action?

The app should preserve state and allow resumption. For sensitive actions (payments, form submission), the app should either complete in the background or safely abort and notify the user on return. Never leave the user in an ambiguous state.

Q4. Why must you test on physical devices, not just emulators?

Emulators do not replicate real-world performance, thermal throttling, memory pressure, gesture accuracy, or hardware-specific bugs. A GPU driver bug on a real Samsung device will never appear on an emulator.