Exploratory testing on mobile: the messy checks that find real bugs
Source: Dev.to
Portfolio version (canonical, with full context and styling)
What it is
Risk‑driven exploratory sessions where design, execution, and analysis happen together.
Platform context
Mobile (Android), where interruptions and device‑state changes are normal.
Timebox
Short, focused sessions – not long wandering play‑throughs.
Typical sessions range from 20 – 45 minutes.
Approach
- Charters
- Controlled variation
- Observation‑led decisions
Outputs
- Defects
- Observations that explain behaviour, with enough context to reproduce
Exploratory testing on mobile in practice
Chartered, time‑boxed sessions with controlled variation, producing defects, context notes, bug reports, and evidence.
Exploratory testing is often summarised as “testing without scripts”. In real mobile QA work, that description is incomplete. This article explains exploratory testing on mobile as it is actually applied in a practical workflow: session structure, risk focus, interruptions and recovery, and how this approach consistently finds issues that scripted checks often miss.
- Examples are drawn from a real Android mobile‑game pass, but the focus here is the method, not the case study.
- In practice, exploratory testing is a way of working where test design, execution, and analysis happen together.
- You are not following a pre‑written script. You are observing behaviour and choosing the next action based on risk, evidence, and what the product is doing right now.
- That does not mean “random testing”. It means structured freedom: you keep a clear intent, and you keep your changes controlled so outcomes remain interpretable.
Why mobile is different
Mobile products rarely fail under perfect conditions. They fail when something changes unexpectedly. On Android especially, many failure modes are contextual and lifecycle‑driven:
- Alarms, calls, and notifications interrupt active flows.
- Apps are backgrounded and resumed repeatedly.
- Network quality changes during critical moments (login, purchase, reward claim).
- UI must remain usable on small screens and unusual aspect ratios.
Applied insight:
For mobile exploration, compare performance across devices where possible and probe interruptions: lock screen, phone calls, network drops, switching Wi‑Fi/data, rotation, and kill/restart recovery.
Expert Voices
Radu Posoi – Founder, AlkoTech Labs (ex‑Ubisoft QA Lead)
Exploratory sessions target these risks directly instead of assuming a clean uninterrupted journey.
- Charter – a short statement of intent, e.g.:
- “Explore reward‑claim behaviour under interruptions”
- “Explore recovery after network loss”
- The charter defines focus, not steps.
- Timeboxing forces prioritisation and prevents unfocused wandering.
Applied insight:
Before you go deep, verify the basics first. A short daily smoke test protects the golden path, so deeper exploratory work isn’t wasted rediscovering obvious breakage.
Nathan Glatus – ex‑Senior QA / Game Integrity Analyst (Fortnite, ex‑Epic Games)
Rather than changing everything at once, alter one variable at a time: lock state, network type, lifecycle state. This keeps results interpretable and defects reproducible.
Typical Session Flow
- Charter chosen (risk & focus)
- Timebox set (20 – 45 min)
- Variables defined (one at a time)
- Notes captured live
- Evidence captured when it happens
- Bug report drafted while context is fresh
Common Findings
- Soft locks where the UI appears responsive but progression is blocked.
- State inconsistencies after backgrounding or relaunch.
- Audio/visual desynchronisation after OS‑level events.
- UI scaling or readability problems that only appear in specific contexts.
Scenario: Reward claim flow under interruptions (Android)
During an exploratory session, repeatedly backgrounding and resuming the app while a reward flow was mid‑animation triggered a soft lock: the UI stayed visible, but the claim state never completed, blocking progression. This did not appear during clean uninterrupted smoke testing because the trigger was lifecycle timing and state recovery.
Why this matters:
It is normal user behaviour on mobile, not a rare edge case. Exploratory sessions hit it because they are designed to.
Applied insight:
High‑impact exploratory bugs live or die by their evidence. Capture context (client & device state), include frequency (e.g., 3/3 or 10/13), and attach a clear repro so the issue is actionable.
Evidence‑Driven Reporting
- Screen recordings captured during the session, not recreated later.
- Notes that include context, not just actions (device state, network, lifecycle transitions).
- Bug reports that clearly separate expected behaviour from actual behaviour.
The goal is to make exploratory findings actionable, not anecdotal.
Key Take‑aways
- Risk‑based testing decisions
- Test charter creation & execution
- Defect analysis & clear bug reporting
- Reproduction step clarity under variable conditions
- Evidence‑led communication
- Mobile UI & interaction awareness
- Device & network variation testing
Exploratory testing is structured, not random. Mobile risk is contextual, not just functional. Interruptions and recovery deserve dedicated exploration. Good notes and evidence make exploratory work credible and actionable.
By using a clear charter, a strict timebox, and controlled variation, you ensure every session has purpose.
If you can’t explain what you were trying to learn in that session, the charter is too vague.
Variables that matter for reproduction:
- Device state
- Network condition
- Lifecycle transitions
- What changed between attempts
Notes should capture context, not just button presses.
Reducing the Test Scenario
- Goal: Trim the scenario to the smallest set of steps that still reproduces the issue.
- Method: Re‑run the reduced scenario while changing one variable at a time until the exact trigger conditions are identified.
Common Mobile Failure Modes
- Lifecycle & OS‑driven:
- Backgrounding / foregrounding
- Notifications
- Lock / unlock cycles
- Network & Permissions:
- Switching between Wi‑Fi / cellular
- Granting / revoking permissions
- Device Constraints:
- Battery‑saving modes
- Performance throttling
Normal user behaviour creates timing and recovery issues that clean runs often miss.
Rebel Racing – Charter‑Based Exploratory & Edge‑Case Testing
Full artefacts and evidence:
Rebel Racing Project Portfolio
QA Chronicles – Issue 2:
Rebel Racing Issue Details
This dev.to post stays focused on the workflow. The case‑study links out to the workbook structure, test runs, and supporting evidence.