QA Nexus Logo QA Nexus Contact Us
Contact Us
Marcus Thornbury, Senior QA Strategist at QA Nexus Pty Ltd, portrait photograph
QA Expert

Marcus Thornbury

Senior QA Strategist

QA Nexus Pty Ltd

Specialises in iterative playtesting methodologies and evolutionary QA frameworks that optimise feedback loops for game development cycles.

14
Years in QA
40+
Game Titles Led
30+
Studios Using Darwin
Background

From Brisbane Testing to Framework Innovation

Marcus began his career as a QA tester at an indie studio in Brisbane in 2010, where he quickly recognised that traditional testing methodologies weren’t capturing the nuanced feedback needed for modern game design. He’s spent the last 14 years pushing QA beyond checklist compliance—toward something more thoughtful, more strategic, and honestly more useful.

After completing his Bachelor of Information Technology at RMIT University in Melbourne, he transitioned into QA management roles at several mid-sized studios. That’s where the breakthrough happened. He developed early versions of what would become the Darwin Playtesting Framework—a system designed to mimic natural selection principles in how player feedback shapes development priorities. The idea was simple: stop treating all feedback equally. Instead, let the data tell you what matters most.

His work gained real traction in 2016 when he published a white paper on iterative feedback loop optimisation that resonated throughout the Australian game development community. Studios started reaching out. Developers wanted to know how he was getting such actionable insights from playtesting sessions. That’s when he knew he was onto something worth scaling.

Since joining QA Nexus in 2018, Marcus has refined this methodology into a comprehensive approach now used by 30+ studios across the Asia-Pacific region. He’s combined player telemetry, qualitative feedback sessions, and evolutionary algorithms to identify critical issues faster and smarter. Not just finding bugs—understanding how players actually interact with games and translating that understanding into development insights that matter.

He’s driven by one conviction: good QA isn’t bureaucratic overhead. It’s the bridge between what developers build and what players actually experience. That gap? That’s where the real work happens.

Specialisation

Core Areas of Focus

Deep knowledge across the QA landscape, built from real project work.

Playtesting Framework Design

Developing structured playtesting protocols that capture meaningful feedback without overwhelming participants or skewing data with leading questions.

Feedback Loop Optimisation

The Darwin Playtesting Framework uses evolutionary principles to prioritise feedback. Which issues matter most? Which can wait? Data decides, not gut feel.

QA Process Scaling

Moving from manual testing to systematic QA operations. How do you scale testing across platforms, regions, and player demographics without losing quality?

Player Telemetry Analysis

Reading the story in your player data. Heat maps, session duration, failure points—every metric tells you where the game’s breaking down.

Cross-Platform Testing Strategy

Games don’t exist in a vacuum. Mobile, console, PC—each platform has quirks. Coordinating QA across all of them without duplicating effort or missing platform-specific issues.

Team Leadership & Training

Building QA teams that think critically, not just follow checklists. How do you train testers to spot the problems before they become catastrophes?

Philosophy

How Marcus Thinks About QA

QA as Strategic Partner

Most teams still treat QA like a gate at the end of development. You build the game, throw it to QA, they report bugs, you fix them. That’s reactive. Marcus believes QA should be in the room from day one, informing design decisions as they’re made.

Why? Because by the time you’re six months into development, changing direction costs exponentially more. But if you’ve been testing and iterating from month one—gathering real player feedback, not just designer assumptions—you’ve already caught the big problems early.

“Good QA isn’t about finding bugs. It’s about understanding how players actually interact with your game and translating that understanding into actionable insights. That’s where the magic happens.”

— Marcus Thornbury

Data-Driven, Human-Centred

The Darwin Framework combines two things that shouldn’t feel like opposites but often do: hard data and human judgment. Player telemetry tells you WHAT’s happening. Qualitative feedback from testers tells you WHY. Neither works without the other.

Telemetry alone? You get numbers without context. Feedback alone? You get opinions that might not reflect actual player behaviour. Together, you get clarity. You know which issues matter, which ones can wait, and which ones nobody actually cares about despite what they said in the survey.

QA strategist reviewing game metrics on monitor in modern office environment with team collaboration

The Darwin Process

1

Gather Signals

Combine player telemetry, playtesting sessions, bug reports, and community feedback into one data set.

2

Evaluate Impact

Score each issue on severity, scope, and player impact. Not all bugs are created equal—some break the game, others are cosmetic annoyances.

3

Prioritise Ruthlessly

The algorithm surfaces the issues that matter most. Development teams fix what’s broken before adding what’s fancy.

4

Test & Iterate

Fixes are tested, data is re-gathered, and the cycle repeats. You’re not aiming for perfection—you’re aiming for continuous improvement.

Credentials & Recognition

Education

Bachelor of Information Technology, RMIT University, Melbourne (2012)

Current Role

Senior QA Strategist at QA Nexus Pty Ltd since 2018. Led QA operations for 40+ AAA and indie titles.

Published Work

White paper on iterative feedback loop optimisation (2016). Framework now used by 30+ studios across Asia-Pacific region.

Industry Impact

Pioneered the Darwin Playtesting Framework, combining player telemetry with evolutionary algorithms for smarter QA prioritisation.

Featured Work

Articles & Insights

Deep dives into playtesting, feedback loops, and QA strategy.

Setting Up Your First Playtesting Session

The difference between useful playtesting and noise comes down to structure. We’ll walk through how to design sessions that actually tell you something about your game.

Common QA Testing Frameworks Explained

There’s no one-size-fits-all approach to QA. We break down the major frameworks and help you pick the right one for your team and project type.

Explore More QA Insights

Marcus’s work spans game testing, feedback optimisation, and QA strategy. Dive into the full collection of articles and resources on QA best practices.