Analytics for PVP
In this post, we’ll explore common issues in PvP matchmaking, including balancing early-game opponents and optimising matchmaking times. We’ll also discuss strategies for overcoming these challenges using analytics to improve player experience and retention. The post aims to cover some of the basics, but much more advanced analysis is often required to help a game succeed.
Games with a PVP component are some of the most engaging. They are also known for their ability to incentivise players to pay—the competitive spirit can drive significant spending. Players often want to show off their achievements, i.e., display status and prestige within the community. PVP games also have higher replayability because the human element drives variety.
These games have their own challenges, such as ensuring that the players have a positive initial experience, learning the basics they need to face other human opponents, and gaining an impression that the game is fair and engaging.
Balancing Opponents in Early Games
The idea of matching a player in their first PVP game against a bot sparks plenty of debate. On the plus side, starting with bots allows one to carefully shape the learning curve, helping players grasp strategies, familiarise themselves with the game’s interface, and avoid the toxic interactions that can sour their initial experience. However, there’s a flip side—bot matches might lead to false expectations, leaving players unprepared for the real competition, or they might simply feel less thrilling and satisfying than facing real opponents.
While tracking immediate churn after each game can provide some insights, it's important to remember that early drop-off can be driven by factors beyond matchmaking issues. It’s common for most games to see a drop-off after the first match, regardless of how excellent the initial experience is and whether or not they have paid for the game.
Let’s consider an example:
Game A throws players straight into the action with real opponents from their first match. Despite a harsh 67% chance of losing their first game, the drop-off rate after this initial match is surprisingly low at just 6%. So, how does Game A achieve this?
This game has an optional tutorial map, which many players skip, believing the game mechanics are straightforward or they could pick them up as they played. After the matches, players also participate in a post-game survey where they rate their experience and opponents. Interestingly, the feedback is positive across the board—players report having fun regardless of whether they won or lost. Players feel the game is fair and enjoyable, even when they don’t come out on top.
Success ratio of first 5 matches of game A
Game A is an excellent example of how loss perceived as fair and fun doesn’t lead to immediate churn. I am using the word "immediate" on purpose because a streak of losses and lack of sense of progression can still lead to churn further down the road.
Despite these positives, player retention still shows a gradual drop-off after each game played. Nevertheless, Game A maintains a respectable 10% 30-day rolling retention rate, considered solid within its genre.
Game B, conversely, has players play against bots during the first match. In this case, the opponent is severely weaker than the player, and all the focus is on learning the mechanics. The player is made aware that they are playing against a non-player opponent.
Game B - early match success ratio
This game has a significant drop-off after the first game, with 50% of players leaving by the time the tutorial game is over. This churn is not due to a matchmaking problem and would be a good subject for another post about game tutorialisation.
The difficulty ramps significantly after the first game, and the players start facing real opponents. At this point, we monitor whether losing a match leads to a more significant drop-off than winning, which would be a sign of frustration likely stemming from a lack of perception of fairness.
It can be hard to tell why players are leaving or what gives a negative impression of the game. To tackle this challenge, we often recommend external playtesting or A/B testing different difficulty levels in early games.
This approach helps us find the ideal balance between learning, ease, and challenge, ultimately creating a more compelling and engaging introduction for new players.
As you can probably tell, there isn’t one recipe for how easy the first match should be or whether facing a real opponent is better than playing against a bot—success is often achieved through experimentation, trial and error, and, most importantly, measuring and analysis.
Matchmaking times
Finding a perfect opponent can take longer than players are willing to wait. Many games prioritise new players by placing them in queues that are more likely to match them quickly, sometimes even against bots or lower-skilled players, to ensure a positive first experience.
We regularly track and analyse matchmaking duration and the time players spend waiting before aborting their attempt to find an opponent. When evaluating the impact of less-than-perfect matchmaking on player retention, we often find that it's better to match players with an imperfect opponent (or a bot) than to risk them getting bored in a long queue during their first game session.
In games without a steady influx of new players, we often compare matchmaking durations experienced by new and seasoned players. For example, in Game C, matchmaking for players in their first ten matches takes longer than for those in the 11-20 match bracket, even if the difference is just a matter of seconds. As players progress, the variation in matchmaking times increases, meaning some players wait significantly longer than others. Still, the overall trend is towards decreased matchmaking time due to the high availability of suitable opponents. Since Game C is a mobile game with matches lasting only a few minutes, developers must monitor and optimise matchmaking times to ensure players spend more time playing than waiting.
Matchmaking time analysis for Game C
In some PvP games, we can analyse early player performance and how it correlates with long-term engagement. Often, we find that players with lower win/loss or kill/death ratios on their first day or during their initial session are more likely to churn quickly. This early drop-off can be mitigated by improving onboarding, offering practice games against bots until players feel confident, or providing better starting equipment. However, many developers accept this churn, recognising that players with lower skill levels may not enjoy the game in the long run.
Data collection for support of PVP games
To effectively analyse and improve matchmaking, it's essential to collect match- and player-level data. Here are the key data points to consider
Match-Level Data:
General Match Information collected once per match:
Player and bot counts at the start and end
Total match duration
Winning team
Team scores
Team make-up (roles, operators, heroes)
Some player-level information can be included in this telemetry, but most commonly, it is collected separately.
Player-Level Data:
Player-Specific Match Data - every player sends their own set of match-related data points:
Match ID
Match type
Flag indicating if the player is the match host
Match duration
Player level
Player role/operator/character/hero
Player loadout
MMR (Matchmaking Rating) or ELO
Match order (number of matches of this type the player has played)
Other game-specific variables
Whilst we have not touched upon it in this post, we often perform MMR/ELO analysis to help validate matchmaking throughout players’ progression.
Additional Data Points
Matchmaking Time
Can be collected separately or included in player-level match data
Match Ordering
Typically performed during the analysis phase
MMR / ELO
Many games collect this both at the start and at the end of each match
Unique Match ID:
Every match should have a unique identifier, such as a matchmaking room ID on platforms like Steam. This unique identifier allows linking all players and actions related to the same match or group. Sometimes, a differentiation between group ID, lobby ID, and match ID is made, often if the group persists through multiple matches.
Final remarks
Finding suitable opponents for players can be challenging in the early stages or during a soft launch. To address this, many games start by facing players against bots. This approach ensures a smooth experience until the player base grows significantly.
Data analysis is essential and critical when launching PvP games, even more so than in other game genres. PvP games thrive on balanced competition and fair matchmaking, which hinge on deep insights derived from data. Developers can uncover trends influencing player satisfaction and long-term engagement by tracking key metrics like win/loss ratios, matchmaking times, and player progression. This data empowers developers to make precise tweaks to game balance, refine matchmaking algorithms, and enhance onboarding, ensuring that players stay engaged and feel the competition is fair. In the fiercely competitive world of PvP gaming, where player expectations are sky-high, leveraging data effectively is the secret to a successful launch, building player loyalty, and sustaining growth.