Blog

It’s no secret that A/B testing is the key to optimizing everything in your hyper-casual game, from level design to ASO to monetization. But are the winners of an A/B test so clear to the naked eye? What that means is, let’s say you don’t have access to the KPIs that tell you the results of your A/B test - no retention rate, no APPU, no CPI. Could you tell which version is best just by looking at them side by side? 

Well that’s what we set out to do when we posted a series of A/B tests on our LinkedIn and Instagram. We showed users side by side versions of an element (e.g. creatives) of one of our hyper-casual games and asked them to guess which was more successful. Below, we run through each of these tests, explore users’ choices, and discuss why there’s one right answer. With these examples, you can confirm that you’re running the right A/B tests in your own game so you can optimize performance at every step of publishing.

1. Sort It 3D - the icon test

We showed our followers two icons that we A/B tested for the game Sort It 3D  and asked them to select which performed better. The correct answer was the one on the right, but many people thought the left-hand one was better because it was 3D and resembled gameplay more accurately. As someone put it, “​​I would agree that the left one [is better]. The 3D design feels like an in-game [experience] - it’s appealing [and makes] people want to try it.” 

So let’s dive into why the right-hand icon was better:

  • It uses a white background instead of the blue and white of the left icon. This creates stronger contrast to stand out in the store and on phone screens
  • The colors in the winning version are lighter and more cartoonish, which are features that generally perform better for hyper-casual games
  • The perspective is a frontal view that takes up more space and makes the icon more visible and clear

As you run an A/B test on your game icon, consider trying a simpler version with brighter colors, clear contrast, and a cartoonish look.

2. Cheerleaders - comparing creative CPI

For the game Cheerleaders, we tested multiple versions of the creative - so in this challenge, we asked followers on LinkedIn which creative had the lowest CPI. Many guessed correctly, saying it was the left-hand creative. Their reasoning included “Because the characters are more visible and understandable at first glance” and “It just got my attention right away”. 

Our followers were right to identify that the creative on the left used brighter, more visible characters. You only have about 3 seconds to grab the attention of users with your creatives, so using bold colors and high contrast is key to hooking them.

You only have about 3 seconds to grab the attention of users with your creatives, so using bold colors and high contrast is key to hooking them.

This test only changed the character design in the creative, but it proves that even one element can affect your game’s success. Starting with a simple A/B test like this one for your creatives can help you confirm the game can scale affordably - and it may reveal that a feature performs so well, it’s worth integrating into your build. 

3. Rope Man Run - the fail scenario

Comparing creatives from Rope Man Run, our Instagram followers picked which had a lower CPI. The image on the left showed the character running, jumping, and gliding through the course. The creative ends in the middle of the action. On the right-hand side, the creative ends with the character running out of yarn and therefore losing the level. Responses were split 50/50, with one person saying “I think the left one has a lower CPI. The right one is constantly failing and that's not encouraging.”

But it’s precisely the fact that the right-hand creative shows a fail scenario that it’s the better-performing of the two. Often in hyper-casual creatives, showing a fail scenario can create a sense of frustration or encourage users to try playing themselves to win. You can incorporate this into your own creative strategy - try a version showing a failed level and check the CPI to determine how it performed.

4. Stacky Dash - every element counts

In this A/B test example, we showed two versions of Stacky Dash creatives side by side to followers on LinkedIn. They then identified which element made the difference in lowering CPI, such as gameplay speed, background, or color contrast. Most correctly pointed out the use of the green color in the left one made it the winning version compared to the all-brown creative on the right-hand side, with one person claiming the importance of the "contrast of play area and path from the background.”

This contrast is the key. Increasing the contrast between characters and the environment in a video can improve clarity and show users how to play more easily - this goes a long way in lowering CPI.

Increasing the contrast between characters and the environment in a video can improve clarity and show users how to play more easily.

The creative on the left heightens the contrast by using a bright green color for the game board and tiles. This makes both gameplay and progression more clear - users can see the tiles that have already been laid down and the empty spaces that still need filling.

5. Cheerleaders (revisited) - theme matters

In another A/B test for Cheerleaders, we asked our Instagram followers to identify the winner of two creatives with different themes. The creative on the right featured a spooky theme, with a graveyard background and skeleton characters, while on the left the theme was lighter - it had a sky environment and female characters. Most people thought the winner was the right-hand creative, “because Halloween is near” and “because the characters and environment are matched”. However, the lighter-looking creative on the left had a lower CPI.

Why?

A spooky theme like the one on the right appeals to some users, but isn’t as accessible as the creative on the left. The left-hand version uses bright colors and a theme that appeals to most players and therefore has a lower CPI. Since the hyper-casual audience is pretty much everybody, your creative should capture the attention of all types of users - avoid niche themes, like space or horror.

Put it to the test

Keep A/B testing all of the elements in your game, from creatives to features in the build itself. These five tests are an excellent place to start as you optimize game performance - just remember to keep doing so even after launch. A/B testing can help sustain your success after launch and unlock further opportunities for scale and profit. 

Don’t miss out on the next challenge - follow us on Instagram, LinkedIn, and Twitter.

 

Let's put these tips to good use

Publish your game with Supersonic