Blog

Search Blog

Customer Stories

Analysis from Flaregames — How to Effectively Predict Launch Week Installs on Mobile

Sam Cheney, App Annie and Victoria Tran and Abhimanyu Kumar, Flaregames

Flaregames alumni, Abhimanyu Kumar and Victoria Tran, sat down with App Annie to discuss their recently published article on using data to predict launch downloads for games.

Mobile is becoming increasingly influential as a gaming platform, and this presents an extraordinary opportunity for developers. However, success on mobile can often seem unpredictable. We regularly see surprise hits launch to the top of the charts, while games that look great on paper fall flat.

This is a huge challenge for gaming executives — in the face of this uncertainty, how can you create an accurate forecast for a new game, that will truly support good business decisions?

Here at App Annie we’ve spent years building the industry’s most trusted mobile data and analytics platform to give publishers the information they need to make better, data-informed decisions. Fortunately Abhimanyu and Victoria share our passion!

The pair recently published some fascinating research that proves the hypothesis that:


“A relationship exists between a game’s launch week install volume and its Featuring Placement, Genre and Art Style”

— Abhimanyu Kumar, Mobile Games Consultant & Victoria Tran, Data Scientist at Supercell


To help you better understand their methodology, and to understand how you can put this into practise, we interviewed Abhimanyu Kumar, Mobile Games Consultant and Victoria Tran, Data Scientist at Supercell, to showcase the research they performed during their time at Flaregames.

We asked Abhimanyu and Victoria how they went about building the model, how it’s helped Flaregames, and how they’re continuing to use data to better understand the unpredictability of the mobile gaming market.

 

App Annie (AA): What was your motivation for creating the model?

Abhimanyu & Victoria (A&V): Given that we were working at Flaregames, a Free to Play (F2P) mobile games publisher, we had a large inflow of potential publishing titles that needed to be evaluated from multiple business perspectives. One such perspective was “If this game was to be featured by the stores on launch, how would that impact the business case?”. To answer this question, the first thing any Product Manager or Analyst would do is try to estimate installs for the launch week and the first month, and this is one of the hardest prediction problems in the industry. Knowing how much of a struggle this was internally at Flaregames, we imagined it to be the same for many in the industry. Most definitely, this internal struggle made it a very exciting challenge to unravel with data science. We decided to open up our findings to the industry as we wanted to make people's lives a little easier. We believe that sharing such internal knowledge only pushes the industry further and thereby pushes us further.

 


“We decided to open up our findings to the industry as we wanted to make people's lives a little easier. We believe that sharing such internal knowledge only pushes the industry further and thereby pushes us further.”

- Abhimanyu Kumar, Mobile Games Consultant


 

(AA): Which data sets, techniques and software did you use?

(A&V): We sourced all the install data from App Annie’s Store Intelligence feature, and we automated data collection with the easy to use App Annie API. The qualitative classification of games in the base dataset was done through an internal tiered review process using our own featuring placement classification and Game Refinery’s industry standard genre and art style classifications. Next, a null hypothesis test was used to assess whether or not a certain feature (e.g., placement, genre, art style) was predictive. And finally, all coding work involved to prove the hypothesis and build the predictive model was performed with Python.

 

(AA): What’s been the impact? How has the model subsequently been used by Flaregames and by yourselves?

(A&V): Without sharing any confidential information, we ended up converting this into an in-house prediction tool — the idea being to facilitate better business planning for Flaregames’ upcoming titles. We were able to test it on newly released titles from other companies and the model’s initial accuracy rates were good enough to justify further investment. Further, the work here has been well received by many industry professionals — with the article being valuable enough for Deconstructor of Fun to publish. Since the foundational work is strong, further improvements spanning model robustness, prediction accuracy, data collection automation, self learning algorithms and usability can be built.

 

(AA): How would you like to see the model applied by gaming companies?

(A&V): We would definitely like for other gaming companies to use the foundational findings here to build more robust prediction models from datasets they trust and to tailor it to their business needs. The most important best practices to consider when building out a similar model would be:

      1. Choose a data source you trust
      2. Pick additional variables/features carefully
      3. Unbias base dataset classification as much as possible
      4. Always test your hypothesis before model generation
      5. Try out multiple statistical models to find the best need fit
      6. Bonus: For longer-term time-cost optimisation, automate the process of data-pulling and maybe even implement self-learning algorithms that the model can work on
      7. Balance overinvestment in everything above with business goals

The eventual hope is that this effort leads to better business planning for gaming companies of all sizes, and therefore better games for players to enjoy!

 

(AA): In the article you say that launch downloads will always be slightly unpredictable — but data can give a directional steer on likely success. What are some examples of factors that create this unpredictability?

(A&V): Gradual evolution of the market, customer preferences and customer demographics will always contribute to some amount of unpredictability and it would be unrealistic to say that a prediction model can solve for this. But since these are slow moving variables, a prediction model tailored to a company’s specific business needs can definitely provide a directional steer, which should be enough for most business decision-making processes. This is also why we showcased install range graphs in the article rather than one magic number.

Other more fast moving factors could also exist, like changes to store layouts or one’s game launch coinciding with that of a big title. This might lead to invalidation of launch estimates. The best advice we could give here would be to keep one’s eyes and ears open for any major external variables that should be factored in when coming up with launch period install estimates.

 

(AA): At this stage your model focuses on the iOS App Store. Do you expect the factors that influence launch downloads, especially being featured, to differ on Google Play?

(A&V): This really depends on the factor being considered. We’d expect the factors considered in our model to not change across stores, as we specifically chose variables that are always key to a customer download decision. However, the install range estimates will most definitely be different depending on which store one looks at and which additional factors one considers.

An additional note on factor selection. We believe that customers go through a download experience journey, and this journey is slightly different for users coming in through different channels/stores. Knowing what the most likely journey looks like for a certain channel/store would be a good starting point to figure out what factors would be important to consider for building a predictive model.

 

(AA): You evaluated first week, and first month downloads as a part of the model — did this give any indication on how long the impact of being featured lasts?

(A&V): We think this mostly comes down to two things — store visibility and recommendation algorithms, and word of mouth effects (both organic and incited). Both variables make it more difficult to put an exact time frame on how long a featuring effect lasts. Though from what we have seen in the data, we can say that most titles featured at launch realise most of their impact during the first month or so before returning to an organic download baseline.

 

(AA): Game publishers often ask us for advice when it comes to coordinating paid marketing activities with being featured? Did you uncover any insights that would help people with this?

(A&V): Unfortunately not, since we filtered out any user acquisition (UA) based effects from this analysis — an important variable to factor out when trying to predict launch installs through featuring. Though during the factoring out process, we did realise that many companies do perform paid marketing activities lockstep with a featuring. Anecdotally, we can say that the platforms knowing that you’re willing to invest into paid marketing activities during launch definitely helps with your chances of being featured.

 

(AA): What are you next steps?

(A&V): In order of priority, some next steps would be:

      1. Automating the data collection, profiling and learning process
      2. Considering additional factors to increase model robustness
      3. Testing out different prediction model designs to optimise prediction accuracy rates
      4. Once optimised on iOS, expanding the complete analysis to the Google Play Store too

If we were to shoot for the moon, we’d love to build an open source tool that performs all this and that anyone can access to run their launch install volume estimations on!

 

(AA): Beyond this model, what do you think the broader gaming industry should be prioritizing, but isn’t, when it comes to using data?

(A&V): To be honest, there are many difficult data problems like this one that we do know the industry is actively working towards solving — and we’re eagerly waiting to see the results! Though on a higher level, I guess our parting words when it comes to using data would be: "Be player first while balancing out business results, through being data informed and not data driven."

Many companies make the mistake of trying to find the answers to their problems only in the data. Data is essential when it comes to making better business decisions, but it has to be weighed against other player first qualitative perspectives that the data cannot reveal. It is important to realise that data only results from decisions made in the absence of it. Being data informed is all about using data as one more additional lens to spur the right product discussions, so as to always keep players first while making the most educated business decision possible.


“Be player first while balancing out business results, through being data informed and not data driven. Data is essential when it comes to making better business decisions, but it has to be weighed against other player first qualitative perspectives that the data cannot reveal. It is important to realise that data only results from decisions made in the absence of it.”

- Abhimanyu Kumar, Mobile Games Consultant and Victoria Tran, Data Scientist at Supercell


For more information on how Victoria and Abhimanyu are using data to better understand the gaming market, follow them on LinkedIn. If you’re interested in looking into some of the data behind the mobile market yourself, sign up for App Annie’s free product here and get started today:

December 18, 2018

Customer Stories
Mobile App Strategy
Mobile App User Acquisition
Mobile Gaming

Related blog posts