As the gap between desktop and mobile devices is narrowing down and usages of these different devices are overlapping, optimization of efforts by methods such as A/B Testing is also becoming severely important for the mobile platform.
Fact1 : Approximately 90% of mobile-cum-tablet market consisted of iOS and Android, as of Dec 2013.
The mobile platform is one of the most non-standard areas when it comes to developing and deploying an app for that platform. The numerous vendors and versions that are prevalent in the mobile space add to the problem of standardization. Just to get a picture of how diverse the scope is let’s have a look at the variety in mobile devices. There are sets of users with 2GHz+ quad core processors or dual core processors with the latest iOS or Android version powering a high-end mobile device. At the same time, there are also the economy level users who use mobile devices with first generation processors with less than 1GHz and furthermore, with older versions of the OS.
Fact 2: Each day, 1 Million+ new Android devices are activated across the world, and 1.5 Billion apps are downloaded each month.
This further extends the diversity, which causes difficulty for the mobile app developers in trying to make their apps work across virtually all available OS versions and device types. On the other hand, it also adds on to the difficulties of conducting an A/B Test on this widely spreading platform of choice. But is this the only difficulty?
There are other challenges as well. Because of the aforesaid reasons, mobile apps face another major hurdle — irks from the app users. Conducting any test, on top of that, is a bolder and riskier decision.
Mobile platforms and mobile apps
Almost all mobile app developers cater to two major mobile platforms – iOS and Android. Blackberry is passé, Windows OS is in the process of a re-entry while other mobile platforms do not have such a large user base as compared to iOS and Android.
[Accompanying Pie Data reference: http://www.netmarketshare.com/]
Depending on whichever OS, a mobile app is either based on HTML5, a native app, or a hybrid of both. Configuring an A/B test on HTML5 based apps becomes comparatively easier than trying to work on a native app. However, the apps based on HTML5 are reportedly sluggish and unresponsive many times. This severely hampers engagement, views and even the ratings in the app store. Apps based on HTML5 are easier to update; whereas native apps can be updated only once a month, because of a monthly code lock regulated by the App Store. This in itself has become a problem to run continuous A/B Testing programs. An example of an app facing this problem is Facebook.
The Testing scenario in Facebook Apps – According to a TechCrunch report, earlier in 2013, Facebook decided to port its mobile app development from HTML5 to the native app to improve its app performance. This improved the app’s performance and engagement factors. But code freeze every month and only once a month code update impacted their testing initiatives.
Testing the mobile apps
Fact 3: As of January 2014, iPhone and iPad owners spent over USD 10 Billion on apps and in-app purchases. Out of this, more than USD 1 Billion was spent during the holiday period alone.
Be it any mobile platform, business owners need to know what works best within their app and what doesn’t; ignoring this means giving an easy miss to potential income. To know the usage of the installed app, there are app tracking mechanisms which can be integrated along with the app. But to ensure that users use it as well, on a regular basis, testing various aspects of the usability becomes important.
How is the typical A/B Test done?
- Different variations of the app or variation in certain parts of the app are shown to a fraction of the users
- Usage of the different variations are tracked
- Actions and conversions are analyzed to find the most effective variation
- The winning variation is exposed to all users
Over and above this basic flow, there could be follow-up tests if there are many other variations to be tested.
What are the challenges?

The previous scenario of A/B testing in a mobile app looks simple, but it subtly hides the complexities that revolve around setting up the A/B test. Here are a few areas that one should be prepared for when planning a test.
Extrinsic challenges
- Smaller attention span of mobile users
- Smaller screen sizes
Intrinsic challenges
- 1) A database within the mobile device: Since the mobile app would be undergoing tests with its wide array of users, storage of all the test data would require a database. If the app doesn’t provide or connect to a database within the mobile device, lot of valuable data would be lost.
- 2) App frontend/backend update: Since apps in App Store or App Market Place undergo a monthly code freeze until the next update, any modifications to the frontend or backend gets staggered through the update window.
- 3) Online-offline modes: If the mobile device goes offline, the app would not be able to communicate or transfer the data collected from testing.
- 4) The test duration: Due to possibilities of online-offline modes of the mobile devices, data from the A/B tests on the app gets collected on a deferred basis. Additionally, app updates are available on a monthly basis, thereby causing additional delays. This, in turn, affects the duration of the A/B Test.
So should we conduct A/B tests on the mobile app?
The challenges highlighted above are worth being aware of, to optimize the testing program in the right way. It shouldn’t, however, be a reason not to test. As continuous optimization is a culture that grows in many different industries, A/B test in mobile apps is a suitable method to optimize on the mobile users.
To overcome the challenges mentioned above:
- One could integrate a mini database when the app is installed
- Have a package of tests defined and pushed for fresh installations
- Have a periodic data sending schedule which can ensure data availability from an offline device when it goes online
- Use test duration calculators to estimate the test duration or evaluate the test results to find the better confidence level and conversion rates
Bonus tip:
Since mobile native apps do not work based on cookies for visitor identification, the email or phone data could be used to index the user.
Mobile users are always-on consumers; hence testing the right view for the widespread user base is the best way to suit user’s requirements. If the app is treated well for the users, smaller screens need not be a roadblock to app engagement. We encourage our customers who deal with mobile apps to conduct A/B tests to improve the app usability and adoption. If you too have an app and you are unsure about how to improve the engagements and conversion, we can help you get it right.
Article references
Fact 1: http://www.netmarketshare.com/operating-system-market-share.aspx?qprid=8&qpcustomd=1
Fact 2: http://developer.android.com/about/index.html
Fact 3: http://www.todayonline.com/tech/iphone-ipad-owners-spent-s13b-apps-2013
Photo credit 1 – http://www.netmarketshare.com/
Photo credit 2 – http://www.flickr.com/photos/61237118@N00/2164167813/in/photolist-4ieVGa
How Upworthy uses A/B Testing for its headings in Facebook
https://developers.facebook.com/docs/showcase/upworthy/