As the gap between desktop and mobile devices is narrowing down and usages of these different devices are overlapping, optimization of efforts by methods such as A/B Testing is also becoming severely important for the mobile platform.
The mobile platform is one of the most non-standard areas when it comes to developing and deploying an app for that platform. The numerous vendors and versions that are prevalent in the mobile space add to the problem of standardization. Just to get a picture of how diverse the scope is let’s have a look at the variety in mobile devices. There are sets of users with 2GHz+ quad core processors or dual core processors with the latest iOS or Android version powering a high-end mobile device. At the same time, there are also the economy level users who use mobile devices with first generation processors with less than 1GHz and furthermore, with older versions of the OS.
This further extends the diversity, which causes difficulty for the mobile app developers in trying to make their apps work across virtually all available OS versions and device types. On the other hand, it also adds on to the difficulties of conducting an A/B Test on this widely spreading platform of choice. But is this the only difficulty?
There are other challenges as well. Because of the aforesaid reasons, mobile apps face another major hurdle — irks from the app users. Conducting any test, on top of that, is a bolder and riskier decision.
Almost all mobile app developers cater to two major mobile platforms – iOS and Android. Blackberry is passé, Windows OS is in the process of a re-entry while other mobile platforms do not have such a large user base as compared to iOS and Android.
[Accompanying Pie Data reference: http://www.netmarketshare.com/]
Depending on whichever OS, a mobile app is either based on HTML5, a native app, or a hybrid of both. Configuring an A/B test on HTML5 based apps becomes comparatively easier than trying to work on a native app. However, the apps based on HTML5 are reportedly sluggish and unresponsive many times. This severely hampers engagement, views and even the ratings in the app store. Apps based on HTML5 are easier to update; whereas native apps can be updated only once a month, because of a monthly code lock regulated by the App Store. This in itself has become a problem to run continuous A/B Testing programs. An example of an app facing this problem is Facebook.
The Testing scenario in Facebook Apps – According to a TechCrunch report, earlier in 2013, Facebook decided to port its mobile app development from HTML5 to the native app to improve its app performance. This improved the app’s performance and engagement factors. But code freeze every month and only once a month code update impacted their testing initiatives.
Fact 3: As of January 2014, iPhone and iPad owners spent over USD 10 Billion on apps and in-app purchases. Out of this, more than USD 1 Billion was spent during the holiday period alone.
Be it any mobile platform, business owners need to know what works best within their app and what doesn’t; ignoring this means giving an easy miss to potential income. To know the usage of the installed app, there are app tracking mechanisms which can be integrated along with the app. But to ensure that users use it as well, on a regular basis, testing various aspects of the usability becomes important.
Over and above this basic flow, there could be follow-up tests if there are many other variations to be tested.
What are the challenges?
The previous scenario of A/B testing in a mobile app looks simple, but it subtly hides the complexities that revolve around setting up the A/B test. Here are a few areas that one should be prepared for when planning a test.
The challenges highlighted above are worth being aware of, to optimize the testing program in the right way. It shouldn’t, however, be a reason not to test. As continuous optimization is a culture that grows in many different industries, A/B test in mobile apps is a suitable method to optimize on the mobile users.
Since mobile native apps do not work based on cookies for visitor identification, the email or phone data could be used to index the user.
Mobile users are always-on consumers; hence testing the right view for the widespread user base is the best way to suit user’s requirements. If the app is treated well for the users, smaller screens need not be a roadblock to app engagement. We encourage our customers who deal with mobile apps to conduct A/B tests to improve the app usability and adoption. If you too have an app and you are unsure about how to improve the engagements and conversion, we can help you get it right.
Photo credit 1 – http://www.netmarketshare.com/
How Upworthy uses A/B Testing for its headings in Facebook