They had developed a content app. The app had only been soft launched as a pilot. The client used Mixpanel for in-app marketing and push notifications. They needed to understand what features were driving retention and what they could do to improve it, but their current Mixpanel implementation could not deliver these insights. They were not tracking many interactions because it would be too expensive. They were, instead, using Google analytics for the bulk of their analysis. They found that there was a data discrepancy of more than 20% between Google Analytics, Mixpanel & the backend database leading to lower confidence on data & sponsorship from senior stakeholders. For example, they could not look at a user who is engaging a lot and understand what about them makes them active. They could not be sure if the product was meeting the performance targets and the success of new feature launches. With a fast-growing database of users and rising amount of app interactions, the client needed a way to easily track, manage, access, and visualize these interactions to keep them ahead of the competition.
Another complication was that the client team were planning to move from soft launch to a high-profile product launch through a big marketing campaign within three months. Hence, the time between repairing the data measurement and moving to a concrete measurement framework was very less. Their app development team had great expertise in app development, but not in analyzing data. Hence, they could not help in reconciling data variances and bring integrity to the reports.
Nabler helped a US-based apparel brand optimize its investment on Adobe SiteCatalyst with a well-investigated re-implementation strategy and user training through our unique Digital Analytics Evangelization Program. This helped all the client’s teams get better and more relevant results and insights from SiteCatalyst, thereby increasing the tool usage and website effectiveness.
Preliminary Investigation: We found that the “Users” metric was very important to measure the performance of the app. This metric was also causing the most problems when reconciling the data discrepancies.
KPI Discovery: Our findings from the investigation resonated well with the sentiments of the stakeholders of the app. We conducted a KPI workshop with the app business team which led to a good understanding of the target audience and the important KPIs for the business’s app wing.
Creation of Use Cases: We started by educating the customer on what the individual tools can and cannot do. We also helped the client gain a complete understanding of the data landscape of the project. Various use cases were developed with multiple scenarios in mind regarding the app screen flow and critical use cases were shortlisted.
Measurement Implementation Guide: Due to the amount of errors that were brought out by the preliminary investigation, the client went for a complete re-implementation. While we worked on the reimplementation, we ensured that all metrics and dimensions which were working correctly were carried forward in the new implementation. Thus, for these data points which were accurate from before our engagement, past trends were maintained. For every critical use case, events and properties are identified and documented in the form of implementation guide which is used by app development team to implement. You pay for the number of events you send to Mixpanel, so it’s best to be strategic. Since the implementation involved coordination with the app development vendor, the technical document was simplified. This was done to also keep the content simple for the business stakeholders of the client’s team. Post their approval, the technical document was then elaborated with instructions to the app development team.
Testing of Measurement Implementation: Post implementation, we tested the updated apps for accuracy of data collection, which was done both in development and live versions of the mobile app. We closely synchronized our work with the release cycles of the app development vendors. This required a lot of collaboration and cooperation so that the output remained relevant and current to the latest app version. The implementation was tested by validating the events and properties, to check if they were populated appropriately.
Post Validation Value Additions: Post validation and fixes, various cohorts are created for which notifications were pushed and validated. As per the client’s requirement, the best visualization tool was identified and dashboards were released.
Drive better results by understanding customer data