In Conversion Rate Optimization (CRO), domain usage of quantitative data helps us to answer the question: What are users doing on my website? However, quantitative data does not answer: What are users looking for? In one such scenario with our client, Fullstack Academy, we could determine what users were doing on the page, but couldn’t figure out what they were looking for.
EDA was performed to understand a potential set of users who frequented the site. Once the user set was identified we then focused on their behavior to see what problems they faced.
Based on our EDA, we found that this set of users were accessing the site using mobile devices. Their behavior was split into three categories:
Bounce Traffic
Engaged & Exited
Engaged & Navigated
Based on behavioral analysis, we discovered that the bounce rate of these users was around 83%. One in four users scrolled down the entire page, consumed the content, and, yet, they excited. The overall page traffic consumed the content, however, the drop from this page is higher, which results in fewer conversions.
For the problem mentioned above, we framed a hypothesis as follows:
Hypothesis 1
The traffic to website was of low quality — junk traffic.
Hypothesis 2
The traffic that bounced or exited the page did not find the content that they were looking for.
Hypothesis 1 was invalidated by analyzing the channel data and the marketing strategy. It was determined that the traffic was not junk, so we began to validate the second hypothesis.
The second hypothesis is more concerned with what content was missing from the page. Using the click stream data, we were able to understand what was happening on the website. But we couldn’t identify exactly what users were looking for within the page.
At this point, data concerning what the users were looking for was missing. So, we ran a pop-up survey poll on the website to better understand what users are looking for. We can then go ahead and recommend content that they want, run a test to confirm, and deploy the better, more relevant, content.
The survey poll was set up using the Lucky Orange tool. The poll information is listed in the framework below:
The pop-up survey ran for about four weeks, and the users’ answers were collected. Forty two percent of the users who participated in the poll selected that they did not find what they were looking for, and answered the poll question about what they were searching for.
The user comments were processed and analyzed, and we noticed that the users were looking for a detailed curriculum section, and had certain questions related to that section.
Recommendation: Providing key details pertaining to each topic present in the curriculum section would provide relevant, timely information users were on the site for in the first place.
For the problem mentioned above, we framed a hypothesis as follows:
Survey Result
42% Users did not find what they were looking for
A/B Testing
A/B Result
Survey Result
42% Users did not find what they were looking for
A/B Testing
A/B Result
Want to learn more? Let's Talk.