P L E A S E  W A I T...
Casestudy banner
A customer success story

Fullstack Academy Conversion Rate Optimization

Introduction

In Conversion Rate Optimization (CRO), domain usage of quantitative data helps us to answer the question: What are users doing on my website? However, quantitative data does not answer: What are users looking for? In one such scenario with our client, Fullstack Academy, we could determine what users were doing on the page, but couldn’t figure out what they were looking for.

Exploratory Data Analysis (EDA)

EDA was performed to understand a potential set of users who frequented the site. Once the user set was identified we then focused on their behavior to see what problems they faced.

Based on our EDA, we found that this set of users were accessing the site using mobile devices. Their behavior was split into three categories:

bounce traffic

Bounce Traffic

Engaged & Exited

Engaged & Exited

Engaged & Navigated

Engaged & Navigated

Problems Identified

Based on behavioral analysis, we discovered that the bounce rate of these users was around 83%. One in four users scrolled down the entire page, consumed the content, and, yet, they excited. The overall page traffic consumed the content, however, the drop from this page is higher, which results in fewer conversions.

Hypothesis Generation

For the problem mentioned above, we framed a hypothesis as follows:

Hypothesis 1

Hypothesis 1

The traffic to website was of low quality — junk traffic.

Hypothesis 2

Hypothesis 2

The traffic that bounced or exited the page did not find the content that they were looking for.

Hypothesis 1 was invalidated by analyzing the channel data and the marketing strategy. It was determined that the traffic was not junk, so we began to validate the second hypothesis.

Challenges Faced

The second hypothesis is more concerned with what content was missing from the page. Using the click stream data, we were able to understand what was happening on the website. But we couldn’t identify exactly what users were looking for within the page.

Solution Identified

At this point, data concerning what the users were looking for was missing. So, we ran a pop-up survey poll on the website to better understand what users are looking for. We can then go ahead and recommend content that they want, run a test to confirm, and deploy the better, more relevant, content.

Solution identified

Collection of Data

The survey poll was set up using the Lucky Orange tool. The poll information is listed in the framework below:

Framework

Framework

The pop-up survey ran for about four weeks, and the users’ answers were collected. Forty two percent of the users who participated in the poll selected that they did not find what they were looking for, and answered the poll question about what they were searching for.

Test Recommendation

The user comments were processed and analyzed, and we noticed that the users were looking for a detailed curriculum section, and had certain questions related to that section.

Recommendation: Providing key details pertaining to each topic present in the curriculum section would provide relevant, timely information users were on the site for in the first place.

Outcome

For the problem mentioned above, we framed a hypothesis as follows:

Survey Result

Survey result

42% Users did not find what they were looking for

Arrow

A/B Testing

AB Testing
Arrow

A/B Result

AB Result

Survey Result

Survey result

42% Users did not find what they were looking for

Arrow

A/B Testing

AB Testing
Arrow

A/B Result

AB Result
  • Wireframes and designs of the test were created and an A/B test was conducted to measure the effectiveness of the variation over the original.
  • The variation was a clear winner because the engagement of the page increased by 25%. Furthermore, the conversion rate of the variation increased by around 30% compared to the original.

Want to learn more? Let's Talk.