P L E A S E  W A I T...

How to audit your website for exceptional data quality using the website analytics tools optimally

Efficient audit practices ensure exceptional data quality and accuracy.

It all simply begins like this. You want to gather more actionable data through website monitoring to make smarter decisions. So, you decide to accelerate your data streaming process by adding complex configurations in your website analytics tools.

  • But are you capturing quality data well enough for you to collate the conversions on your website to your marketing activities?
  • Do you often face the problem of website analytics tags disappearing from your website or none of your data being captured when you access website analytics tools like AdWords, google analytics etc.?

If yes, then what you need is a robust tagging infrastructure for the digital success of your firm. All the above website analytics tools use tags (which are JavaScript codes) to record analytics data and track user behavior & information. Running an internal audit of these tags is a recurring process that ensures that you have accurate, consistent and quality data.

Your organisation needs to follow certain best practices of audits to ensure quality data and avoid mistakes that lead to data discrepancies.

So, if you want to be confident about your data and reduce wasting time on quality checks and re-tagging we suggest you read till the end.

For your convenience, we have categorized the best practices for an internal audit during website monitoring in the following sections:

  • Setting up Tag Structure.
  • Tag Validation.
  • Evaluation and Audit process.

Best Practices to be followed in setting up Tag Structure.

1. Efficient Tag Structure

Setting global standards to your tags brings uniformity and ease of usability in reports. A good way to achieve this is using the ‘3 Con’ and ‘2 Cle’ structure.


2. Maintain Character Limit

We often create tags in which the dimensions include more characters than required as shown in the screenshot.

In doing so, the tag length becomes too long to understand, and all such tags end up in the reporting tool making it difficult for the analyst to interpret them. Maintain character limits not only from a technical point of view but also to simplify things for the end user.


3. Assign friendly names to pages

If you don’t assign friendly names to pages, the website analytics tool automatically picks up the URL. This can have various issues such as;

  • URL’s are often long and can exceed the variable character limit (normally 100 characters)
  • End users might not be comfortable viewing metrics put up against URLs
  • Query string parameters can get stripped off, treating multiple pages like the website analytics tool page, impacting the page pathing.

4. Use Tag Management Solutions (TMS)

Tagging in the website source code affects the performance of the page and increases IT development cycles. It is recommended to migrate to Tag Management Solutions (TMS) so that tags are triggered asynchronously without affecting page load time. Additionally, it provides another layer of abstraction to manage the complexity of large website analytics. In the process of validation, identify the tags that are firing from website source code and work towards the migration to TMS.

Best Practices to be followed in Tag Validation

1. Use Debuggers

Browser-based debuggers (developer tools) are used to view website analytics requests (tags). However, web debugging proxy tools such as Charles, Fiddler etc., provide more options as they can be used in debugging tricky problems that go undetected when browser debuggers are used.

2. Filter Keywords

Each website analytics server call or marketing pixel can be viewed by using appropriate filter keyword in the website analytics tag debugger. Using the wrong keyword results in viewing no or unwanted tag.

Also, mobile app tags are sometimes sent in batches. In those cases, the tag can be filtered using “/batch” as the keyword.

3. Be cautious about duplicate tags

Every CTA link on a page should be uniquely tagged so that it’s easy to track. This can be achieved by using unique keywords in the tag like product ID, description etc, which would help in preventing a repetition of characters in the tag.

For e.g. the “Clicks” data for both the interactions above is collected in the same variable, c54. As shown in the image, both the tag descriptions have the same product id (12345). This would lead to quality data being collected twice for the same product, and no data for the other.


4. Be wary of unwanted tags

During an audit, keep an eye on all the tags that are fired. There may be cases where a tag that is supposed to be fired for some other interaction may also fire in the current scenario. For e.g. when you view a form, a “video is viewed” tag should not fire or vice versa.

5.Have appropriate Tag Description.

A tag should contain the correct and complete description of the interaction it has been associated with.

In the e.g. below, if there is a ‘shop now’ interaction on the page, the tag should contain the word, ‘shop now’ and not be a get coupon.


6. Check for HTTP status codes

When the browser sends HTTP request to the server, the server sends back an HTTP Response with any one of the status codes such as 200, 404, 302 etc., Generally, a successful response is denoted by 200 which means the website analytics data is entering the respective website analytics tool. Even a tag with 302 (redirect) response enters the analytics tool. However, when status codes are 400, 404 etc., then there is an error on the client side and this need to be investigated further.

7. Validate across different browsers & mobile responsive modes.

It is best to validate your website analytics tags in at least 2 different browsers to ensure data quality. Also, if the website is responsive then it is recommended to validate using mobile or tablet device browsers.  In the absence of mobile or tablet device, it needs to be validated using mobile responsive mode of the browser. This is required because at times HTML/CSS selectors vary across different modes, resulting in either no firing or misfiring of tags.

8. Check Analytics Real-time reports

In many cases, the tags accidentally send data to a different reporting profile or data is not as expected after it reaches the reporting tool. Check the real-time reports on regular basis as a part of tag validation to prevent this issue and ensure that quality data gets collected correctly.

Best Practices to be followed in Evaluation and Internal Audit Process

1. Audit Process across environments

It is very important to perform at least one round of validation across all the possible environments such as Development, QA/UAT and Production Environments to ensure that all updates or changes are audited on time before traffic starts kicking in. This is required because all the environments may not behave in the same manner due to timing issues in the loading of various libraries etc.,

2. Live Data Check

The audit process needs to be followed up with a live data check to ensure that quality data is being passed on to the reporting tool. Large firms might outsource this internal audit process to an external firm that holds expertise. Once the page is live, it is important to add a geographical dimension to see where these clicks are being generated from.

3. Handling Impact of Website/Tag Changes

All information regarding updates made to the website structure need to be passed on to the audit team to ensure that website analytics tags are not broken. Also, any changes to the tag structure need to be passed on to the audit team again to ensure tags are working as expected.

4. Dedicated Audit Team

One of the most important aspects of an audit is to have a team that takes ownership of their duty to ensure quality data and accuracy during website monitoring. This will help in preventing data loss and increase the efficiency of all other processes.

If you have reached here, you are sorted with your quality data. But if you still have doubts we are here. Contact us

Drive better results by understanding customer data