According to research by Gallup on Behavioral Economics, organizations that leverage customer behavioral insights outperform peers by 85 percent in sales growth and more than 25 percent in gross margin. Your company is now competing based on customer experience, but you’re only capturing few feedbacks from few sources and tracking maybe 10 out of 2500 interactions.
Businesses generate a lot of data concerning their customers, especially if they are present online. If you use a variety of digital systems and tools to track this data, you can use analytics to comb, combine, and analyze it in one place.
For example, you are an e-commerce company and your analytics tool captures data of the user’s entire journey until the order is placed online. If the user also visited the physical store and made a purchase, that data would only be stored in the CRM database. To view and analyze the user’s purchase journey across offline and online stores, it is necessary to see the unified data, presented as a single report. You know you need the right customer data to discover customer journeys, understand customer behavior, and provide your customers with a better experience. But standing in your way is an increasing number of data sources and invites challenges like data quality problems, data governance issues, data ownership barriers, customer identity matching, and data schema incompatibilities, as well as the duplication of data across sources.
When a user visits or calls the store and provides more information such as email id or phone number, the userID and other data attributes get stored in the CRM database. You can then export the CRM data into an excel sheet and import it into the analytics tool with the userID as the unique identifier. This allows you to analyze the combined data in the tool itself.
Importing external data to an analytics tool can be achieved in different ways for different tools. In Google Analytics, it is accomplished via Data Import, whereas in Adobe Analytics, either Data Sources or SAINT classifications would do the job.
Data Import in Google Analytics lets you combine the data generated by your offline business systems with the online data collected by a Website/Mobile Analytics tool.
For instance, consider that you have an online education website, and there is an option for users to upgrade or enroll in a program using a call-in method. The goal now is to analyze these two streams of data at a user level.
The steps to be followed for data import are:
The data to be uploaded to Google Analytics can be stored in custom dimensions. For this, create the custom dimensions (number of fields you are uploading with those many custom dimensions) before uploading data.
A “data set” is a container that holds the data you upload to Analytics. Data Set Schema defines the structure that joins the data you upload with the existing analytics data. You manage your uploaded data files using a data set. Data sets are associated with views, giving you flexibility regarding which data you want to import in your reports.
The schema has two important parts:
“Key” is a dimension using which you will join the data that already exists in GA with the data you are uploading.
To upload the data collected from the CRM and from users who called the call center and subsequently enrolled, create a Data Set. The Data Set Schema would define a key, using email ID (for instance) as a dimension, and import dimensions for metadata such as courses, city, revenue, etc. You can then upload the information as needed to the relevant Data Set.
Follow these steps to define the schema and configure options that affect how imported data is connected to your property.
Use the downloaded template to fill out the details
There are two ways to upload data manually, using the Analytics web interface; and programmatically via the Management API’s ‘Uploads’ resource.
Data uploaded using the custom import feature can be included in Custom Reports.
Data Sources in Adobe Analytics lets you manually import additional online or offline data for reporting.
Two methods are available to submit data:
You can create and manage FTP-based data sources through marketing reports, which uses FTP file transfer to import data files into Data Sources in Adobe Analytics. After creating a data source, Adobe provides you with an FTP location that you can use to upload Data Source files. Once uploaded, Data Sources automatically locates and processes them. Once processed, the data is available for marketing reports.
Adobe offers a Data Sources API that lets you programmatically link your applications to Data Sources. This eliminates the need for an intermediary FTP server and transfers data via HTTP, SOAP, and REST.
Please note:
Full Processing Data Sources: Use the ISO 8601 date format of YYYY-MM-DDThh:mm:ss±UTC_offset (for example, 2013-09-01T12:00:00-07:00), or Unix Time Format (the number of seconds elapsed since January 1, 1970).
Standard and Integration Data Sources: Use the following date format: MM/DD/YYYY/HH/mm/SS (for example, 01/01/2013/06/00/00)
Create, manage, and view the use of data sources in a report suite (Analytics > Admin > Data Sources).
The Create tab lets you configure a new data source for the currently selected report suite. When you activate a data source, the “Data Sources” wizard guides you through the process of creating a Data Sources template and creates an FTP location for uploading data. The selection you make on the create tab determines the initial fields in the template.
Drive better results by understanding customer data