Traditionally, Business Intelligence (BI) tools are designed to Extract, Transform and Load (ETL), data warehousing, and data visualization. Most of these cloud-based BI tools are created around labor-intensive insight delivery operations. Boost the computational power of these tools with cloud compute engines.
Architecture design is focused on scalable data storage and easy visualization deployment with manual insights and analysis. This often means there is a compromise in the platform on the computation power available.
There are two key challenges that the marketer faces due to this lack of computation power:
Modern-day decision-making is made with data and insights displayed in smart, contextualized visuals using AI/ML. This requires complex and scaled, analytic computations to be embedded inside the data transformation and visualization operations, which requires a high-compute environment. This environment requires complex, API-based integration between BI tech and cloud-based compute tech.
The most popular method of extension involves real-time integration between BI tool and the cloud-based compute engine. For example, a real-time integration between Datorama and Amazon Web Service (AWS) through an API Gateway.
Here, Datorama acts as the master, sending the compute request and relevant data to AWS. As the slave, AWS processes the request and sends the output back to original location. The master-slave exchange is conducted in real-time, or near real-time. This architecture is the most common because it allows for easy extension from existing the BI platform to augment intelligence with ML-based insights.
Another frequently used architecture is where the BI tool shares some of the data warehousing and ETL with the cloud compute engine. Here, raw data lands on compute environment and is analyzed. But only insights and the aggregated analytical data set is sent over to BI layer where it is merged with the overall data-warehouse.
This architecture is typically deployed when the BI-layer needs to handle a lot of data. For example, web logs, or social media conversations, are assessed and calculations are run. Only after these calculations are complete, the insights and the aggregates are brought into the BI-layer for visualization.
A third architecture has the BI tool transform into a kind of visualization layer. In this instance, the entire data warehousing, ETL, and analytical workloads are within the cloud compute. This is useful when the BI tool doesn’t have to be very interactive and the data/analytical workloads are consistently refreshed.
Such an architecture removes constraints on a solution’s scalability in terms of storage and compute. However, this can compromise BI’s adaptiveness to end-user requests, because the layer only has a pre-calculated analytical data set.
Nabler has built two, primary use cases for its clients by extending cloud compute to BI platforms like Datorama:
Scalable benchmarking using forecasting techniques:
Instead of using static, or historic moving average benchmarks at quarterly or annual levels, Nabler clients can get forecasted, expected performance for each KPI across the campaign data hierarchy. This is re-enforced at minimum 24- to 48-hour refresh window. Marketing managers can easily plan their next day, week, month, quarter, and/or year. This simplified view on critical thresholds, and expected breaches, allows actionable intelligence for pro-active intervention, even during mid-flight.
Automatically augment performance insights:
Nabler embeds extensive machine learning capability, which automatically analyzes each of the following aspects, and bubbles up the key ones that business needs to focus:
This AI/ML driven automation of insights generation helps the clients save almost 70% of time and spend on the manual effort to generate similar level of output.