With an increase in the number of customers and the volumes of business generated by these customers, a business intelligence (BI) strategy was needed. What started as a need for a series of automatically generated KPI reports led to the creation of a massive data analytics warehouse.
Our client invested heavily in a BI suite of technologies. For the most part, the technologies performed superbly. Digital dashboards provided just about every metric required and are extensively used by the business and available on-demand.
In one particular area however, the area of KPI reporting across certain parts of the business, the in-house team had been unsuccessful in producing the desired results.
Data Tracking – Data Intelligence
As an outcome from our data intelligence work for the client, we were asked to apply our strategic architecture, technical experience, dexterity and cunning to develop a complete process for automatically generating KPI reports and measuring business performance against key customer metrics.
Each KPI report would need to track between 35,000-45,000 items across multiple locations throughout the country.
Each item could be at a different stage of their lifecycle that also needed to be reported on.
The source system (in operation for several decades), had been developed on a proprietary, non-relational database. There was no formal schema definition, analytical capabilities or method for integration with the clients BI solution.
On top of that, the client had little knowledge of the internal data structures of their legacy operational system.
Additionally, the data in the system was created through a variety of means including directly entered by staff, created as a results of scan events by mobile data terminals or generated as a result of Electronic Data Interchange (EDI) events.
Tracing the source of event data through the original system was not obvious and therefore, non-trivial.
To complicate things further, a constraint in the way the BI suite worked meant execution time for any of the KPI reports had to be kept to less than 100 seconds.
From Automatically Generated KPI Reports to Data Analytics Warehouse
Semantia chose to address this by creating a separate SQL Server data analytics warehouse into which event records were reconstructed – in a relational schema more suitable for access by the BI suite.
Following a detailed analysis of the internal system’s data representation, we began writing conversion scripts that needed to transform more than 130 million event records, spread across more than 180 tables and 2,000 columns.
After the initial conversion of the database, an automated transformation process was scheduled at 15 minute intervals. Where 50,000-60,000 event records were loaded in a 60 second period to provide a near, real-time view of the production data.
Complex SQL was created and wrapped in the form of User Defined Functions (UDFs). These were created to allow the client to leverage and build additional services from the data. The UDFs were then made available to the BI suite.
Execution time for the automatically generated KPI reports was generally less than 100 seconds to produce a rolling 12-month view of customer metrics.

The KPI reports were structured to be user-triggered, and able to run at any time without impacting the performance of the operational systems used company-wide.
More importantly though; what started as an attempt to automatically generate KPI Reports, led to the reverse engineering of a proprietary NoSQL operational datastore. The creation of the client’s core data analytics warehouse and the display of real-time metrics throughout the business.
A new data analytics warehouse capable of providing a single view of customer activities – a capability that was previously non-existent.
With the development of a new ERP system underway for the client, the data warehouse has become the “database of record” for data migration.