This client, a chain of bakery café's, was approaching the end of their contract with their existing data warehouse provider and was not renewing the contract. They needed to build a new data warehouse in Azure in order to replace all the reports and dashboards, using PowerBI, prior to losing access to the existing system, but their only in-house expert in that system had left a few weeks before.
This client provides training courses, group coaching, and certification exams intended primarily for teams of employees, and needs to accumulate data from two different web systems into a data warehouse and provide a variety of dashboards for the students & their management, as well as for internal use by the client for sales and customer success. Prior to building the data pipeline, we worked with the client to analyze a number of options for obtaining the data from Thinkific directly or via Segment and storing, managing, and reporting on the data, including: Azure SQL Data Warehouse, Amazon RedShift, Azure Functions, Amazon Lambda Functions, Klipfolio, Airtable, Looker, and others.
This client, located in Belgium, uses sensors and mobile devices to monitor fleets of vehicles for vehicle performance, driver behavior, vehicle event & alarm monitoring, telephone usage, and activity & task completion. They needed to build an analytics & reporting platform that could be embedded in their existing user portal for fleet managers. They also needed to be able to get the project off the ground fairly quickly while attempting to limit repeating work as they change database platforms and hosting models in the future. In addition, they needed to start off supporting English and French, translating both UI elements and some data components, with the plan of eventually adding support for another 6 languages with a minimum of redevelopment.
Our client for this project is an American beverage manufacturing company, located in Connecticut & Ohio, who had a poorly performing, unwieldy Birst data warehouse and needed to make significant improvements. The existing Birst environment was loaded nightly from an existing data warehouse that received data from three sales/ERP systems and took 3-4 hours per night to process a full load of data from those systems before delivering a subset of the data to Birst, which then took hours more to process it. On top of that, the data model forced the generation of extremely complex queries that were difficult to maintain and performed very poorly - one report, generated daily, routinely took 90 minutes to render, so could only be delivered by email. On top of that, there were significant questions about data quality because there were frequently differences between the reports and accounting's information.
The new system is so much less complex and works so much faster that the client's new Birst team were able to rebuild all their original reports almost entirely on their own. End-to-end data processing on a nightly basis now takes less than half an hour, with the exact cycle time depending on the amount of data; the client is considering starting to run hourly incremental updates throughout the business day. The report that used to take 90 minutes now takes 60-70 seconds, so they are able to use it in a dashboard (we may be able to optimize that further, if they decide it is worthwhile). QA of the data has shown that it is 100% accurate both compared against the source system and the accounting information.
We also mentored the client on management of the old and new Birst spaces, report development, testing & troubleshooting, and future model enhancements.
A number of clients engage with us to provide mentoring, troubleshooting and other guidance to their staff who are involved with Birst, including:
The client is a large, US-based fleet tracking and management service provider who were using Birst Live Access to provide a logical data model for reporting against a heavily-used transactional database. Performance under this configuration was extremely poor, and they were considering dropping Birst despite their investment in it. During our initial engagement, we discussed some ideas of ways to optimize the Live Access model and the underlying database, but quickly concluded that the best course was to move to loading the most relevant data into a Birst data model. We provided extensive guidance on designing the Birst data model, optimizing the extraction and delivery of data, managing the data volumes of their many customers, suggested a number of important improvements to the Birst system that were implemented for this client, and acted as a liaison to the Birst support and product teams.
Over the 5 years since then, we have continued to provide additional guidance and mentoring to the client's staff, helping them become more comfortable with Birst, reviewing their planned data model changes and additions, designing web-based ETL and deployment management software, troubleshooting data and software problems, etc. In addition, we took part in two annual business intelligence summits held by the client and some of their sister companies; one year, to help the sister companies understand how best to plan fitting Birst into their own environments and, the next, providing 12 hours of custom Birst training sessions to the client and several of their sister companies.
This client was a startup that needed to implement an analytics platform and develop an analytics roadmap to include in their engineering plans. We helped them evaluate a number of options for collecting their data from various systems and delivering it all to Birst in a manageable fashion, what data sets to prioritize and which to leave for later, and how to plan the Birst model in order to ensure easy addition of new data sets later. We also built their initial data models and dashboards in Birst.
This client is a multinational food-manufacturing company with over 30 thousand employees worldwide, based in Michigan. They have a mandate to report on Equal Employment Opportunity compliance, for which they were already using Birst. The existing Birst data warehouse implementation was not working well for them, because of a couple of model design problems. The new Birst implementation substantially reduced the ETL processing time, model & query complexity, and query execution time, resulting in new reports that were more accurate, easier to validate & maintain, and faster.
When the client was migrating from one HR system to another, they needed to rebuild their Birst Equal Employment Opportunity compliance reporting system to accommodate the new data formats and asked Birst to have us, specifically, to do the work. Initially, we agreed that we would just format the extracted data from the new system so that it could be loaded into the existing Birst space, but when they realized that they wanted to remove some components and change all the column names to match the new system, we concluded that it was best to build a new system mirroring the previous with new names. The transition to the new environment went very well, with minimal impact on end users.