Situation

This community-based non-profit organization has 6 divisions that spans over Children & Family, Home Ownership & Repair, Economic Opportunity, Transportation, Heat Energy & Fuel and Healthcare. Each of the divisions use separate systems to perform their individual functions. All of these systems are independent SAAS solutions that does not talk to each other. That means if an individual is served by more than 1 division, a new contact has to be created with all details entered manually. All the divisions operate independently using their systems with no opportunity to collaborate among each other. This leads to 3 main concerns:

  • Quality of data within each system is questionable but not quantifiable – Data Quality issues have to be identified and fixed
  • Every systems works independently and so there is no data collaboration among different divisions
  • Above 2 concerns are blocker for accurate and elaborate reporting that is a hindrance for efficient decision making

Solution

In this particular situation, there was no direct connectivity that could be established with the systems. Hence, the solution had 3-step consultative approach.

Standardize

  • Databases cannot be accessed directly – cloud-based applications do not have APIs or expose their databases
  • Reports are generated daily and uploaded to a shared folder following a specific naming conversion
  • Routine scheduled to extract data from the shared folders and process them daily
  • Alerts for specific dataset sent to identified users (as identified by division)
  • Users are trained to review the anomalies and fix data in the appropriate source systems

Consolidate

  • Data structure for the Staging, Master and Transactional data must be defined
  • Routine extracts the files from source systems and push them to the staging tables of the Data Lake – daily scheduled. This remains the raw data repository
  • Incremental transform events mashes the data and updates Master data and Transactional data
  • Validation events trigger alerts in case of anomalies found in the transformed data for the users to confirm and fix in the source systems

Integrate

  • Connect data lake to the Reporting tool to analyze, summarize and slice both Master and Transactional data for continuous evaluation and decision-making
  • Encourage source systems to enable technology for direct extraction and push back of data

In every step, new components of the overall solution are introduced to make it easier for the users to adopt and execute. Once all the steps are completed, the entire solution runs without much human involvement. Data and the interactions are extremely secure and there is no sharing of data with other organizations.



Results

Implementation of this custom routine solution had significant impact on the overall business operations and efficiency. 3 major areas of immediate wins are:

  • Master and transactional data consolidation: Having a unified master and transactional data across all division allowed for accurate segmentation, cost and operational reporting and analysis
  • Customer Master: A single source of truth to search for customers before making a decision to add them as a new customer helped in bringing onboarding time down significantly
  • Mandatory reporting:  State and Federal level reporting is now simple and easy instead of a fire drill every month

Sign me up for a free demo.

I would like to add routine to my data, organize and consolidate my records now!