The trend in enterprise resource planning (ERP) and human capital management (HCM) has been to move much of the functions out of the single system.  Specialized products over the last 10 years have sprung up focusing only on portions of a HCM system.  Enterprises can easily have Workday Or < a href="https://www.ultimatesoftware.com/">Ultimate Pro for employee management, ADP Workforce Now for payroll, Taleo for talent, and Namely for benefits management. Or maybe you have a couple old PeopleSoft systems and SAP from several mergers over the years, and you are in the process of migrating to Success Factors.

The Challenge of Data Quality

There is a system for payroll, a system for benefits management, another for compensation, another for performance reviews, etc.  While the reason for this shift has been to create a deeper feature set by focusing only on one aspect of ERP it has created a decentralized architecture.  This has profound affects on data quality across these systems because there is a core set of data that all of these systems require to operate.  This core data about employees must be shared among these systems if not done properly it creates data duplication and overlap errors.

Redundant Systems

Data quality can only get worse when multiple ERP or HCM systems are in play.  An organization can end up with multiple systems through modernization upgrades or mergers with other companies.  On average companies upgrade their ERP system or some human resources IT system every 7 years.  By law companies must keep historical data for 7+ years before letting it go.  And standard upgrade procedures only converts the current employees or some limited historical data.  That leaves much of the existing data behind in the legacy system(s).  After an upgrade often the legacy system will still be running, or even worse the organization will continue to maintain BOTH systems for some period of time!

Migrations

These upgrades have an impact on data quality because there has to be a transformation between the legacy system(s) and the new system.  Data must be extracted, transformed, and loaded into the new system, and this process is NOT easy.  On average most companies spend around 6 months migrating data from prior systems.  It is a complex tedious task to track down all of the information, and errors can creep into the process.

Running multiple systems simultaneously means the data in each system can diverge.  It’s very difficult to spot these problems in real time because often overlapping systems don’t integrate with each other.  There is no single view of all of the data.  This is the big challenge with having a decentralized environment.  No one system can be declared the master.  This can lead to silos of data that are trusted by portions of the organization, but not everyone.

If there are multiple systems at play here it can be a massive undertaking to modernize and upgrade your IT environment.  Data consolidation drives data quality because conflicts can be found more quickly simply by trying to bring all of the data together.
Not only that, but removing redundant data stores, simplifying the number of systems needing to be converted, and resolving conflicts will all improve data quality.

The simple answer to improving data quality is to minimize the size of your HCM systems.

  • Use a single system for each domain.
  • Consolidate data across multiple systems
  • Archive legacy data and consolidate legacy systems.

These systems are the basis for decision making in any organization.  If the data can’t be trusted neither can the decision made based on that data.  Data consolidation forces data quality to improve while making your organization simpler to manage.