The master data of Hagos was partially redundant and stored in different systems, current maintenance processes were time-consuming and inefficient.
New master data including changes and updates were forwarded on in lists to the next processor via email attachments. This process was susceptible to errors, non-transparent and often led to issues in the databases and systems. Data changes were difficult to track as logging was not standardised nor automated. The distribution of data did not take place in real time, which led to delays in the maintenance process and posed the risk of utilising old data.
In order to provide the data for the different sub-systems in a suitable form, a central data model was designed.
A multi-stage supplementation and release workflow was developed to standardise the maintenance processes. Stakeholders defined key figures for the data quality criteria and corresponding measures were already integrated into the maintenance process.
The master data maintenance process is now workflow-supported in the system, whilst system breaks have been eradicated. To maintain consistently high data quality, evaluation options and data clearing functionalities have been introduced. Based on the requirements concept (specifications), a master data management tool was qualified and introduced.