hagos overlay image
Back to Case Studies

Master data quality in B2B wholesale: How Hagos eG reduced the error rate by 73% with a central data model

Initial situation

Hagos eG is Germany's leading purchasing cooperative for the tile stove and warm air heating industry. Founded in Stuttgart in 1919, the company now serves around 1,400 members in Germany, Austria, France, and Italy. With annual sales of approximately €210 million, ten branches, three central warehouses, and a product range of 45,000 items, Hagos is one of the most important specialist wholesalers in the sector.

The challenge: IT structures that had evolved over decades and decentralized data collection at all locations led to significant quality problems in the master data. Supplier data, product master data, and customer data were recorded, maintained, and modified at different locations – without uniform standards, without central validation, and without workflow-driven approval processes.

 


 

Key challenge

The problem analysis identified three critical dimensions:

Dimension 1 – Data inconsistency: Identical items were managed with different names, attributes, and classifications in various branches. Duplicates and inconsistencies hampered supply chain automation.

Dimension 2 – Process Risk: The planned ERP-supported workflow management required reliable master data. Automated ordering processes, scheduling, and supplier management only functioned with error-free master data. Every master data error was amplified in downstream processes.

Dimension 3 – Dezentrale Governance: With ten branches that could independently enter and modify master data, there was no central authority for quality assurance. Neither responsibilities nor approval processes were clearly defined.



Methodical approach

The consulting project followed a structured four-phase approach:

Phase 1 – Master Data Audit: Complete recording of all master data-relevant entities: articles, suppliers, customers, terms and conditions, storage locations. Analysis of existing data structures, identification of duplicates, inconsistencies, and gaps. Evaluation according to the quality criteria of completeness, uniqueness, timeliness, and consistency.

Phase 2 – Data Model Design: Development of a central master data model based on the single source of truth principle. Definition of mandatory attributes, required fields, and validation rules for each master data category. Establishment of classification standards for the 45,000-item product range.

Phase 3 – Workflow Design: Implementation of workflow-driven quality controls. Every master data change goes through a defined approval process. The decentralized branches enter data, the central master data office validates and approves it. Automatic duplicate checking and plausibility checks are performed before each approval.

Phase 4 – Training and Implementation Concept: Creation of a site-wide training program for all 280 employees with contact to master data. Definition of roles and responsibilities in accordance with data governance principles. Development of work instructions and quick reference guides for daily operations.



Implemented solution

The concept established a three-tiered quality assurance system:

Stage 1 – Recording: Decentralized input with system-based mandatory field checking and format validation. Automatic warning for potential duplicates.

Stage 2 – Validation: Centralized review by the master data office. Comparison against defined quality rules and classification standards.

Stage 3 – Release: Documented release with audit trail. Master data for ERP workflows is only available after release.




Quantifiable results

The implementation resulted in measurable improvements:

Key figure
Before the project
After project
Improvement

Error rate master data

Baseline

–73%

Quality management effort

Baseline

–15%

Processing time for master data creation

Variables

Standardized

Defined

The duplicate quote

High

Controlled

Eliminated

 

The results create the prerequisite for the planned ERP optimization: Automated supply chain workflows, machine-based planning and digital ordering processes are now based on a reliable data basis.

 




Lessons Learned

The key success factors of the project can be transferred to comparable organizations:

First: Master data quality is not an IT issue, but an organizational one. Without clear responsibilities and data governance, every technical concept will fail.

Secondly, decentralized data collection requires centralized validation. The principle of "as decentralized as necessary, as centralized as possible" balances practical relevance with quality assurance.

Thirdly: Training is not a one-off measure. Sustainable master data management requires continuous qualification and refresher courses.

 

FAQ

How can master data quality be improved in wholesale?
This is achieved through a central data model with mandatory attributes, workflow-driven approval processes, and clear data governance rules. Crucially, this involves separating decentralized data collection from centralized validation.

Why is clean master data a prerequisite for ERP automation?
Automated ERP workflows such as machine-based inventory management, automated ordering, or supply chain processes exacerbate any master data error. Incorrect product data leads to incorrect orders, and incorrect supplier data leads to delivery delays.

What is the difference between centralized and decentralized master data maintenance?
In decentralized care, employees at different locations enter data directly into the system. In centralized care, all data is consolidated in one place. Best practice is a hybrid model: decentralized data entry with centralized validation and approval.

How do you reduce duplicates in master data?
This is achieved through automatic duplicate checking during data entry, clear classification rules, and regular data cleansing. Crucially, the "first-time-right" principle applies: quality during initial data entry rather than subsequent correction.

Dreher Consulting successfully carried out a master data optimisation and ERP consultancy at the company Data Modul.

Number of Employees 300 Customer Focus B2B

The solution introduced a controlled three stage quality assurance process. Data is first captured locally with automated validation and duplicate detection then centrally reviewed against defined quality and classification rules before being formally released with a full audit trail.

Schedule Consultation

Get in Touch