The manager's looked at the company design portfolio and sees design "A" in the archives. The sales team brings in an opportunity "B" which looks similar to A. As managers often do, they look at the surface and think,"I can bring in new product B on time, under budget and with less testing because I'll leverage A." The development group gets the order to use A's data to make product B in 6 months. 12 months later, B is done after doubling the staff working on the project, testing failures, and quality issues. It would have been easier to do B as a new project without reference to A. Why? The data for A was only "good enough". The depth of useful information in the data was really a facade.
The world of data is continuing to grow. The genesis of data is often driven by a specific need. In business, those needs are almost always linked to a schedule. A limited number of man-hours are allot to the task. The engineer/drafter/designer will take the model to the "good-enough" level and stop. As long as the model "looks" right on screen then the job is done.
Now take that "good-enough" model and pass it over to the recycle bin. Not the eternal land of deletion under Windows but the green concept of recycle, re-use, re-purpose. A manager knows the company already has data from another project. Why spend money on development when we can recycle existing data for a profit? Why should a flow analysis, stress analysis, weight analysis, cost analysis, or assembly instructions required a new data model to complete the task? Because the first model was only "good-enough" and now extra effort is need to re-purpose the original data. The original schedule was kept, the original cost was contained, and the manager overseeing the original work is given a bonus for meeting performance goals. It is the next manager, inspector, or engineer will take the penalty when the "good-enough" model cannot be used and requires a new data model for the new task. Or worse, the "good-enough" model unknowingly provides erroneous results. "Good enough" data has to be replaced with "better" data.
Logically, recreating or struggling with recycled data is not productive. A 20% increase on the original data generation schedule could have saved 200%, 300% even 500% downstream. Data isn't just used one time and archive for the record anymore. A single piece of data can be reused for a generation to come. Money, time, and standards at the beginning of data creation can save work for the entire organization later. Working for the future now does have measurable returns.
The integrity of data is directly linked to the successful cost reduction desired from re-used data. This assertion of allotting dollars for data integrity is money well spent is not driven by model based definition; it is good sense and a best practice. But without better data, MBD benefits are degraded and advantages of MBD are lost. Without better data, any organization implementing model based definition will be frustrated. That same organization may blame MBD for their woes instead of the acceptance that the organization's standard of "good enough" is the hidden cost driver.