Blog: William McKnight Subscribe to this blog's RSS feed!

William McKnight

Hello and welcome to my blog!

I will periodically be sharing my thoughts and observations on information management here in the blog. I am passionate about the effective creation, management and distribution of information for the benefit of company goals, and I'm thrilled to be a part of my clients' growth plans and connect what the industry provides to those goals. I have played many roles, but the perspective I come from is benefit to the end client. I hope the entries can be of some modest benefit to that goal. Please share your thoughts and input to the topics.

About the author >

William is the president of McKnight Consulting Group, a firm focused on delivering business value and solving business challenges utilizing proven, streamlined approaches in data warehousing, master data management and business intelligence, all with a focus on data quality and scalable architectures. William functions as strategist, information architect and program manager for complex, high-volume, full life-cycle implementations worldwide. William is a Southwest Entrepreneur of the Year finalist, a frequent best-practices judge, has authored hundreds of articles and white papers, and given hundreds of international keynotes and public seminars. His team's implementations from both IT and consultant positions have won Best Practices awards. He is a former IT Vice President of a Fortune company, a former software engineer, and holds an MBA. William is author of the book 90 Days to Success in Consulting. Contact William at wmcknight@mcknightcg.com.

Editor's Note: More articles and resources are available in William's BeyeNETWORK Expert Channel. Be sure to visit today!

Recently in Master Data Management Category

And then there was the case study from my client, Commerzbank. I co-presented this one with Carolina Posada, Vice President at Commerzbank. In regards to the presentation topic of data governance, as a midsize organization (the US Branch), they combined Program Governance (program direction) and Data Governance (standards) into a Steering Committee. They also have a data stewardship program.

I pointed out that the important thing about these committees is that all of the necessary information management functions for the organization are done. These committees normally comprise data governance, data stewardship, program governance and a business intelligence competency center (or center of excellence.) I do not wish to overdo committees at my clients, but want to be sure all of the required functions for success are being done.

The benefits Carolina cited for their Analytical MDM implementation were:
1. Data management is aligned with the company strategy
2. Operational systems (by product) supports reporting and compliance
3. The hub allows the single customer master to be shared to all product systems
4. Early data issues detection
5. They know their complete exposure to clients, whereas before it was piecemeal and incomplete
6. Reconciliation of transformed data to GL metrics
7. Managers consuming information and providing constant feedback for improvements
8. A unified customer view... for all its other benefits

I generalized from many MDM implementations and presented "Top 10 Mistakes Companies Make in Forming Data Governance." They are (in no particular order):
1. Not Translating IT Investments into Business Objectives
2. Thinking of it as a Technical Function
3. Scope Creep
4. A Revolving Door of Membership and Participation
5. No Decision Maker
6. Failure to Create a Charter
7. Turning Governance into the Blame Game
8. Lack of Customization to the Culture
9. Thinking of it as "just meetings"
10. Hyperfocus on a tactical issue

Technorati tags: Master Data Management, Business Intelligence, MDM Summit


Posted October 24, 2008 9:18 AM
Permalink | No Comments |

I made it out for a day on Monday to the MDM Summit in New York. The conference has picked up some from years past. Their information has it that case studies are the draw so the conference had quite a few of them. RR Donnelley (using Purisma) had a great case study because they have followed some best practices like:

1. Knowing BI & MDM go hand-in-hand
2. Focusing on MDM when combining 3 large organizations to formRR Donnelley
3. They didn't pick the technology first, but grew into it
4. Somebody there had the wisdom to declare early that MDM must be minimally invasive to the source systems, and it was something RR Donnelley followed
5. They used D&B DUNS number for identifying (B2B) customers
6. They built in capability for (what I call) master data query
7. Data governance and stewardship

They use the Registry model for MDM.

The "ROI" from the effort was in sales reporting, reduced manual work in reviewing customer names, and knowing their exposure to companies who were/are potentially going under in the challenging economy.

The last best practice was to use outside implementation services. I know of one that can help there.

Technorati tags: Master Data Management, merger, Business Intelligence, MDM Summit


Posted October 24, 2008 8:54 AM
Permalink | No Comments |

Acquisitions are all around us these days, both through the natural movements of the market as well as the activity becoming increasingly necessary because of business downtown. This can double, or greater, the number of customer records that need to be managed. Duplicates will undoubtedly be a challenge in this process and it would be unheard of for this situation not to create data conversion issues.

It used to be that I would advocate IT be a part of due diligence in M&A. With the urgent nature of many of the recent M&A activity, that is not happening. IT must assess the master data issues in an M&A and take appropriate action. Just getting the application layer together (i.e., by integrating ERP) is not enough. The data layer is equally, if not more, important to enable answers to questions like (all for the combined entity):

Who are the customers?
Who are the most/least profitable customers?
What customers are shared by the pre-merge companies?
How do I reform my sales staff to address the customers?
What suppliers are common to the pre-merge companies and what is the total spend with them?
What is my total exposure to each customer and supplier?
How do I reform my vendor management?

Some will turn to a “neutral” source such as D&B for keying the customers at this point and other times it’s appropriate to form up a new surrogate key for the customers. Either way, physical co-habitation inside a database and true integration of customer lists are a must and M&A is a good time for MDM.


Posted October 8, 2008 11:45 AM
Permalink | No Comments |

Two concepts that must go together are Master Data Management and Data Quality. One reason why is that, no matter that you are calling it MDM which is supposed to carry some cache, to many people in the organization, it's just another place where customer (or other subject area) master data is going to be maintained. In this "free market" where the MDM data store is about the 25th such store to hold master data, it had better be the best store for master data.

It must be up-to-date, able to take on syndicated data, and void of all intolerable defects across the spectrum of referential integrity, uniqueness, cardinatlity, subtype and supertype rules, value reasonability, consistency, formatting, data derivation, completeness, correctness and conformance to a clean set of values.

Poor data quality in MDM, the most leveragable of the master data stores, from where master data will propagate throughout the organization, will not provide a foundation that management will support and hurt the project more than just about anything.

On the positive side, if the MDM hub can provide and propagate high quality master data, that will almost surely provide a unique high value propsition to the organization.


Posted October 6, 2008 3:53 PM
Permalink | No Comments |

I'll be speaking on "MDM ROI and Justification" at the MDM Summit this Sunday, March 30 from 6:30 - 8:00 in the night school program at the San Francisco Hilton.

Why would you want to do such a thing? No, I don't mean come to the session, I mean MDM itself. Come to find out. The top six frameworks for MDM justification will be presented. Link.


Posted March 25, 2008 9:51 AM
Permalink | No Comments |

Leave it to Microsoft to make technology inexpensive and easy. Quite possibly (i.e., link) Office 14 will begin to include the Stratature Master Data Management (MDM) acquisiton. Microsoft has helped to demystify data warehousing and data mining and now begins to do the same with MDM.

For the MDM market, this is a partial verification and will expose MDM to a wider audience. Those who are opposed to Microsoft philosophically or don't believe it will scale will go elsewise for their tool.

Tools are a start, but can obfuscate the need to address data quality, still the largest impediment to information management success.

Technorati tags: MDM, Microsoft, Office 14, Stratature


Posted September 27, 2007 1:31 PM
Permalink | No Comments |

Link to article. I guess my entry from May 24 was timlier than I thought. This really legitimizes master data management as a force. Stratature had not made great strides, but it does have a nice complement of the requisite MDM functionality that I discuss in my fullday MDM course including the hub, publish/subscribe, and modeling facilitation. Although Stratature hasn't made short-lists in my recent MDM strategies with Fortune clients, I expect its presence to increase now.

I expect more midsize companies to now get involved in MDM and, over time, for the price points for enterprise MDM software to settle.


Posted June 9, 2007 11:19 AM
Permalink | No Comments |

What’s on my mind tonight is where companies are putting their enterprise master data and I do think ‘enterprise’ is the key word here. It takes proactive planning, and a dedicated project, to build true enterprise master data in any environment – even those with a heavy ERP footprint who may have done some level of ‘mastering’ there already.

1. In the data warehouse – This is the place where most master data comes together today due to the proliferation of data warehouses and the intense focus during the build of those data warehouses to build a master, shared (conformed) view of each business dimension. Usually most of this master data actually is mastered in the operational environment, but doesn’t come together or get integrated until it reaches the data warehouse. If you were consciously trying to fix master data problems within your operational environment, you would prefer to pull it together in the operational environment and make the data warehouse another feed of the data.
2. In pieces throughout the operational environment – This is leaving the master data in place throughout the operational environment, identifying it as such and arranging for the access to that data from wherever it is needed. This virtualization strategy can create performance issues upon query and can also limit integration possibilities. And finally you are adding functionality to environments that they were not built for.

3. A third option is to create a new hub in the operational environment and actually collect the master data there. You still go through the identification activities as with virtualization, but physically instantiate the master data separate from its origins. Another level can be taken to this approach to actually make the hub the system of entry in addition to the system of record.

In reality, even if you take a proactive approach, the third option could be years in the making for ALL of an organization’s master data. However, it is always worth striving for the best infrastructure and achieving it in bits over time. It may also be completely appropriate to stop at some point and tier master data throughout the various strategies.

Technorati tags: Master Data Management, CDI


Posted May 19, 2007 6:07 PM
Permalink | No Comments |

I was recently posed a good set of questions by Dan Lindstedt on MDM. I thought I’d share my quippy answers with you.

- Where do you ‘master’ the master data? Strategy: MDM hub feeding operational systems and the DW
- How do you model master data? Strategy: Like a dimensional model’s dimensions; hierarchical
- What master data do you distribute? Strategy: Break all master data into ‘subject areas’ and distribute full subject areas (changed data only)
- Does 3rd party data constitute master data? Strategy: Absolutely. If it’s not transactional and it helps to explain the subject, it’s master data.
- What style of MDM should we use? Strategy: It depends, but usually a hub has had the best value proposition so far.
- Use a tool or do homemade MDM? Strategy: It depends, but usually should be a more later-stage decision than it usually is

Technorati tags: Master Data Management, CDI


Posted March 30, 2007 8:42 AM
Permalink | No Comments |

I spoke at the CDI Institute conference Monday and this question was on the minds of the attendees, many of whom were just starting off their MDM program.

Defining ownership as the entity that would actually come up with the rules for the sourcing, quality and presentment of the data, as opposed to the entity that would actually build the rules into the systems, my answer is Data Stewardship. Specifically, it's the business data stewards, who are representing the business rules to the IT/consulting build team for MDM. I have written extensively elsewhere about stewardship, but it is essential to MDM success. Hopefully those stewardship programs that were built for data warehousing can carry over to the MDM extensions many are now planning for their information management environment, as evidenced by the conversations I had on Monday.

Technorati tags: Master Data Management, CDI , Data Stewardship


Posted March 29, 2007 1:24 PM
Permalink | No Comments |
   VISIT MY EXPERT CHANNEL

Search this blog
Categories ›
Archives ›
Recent Entries ›