Blog: William McKnight Subscribe to this blog's RSS feed!

William McKnight

Hello and welcome to my blog!

I will periodically be sharing my thoughts and observations on information management here in the blog. I am passionate about the effective creation, management and distribution of information for the benefit of company goals, and I'm thrilled to be a part of my clients' growth plans and connect what the industry provides to those goals. I have played many roles, but the perspective I come from is benefit to the end client. I hope the entries can be of some modest benefit to that goal. Please share your thoughts and input to the topics.

About the author >

William is the president of McKnight Consulting Group, a firm focused on delivering business value and solving business challenges utilizing proven, streamlined approaches in data warehousing, master data management and business intelligence, all with a focus on data quality and scalable architectures. William functions as strategist, information architect and program manager for complex, high-volume, full life-cycle implementations worldwide. William is a Southwest Entrepreneur of the Year finalist, a frequent best-practices judge, has authored hundreds of articles and white papers, and given hundreds of international keynotes and public seminars. His team's implementations from both IT and consultant positions have won Best Practices awards. He is a former IT Vice President of a Fortune company, a former software engineer, and holds an MBA. William is author of the book 90 Days to Success in Consulting. Contact William at wmcknight@mcknightcg.com.

Editor's Note: More articles and resources are available in William's BeyeNETWORK Expert Channel. Be sure to visit today!

May 2007 Archives

I found this interesting when reviewing the specifications for NeoView. It says "HP is responsible for platform management and 24x7 monitoring, support and incident analysis. This support is provided remotely and securely by HP's Global Customer Support Centers using HP Instant Support Enterprise Edition (ISEE). Upgrades or repair of software and hardware, as warranted, are provided by HP." To my knowledge, this type of support, while offered, is not as prominently touted or widely adopted with any other of the common data warehouse platforms whereas here it seems standard.

Otherwise, I found a lot of what has come to be standard - if I can use that word already - in the appliance world, though obviously with some different underlying technology. I think data warehouse appliances have a strong future as data volumes grow out of control and business intelligence is added to operational functions and the data warehouse needs to be optimized for a mass, historical data store.

I've written 2 white papers on data warehouse appliances, which can be found in the white paper section of my B-Eye Channel.


Posted May 31, 2007 7:13 PM
Permalink | No Comments |

Denmark is actually becoming an interesting area for business intelligence development, led by the consulting firm Platon. I just wanted to give some visibility to a couple of vendor products I encountered while there last week at the Information Management Conference 2007 in Copenhagen.

Omikron is a Data Quality vendor, providing functionality not unlike a Trillium or FirstLogic (Business Objects) in the states – data enrichment, postal address correction, duplicate checking, sanction listing and so forth. However, implementing many of these functions with US-based tools has limitations for European data since address structure is quite unique to the address country. Omikron, based in Germany, specializes in European data quality.

I found timeXtender interesting also. It’s a framework for fast development of a SQL Server 2005 data warehouse environment. timeXtender makes the process of defining sources to SQL Server, bringing the data into a dimensional model and building basic cubes easier than native SQL Server 2005. I saw this environment built before my eyes. Their motto is “From source to cube – in no time.”

Technorati tags: Microsoft, SQL Server, Omikron, timextender, Data Quality


Posted May 29, 2007 5:46 AM
Permalink | No Comments |

At the Information Management Conference 2007 in Copenhagen this week, Lars Monrad-Gylling, CEO of KMD, a consulting firm who has just completed Denmark’s largest IT and data management project, shared a unique motivational idea he used on the sub projects. The status of all projects were broadcasted widely, including on a BIG SCREEN in the cafeteria with the name of the responsible project leader with the appropriate green (smiley), yellow (neutral) or red (frowny) face icon next to the name. They also sent news to the press on the status of each project. Talk about incentive to be on track.

Technorati tags: Data Warehouse, Project Management


Posted May 26, 2007 10:42 AM
Permalink | No Comments |

At the Information Management Conference 2007 in Copenhagen this week, Marin Bezic, SQL Business Intelligence product manager, Microsoft EMEA, provided the following timetables:

Performance Point Server:
April 2007: CTP2
Summer 2007: CTP3
2H2007: GA

SQL Server “Katmai”:
June ’07 (at TechEd US): Public disclosure
Every 60 days thereafter: Public CTPs
2008: GA

The themes for Katmai are mission critical platform, dynamic development and end-to-end business insight.


Posted May 25, 2007 9:21 AM
Permalink | No Comments |

At the Information Management Conference 2007 in Copenhagen yesterday, in response to the question “Will we see Master Data Repository and Data Quality tools from Microsoft?”, Marin Bezic, SQL Business Intelligence product manager, Microsoft EMEA replied, after a pause, “yes.”

Something to look forward to.

Technorati tags: Microsoft, SQL Server, Master Data Management, Data Quality


Posted May 24, 2007 1:26 PM
Permalink | No Comments |

What’s on my mind tonight is where companies are putting their enterprise master data and I do think ‘enterprise’ is the key word here. It takes proactive planning, and a dedicated project, to build true enterprise master data in any environment – even those with a heavy ERP footprint who may have done some level of ‘mastering’ there already.

1. In the data warehouse – This is the place where most master data comes together today due to the proliferation of data warehouses and the intense focus during the build of those data warehouses to build a master, shared (conformed) view of each business dimension. Usually most of this master data actually is mastered in the operational environment, but doesn’t come together or get integrated until it reaches the data warehouse. If you were consciously trying to fix master data problems within your operational environment, you would prefer to pull it together in the operational environment and make the data warehouse another feed of the data.
2. In pieces throughout the operational environment – This is leaving the master data in place throughout the operational environment, identifying it as such and arranging for the access to that data from wherever it is needed. This virtualization strategy can create performance issues upon query and can also limit integration possibilities. And finally you are adding functionality to environments that they were not built for.

3. A third option is to create a new hub in the operational environment and actually collect the master data there. You still go through the identification activities as with virtualization, but physically instantiate the master data separate from its origins. Another level can be taken to this approach to actually make the hub the system of entry in addition to the system of record.

In reality, even if you take a proactive approach, the third option could be years in the making for ALL of an organization’s master data. However, it is always worth striving for the best infrastructure and achieving it in bits over time. It may also be completely appropriate to stop at some point and tier master data throughout the various strategies.

Technorati tags: Master Data Management, CDI


Posted May 19, 2007 6:07 PM
Permalink | No Comments |

I have time to time brought in selective performance-enhancements into client situations. I even wrote a white paper about one of them, RightOrder. SAP NetWeaver BI Accelerator has come in and out of my life several times. It’s an in-memory analytic engine that performs its queries in-memory. It actually has quite a broad base of deployment and is shipped as an appliance (HP or IBM with Linux) and I think it’s worthwhile to think about in selective situations. It’s a BI product for SAP NetWeaver. There are features of the product that add a lot of value, like the high compression, which is enabled by the column-wise ordering of the data, which I’ll discuss below.

There is also the horizontal partitioning across multiple machines which enables the solution to get beyond previous in-memory limitations. It’s optimized for various common data types and has different algorithms for different data types. I don’t know exactly how much of the query performance improvement comes from this, but I suspect it’s quite a bit. Architecturally, they put more memory in the cache – another performance improvement technique.

The column-wise ordering of data allows it to perform very high selective compression because all of a column’s values are physically together. It also provides for excellent performance when you select a small subset of the columns in a table since you do not perform I/O for data that is not needed. Column-orientation greatly assists a compression strategy due to the high potential for the existence of similar values in columns of adjacent rows in the table. It’s ideal for columnar functions like SUM, COUNT, MIN, MAX, and AVG.

So, the shoe fits here when you need a specialist data mart useful when the performance of limited-column queries, especially columnar functions, is the overriding selection factor.

Technorati tags: SAP, Netweaver, Business Intelligence


Posted May 9, 2007 5:06 PM
Permalink | No Comments |
   VISIT MY EXPERT CHANNEL

Search this blog
Categories ›
Archives ›
Recent Entries ›