Blog: William McKnight Subscribe to this blog's RSS feed!

William McKnight

Hello and welcome to my blog!

I will periodically be sharing my thoughts and observations on information management here in the blog. I am passionate about the effective creation, management and distribution of information for the benefit of company goals, and I'm thrilled to be a part of my clients' growth plans and connect what the industry provides to those goals. I have played many roles, but the perspective I come from is benefit to the end client. I hope the entries can be of some modest benefit to that goal. Please share your thoughts and input to the topics.

About the author >

William is the president of McKnight Consulting Group, a firm focused on delivering business value and solving business challenges utilizing proven, streamlined approaches in data warehousing, master data management and business intelligence, all with a focus on data quality and scalable architectures. William functions as strategist, information architect and program manager for complex, high-volume, full life-cycle implementations worldwide. William is a Southwest Entrepreneur of the Year finalist, a frequent best-practices judge, has authored hundreds of articles and white papers, and given hundreds of international keynotes and public seminars. His team's implementations from both IT and consultant positions have won Best Practices awards. He is a former IT Vice President of a Fortune company, a former software engineer, and holds an MBA. William is author of the book 90 Days to Success in Consulting. Contact William at wmcknight@mcknightcg.com.

Editor's Note: More articles and resources are available in William's BeyeNETWORK Expert Channel. Be sure to visit today!

Recently in Market Category

I have received some questions on my article “Information Management and the Financial Meltdown” so I thought I’d address them here. The article was written in September, after the meltdown of Fannie Mae and Freddie Mac and Lehman Brothers filing for bankruptcy. AIG had suffered a liquidity crisis, but had not received the government loans yet to come. Goldman Sachs and Morgan Stanley had yet to be converted to bank holding companies, Washington Mutual yet to be seized, Wachovia acquired, etc., etc. And now we see it spreading to the auto industry and probably eventually the airline industry will be front and center. In other words, it was and continues to be a moving target.

It is really difficult to tell the depth of the deleveraging and decoupling that the world economy will go through. The economy is wound up pretty tight and must let out the built up pressure. Questions remain about the approach and the timing, but there is no avoiding that pain has, and will, occur.

One point I made is that financial companies were motivated to get mortgages out the door and that they sold their toxicity. This was true, but why were they motivated as such? Some point to the Community Reinvestment Act of 1977, which required institutions to loan to those less qualified. 1994’s Riegle-Neal act compounded the CRA’s effect by rewarding banks with high CRA scores to bank across state lines. And then more ability to compound behavior was possible in 1999 with Gramm-Leach-Bliley, which allowed banks to combine investment and commercial operations.

There was also incentive to take undue risks with the dilution of executive accountability once the firms went public and the executives became more minority interests in the entities. This started with Salomon Brothers, important in Citi’s heritage, going public in 1981. While I’m at it, the rating agencies’ presentation of their business intelligence left some things to be desired. And over 100% home equity loans, combined with a real estate downturn, tossed more toxicity on the fire.

Another point is that the mortgages were put into complex packaging, which business intelligence did not keep up with. So, in context of business intelligence, did the financial companies know what they were buying? I think business intelligence has some room to grow in terms of that, as pointed out in the article. A better question may be did they care? In some respects they did, but in other respects business intelligence was relegated to secondary consideration given that the institutions were not incented purely by profitability and good business. As I said “full visibility into exposure and liquidity is going to be a must.” Visibility and rewarding only good business are part of the “executive sponsorship” I mention that is required.

I had an MBA professor who went through some of the early lineage above with his students and predicted a dire outcome. I took his notes (early 1990’s) and extrapolated the more recent events for this entry. Many probably could have seen this coming, but when times were going well, nobody wants to stop the music. Executive sponsorship and business intelligence will be critical to mend the markets as painlessly as possible.

What are your thoughts?

Technorati tags: data, Business Intelligence, financial crisis, Information Management,Community Reinvestment Act, Gramm Leach Bliley


Posted November 30, 2008 5:34 PM
Permalink | 5 Comments |

I had a chance to review Lyza a few times in the last couple of weeks – both before and after its launch on Sept. 22. The biggest reason why I like it is I found immediate applicability to both a client situation and a personal situation. I.e., I’ve actually used it. Perhaps another reason, by way of disclosure, is that I’ve known and like the team at Lyzasoft and know their goal to provide a strong value proposition to the market. The extent of the focus groups that went into the product development is amazing.

With dynamic connections to the underlying data sets you define to Lyza (refreshed with a click), I find that it extends the functionality of the desktop. Think of it as providing a functional way of enabling joins and analysis across file types. You will mostly use this with Excel, Word, Access and text files. You pick the cell to start the connection or the range of cells that define the connection. It does not have to be the entire file. Then, again, there will be enterprise uses for its ODBC/JDBC connectivity, which is probably its ultimate destination.

My favorite feature is the ability to put all these data types on an equal footing and establish the joins.

It also has the ability to store data that you may want to derive from the underlying data sources in its own (column oriented) data store. So, in effect Lyza itself can become one of the data stores used in the analysis. And you can publish complex worksheets that contain the logic, from the underlying files, to determine sales commissions, vendor rankings, promotion effectiveness, etc. Worksheets can also be effectively a data set and connect dynamically. The metadata makes tracking your way back very easy.

Though not to the level of a Tableau Software yet in terms of charts and display options, the conditional logic, rich function library and ability to subdivide a data set (i.e., 1st 10,000 rows, a random 500 rows) make it pretty rich for a version one.

Unlike more complex tools that fit the gather requirements, out-of-sight development and launch to users, many of the Lyza applications for users can be developed in front of the user, or by the user.

Lyza doesn’t categorize easily, but I think it’s going to find a fit in the large gap between Excel capabilities and data integration – the lair of the true business analyst. With its quasi-EII capabilities to understand source data from its metadata, Lyza fits the unstructured nature of the analyst’s work in a modern, heterogeneous corporate information environment.

lyza.gif

Technorati tags: lyza, lyzasoft, Business Intelligence


Posted September 27, 2008 2:33 PM
Permalink | No Comments |

In a very strategic move for Microsoft’s enterprise goals, they have just announced the purchase of data warehouse appliance vendor Datallegro!

While Microsoft has significantly expanded SQL Server’s scale over the past few years, the perception of its limitations has been somewhere below the “big guys” of Oracle and IBM. And, wherever you believe the scalability of SQL Server has grown to, now undoubtedly the scale of Microsoft solutions goes beyond 100 terabytes. This is the scale that many, myself included, believe accessible data management capabilities need to get to in order to manage the future of telecommunications, retail, healthcare and other transactions and make them available.

Look for Microsoft, and others like myself, to publish reference architectures and guidance on the changeover point from SQL Server to Datallegro (or should we start calling it Microsoft MPP?) as well as integration points.

I have found Microsoft’s integration of its acquisitions to be very above average in terms of making the most of the acquired products. There are too many data appliances and Datallegro was caught up in this frenzy. It has found its way to be a long-term appliance play.

The open source DBMS that Datallegro was using, Ingres, will be scrapheaped over time and replaced by SQL Server. This will take some time, but Microsoft has that. Its customers now can see a plan in action and that will hold them over for a while. Many customers have settled into the “Microsoft zone” of pricing, which is more than open source (duh), but less than its big competitors. Look for Datallegro, likewise, to be in the low (but not “no”) cost points for its capabilities.

Congratulations to the respective teams.


Posted July 24, 2008 1:25 PM
Permalink | No Comments |

I was looking forward to this presentation. However, I must admit, with the plethora of appliance vendors who have hit the market lately and made their way onto client short-cum-long lists, I was more than happy to dismiss NeoView if this data point did not move the story forward several paces. However, Greg Battas addressed NeoView's lack of market penetration and their 'soft roll out' up front. They spent a full year with customers before the announcement in 2007.

HP, as a company, was losing big deals to IBM and Oracle since those 2 had full suites. Back in 2004/2005, Tandem (now part of HP) had built an earlier form of NeoView, but ultimately didn’t go to market with it because they didn’t want to compete with Oracle. That's not an issue now.

The first place to test NeoView was at HP itself, where they have, according to Greg, shut down 500 internal databases in a consolidation project.

HP still lacks in the data access space. Obviously, they were looking at BO and Cognos as well as SAP and Oracle did. They are working closely with Ab Initio for ETL although they're philosophy is less 'load and analyze' and more 'ingest and do things inline.' The philosophy, supposedly manifested in the architecture, is very Operational BI-centric.

NeoView is meant to be a "Teradata killer." However, as Greg pointed out, the road is littered with those who claimed to be "better than Teradata" and still, there's Teradata.

Technorati tags: Business Intelligence, Independent Analyst Platform, HP, NeoView


Posted July 14, 2008 7:44 AM
Permalink | 1 Comment |

What I liked about the Kalido presentation was that a demonstration was given (that worked) in a short amount of time. I think Bill Hewitt, CEO, has an excellent grasp of the market. Like many of the products at the IAP, I have worked with Kalido.

The modeling tool is very intuitive and graphical and a contrast to ErWin, which is entrenched in our culture. You can download it for free at www.kalido.com/bmcf. Kalido also has a community (http://groups.google.com/group/bmcf), where you can find a number of pre-built models and join the discussion about business-model-driven BI.

In addition to the modeling tool, Kalido has its Dynamic Information Warehouse which, to me, is the backside of the modeling tool - the implementation side, the Universal Information Director, and a Master Data Management tool.

Technorati tags: Business Intelligence, Independent Analyst Platform, Kalido


Posted July 10, 2008 8:06 AM
Permalink | No Comments |

Composite Software is rolling out their EII appliance, called the Composite Discovery Appliance. It's for enterprise search and utilizes indexes and discovered relationships in the data sources that you train it on. The appliance is actually a blade. Composite Software is embedded into some other more popular large-vendor stack EII products in the market.

Composite attempts differentiation from enterprise search (i.e., Fast and Endeca) through its focus on structured data and their focus on unstructured data.

This is EII and subject to all the pros and cons of that method.

One thing I really liked about this presentation was that they shared the price! Sure, most of the vendors have good and interesting technology, but it's all only worthwhile at a price.

The Composite Discovery Appliance is $150,000 for full use or you can pay $7,500 up front plus $4,000 per month.

I find it interesting that data access is now getting wider. Different methods are emerging with ample, but not obscure, access capabilities. Whereas data access had been an inch wide and a mile deep, now I see some balance emerging in the marketplace (i.e., Composite, IBI) that are balancing that out.

Technorati tags: Business Intelligence, Independent Analyst Platform, Composite Software


Posted July 8, 2008 7:53 AM
Permalink | No Comments |

Kevin Quinn gave a convincing presenation about WebFocus’ relevance-cum-dominance in the “super multi user” (my words) space. Why do they do this? Part of it is the deals they craft and their focus on enterprise licensing. Part of it is is the architecture of the product – it’s multi-threaded, not ODBC, and native and you can put the WebFocus server on any platform and it has load balancing and failover capabilities. There is a white paper on why they scale available on the Information Builders web site. I read and recommend it. Finally, the "active reports" feature of Information Builders - highly interactive reporting to Office products and with an easy method of distribution - helps enable the product's scaling.

If “pervasive BI” takes off, IBI could see glory days ahead. One wonders if they’re really well-classified as “business intelligence” (i.e., Gartner Magic Quadrant, etc.) since they really serve a different space – quantity users – and can co-exist with analytical and power users.

Also, IBM distributes Webfocus on the I platform now and it’s more than doubling Webfocus’ reach. Apparently a nice deal for IBI.

As for iWay, IBI’s other product line, it has 300 adapters and has always been a leader there. This is some behind-the-scenes stuff, but iWay also sells directly. Webfocus is its “biggest OEM”.

Technorati tags: Business Intelligence, Independent Analyst Platform, Information Builders


Posted July 7, 2008 12:23 PM
Permalink | No Comments |

I’ve noticed a general creep of the term “mashup” into vendor vernacular. However, I wouldn’t necessarily agree that the buzzword buzz is resultant of actual tool implementations of mashup. In the “traditional” sense, mashup is a web application that combines data from multiple sources. Some of the better products in this space are Kapow and Denodo.

I’m at the IAP, which Shawn Rogers has been busily blogging about, receiving numerous product walkthroughs and presentations. At this pace, “mashup” may make it to its crest quicker than the other buzzword terms in my blog title did. And that’s saying something. Remember when everything was CRM (until the term fell out of favor)? Or business intelligence (hey, wait, that’s still true)? Over time, these terms became, or will become, so meaningless so as to really demand definitions and scrutiny at the outset of any conversation.

May I suggest that if you’re combining things, you’re “combining things”, not necessarily doing mashups?

Technorati tags: mashup


Posted July 1, 2008 5:26 PM
Permalink | No Comments |

Other than the fact that by the time I get to the word “service”, half of the time I say “solution” instead of service, what about the viability of SAAS?

First, to levelset, SAAS refers to those set of solutions that are ‘housed’ offsite at the location of the vendor who actually developed the application. If a vendor is hosting all or most of a company’s software, well beyond those applications which the vendor built, that is really an outsourcing relationship and falls under different rules. These 2 approaches are seldom compatible since the cost of applications to an outsourcing relationship is usually additive and efficiences are gained from having more applications at the outsourcer.

Back to SAAS, the 3 hallmarks or selling points are (1) no IT involvement, (2) pay-as-you-go with little upfront cost and (3) the vendor takes all responsibility for infrastructure and upgrades – those invasive and non-value-added activities. The perception of SAAS is lower costs, speed to market and no IT. Interest is growing across a range of applications at my clients and much of SAAS is designed to fit within a departmental budget.

Here are some rules of thumb for the consideration of SAAS solutions, BI or otherwise:

1. Check the value proposition of the application.

Of course, this applies to any application, SAAS or otherwise. However, it’s worth mentioning that there should be some bottom-line business benefit at some level to actually doing the application in the first place.

2. Ensure scalability

I’m not referring to the expansion of disk, etc. That’s assumed to be taken care of by the vendor. I’m referring to the ability to expand the core functionality of the application beyond the provided functionality, which becomes less interesting after the first 60 days.

3. Don’t sabotage IT plans

IT may actually have a way to provide the business the functionality it needs at a lower cost and in a manner that is congruent with the direction of the core infrastructure of the company. Notice I say the functionality that the business needs - not exactly the functionality of the vendor product.

On the other hand, if IT wants the business, it should be continually performing top-down analysis of the technical environment and demonstrating progress towards the ability to support the business requirements. If there’s no plan, the business is not sabotaging the plan, now are they?

At the end of the day, the general situation that SAAS seems to make the most sense is in an SMB shop when the data sources are somewhat standard (i.e., popular) and the data is not overly sensitive in an SMB shop. I am quite optimistic that SAAS will make numerous inroads in the coming years.

Technorati tags: software as a service, SAAS


Posted May 21, 2008 8:18 AM
Permalink | 2 Comments |

Greenplum's current main strategy is its bundling with Thumper, a product line from Sun Microsystems.

Network Appliance has sued Sun Microsystems for intellectual property infringement, specifically related to ZFS, the zettabyte file system that Thumper utilizes.

Dave Hitz, Founder and Executive Vice President of Network Appliance, explains their position here and Johnathan Schwartz, CEO of Sun Microsystems replies in his blog here.

In triangulating this, it could be a problem for Greenplum.


Posted March 5, 2008 9:07 AM
Permalink | No Comments |
   VISIT MY EXPERT CHANNEL

Search this blog
Categories ›
Archives ›
Recent Entries ›