Blog: Ronald Damhof I have been a BI/DW practitioner for more than 15 years. In the last few years, I have become increasingly annoyed - even frustrated - by the lack of (scientific) rigor in the field of data warehousing and business intelligence. It is not uncommon for the knowledge worker to be disillusioned by the promise of business intelligence and data warehousing because vendors and consulting organizations create their "own" frameworks, definitions, super-duper tools etc. What the field needs is more connectedness (grounding and objectivity) to the scientific community. The scientific community needs to realize the importance of increasing their level of relevance to the practice of technology. For the next few years, I have decided to attempt to build a solid bridge between science and technology practitioners. As a dissertation student at the University of Groningen in the Netherlands, I hope to discover ways to accomplish this. With this blog I hope to share some of the things I learn in my search and begin discussions on this topic within the international community. Your feedback is important to me. Please let me know what you think. My email address is Copyright 2010 Mon, 27 Sep 2010 23:59:26 -0700 We need to be flexible - Do we? Really? In 1967 Thompson wrote about the administrative paradox; a dichotomy where continuity (stability) and flexibility are positioned at both ends of the spectrum. In other words; be flexible and at the same time try to progressively eliminate or absorb uncertainty. This paradox can also be discussed in terms of time; in the short run administration seeks to reduce uncertainty. In the long run, the administration strives for flexibility.

iStock_000011274573XSmall-1.jpgNothing new I hope? Now, what about Information Systems...

In using Information Systems we also need to deal with this paradox. We tend to use Information systems to automate tasks, formalize sequences of events, kill flexibility (;-)). An Information System can be interpreted as a 'bureaucrat in an electronic version'(Checkland and Holwell, 1998).

So, what do we do? We tend to modularize information systems, integrate them via services that are of course strongly decoupled with each other. IT delivers and supports all kinds of business functions and with a brilliant Service Oriented Architecture we cross the bridge between function and business process. We can now change the business processes if demand requires it.

Yee - we=happy. we=flexible again. Easy huh?

NO, it is not easy. It can be an open check you write to your 'partners' - the System Integrators, it may takes years before you capitalize on the investment that has been made. And in the process you tend to demotivate your own personnel (or customers) big time

My point; the balance between stability and flexibility is sometimes totally lost in organizations. Some architects and many vendors/solution providers are pushing the flexibility agenda big time nowadays, but the 'why' of flexibility has never been fundamentally discussed with(in) top management. The 'why' should be related to the industry your in and the strategy you wish to approach the market with. For example; I believe firmly that many government agencies should focus on stability over flexibility. But unfortunately, they seem not to agree with me. And what also needs to be considered is that stability and flexibility are interconnected; more focus on flexibility will diminish your stability and vice versa. Accept collateral damage if you architecture is all centered around 'being flexible', if you want both, well, you can not and expect to pay a price ;-)

Even if the case for flexibility is made, the 'how' should be extremely careful considered. Is flexibility in business processes needed (hard question)? Or is flexibility in data sufficient (which is a huge difference in terms of attainability, costs and organizational impact), but the latter can overcome the Administrative Paradox at least partly....

]]> Mon, 27 Sep 2010 23:59:26 -0700
Change always comes bearing gifts
A story.....
  • Vendor X sells its ERP to a company in Healthcare;
  • Client wishes to setup its informational environment (data sharing, BI, CPM etc..) right from the start;
  • Vendor X pushes the 'standard' solution' they sell;
  • Client decides to decouple their informational environment from its source(s) for several reasons (heterogeneous sources, sustainability, compliance, adaptability etc..);
  • Vendor X deploys their ERP;
  • Client starts to design and build the informational environment;
  • Interfaces between ERP of vendor X and the informational environment are developed;
  • The ERP of vendor X off does not offer functional interfaces ('X keeps pushing their standard product'), so client needs to connect on the physical level;
  • Going-live is near; of both the ERP and the new informational environment

And then change management of vendor X regarding the ERP kicks in.

Client: 'What's your release schedule for patches'?
X: 'Every 2 weeks' 
Client: 'Huh'?

Client thinks: 'Damn, how can I keep up with this change schedule?'

Client: 'Well, can you tell me anything regarding the average impact of these patches?'
X: 'Well, they can be very small and very big' 

Client thinks: 'Ok, what are you NOT telling me' 

Client:'Ok, but this ERP is like 15 years old, so give me an overview of the average impact'
X: 'Basically anything can happen'

Client thinks: 'o, o'

Client: 'Ok, but the majority of these changes are of course situated in the application layer, not the data layer?'
X: 'Well..anything can happen.'

Client thinks: 'Is it warm in here?'

Client: 'Anything? Also in the data layer? Table changes, integrity changes, domain type changes, value changes?'
X: 'Aye'

Client thinks: 'Ok - I'm dead'

Client: ' least tell me that existing structures always remain intact and the data remains to be auditable - extent instead of replace for example'
X: 'Huh'?

Client thinks: 'Well, at least I am healthy...'

Client: 'hmm...just a side note, we use Change Data Capture, I assume that these changes are fully logged?'
X: 'Nah - log is turned off, otherwise we can't deploy the changes' 

Client thinks: ' my resume up to date?'

My point; do not assume your vendor (of any system) to engage in professional application development and a change management policy that takes into account the simple fact that data of these information systems need to be shared with other information systems in your company.

Change management and professional application development needs to be important criteria regarding the selection of information systems.

]]> Tue, 08 Jun 2010 14:29:39 -0700
Collaboration software - fluff?
Business Intelligence vendors seem to embrace collaboration (I am still struggling whether this software is any different from the groupware we had in the 90's) . As an example please take a look at SAP streamwork at youtube. I am gonna be blunt here; this type of software is completely useless, unless the organization is willing to fundamentally change its decision making process.

Let me try to make my point here with the help of giants like Galbraith, Daft, Davenport and some..

There are basically two information contingencies; Uncertainty and Equivocality.
  • Uncertainty can be defined as the absence of information (e.g. Shannon and weaver) and can be overcome by simply asking the right question. The answer is out there.....
  • Equivocality is an ambiguity, the existence of multiple and conflicting interpretations about an organizational situation. Participants are not even sure about the questions that need to be asked, let alone the answers they need. I think this can also be regarded as 'wicked problems'.
Now, for overcoming uncertainty you can suffice with relatively blunt instruments. Reporting and the ever increasing possibilities in analytics really shine in reducing uncertainty.

Now, for overcoming Equivocality the Business Intelligence stuff like reporting and even analytics have diminishing usage. You need more 'richness' in the tooling. And with tooling I don't necessarily mean software. Examples of more rich tooling are group meetings, discussions, planning, creative (group) thinking, etc..Simply put; you need face-to-face contact.
Davenport wrote an article about 'Make Better Decisions' in the Harvard Business Review in 2009. He is advocating a more formalized approach towards decision making:

'Smart organizations can help their managers improve decision making in four steps: by identifying and prioritizing the decisions that must be made; examining the factors involved in each; designing roles, processes, systems, and behavior to improve decisions; and institutionalizing the new approach through training, refined data analysis, and outcome assessment.'

Davenport, in my opinion, is aiming towards the equivocality and a more formalized method of coming to an outcome. And frankly, I like it a lot. But organizations need to really be willing to change its decision making process. And this is a major organizational and cultural change in my opinion. If organizations are really committed (Davenport is naming a few of those companies - like Chevron, The Stanley Works) in making this change, collaboration software has the potential to shine in supporting such a decision making process.

I am however afraid that collaboration software from BI vendors will be sold as candy with the promise of better decisions. And that is just bullshit and my prediction is that it will fail big time. 

]]> Tue, 30 Mar 2010 12:17:36 -0700
Outsourcing DSS is not the same as OLTP What we all knew was true, but could not get across to management, is now more scientifically proven. The decision process regarding the outsourcing of a DSS is influenced by significantly other characteristics, when compared to OLTP. If you are interested in the details, the theory and the underlying data, please read:

Factors considered when outsourcing an IS system:an empirical examination of the impacts of organizations size, Strategy and the object of a decision (DSS or OLTP).

B.Berg and A.Stylianou in the European Journal of Information systems (2009 18, 235-248)

I still encounter organizations who are stuck in the OLTP world, even when the object of decision regarding outsourcing is completely different on many dimensions. They tend to use the same decision process regarding the outsourcing as they always did...whether they outsource an ERP, a CRM system a data warehouse or a more elaborate BI system.

]]> Mon, 08 Feb 2010 07:47:39 -0700
Disruption incoming?

I have just read a very intriquing paper called 'A Common Approach for OLTP and OLAP using an In-memory Column Database', written by Hasso Plattner. 

It's not a revolutionairy new technical approach for Data Warehousing a Business Intelligence. It's just a series of smaller (mostly technical and some are even quite old) innovations that together could lead to a paradigma shift [1] in the area of Data Warehousing.

This paper is focussing on the transactional world, because that's where the disruption will originate. In short;

  • Ever increasing multi-CPU cores
  • Growth of main memory
  • Column databases for transactions (!)
  • Shared Nothing approach
  • In-memory access to actual data - historic data on slower devices (or not)
  • Zero-update strategies in OLTP (recognizing the imporance of history as well as the importance of parallelism)
  • Not in the paper; but I see datamodels for newly build OLTP systems increasingly resembling the datamodels of the HUB in the data warehouse architecture. 

]]> Sat, 12 Dec 2009 04:13:40 -0700
Tied selling......let's take a stand

Lately I come across vendors of OLTP systems that engage highly in tied selling. Now - if there is complete transparency to the customer- this is off course no problem. But, the sad thing is, this transparency is usually extremely bad or even non existent. 

This tied selling combined with (complete) lack of transparency has caused severe damage to the economy as we have seen especially in the financial sector. In Europe the European Commission is highly active in forcing vendors to stop tied selling (for example Internet Explorer and the OS of Microsoft). Vendors receive huge fines if they engage in these practices.

Unfortunately these practices are also quite mainstream in the data exploitation industry.

]]> Tue, 20 Oct 2009 04:54:29 -0700
Data is and will always be the ultimate proprietary asset Yep - bringing up an oldie, but still o so true.

Financial assets: A single Euro can only be spend once.....

Human captial: They can just walk out of the door and leave or be hunted by your competitor

Not so with your data; It is uniquely yours, closely connected to your business language, full of nuance, in the proper context and the ability to procreate new data. Every organization can use it's data differently to create it's own niche. When data is used by anyone in your organization, it will not be consumed and it will not expire (a patent will). There is simply no other asset that anyone can think one that owns these capabilities.

To make the case even stronger; the movement of data in your organization is your lifeblood. Any company will simply seize to exist if this data is of bad quality. Orders will not get through, customers credit information can not be seen and invoices can not be send.

Another fascinating attribute of data is that it defines the other assets; financial data, property data, employee data. Badly managed data will jeopardize the management of the other assets big time. So, it's kind of a 'super', ultimate asset.

Nicholas Carr wrote in 2003 in the Harvard Business Review 'IT doesn't matter'. Systems, infrastructure, services, hardware, software, ERP...whatever...they will and probably are already commodities. They are not a differentiating factor in competitive advantage, sustainability or reliability.  Data is however always unique, competitors can not get hold of it (well....unless you got a security problem) and if they got it, it will probably be hard to use, since data is often very closely linked to the organizational lingua franca.

Data must be nurtured, managed aggressively and - most of all- must be enabled to be exploited to the max in order to get a good return.It is by far the ultimate proprietary asset.

I believe that in business nowadays huge amounts of money are spend on very sophisticated technology, financial controls and process management  (the process taliban). The marginal returns from ever increasing investments into these developments will be small. Investments in data are quite the opposite. The opportunities are limitless. And, more importantly, any investment you make is proprietary to your organization. This is the true definition of competitive advantage. 

This posted is inspired and sometimes quoted by T.Redman and his book 'Data Driven'

]]> Thu, 27 Aug 2009 07:05:59 -0700
Data Vault in the Netherlands July 2007 I had a first aquaintance with Dan Linstedt - via LinkedIn.
I worked for a large government organisation as the architect for Data
Warehousing and Business Intelligence. I seriously considered Data
Vault as the standard methodology for the Enterprise Data Warehouse. I
worked for over 2 years for this govenment organisation and my choice
for the Data Vault is still a very good one.

November 2007 I got
Dan Linstedt to the Netherlands and got the first 27 peeps trained and
certified of that same government organisation. I had dozens of
discussions with architects, management, developers, etc...but in the
end the consensus on choosing Data Vault was huge.

Since november
2007 Data Vault sky rocketed in the Netherlands. Together with the
Genesee Academy and DNV I think we certified over 200 people and I dare
to say that most new EDW projects nowadays in the Netherlands have
chosen Data Vault as the prime methodology.

Before I get
criticised for sanctifying one methodology despite the various (non)
functional requirements; I do believe that ANY architecture needs to be
in line with the business-, application- and technical architecture. So
choosing one methodology on forehand is just plain wrong. There has to
be sound justification and alignment for any choice in your
architecture. Whether it's a tool, a methodology or whatever.

2 years we (Genesee Academy, me, DNV) certified over 200 people - most
of them extremely seasoned and respected experts in our field. On
average these consultants are very critical on any new 'trick in the
book'. I dare to say that most of the experts that were certified were
convinced that Data Vault has got extreme value and potential. The
proof of the pudding is in the eating; Data Vault got many
implementations in the Netherlands the last two years and it's growing

I am confident that Data Vault in the years to come will hit Europe as hard as it has hit the Netherlands.

I wonder......I and am still baffled by it. The attention for Data
Vault in the United States is extremely limited, why is that? Even a
respected institute as TWDI is not paying much attention. Why? Experts
in the field do not mention Data Vault what so ever. I just don't get
it. Can anybody enlighten me?

For those of you not familiar with Data Vault --> read my articles.

]]> Tue, 18 Aug 2009 09:00:03 -0700
Back to basics; 'EIS revisited' In the 'Neverland' of decision support, Business Intelligence has completely taken over all attractions.

Let's name a few; BI, Neuro BI, Ambient BI, Sentient BI, Process driven BI, process Intelligence, Pervasive BI, BI 2.0, BI as a Service, BI for SOA, SOA for BI, Mobile BI, google BI, Personal BI, BI-Tools-where-you-do-not-need-a-DWH-and-ETL-Buy-it-now, Mashups-the-new-BI-Desktop (oh my god), BI-in-the-box,Business Analytics,BI-in-the-cloud (love this one!), BI-virtualization, Agile-BI,Operational-BI,Decision Intelligence, Decision Management, Even driven analytics, Complex Event Processing, BAM, Collabarative Decision Making......

Do not get me wrong; I certainly do not dismiss them all.....just most of them :)

Most of these 'attractions' are highly sponsored, look super-dooper on the outside and if you won't sit in them you will loose out big time (so they say). However, when you finally decide to ride them you feel that they are not that stable, not really safe and there is hardly any enjoyment when you exit them (but you can tell you neighbour you dared to ride it!). To put it in other words, not really grounded in theory and the relevance for practice is extermely hard to find.

There are however some attractions in this Decision Support 'Neverland' that are very much dusty, spiderwebs all over the place, but the attraction is still extremely solid and if we would overhaul it with new architectural insights and technology, it could be a smash-hit. These attractions are named DSS, EIS, ESS.... 

I feel that we - as an industry - failed miserably in continuing on the path that was made for us by people like John Dearden, John Rockart, David Delong, Ralph Sprague, Hugh Watson, Steven Alter, Daft and Lengel, Peter Keen, Michael Scott Morton, Herbert Simon, Henry Mintzberg and many more.

Dan Power is one of those brave souls who is standing with the ticketbox - selling tickets for his 'attraction'. I recommend people to read his last written article as well as the blog post written by Wouter van Aerle.

Finally, I wanna contribute to the Decision Support 'Neverland': BI goes Retro

]]> Tue, 30 Jun 2009 06:54:54 -0700
Information Richness and Business Intelligence In 1986 Daft and Lengel published an article in Management Science:


A very interesting article that combines the processing of information by decision makers, the use of structural mechanisms and organizational design. The latter one - organizational design - I will not discuss thoroughly in this blog. Just remember that the choice of structural mechanism to overcome uncertainty/equivocality will impact the organizational design.

]]> Thu, 25 Jun 2009 02:16:29 -0700
A response to 'The flaws of the classic data warehouse' (3) It is only by means of good and respectfull discussion that knowledge
and insight will evolve. This post should be regarded as such.

This post is a second reaction to the first article in a series of three
which were written by a highly respectfull thoughtleader in the field
and publisher on the B-Eye-Network; Rick van der Lans. The papers are
titled 'The Flaws of the Classic Data Warehouse Architecture'.

This blog post is a reaction to the first part. It deals with the flaws of the classic data warehouse architecture (CDWA).

Rick signals five flaws which will lead in article two and three to a new architecture. This post is addressing the second flaw.

- My reaction to flaw #1 can be read here
- My reaction to flaw #2 can be read here

Flaw 3 according to Rick
Rick signals the need to do analytics on external data and on datastores that are unstructured. I quote Rick: 'Most vendors and analysts propose to handle these two forms of data as follows: if you want to analyze external or unstructured data, copy it into the CDW to make it availablefor analytics and reporting'. Ricks is wondering why? Unstructured data can be 'handled' on the source and external data can be done by mashup tools.

My reaction to flaw 3
Where are these vendors and analysts that propose to copy unstructured data into the CDW? I do not know them....really I don't. And if they exist - I agree with Rick; don't do it. Especially for the unstructured data I think other architectural choices are more optimal at the moment. But where is the flaw in the CDWA architecture? The CDWA was not meant for unstructured data and still is not. I still do not see the flaw...

But for the external data, I really believe in the years to come that there is still a solid business case of getting this data into your data warehouse. Sure enough - especially for more
situational BI - mashups offer very fast time to market for new informational products. Although I believe the securtity issue is not to be underestimated as well as the need to perform analytics on the combinations of internal and - multiple sources of - external data.  

Mashups also need solid architectures.......

- I challenge the notion that the vendors and analysts in data warehousing massively propose to put unstructured data in the DWH. The CDWA was not meant for that purpose. Not a flaw.
- There are solid business cases for getting your external data into your DWH. The CDWA is still a valid approach. Not a flaw.
- Mashups - as is new (BI) technology in general - surely offer new features and promising functionality.

]]> Mon, 22 Jun 2009 05:03:03 -0700
Data warehousing is failure prone......or is it not?

I often critisize vendors and others for not being thorough enough. Now it's time to critisize science a bit as well as those consultants and analysts that are 'riding the wave' of negative sentiment surrounding data warehousing.

In several papers I am reading at the moment (ranging from MIS Quarterly, Information & Management, Decision Support Systems and many more journals) I encounter something similar.

Please read the following quotes:

"....the road to DW success has been littered with failures [43,63,80]"
"....nearly half of all DW initiatives end up as failures [38]"
"According to a press release 2005 by Gartner: through 2007 more then 50% of data warehouse projects will have limited acceptance, or will be outright failures."

The above quotes I got from one paper in one paragraph in Decision Support Systems (paper is from 2008 - very recent), which is an important journal in our field of expertise. Since I encounter these statements over and over again, I decided to follow up on the citations.

Let me begin by dismissing the quote refering to Gartner's press release. That's not really a sound scientific basis.....So we are left with two more quotes.

[43] C.Hall Corporate use of data warehousing and Enteprise Analytic Technologies, Arlington, Massachusets, 2003 URL:
Ronald: can not validate the information. You need to buy a report......

[63] S.Kotler When enterprise hit open road: move beyond the silos and let the idead roll, Teradata Magazine 3 (3) 2003
Ronald: I read this it becomes shocking. Let me quote this article:

"According to a recent article in Information Week, "41% of all companies surveyed by the Cutter Consortium, an IT consultant and market analysis firm, have failed data warehouse projects, and only 15% call their data warehousing efforts to date a major success."

Ok....this citation is actually referencing the first one [43] (I think...can't access it). We got a loop here....

[80] M.Quaddus, A.Intrapairot, Management Policies and the diffusion of DWH: a case study using dynamic based decision support systems, Decision Support Systems 31 (2) 2001, 223-240
Ronald: ok, this is becoming complex. I quote this paper:

"..quite a few DW projects end up in failure even before full implementation owing to lack of immediate substantial economic returns on massive investment [24,25,67]"

So a citation is used in a paper that refers to a claim in another paper but with different citations (you guys still with me here?). So we end up with three more citations which are off course even further in time, let's examine these:

[24] R. Hackathorn, Data warehousing energises your enterprise, Datamation 41 (2) 1995. 38-42
- I could not find this article, so I was not able to validate the claim that was being made. It sounds to me like some sort of column though. But again - can not validate.

[25] C. Horrock, Making the Warehouse Work, 1996, available from http://www.computerworld.comrsear . . . -htm1r9606r960624DW1SL96dw10.html.
- I could not find this article, so I was not able to validate the claim that was being made.

[67] The Siam Commercial Bank's Staff, Data Warehouse Questionnaire and Interview, (6 January-28 February 1998). Personal Communication.
- I could not find this 'article', so I was not able to validate the claim that was being made.

[38] L.Greenfield, The Data Warehousing Information Center, december 19, 2003
Ronald: This is a link to a whole site.......oh my, how on earth can you make a reference to a whole site??

To summarize things; I was not able to establish any (empirical) evidence that would support the claim made in the paper of DSS (and in several other papers as well) - which is by the way quite a recent paper (2008). Somehow we are being made to believe that Data Warehousing are failure prone. Increasingly I encounter consultants and analysts that fuel this negative sentiment surrounding data warehousing. I challenge anyone to deliver some real (empirical) evidence. As for now - I suggest we all use caution in communicating that Data Warehouse undertakings tend to fail a lot.

On the subject, there is one interesting piece of paper from TDWI, written by Hugh J.Watson (I think in 2006) that seems to be relevant and hitting the nail on the head by saying - and I quote:

"The data suggests that whether data warehouses are failure-prone depends on one's definition of "failure." Varying with the architecture implemented, there is approximately a 32-47 percent chance that a warehouse will be behind schedule, and a 30-43 percent chance that it will be over budget. However, this does not mean that the warehouse will not succeed. By a more global measure of success, only 10-20 percent of warehouses are potentially in trouble, while the others are either up-and-coming systems or runaway successes"

Ronald: And yes, Watson is using an empirical basis. ]]> Mon, 22 Jun 2009 03:24:11 -0700
A response to 'The flaws of the classic data warehouse' (2) It is only by means of good and respectfull discussion that knowledge and insight will evolve. This post should be regarded as such.

This post is a second reaction to the first article in a series of three which were written by a highly respectfull thoughtleader in the field and publisher on the B-Eye-Network; Rick van der Lans. The papers are titled 'The Flaws of the Classic Data Warehouse Architecture'.

This blog post is a reaction to the first part. It deals with the flaws of the classic data warehouse architecture (CDWA).

Rick signals five flaws which will lead in article two and three to a new architecture. This post is addressing the second flaw.

- My reaction to flaw #1 can be read here.

Flaw 2 according to Rick
The CDWA stores a lot of redundant data. The more redundant the data, the less flexible the architecture is. We could simplify our data warehouse architectures considerably by getting rid of most of the redundant data. Hopefuly, the new database technology on the market, such as data warehouse appliances and column-based database technologies, will decrease the need to store so much redundant data. Rick commented on this flaw in his closing keynote statement on a BI event we had last week, stating basically that the DWH professional did an extremely lousy job last decades in building these redundancy monsters. Like in his article he strengthened this argument by research done by Nigel Pendse claiming that the average BI application only needed a fraction of the stored (redundant) data. 

My reaction to flaw 2
First of all, I agree that new technologies can limit the volume of redundant data considerably.

But to say that in the last decades the data warehouse professional did an etremely lousy job because of the huge redundancy they created in their data warehouses...well, that's just plain stupid and for the people that are applauding this statement I would like to say; 'I bet you never actually build a data warehouse'.

BI populism.....thats what it is.

As for the flexibility argument; more redundant data kills flexibility.'s a bit of a bs-argument. Because flexibility is not only affected by redundant data. If I had build my data warehouses in the last decades without redundant data I would have ended up with huge complex transformation rules and a big strain on processing capacity. Both issues woud have killed the flexibility big time and I am leaving aside the degradation of performance, degradation in ease of use, degradation in maintainability and the degradation of the testability of the system. But I agree - I would not have redundant data...I would not have any quality of service either....but who cares.

BI populism.....thats what it is.

But is the CDWA architecture flawed by this redundancy problem? I do not think so at all. We would still need a datastore of some kind (Rick seems to acknowledge that by advocating the use of appliances), we would still have several layers after this datastore, preparing the data for several different functionalities (reporting, mining, advanced analytics, datasharing to third parties, etc.). Let's take the datamart layer, will it dissapear? I don't think so. The question is whether it needs to be materialized. And that's where new technology will be extremely valuable. It seems that Rick is translating the word 'Architecture' with 'Technical Architectue' as a 1:1 relationship.

The hub-spoke architecture of the CDWA model is still extremely valid. Off course, technology within this architecture will evolve and will enable us to deliver an even better quality of service.

]]> Sun, 14 Jun 2009 03:57:06 -0700
A response to 'The flaws of the classic data warehouse' (1)
It is only by means of good and respectfull discussion that  knowledge and insight will evolve. This post should be regarded as such. Furthermore, it is from a good friend from whom I understood that Rick meant to be controversial with these papers.....

This post is a first reaction to the first article in a series of three which were written by a highly respectfull thoughtleader in the field and publisher on the B-Eye-Network; Rick van der Lans. The papers are titled 'The Flaws of the Classic Data Warehouse Architecture'.

This blog post is a reaction to the first part. It deals with the flaws of the classic data warehouse architecture (CDWA) according to Rick. If you wanna know what exactly constitutes a CDWA - I would suggest to read this first part.

Rick signals five flaws which will lead in article two and three to a new architecture. This post is addressing the first flaw. In upcoming postings on this blog I will also adress the other four and I will also respond to the solution he is proposing.

Flaw 1 according to Rick
The CDWA does not support the concept of Operational Business Intelligence. This conclusion is drawn from the fact that the CDWA can not include 100% up-to-date information. Rick concludes that we have to remove storage layers and minimize the copy steps.

My reaction to flaw 1
A metaphor; I am driving my car and suddenly I say 'damn; I wanna fly'. Looking at my car, I can not seem to find the 'fly' button and I therefore conclude that my car  is flawed.

Although a bit of a corny metaphor it reflects the core of my criticism. Aparently there is a new requirement called Operational Business Intelligence* that can not be served by the existing architecture. Is the existing architecture then flawed? I do not think so. Does the existing architecture fit the needs of the organisation? I do not think so. So flaw 1 in my opinion is not a flaw, it might simply be not a good fit between requirement and architecture.

Let's take this corny metaphor one step further. Suppose there is a genuine need for me to fly (e.g 100% up-to-date information for decision-like processes*). Is it then considered common sense to build wings on my car and put in a jet engine? I wouldn't ......I would just buy a plane ticket and get to an airfield or maybe I would use a substitute to achieve my objectives....the train.

To conclude; requirements are evolving and architecture needs to follow. The data warehouse architecture depicted as a hub-spoke model is still valid for it's intented use (although the design is evolving). New requirements can lead to new choices in architecture (and subsequently in design). 

Although I do not agree on the flaw issue, I do agree that new requirements can require new architecture which - in the end - is exactly what Rick is proposing (although I do not agree completely on this new architecture - but lets keep that in mind for a next posting). 

* as you can see I am eluding the tedious discussion regarding the term Operational Business Intelligence. I am also eluding the so-called 'fact' that organizations all need 100% up-to-date information for decision like processes.

]]> Wed, 10 Jun 2009 13:20:18 -0700
Disruptive innovation - to be or not to be Vendors, but also the analysts (and I see a trend...), are increasingly using the term 'disruptive' for new products, new technologies or whatever. And lately it kind of got to me, simply because -  most of the time - there's no basis at all to define something 'disruptive'. It kind of inflates the term...big time.

'So what', I hear you say. Well, there is off course not much of a problem when a vendor defines their own technology or produtc as being 'disruptive'. I know where it comes from and I understand the vendors wish to increase its turnover by claiming to sell a disruptive technology/product/etc..

But when analysts do it, I am getting more suspicious and sometimes extremely annoyed. It is the analyst that needs to be neutral, a bit restrained and off course critical. The analyst needs to put this 'disruptive' stuff a bit in perspective for the reader.

Let's try to get some sort of definition to the word 'disruptive'. So I did some research and ended up with Kalle Lyytinen's paper from 2003 in MIS Quaterly called;
"The disruptive nature of Information Technology Innovations: The Case of Internet Computing in Systems Development Organizations"
In my opinion a very good paper. And by the way; in his study he shows that Internet Computing has radically impacted the IT innovations of firms both in terms of development processes and services. Maybe not at all suprising, but if you compare this type of innovation with (let's take an arbitrary example that is often defined as disruptive*) DW appliances......

Lyytinen defines disruptive innovation as:
They radically deviate from an established trajectory of performance improvement, or redefine what performance means in a given industry (Chistensen and Bower 1996). They are radical (Zaltman et al. 1977) in that they significantly depart from existing alternatives and are shaped by novel, cognitive frames that need to be deployed to make sense of the innovation (Bijker 1987). Consequently, disruptive innovations are truly transformative (Abernathy and Clark 1985). To become widely adopted, disruptive architectural innovations demand provisioning of complementary assets in the form of additional innovations that make the original innovation useful over its diffusion trajectory (Abernathy and Clark 1985; Teece 1986). By doing so, disruptive innovations destroy existing competencies (Schumpeter 1934) and break down existing rules of competition.

Are appliances or new technology for data storage and data management really disruptive? Or are they just the natural flow of continuing innovation. I think the latter.

Let's be cautious in using big words like 'disruptive'.......

 * Did a quick search on 'Disruptive' in B-eye-Network and found 128 hits, most of them appliances or other 'revolutionary' database products/technologie

]]> Wed, 10 Jun 2009 02:11:37 -0700