Blog: Ronald Damhof Subscribe to this blog's RSS feed!

Ronald Damhof

I have been a BI/DW practitioner for more than 15 years. In the last few years, I have become increasingly annoyed - even frustrated - by the lack of (scientific) rigor in the field of data warehousing and business intelligence. It is not uncommon for the knowledge worker to be disillusioned by the promise of business intelligence and data warehousing because vendors and consulting organizations create their "own" frameworks, definitions, super-duper tools etc.

What the field needs is more connectedness (grounding and objectivity) to the scientific community. The scientific community needs to realize the importance of increasing their level of relevance to the practice of technology.

For the next few years, I have decided to attempt to build a solid bridge between science and technology practitioners. As a dissertation student at the University of Groningen in the Netherlands, I hope to discover ways to accomplish this. With this blog I hope to share some of the things I learn in my search and begin discussions on this topic within the international community.

Your feedback is important to me. Please let me know what you think. My email address is Ronald.damhof@prudenza.nl.

About the author >

Ronald Damhof is an information management practitioner with more than 15 years of international experience in the field.

His areas of focus include:

  1. Data management, including data quality, data governance and data warehousing;
  2. Enterprise architectural principles;
  3. Exploiting data to its maximum potential for decision support.
Ronald is an Information Quality Certified Professional (International Association for Information and Data Quality one of the first 20 to pass this prestigious exam), Certified Data Vault Grandmaster (only person in the world to have this level of certification), and a Certified Scrum Master. He is a strong advocate of agile and lean principles and practices (e.g., Scrum). You can reach him at +31 6 269 671 84, through his website at http://www.prudenza.nl/ or via email at ronald.damhof@prudenza.nl.

In 1967 Thompson wrote about the administrative paradox; a dichotomy where continuity (stability) and flexibility are positioned at both ends of the spectrum. In other words; be flexible and at the same time try to progressively eliminate or absorb uncertainty. This paradox can also be discussed in terms of time; in the short run administration seeks to reduce uncertainty. In the long run, the administration strives for flexibility.

iStock_000011274573XSmall-1.jpgNothing new I hope? Now, what about Information Systems...

In using Information Systems we also need to deal with this paradox. We tend to use Information systems to automate tasks, formalize sequences of events, kill flexibility (;-)). An Information System can be interpreted as a 'bureaucrat in an electronic version'(Checkland and Holwell, 1998).

So, what do we do? We tend to modularize information systems, integrate them via services that are of course strongly decoupled with each other. IT delivers and supports all kinds of business functions and with a brilliant Service Oriented Architecture we cross the bridge between function and business process. We can now change the business processes if demand requires it.

Yee - we=happy. we=flexible again. Easy huh?

NO, it is not easy. It can be an open check you write to your 'partners' - the System Integrators, it may takes years before you capitalize on the investment that has been made. And in the process you tend to demotivate your own personnel (or customers) big time

My point; the balance between stability and flexibility is sometimes totally lost in organizations. Some architects and many vendors/solution providers are pushing the flexibility agenda big time nowadays, but the 'why' of flexibility has never been fundamentally discussed with(in) top management. The 'why' should be related to the industry your in and the strategy you wish to approach the market with. For example; I believe firmly that many government agencies should focus on stability over flexibility. But unfortunately, they seem not to agree with me. And what also needs to be considered is that stability and flexibility are interconnected; more focus on flexibility will diminish your stability and vice versa. Accept collateral damage if you architecture is all centered around 'being flexible', if you want both, well, you can not and expect to pay a price ;-)

Even if the case for flexibility is made, the 'how' should be extremely careful considered. Is flexibility in business processes needed (hard question)? Or is flexibility in data sufficient (which is a huge difference in terms of attainability, costs and organizational impact), but the latter can overcome the Administrative Paradox at least partly....




Posted September 27, 2010 11:59 PM
Permalink | No Comments |

A story.....
  • Vendor X sells its ERP to a company in Healthcare;
  • Client wishes to setup its informational environment (data sharing, BI, CPM etc..) right from the start;
  • Vendor X pushes the 'standard' solution' they sell;
  • Client decides to decouple their informational environment from its source(s) for several reasons (heterogeneous sources, sustainability, compliance, adaptability etc..);
  • Vendor X deploys their ERP;
  • Client starts to design and build the informational environment;
  • Interfaces between ERP of vendor X and the informational environment are developed;
  • The ERP of vendor X off does not offer functional interfaces ('X keeps pushing their standard product'), so client needs to connect on the physical level;
  • Going-live is near; of both the ERP and the new informational environment

And then change management of vendor X regarding the ERP kicks in.

Client: 'What's your release schedule for patches'?
X: 'Every 2 weeks' 
Client: 'Huh'?

Client thinks: 'Damn, how can I keep up with this change schedule?'

Client: 'Well, can you tell me anything regarding the average impact of these patches?'
X: 'Well, they can be very small and very big' 

Client thinks: 'Ok, what are you NOT telling me' 

Client:'Ok, but this ERP is like 15 years old, so give me an overview of the average impact'
X: 'Basically anything can happen'

Client thinks: 'o, o'

Client: 'Ok, but the majority of these changes are of course situated in the application layer, not the data layer?'
X: 'Well..anything can happen.'

Client thinks: 'Is it warm in here?'

Client: 'Anything? Also in the data layer? Table changes, integrity changes, domain type changes, value changes?'
X: 'Aye'

Client thinks: 'Ok - I'm dead'

Client: '...at least tell me that existing structures always remain intact and the data remains to be auditable - extent instead of replace for example'
X: 'Huh'?

Client thinks: 'Well, at least I am healthy...'

Client: 'hmm...just a side note, we use Change Data Capture, I assume that these changes are fully logged?'
X: 'Nah - log is turned off, otherwise we can't deploy the changes' 

Client thinks: '..hmm....is my resume up to date?'


My point; do not assume your vendor (of any system) to engage in professional application development and a change management policy that takes into account the simple fact that data of these information systems need to be shared with other information systems in your company.

Change management and professional application development needs to be important criteria regarding the selection of information systems.



Posted June 8, 2010 2:29 PM
Permalink | No Comments |


Business Intelligence vendors seem to embrace collaboration (I am still struggling whether this software is any different from the groupware we had in the 90's) . As an example please take a look at SAP streamwork at youtube. I am gonna be blunt here; this type of software is completely useless, unless the organization is willing to fundamentally change its decision making process.

Let me try to make my point here with the help of giants like Galbraith, Daft, Davenport and some..

There are basically two information contingencies; Uncertainty and Equivocality.

  • Uncertainty can be defined as the absence of information (e.g. Shannon and weaver) and can be overcome by simply asking the right question. The answer is out there.....
  • Equivocality is an ambiguity, the existence of multiple and conflicting interpretations about an organizational situation. Participants are not even sure about the questions that need to be asked, let alone the answers they need. I think this can also be regarded as 'wicked problems'.
Now, for overcoming uncertainty you can suffice with relatively blunt instruments. Reporting and the ever increasing possibilities in analytics really shine in reducing uncertainty.

Now, for overcoming Equivocality the Business Intelligence stuff like reporting and even analytics have diminishing usage. You need more 'richness' in the tooling. And with tooling I don't necessarily mean software. Examples of more rich tooling are group meetings, discussions, planning, creative (group) thinking, etc..Simply put; you need face-to-face contact.
 
Davenport wrote an article about 'Make Better Decisions' in the Harvard Business Review in 2009. He is advocating a more formalized approach towards decision making:

'Smart organizations can help their managers improve decision making in four steps: by identifying and prioritizing the decisions that must be made; examining the factors involved in each; designing roles, processes, systems, and behavior to improve decisions; and institutionalizing the new approach through training, refined data analysis, and outcome assessment.'

Davenport, in my opinion, is aiming towards the equivocality and a more formalized method of coming to an outcome. And frankly, I like it a lot. But organizations need to really be willing to change its decision making process. And this is a major organizational and cultural change in my opinion. If organizations are really committed (Davenport is naming a few of those companies - like Chevron, The Stanley Works) in making this change, collaboration software has the potential to shine in supporting such a decision making process.

I am however afraid that collaboration software from BI vendors will be sold as candy with the promise of better decisions. And that is just bullshit and my prediction is that it will fail big time. 
 


Posted March 30, 2010 12:17 PM
Permalink | No Comments |

What we all knew was true, but could not get across to management, is now more scientifically proven. The decision process regarding the outsourcing of a DSS is influenced by significantly other characteristics, when compared to OLTP. If you are interested in the details, the theory and the underlying data, please read:

Factors considered when outsourcing an IS system:an empirical examination of the impacts of organizations size, Strategy and the object of a decision (DSS or OLTP).

B.Berg and A.Stylianou in the European Journal of Information systems (2009 18, 235-248)

I still encounter organizations who are stuck in the OLTP world, even when the object of decision regarding outsourcing is completely different on many dimensions. They tend to use the same decision process regarding the outsourcing as they always did...whether they outsource an ERP, a CRM system a data warehouse or a more elaborate BI system.


Posted February 8, 2010 7:47 AM
Permalink | No Comments |

I have just read a very intriquing paper called 'A Common Approach for OLTP and OLAP using an In-memory Column Database', written by Hasso Plattner. 

It's not a revolutionairy new technical approach for Data Warehousing a Business Intelligence. It's just a series of smaller (mostly technical and some are even quite old) innovations that together could lead to a paradigma shift [1] in the area of Data Warehousing.

This paper is focussing on the transactional world, because that's where the disruption will originate. In short;

  • Ever increasing multi-CPU cores
  • Growth of main memory
  • Column databases for transactions (!)
  • Shared Nothing approach
  • In-memory access to actual data - historic data on slower devices (or not)
  • Zero-update strategies in OLTP (recognizing the imporance of history as well as the importance of parallelism)
  • Not in the paper; but I see datamodels for newly build OLTP systems increasingly resembling the datamodels of the HUB in the data warehouse architecture. 


Posted December 12, 2009 4:13 AM
Permalink | No Comments |

Lately I come across vendors of OLTP systems that engage highly in tied selling. Now - if there is complete transparency to the customer- this is off course no problem. But, the sad thing is, this transparency is usually extremely bad or even non existent. 

This tied selling combined with (complete) lack of transparency has caused severe damage to the economy as we have seen especially in the financial sector. In Europe the European Commission is highly active in forcing vendors to stop tied selling (for example Internet Explorer and the OS of Microsoft). Vendors receive huge fines if they engage in these practices.

Unfortunately these practices are also quite mainstream in the data exploitation industry.


Posted October 20, 2009 4:54 AM
Permalink | No Comments |

Yep - bringing up an oldie, but still o so true.


Financial assets: A single Euro can only be spend once.....

Human captial: They can just walk out of the door and leave or be hunted by your competitor

Not so with your data; It is uniquely yours, closely connected to your business language, full of nuance, in the proper context and the ability to procreate new data. Every organization can use it's data differently to create it's own niche. When data is used by anyone in your organization, it will not be consumed and it will not expire (a patent will). There is simply no other asset that anyone can think one that owns these capabilities.

To make the case even stronger; the movement of data in your organization is your lifeblood. Any company will simply seize to exist if this data is of bad quality. Orders will not get through, customers credit information can not be seen and invoices can not be send.

Another fascinating attribute of data is that it defines the other assets; financial data, property data, employee data. Badly managed data will jeopardize the management of the other assets big time. So, it's kind of a 'super', ultimate asset.

Nicholas Carr wrote in 2003 in the Harvard Business Review 'IT doesn't matter'. Systems, infrastructure, services, hardware, software, ERP...whatever...they will and probably are already commodities. They are not a differentiating factor in competitive advantage, sustainability or reliability.  Data is however always unique, competitors can not get hold of it (well....unless you got a security problem) and if they got it, it will probably be hard to use, since data is often very closely linked to the organizational lingua franca.

Data must be nurtured, managed aggressively and - most of all- must be enabled to be exploited to the max in order to get a good return.It is by far the ultimate proprietary asset.

I believe that in business nowadays huge amounts of money are spend on very sophisticated technology, financial controls and process management  (the process taliban). The marginal returns from ever increasing investments into these developments will be small. Investments in data are quite the opposite. The opportunities are limitless. And, more importantly, any investment you make is proprietary to your organization. This is the true definition of competitive advantage. 

This posted is inspired and sometimes quoted by T.Redman and his book 'Data Driven'


Posted August 27, 2009 7:05 AM
Permalink | No Comments |

July 2007 I had a first aquaintance with Dan Linstedt - via LinkedIn.
I worked for a large government organisation as the architect for Data
Warehousing and Business Intelligence. I seriously considered Data
Vault as the standard methodology for the Enterprise Data Warehouse. I
worked for over 2 years for this govenment organisation and my choice
for the Data Vault is still a very good one.

November 2007 I got
Dan Linstedt to the Netherlands and got the first 27 peeps trained and
certified of that same government organisation. I had dozens of
discussions with architects, management, developers, etc...but in the
end the consensus on choosing Data Vault was huge.

Since november
2007 Data Vault sky rocketed in the Netherlands. Together with the
Genesee Academy and DNV I think we certified over 200 people and I dare
to say that most new EDW projects nowadays in the Netherlands have
chosen Data Vault as the prime methodology.

Before I get
criticised for sanctifying one methodology despite the various (non)
functional requirements; I do believe that ANY architecture needs to be
in line with the business-, application- and technical architecture. So
choosing one methodology on forehand is just plain wrong. There has to
be sound justification and alignment for any choice in your
architecture. Whether it's a tool, a methodology or whatever.

Since
2 years we (Genesee Academy, me, DNV) certified over 200 people - most
of them extremely seasoned and respected experts in our field. On
average these consultants are very critical on any new 'trick in the
book'. I dare to say that most of the experts that were certified were
convinced that Data Vault has got extreme value and potential. The
proof of the pudding is in the eating; Data Vault got many
implementations in the Netherlands the last two years and it's growing
fast.

I am confident that Data Vault in the years to come will hit Europe as hard as it has hit the Netherlands.

But
I wonder......I and am still baffled by it. The attention for Data
Vault in the United States is extremely limited, why is that? Even a
respected institute as TWDI is not paying much attention. Why? Experts
in the field do not mention Data Vault what so ever. I just don't get
it. Can anybody enlighten me?

For those of you not familiar with Data Vault --> read my articles.


Posted August 18, 2009 9:00 AM
Permalink | No Comments |

In the 'Neverland' of decision support, Business Intelligence has completely taken over all attractions.

Let's name a few; BI, Neuro BI, Ambient BI, Sentient BI, Process driven BI, process Intelligence, Pervasive BI, BI 2.0, BI as a Service, BI for SOA, SOA for BI, Mobile BI, google BI, Personal BI, BI-Tools-where-you-do-not-need-a-DWH-and-ETL-Buy-it-now, Mashups-the-new-BI-Desktop (oh my god), BI-in-the-box,Business Analytics,BI-in-the-cloud (love this one!), BI-virtualization, Agile-BI,Operational-BI,Decision Intelligence, Decision Management, Even driven analytics, Complex Event Processing, BAM, Collabarative Decision Making......

Do not get me wrong; I certainly do not dismiss them all.....just most of them :)

Most of these 'attractions' are highly sponsored, look super-dooper on the outside and if you won't sit in them you will loose out big time (so they say). However, when you finally decide to ride them you feel that they are not that stable, not really safe and there is hardly any enjoyment when you exit them (but you can tell you neighbour you dared to ride it!). To put it in other words, not really grounded in theory and the relevance for practice is extermely hard to find.

There are however some attractions in this Decision Support 'Neverland' that are very much dusty, spiderwebs all over the place, but the attraction is still extremely solid and if we would overhaul it with new architectural insights and technology, it could be a smash-hit. These attractions are named DSS, EIS, ESS.... 

I feel that we - as an industry - failed miserably in continuing on the path that was made for us by people like John Dearden, John Rockart, David Delong, Ralph Sprague, Hugh Watson, Steven Alter, Daft and Lengel, Peter Keen, Michael Scott Morton, Herbert Simon, Henry Mintzberg and many more.

Dan Power is one of those brave souls who is standing with the ticketbox - selling tickets for his 'attraction'. I recommend people to read his last written article as well as the blog post written by Wouter van Aerle.

Finally, I wanna contribute to the Decision Support 'Neverland': BI goes Retro


Posted June 30, 2009 6:54 AM
Permalink | No Comments |

In 1986 Daft and Lengel published an article in Management Science:

ORGANIZATIONAL INFORMATION REQUIREMENTS, MEDIA RICHNESS AND STRUCTURAL DESIGN.

A very interesting article that combines the processing of information by decision makers, the use of structural mechanisms and organizational design. The latter one - organizational design - I will not discuss thoroughly in this blog. Just remember that the choice of structural mechanism to overcome uncertainty/equivocality will impact the organizational design.


Posted June 25, 2009 2:16 AM
Permalink | No Comments |