Blog: Rick Sherman Subscribe to this blog's RSS feed!

Rick Sherman

Welcome! In addition to data integration, my BeyeNETWORK blog will include observations on the business and technology of performance management, business intelligence and data warehousing. Most posts will be hosted on my Data Doghouse blog, so feel free to leave comments here or on the Data Doghouse. If you'd like to suggest topics or ask me any questions, please email me at

About the author >

Rick has more than 20 years of business intelligence (BI), data warehousing (DW) and data integration experience. He is the founder of Athena IT Solutions, a Boston-based consulting firm that provides DW/BI consulting, training and vendor services; prior to that he was a director/practice leader at PricewaterhouseCoopers.  Sherman is a published author of more than 50 articles, an industry speaker and has been quoted in CFO and Business Week. He also teaches data warehousing at Northeastern University's graduate school of engineering. You can reach him at and follow him on Twitter at

Editor's Note: More articles and resources are available on Rick's BeyeNETWORK Expert Channel. Be sure to visit today!

November 2009 Archives

bigdoglittledog.jpgI'm not only concerned about hand-coding versus ETL tools
I'm also concerned that potential buyers of ETL tools and the market in
general are only looking at a small number of players in the ETL

For many years industry analyst research groups
have identified the top two product vendors: Informatica and IBM (from
its acquisition of Ascential Software). So, naturally, these two appear
on any evaluation shortlist. The rest of the evaluation shortlist
usually includes the bundled products (mentioned in my recent posts)
that come with the databases, BI tools or applications that the
evaluating company already owns. Beyond these usual suspects, other ETL
or data integration products are pretty obscure and almost invisible,
at least from a general market perspective.

>>>Continue this post on The Data Doghouse

Posted November 17, 2009 9:46 AM
Permalink | No Comments |

hand-stop.jpgThis is a continuation of an earlier post that discussed the problems of hand-coding using ETL tools.

What Went Wrong?

are two aspects of effectively leveraging an ETL tool. First is
learning the tool's mechanics. e.g. taking the tool vendors' training
either in a class or through their on-line tutorials. Most IT people
have no problem learning a tool's syntax.  Since they most likely
already know SQL, they learn the tool very quickly.

But the second aspect actually involves understanding ETL processes.
This includes knowing the data-integration processes needed to gather,
conform, cleanse and transform; understanding not only what is
dimensional modeling but why and how do you deploy it; being able to
implement slowly changing dimensions (SCD) and change data capture
(CDC); understanding the data demands of business intelligence; and
being able to implement error handling and conditional processing.

>>>continue reading this post on The Data Doghouse

Posted November 10, 2009 9:34 AM
Permalink | No Comments |

hand-stop.jpgWe are creatures of habit. It's not easy to stop doing something the
way we've always done it. Especially when we think we are right (but
actually we're not). Let's explain.

I have discussed (some might
say preached) in many posts, articles, webinars, podcasts, classes and
client discussions that for any recurring data integration tasks IT
should use an Extract, Transform and Load (ETL)  tool.

certainly has been the best practice for enterprise data warehousing
projects in the Fortune 1000. This is where I got my early experience
in data integration and got to use the ETL tools that annually rank in Gartner's Upper Magic Quadrant and Forrester's Top Wave.
These ETL tools enabled IT groups and SI (system integrator) project
teams to tackle data integration challenges too complex and extensive
for hand-coding.

>>>continue reading this post on The Data Doghouse 

Posted November 5, 2009 5:03 PM
Permalink | No Comments |