Business Intelligence projects have become commonplace amongst industry leaders over the past several years. What never ceases to amaze me is how different the implementations are from one environment to another and how often a setup is extremely strong in one area, while lacking in another. I stumbled across a really great document about IT governance that discussed process maturity and thought that some similar concepts could be applied in the BI domain.
The below tables are a collection of central ideas from our BI team’s experiences that are meant to be a guide in determining the level of maturity or completeness of the BI implementation. I’ll be the first to admit that they are not all-inclusive as project requirements can cover virtual miles of territory. But they certainly contain many of the core components necessary for a highly useable, secure, available, quality, visible set of analytics.
What categories, components, or ideas would you add to the list?
High Level BI Maturity Assessment
- Lack of Awareness – May have heard terms related to BI, but have little or no understanding of what is involved. There is no awareness of potential impact of data analytics.
- Aware – Have become aware of BI topics and some general benefits and features. May have minimal exposure to related technologies. Potential benefits are still abstract, but there is recognition that BI analytics would have a positive impact on the business. Could be some manually created spreadsheets or documents that are not easily shared or centrally stored.
- Entry Level – The first steps have been taken. The questions and problems that can be answered by data analysis have been defined. There is likely some one-off development that has been done using entry level self-service products or spreadsheets. Analytical collaboration and re-use between end users is difficult. Producing new analytics is time consuming, use inconsistent rules, and have data quality issues.
- Centralized Data Rules – Source data for analytics reside in a single point and is reused for all data visualizations. End user facing visualizations are still mainly contained within spreadsheets, but there may be some other relatively static reports that have now been created. Consistency of data rules has been improved by using a central source, but there are still many data quality issues and collaboration at the data presentation level is still difficult. Data latency is relatively high, with data being refreshed on periods of 24 hours or greater. Most analytics are still looking backwards into “what happened”.
- Clean and Shared – Data quality issues have been addressed and there are processes in place to continually improve. A platform has been chosen to deploy collaborative analytics that can be shared amongst users. Data latency has improved to hours or minutes. With data quality improvement, analytics are now starting to show relationships between characteristics to show not only what happened, but why it happened.
- Predictable – Data quality is excellent. There are secondary and tertiary data components now included that allow immense analytical power and flexibility. End users are able to create, explore, and share analytics in only a few clicks. Security models follow best practices. Data latency is closer to real time to all end users. Infrastructure personnel have insight and visibility into BI processes to monitor for any issues. Analytics can now begin to show trends and help predict what will happen next.
Attributes of a Successful BI Implementation
Self-service Usability | Governance Control Security | Repeatability Latency | Data Quality | Process Visibility |
---|---|---|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|