Are you greeted with a chorus of yawns when you talk about data management? Try mentioning that it can save over three quarters of a billion dollars.
Like other global 100 firms, British Telecommunications plc has a complex patchwork of applications and overlapping business processes that generate vast amounts of data. As a leader in its class, however, BT instituted a far-reaching program to investigate and improve data quality that has generated savings of over $800M during the past decade. Over time, a culture of information quality has taken root and senior executives now recognize that data is a strategic asset to be carefully nurtured and jealously protected.
BT’s story offers a positive message for other organizations seeking similar results. Although you may think that $800M is an impossible goal, you too can achieve similar results relative to the size of your organization. The prescription for success relies on a small set of fundamental principles, combined with patience and persistence. You could stop reading this article and go back to working on further cuts to your budget… or read on and learn how to make a truly significant impact to your bottom line.
An Incremental Approach
In 1997, BT began a series of modest, narrowly targeted investments in “information quality”.
Leveraging the success of these initial efforts, a focused Information Quality (IQ) team began an iterative program of technology, process and change management work that gradually extended data quality tools, practices and awareness across the entire business.
Through rigorous attention to return on investment (ROI), backed up by empirical analysis of the underlying data itself, the IQ team prioritized projects based on clear and tangible business benefits. This discipline ensured proper alignment of scope, objectives and solutions – resulting in returns from four to forty times the original investment cost. By communicating the positive business results of each project, confidence in the techniques and the potential benefits grew. Managers were willing to support more ambitious goals and invest the necessary effort to see projects to completion.
Today, a culture of awareness and attention to the value of good quality data has become institutionalized. The IQ team, now numbering over 75 professionals, currently provides consulting services across not only the entire enterprise, but also to BT clients worldwide.
The Prescription for Success
According to Nigel Turner, head of ICT Customer Management for BT, the success of any data
quality improvement program depends on a few key factors. Although adhering to these factors won’t produce results overnight, it will ensure that over time your efforts enhance the bottom line.
• Recognize that successful business processes depend on having quality information.
• Data quality is not an end in itself, but rather must be tied to a business case.
• Every specific improvement effort must begin with rigorous data profiling.
• Invest in tools, processes, methods, and culture to enable information quality.
• Data quality is a continuous process – cyclical rather than linear in nature.
• Select objectives and projects appropriate to the data management maturity of your
• Place responsibility for information quality with the business, not with IT.
It is not necessary to start with executive level support. Instead, it is better to limit your scope to a point where you can ensure adherence to the seven principles above. This approach will ensure that your projects, although initially modest, will be successful. As support and awareness increase, the same principles will enable you to manage larger investments and to demonstrate the importance of business responsibility for data, a focus on continuous improvement, and so on.
Successful Business Processes Depend on Having Quality Information
It is obvious yet fundamental to realize that most business processes today are heavily dependent on specific data being available to the right people at the right time. It follows that inaccurate,
incomplete, inconsistent or inaccessible data will negatively impact these business processes. Recognizing that process inefficiencies cost money and/or lost revenue, it becomes clear that data
quality issues most certainly do affect financial results.
In the case of BT, the first symptoms of poor data quality included slow product launches (resulting from lengthy data reconciliation efforts) and inability to effectively and efficiently market to the existing customer base (resulting from limited visibility to the full relationship with each customer). Although the problem was daunting – BT had 26M customers whose data was scattered across hundreds of independent systems – the eventual benefits of properly cleansing and integrating this data were tremendous. These benefits included better asset management, reduced revenue leakage, improved customer loyalty, reduced staff turnover, enhanced opportunities for e-business and process automation, ability to migrate to new platforms more quickly, and closer adherence to regulations.
Tie Data Quality to a Business Case
Data quality by itself cannot drive financial benefits. In fact, without reference to a particular
business context, it is difficult even to specify what “quality” data would look like. Therefore, in any data quality initiative it is critical to understand the connections between data fitness, business processes and financial results. Be very clear about what you are trying to achieve. Will identifying duplicate supplier records result in fewer errant orders? Will more complete address information
help ensure that invoices reach the intended recipients? Will identifying relationships among a customer’s various accounts help improve retention?
Specifically, BT’s experience demonstrates the power of explicitly linking improvements in data quality to specific process improvements and financial outcomes. Throughout the information
quality program, BT kept careful track of financial metrics. Although these were smallish in the first few years, by 2001 BT was saving $165M from data quality improvements annually. By 2004 this number had swollen to $388M. Such success was only possible because each of the 70-odd IQ
projects was clearly linked to a business case.
Begin with Rigorous Data Profiling
One of BT’s secrets to success was detailed and comprehensive data discovery at the outset of each project. Traditionally, profiling was a manual effort, confined to the IT development team and
executed only after project planning was complete. Using a tool called TS Discovery™ from
Trillium Software®, BT placed control in the hands of business users, thus greatly facilitating the
identification of links between data anomalies and process inefficiencies. Any new ideas were first directly vetted in the data itself, resulting in avoidance of projects that initially seemed promising but might well have become costly quagmires.
Furthermore, BT reported that using automated profiling increased data analyst efficiency tenfold. The ability to view profile results together with actual data rows helped business users more clearly see the impact of the data issues. Within the TS Discovery™ platform, the data quality consultants helped the process owners to quickly prototype potential business rules. The users could immediate see the positive effects of correcting inaccurate values, removing duplicate records and so on. The team only pursued permanent solutions that materially improved the associated business processes and impacted the bottom line.
Invest in Tools, Processes, Methods and Culture
From the outset, BT took pains to get the organization right. Realizing that technology is no
substitute for flawed or missing processes, champions from each line of business were organized into an “information management” forum. The business champions prioritized data quality projects and
worked to coordinate budgets and projects for data cleansing. This helped speed the adoption of standards and yielded economies of scale. At the same time, the IT group created a data quality
center of excellence, whose head was part of the information management forum and whose consultants employed common methods across all of the IQ projects.
BT recognized that tools would be needed for diagnosis, re-engineering (e.g. cleansing of existing datasets), and consolidation (e.g. reconciliation of data across operational systems). Among other things, these tools enabled rapid prototyping and frequent use of pilot projects to vet solutions and aid adoption of new practices. Moreover, BT’s technology vendors provided a platform that enabled existing rules sets and workflows to be repeatedly re-used and re-purposed.
In addition to studying how technology could facilitate consistent information quality, the information forum identified opportunities to improve business processes including workflows, training, and so on. BT’s Turner recommends “making it as hard as possible for people to enter the wrong data.” This could involve real-time validation of new customer records to prevent the inadvertent creation of duplicates or simply to ensure all records have valid addresses.
Ultimately BT was able to achieve a true cultural shift. As more departmental heads realized direct benefits from past information quality projects, they willingly advocated for additional investments. Over time, the success of the program caught the attention of senior executives who helped spread adoption to the remainder of the enterprise.
Data Quality is a Continuous Process
Because data is usually stored for future use, it is amenable to after-the-fact, batch “cleansing”.
Although in many cases this can yield positive ROI, it also breeds complacency as data owners fail to appreciate how quickly data degrades. Product configurations change, customers get married and move, suppliers merge. Modern business processes are constantly generating new data and attempting to glean new insights from data collected in the past.
Therefore, cautions Turner, do not start on any data quality process unless you know how to keep the data clean. In fact, BT’s methodology is a cyclical process consisting of five steps: problem and opportunity identification, diagnosis, proposal, re-engineering, and consolidation. Upon reaching the consolidation phase, the process begins anew with identification of new problems and opportunities.
More tangibly, this principle implies the need to get data right at the point of capture. Through rigorous attention to data creation you can avoid the cycle of haphazard data capture, after-the-fact cleansing and subsequent degradation. To assist, Turner recommends tools for data discovery, real- time data validation and cleansing, and ongoing data monitoring.
Finally, it is important to leverage a technology platform that enables you to deploy consistent data quality processes wherever and whenever they are needed in the organization. From an architecture standpoint this requires a platform that can scale and that offers a high degree of flexibility for integrating with third-party applications. It is also important to partner with a vendor that provides the depth of expertise needed to execute specific projects but also the vision to help you drive information quality throughout your organization.
Understand the Maturity of Your Organization
The BT example demonstrates the importance of an incremental approach. As Turner points out, you
cannot progress from barely understanding a problem to optimal management in one giant leap. Instead, you have to go in stages: starting from initial awareness, then to departmental and reactive projects, gradually becoming more proactive, and ultimately to optimized enterprise data management.
In BT’s case, this journey took about eight years. They started with batch legacy data cleansing and later implemented a single view of customer data in the main marketing system. As confidence and support grew, the IQ team deployed unified name and address data in real-time to select operational systems. Most recently, BT implemented an enterprise-wide, centralized customer data hub feeding the primary customer relationship management (CRM) applications. More work is planned.
The lesson is simple: start small, publish your successes and take an incremental approach.
Responsibility for Information Quality Rests with the Business
Despite the best intentions, when IT attempts to manage data integrity the result will almost surely
be failure. IT has neither the authority nor the understanding to ensure business processes correctly generate and consume information. The only way to ensure that data is managed appropriately is to empower the business process owners who rely on that data every day.
A system-centric view of the enterprise leads to siloed applications stitched together with a “spaghetti bowl” of point-to-point interfaces. This is because each system considers only the data it requires and the immediate context in which that data will be used. By moving to a data-centric view of the world managers can begin to focus on how information flows into and within the organization.
Is It Possible?
The BT case demonstrates that regardless of the maturity of your organization to manage data and information, you can make a substantial impact to your bottom line. The work is often tedious and painstaking, but who knew that the results could be so dramatic?
Start small but start today. Get your additional questions answered by the experts: participate in an exclusive, worldwide Q&A web seminar with Nigel Turner, founder of BT’s Information Quality team, hosted by Harte Hanks Trillium Software®. To register, visit www.trilliumsoftware.com/savingmillions.
Managing output Xerox's James Joyce outlines the key elements of a successful managed print services strategy.
Virtual Reality Datalink's Kent Christensen reveals to Business Management how a virtualized data center can revolutionize your business.
Security Board: The World of Board Portals Joe Ruck explains why a secure and stable board portal is important for today's business leaders.