These presentations raise awareness of the data available and suggest possible new applications. This concept of pervasive business intelligence enables employees to understand and achieve their individual and company goals from the "shop floor to the top floor." The closer the decision is to generating data, the more informed the decision will be.
Efforts to cascade the use of BI tools further into departments—the "democratization of BI"—were not immediately embraced. More than half of our respondents were business analysts – the role of the intersection between IT and a typical business user. For example, adding multiple N of M filters (where N is 2 or greater) to the report filter (sometimes called a page filter or page slicer) in the OLAP option of an Oracle database can improve the performance of the filter.
What are some of the sweet-spot applications that other firms like Randy's have found for real-time BI, and specifically for decision automation. What are some of the "bumps" other firms have experienced in moving to real-time BI? Training end users on new technology and convincing them of the benefits should never be neglected.
The right place to start is with large-scale, repeatable decisions – the kind that are made every day – that affect important KPIs. Organizations need to capture, integrate and analyze social media data, the type of customer interaction data that represents a significant part of the “big data” explosion facing businesses. The solution uses data quality deduplication to eliminate discrepancies, can offer users a choice of which attributes to use, and ultimately creates a “golden record” – the best version of the truth.
The most common strategy used to implement a BI project is popularly called "parallel execution". The approach is often justified by claiming that it lets workers use the new technology with the comfort and peace of mind that the old reports are still available. Drilling down to the transaction level is the first line of defense against critics of the new BI platform. Successful implementation of a BI project depends on the readiness of end users to test the new reporting framework.
The spark was Yen's attendance at a MicroStrategy conference—the company uses other MicroStrategy products—where he realized the BI company had a viable mobile app ready for the iPad (a device Yen admits he had previously been skeptical of). We understand that in our competitive global business world, getting the right path depends on the quality of data used to make decisions. Data quality is the alignment between business data requirements and the level of completeness, accuracy and availability of the data.
In projects that integrate data from different sources, the data quality of the source systems is usually seen as an uncontrollable constraint that the project must deal with.
Manage Data Quality Risk
At the beginning of the project, the data quality expert may challenge the intended use of the data source by the target processes. As the architecture and design are defined in detail, data quality risks must be addressed. By being involved early and throughout the process—including defining business requirements and architecture elements, designing business processes, and profiling data—a data quality professional can ensure that the proverbial dots stay connected from a data quality perspective.
Thus, many risks are not identified by specific project streams, but rather by the data quality project functions. A data quality risk matrix can be used to summarize the risk elements for each process. Data quality concerns and risks are raised during the data discovery workshops and data profiling activities.
Accepting the risk may be the right option if the data at risk represents only a small volume, if the impacts to the process are minimal, or if remediation of poor data quality would require undue effort. These improvement efforts are often outside the scope of the project, but the business case can allow the issue to be escalated to the executive level and clearly demonstrate the link between poor data quality and project benefits. The goal of data quality improvement projects can be a one-time data cleanup or cleanup over a period of time.
The cleanup usually comes with process improvements and changes to the systems themselves to strengthen data quality control. Some of the expected benefits of the marketing systems may be at risk because the customer DOBs managed in the billing system are only 60 percent complete. The identified cause may be that the DOB written in the client's subscription forms is not systematically entered into the billing system.
The recommendation may also include additional quality checks in the billing system (such as cross-validation with a driver's license) to validate the DOB. The recommendation may be to transfer the data with the current quality level and improve data quality in the target system, especially if the source system is to be decommissioned and the project timeline allows for addressing the risk at a later date. Activities in the remediation plan that are external to the project should be escalated to the enterprise data governance committee, which should act as a bridge between the projects and the various sectors or lines of business (LOBs) in the organization.
Implement Data Quality Metrics for Key Data
If the recommendations are accepted, who will pay for the effort - the business owner of the flawed data or the project sponsor. The recommendation may be to hire a temporary resource to manually review the paper file and enter the information into the billing system; to adjust the billing data entry process to include DOB; or to modify the billing system to make DOB mandatory when a new customer is added to the system. Data can be scrubbed to remove the DOB of 700-year-old customers or customers who died before they were born.
You may need to postpone developing a business case to assess another potential risk; You may need to drop a business case early if you discover the risk is immaterial. When different data elements are incomplete or incorrect, they will have completely different consequences for the processes that use the data. Their lack of quality may go unnoticed throughout the data processes, but it does have an impact on the process results.
Their number is limited (the list is short enough for the company to take action); identifying measurable, manageable key data in this way makes the difference between simply talking about data quality and empowering users to improve data quality.
Implement Data Quality Metrics
They provide a window into the state of data quality currently used by a process, such as the number of rejected records, the number of unmatched clients, and alerts and thresholds, which are common metrics of operational data quality. Two of the main dimensions used for strategic data quality management are completeness and correctness. Other dimensions of data quality, such as availability, conformance, and security, are typically the primary drivers for architecture and design rather than strategic data quality management.
Implementing data quality metrics is a challenge (just like all other aspects of data quality management). Publishing the data quality metrics in an enterprise data quality dashboard is the true leverage of data quality metrics. Strategic data quality metrics are not used to monitor data quality; they are used to reinforce responsibilities.
Link to the action plan (or lack of an action plan) If your organization does not have an enterprise data quality dashboard, integration projects provide a good opportunity to define and implement one. A data quality data repository can be designed to store the data and feed the dashboard. As new projects implement new processes and new metrics, they will benefit from the data quality repository and enrich the dashboard with new processes that are critical to the organization.
Managing data quality in projects is challenging, but the following key factors will ensure successful data quality management. Responsibilities for delivering data quality will not automatically arise from the projects or the LOBs. Data quality management should be involved from the very beginning of projects to ensure that data quality requirements are part of the business requirements, to position the data quality flow, to ensure adequate funding for data quality and to identify and address data quality risks.
Define KPIs that will highlight the effectiveness of the project to manage data quality risks (eg, number of data quality risks identified, number of business cases presented, number of business cases approved, etc.). The urgency to identify and address data quality risks in the early stages of a project will be met with resistance. The most common objection is, "How can the project address data quality risks before the target processes are defined and well designed?" The main data quality risks are not related to detailed architecture and design.