Organizational Agility Hampered by Data Quality
A recent Insurance Networking News article by Nathan Golia (@NathanGolia) references a West Monroe Partners’ study on data analytics in insurance companies that found “Two-thirds of the 122 respondents to the survey said data quality and accuracy was the greatest challenge to advanced analytics.”
But when done right, the use of quality data analytics can lead to “increased engagement, improved production by existing advisors and reduced attrition.”
We too have heard similar complaints from customers who are looking for one source of operating data that everyone in a company agrees is the real, trusted number with which to plan and measure performance. Without this definitive information, people spend a lot of time arguing about the validity of numbers, the impact of a challenge or opportunity, and the cause of issues and/or breaks in processes.
“One version of the truth” is hard to come by because of:
- Organizational silos, each with different tools, methodologies and metrics
- Disparate systems with little real-time data
- A status quo mentality.
The following two case studies demonstrate how organizations across various industries broke through these barriers to create a standardized framework and an acknowledged, one version of the truth to improve operational agility and performance.
An HR services outsourcer struggled with managing seasonal peaks, resorting to excessive overtime and temporary help to manage demand. It wanted the agility to move resources between the contact center and back office as customer demand fluctuated. However, in order to share resources, it needed a single source of data to get insight into the types and volume of work being handled, forecasted demand and true capacity needs.
They implemented a Verint Enterprise Workforce Management solution across functions which increased accuracy and enabled the company to handle a larger amount of work items for their clients’ customers with the same number of people—saving one of the outsourcer’s clients $100,000. Verint Enterprise Performance Management scorecards enabled them to effectively measure front and back-end productivity, even if an individual switched between functions. It allowed them to redeploy resources based on what the true needs of the work were—and still track key measures such as quality, adherence, productivity and customer survey scores.
The outsourcer itself saved $300,000 in overtime in the first year and, as a side benefit, was able to generate cost-to-serve reporting by customer and product line.
A large insurance company had created a shared services group which was charged with developing a global, standardized performance management framework and consistent metrics that would help it break down silos between its many companies, functions and locations. The insurer also turned to the employee desktop as the source of trusted data and implemented Verint Desktop and Process Analytics and enterprise performance management. The insurer subsequently experienced greater processing consistency and productivity.
Specifically, they’ve seen a 36 percent increase in time spent in production applications, and a dramatic decrease in time spent on non-production-related activities, saving them over $9 million USD.
Ultimately, the HR outsourcer, insurance company, and numerous other organizations have chosen the employee desktop as the trusted source of data for their operational and performance management reporting. The standardized performance management framework across functions provides management and employees with the data, tools and disciplines needed to be successful in their roles, break down silos, improve operational agility, and meet organizational goals.
The post Organizational Agility Hampered by Data Quality appeared first on Customer Experience Management Blog.