Skip to main content
Skip table of contents

Built-In Quality Management Reports and Their Types

Contact centers have always been statistics-driven organizations where almost every aspect of both the agent’s and the contact center’s performance is measured against various KPIs. Quality management also generates its own statistics. Review forms are designed to produce statistical outputs.

This data :

  1. Can reveal trends over time.

  2. Allows for the comparison of one agent with another or to the group total or a specific target.

  3. Can establish correlation and therefore a relationship between two different phenomena.

Reporting is therefore an essential part of data-driven decision making which is at the heart of best practices in contact center management. The importance of this data cannot be underestimated, as only quality management processes and the technology used to facilitate them can deliver insight into the behavior of agents when they are communicating with customers.

Quality Management can create various types of reports:

  1. The most basic function of contact center reporting is to give management an overview of individual and group performance over time.

    • Compare Teams: This report allows the user to select a questionnaire and look at teams' or agents' performance over a period of time to identify who is improving, is consistent or is getting worse.

    • Team and Agent Skills: This report allows the user to select a team and an agent, and review the results for different questionnaires and questions over a period of time in graphical and statistical form. A team leader, for example, can use this report to identify where an agent needs to improve his or her performance and then develop appropriate coaching and training to do so.

  2. The report listed below provides an overview of how multiple reviewers scored a single agent. It visually depicts how multiple reviewers vary in their scoring of specific questionnaires so that it can be quickly determined if the review team is assessing quality-management scores similarly.

    • Compare Reviewers: This report allows the user to compare how two or more reviewers have reviewed an agent over a period of time. The graphical and spreadsheet data compare scores received from different reviewers at the question section level. This can be used either as a very basic calibration tool or as a way to investigate accusations of bias and prejudice. 

  3. Report outputs can be useful for both modeling and implementing bonus schemes. For example, a League Table report, which calculates and displays average scores for each agent over a certain period, can be exported and sent straight to payroll.

    • League Table: This report allows the user to get a snapshot view of an agent's standing in terms of average scores over a user-defined period of time. 

  4. The report listed below provides an overview of review volume:

    • Review Volume: This report provides an overview of the number of reviews performed during the selected period of time. It can display the total number of reviews, as well as: planned, in progress and finished reviews separately.

Quality reporting measures trends over time and enables the comparison between agents and groups. It can be used to establish the correlation between multiple phenomena within the contact center. Dynamic reports allow users to click to “drill down” and “zoom out” to various levels. Reports can be used as a basis for coaching sessions to encourage agents to think strategically. The reporting system allows for tracking the reviewer's progress against the review plan and ensures adherence to targets. Review forms determine the “shape” of the data reported. Review data in reports can be correlated against other data to investigate and address operational issues. Reports only show data for active (not deleted) users. However, Reviews show all reviews for deleted users as well.

Please note that:

  • results presented in Built-in reports are always shown as a percentage (results in points are also converted to a percentage)

  • the scale on which results are presented has a range from 0 to 100%

  • the calculation of NPS® in Built-in reports is not supported here (this feature is, however, supported by ZPA reports and the NPS widget on the dashboard)

The way an agent's name is displayed in reports (except Compare Teams, Compare Reviewers and League Table) can be set from within the Web UI. Go to Administration > Preferences. Select one of the options under the setting Agent Names in reports. Possible options include: Firstname Lastname, Firstname Lastname (login), Lastname, Firstname, Lastname, Firstname (login). For more information see the Quality Management Preferences section.

The time used when calculating Reports is not shifted according to the current users' timezone.
Period To as defined in the Review criteria is used as the decisive date when a Review score is shown in the report/chart.

“Net Promoter, NPS, and the NPS-related emoticons are registered U.S. trademarks, and Net Promoter Score and Net Promoter System are service marks, of Bain & Company, Inc., NICE Systems, Inc. and Fred Reichheld.”

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.