Accessibility Skip to Top Navigation Skip to Main Content Home  |  Change Text Size  |  Contact IRS  |  About IRS  |  Site Map  |  Español  |  Help  

4.24.7  Excise Embedded Quality Review System (EQRS) and Excise National Quality Review System (NQRS)

4.24.7.1  (08-13-2008)
Overview of Excise Quality Review Program

  1. Embedded Quality (EQ) is a process tool to assist managers in identifying opportunities to build skills and enhance strengths in their employees’ individual performance.

  2. The Embedded Quality system is an automated quality review system which allows Reviewers to rate case quality using quality standards called"attributes" . The use of similar attributes for both managerial (EQRS) and national (NQRS) reviews provide consistency and comparability. EQRS stands for Embedded Quality Review System. NQRS stands for National Quality Review System.

  3. EQRS is used by managers to document all case related reviews of employees, including case reviews, 637 compliance reviews, technical time (4502) reviews, on the job visitations, and workload reviews.

  4. Attributes in EQRS are linked to Critical Job Elements (CJEs) to assist managers in performance evaluation. Excise functional guidance for EQRS may be found on the Embedded Quality web site located at http://sbse.web.irs.gov/EQ/functionalguidance/specialty/spexcise.htm .

  5. NQRS is used by Excise National Reviewers to review sample cases. Sample cases are collected using a statistically valid sample so that results will be representative of the entire Excise case universe.

  6. NQRS data is used by management to assess program performance and to identify training and educational needs to improve work processes.

4.24.7.2  (08-13-2008)
Components of Quality Reviews

  1. All of the topics referred to below are equally applicable to NQ and EQ reviews. The term Reviewer refers to:

    1. the manager on the EQRS system, or

    2. the Revenue Agent Reviewer on the NQRS system.

  2. The Embedded Quality review system consists of attributes, scoring system, key elements, case selection, case review procedures, and reports.

4.24.7.2.1  (08-13-2008)
Attributes

  1. An attribute may be defined as a specific aspect of case work that the review will measure. "Attribute" is synonymous with "quality standard" . Attributes compare to, and replace, the quality standards formerly used by the IRS. Refer to the Embedded Quality web site above for a listing of these attributes.

  2. Attributes are concise statements of SB/SE Excise expectations for quality examinations and compliance reviews, and are guidelines to assist examiners in fulfilling their professional responsibilities. Each attribute is defined by elements representing components that are present in quality examinations and compliance reviews.

  3. Attributes provide objective criteria against which case quality is assessed. Reviewers are responsible for reviewing Excise cases for quality based on the following four quality measurement categories:

    1. Timeliness (Efficient issue resolution using workload management and time utilization techniques),

    2. Professionalism (Effective communication techniques),

    3. Regulatory Accuracy (Adhering to statutory/regulatory process requirements), and

    4. Procedural Accuracy (Non-statutory/regulatory internal process requirements).

  4. Attributes directly link to the examiner’s Critical Job Elements (CJEs) in EQRS.

  5. Attributes are divided into five categories for ease of data input, as follows:

    1. Exam Planning (100s),

    2. Investigative/Examination Techniques (400s),

    3. Timeliness (500s),

    4. Customer Relations/Professionalism (600s), and

    5. Documentation/Reports (700s).

  6. The Embedded Quality web site http://sbse.web.irs.gov/eq contains reference materials to aid the Reviewer in rating cases and the employee in understanding the evaluation system. These reference materials include:

    1. the Excise Embedded Quality Job Aid, which contains attribute definitions, available reason codes, points to consider, case examples, IRM references, Critical Job Element cites and references to Excise Lead Sheets and Templates,

    2. CJE to Attribute Crosswalks, which match the attributes to the Critical Job Elements for Excise Revenue Agents, and.

    3. Excise manager review mapping guide.

4.24.7.2.2  (08-13-2008)
Scoring System

  1. Each attribute and aspect has been worded as a question. This format minimizes the need for interpretation and is scored as:

    1. a "Yes" meaning met,

    2. a "No" meaning not met, or

    3. an "N/A" meaning not applicable.

  2. This scoring system provides flexibility for Reviewers and provides equal rating for all attributes.

  3. The quality score is computed as a percentage of total " yes" ratings divided by total "yes" , and " no" , ratings. A total score of 100 is possible for each case.

  4. The overall quality score for Excise is reported as the Business Results (Quality) measure for SB/SE.

4.24.7.2.3  (08-13-2008)
NQRS Case Selection

  1. The National Quality Manager, Specialty Programs, will work closely with Cincinnati Submission Processing Center Post Processing Operations personnel to obtain the necessary inventory of closed (status 90) Excise Tax cases for review.

4.24.7.2.4  (08-13-2008)
Case Review Procedures

  1. The purpose of the National Excise quality review process is to provide management with measures on how well work processes are carried out and identify processes that need to be improved.

  2. It is important that Reviewers have an in-depth understanding of the EQ attributes and process measures, as well as a good understanding of Excise law, requirements and procedures. Furthermore, it is important that the reviews be complete, accurate and consistent to promote a true and valid sampling that management can rely upon to improve examination processes. See IRM 4.24.7.6., Consistency of Reviews

  3. Case review results are entered into a web-based Data Collection Instrument (DCI) on the Quality Review System (NQRS & EQRS). See IRM 4.24.7.5.1 for DCI input procedures.

4.24.7.3  (08-13-2008)
Reports

  1. Once the review is completed, the results are included in the data from which the Embedded Quality system generates various reports for NQRS & EQRS. For further guidance; refer to the reports section in the EQRS training course #19054 at http://core.publish.no.irs.gov/trngpubs/pdf/20563f06.pdf .

  2. The database includes basic pro forma reports designed to assist in the analysis of data collected through case reviews. These reports include:

    • Employee Reports,

    • Organizational Reports,

    • Attribute Narrative Report, and

    • Adhoc Reports.

4.24.7.4  (08-13-2008)
Embedded Quality Job Aid

  1. An Embedded Quality Job Aid has been developed to be used as a guide in completing the NQRS & EQRS database input screens.

  2. The Job Aid provides:

    • operational and job aid definitions,

    • points to conside,r

    • reason codes (If available),

    • rating guide explanations,

    • attribute examples,

    • examination lead sheet and template references,

    • IRM references, and

    • CJE links for each attribute in the guide.

  3. For help in reviewing cases, Reviewers may access electronic research portals and reference materials on the Embedded Quality (EQ) home page. Available reference materials include the Excise Job Aid, Attribute Reference Guides and Crosswalks, as well as links to the IRM.

4.24.7.5  (08-13-2008)
Quality Review Database

  1. Case reviews are input into the web-based system known as the Quality Review System.

  2. Each case review is input using a Data Collection Instrument (DCI). The DCI guides the review process and documents reviewer comments. This documentation can be extracted from the system through statistical and informational analysis reports.

  3. The database is accessed through the EQ home page.

  4. Reviewers must complete an On-line Form 5081 to obtain a password to access the system.

4.24.7.5.1  (08-13-2008)
DCI Input Procedures

  1. The DCI provides the principle documentation for the reviewer’s case evaluation and conclusions. A DCI will be completed for each case reviewed in the Embedded Quality system.

  2. The first input section of the DCI is the data input of headers that capture basic information about a closed case, such as "Who, What, When, Where and Why." This information is categorized into four groupings:

    1. review information which considers important review data,

    2. case information which considers basic case information,

    3. process measures which considers case actions taken by the examiner, and

    4. special use which considers special tracking for local or national purposes.

    Note:

    Header fields in bold are mandatory fields that must be completed in order to complete the DCI.

  3. The next step in the case review process is the evaluation and coding of the attributes. Reviewers of the case evaluate and code attributes that are specific to a Specialized Code (SPRG). Excise has three SPRG’s: Excise General, Excise 637, and FCO.

  4. Attributes on the DCI are either required or optional and all required attributes (in bold) must be rated "Yes" or " No" in order for the DCI to be recorded as complete if the review is "Evaluative" .

  5. If the review is "Evaluative Targeted" then no attributes or all attributes can be rated by the Reviewer. Attribute fields should be completed if information is available to ensure an accurate quality case review.

  6. Once the attributes have been rated, at least one reason code must be selected for each attribute rated as not met ("No" ). The reason code that best applies should be selected. If none apply, " Other" may be selected and a narrative should be entered describing the reason.

  7. In addition to selecting a reason code on all attributes rated "N" , the Reviewer is required to write a narrative in the attribute narrative box specifically describing why a case failed a particular attribute. Positive comments may also be entered for attributes that have been met and are encouraged in order to provide a complete and balanced case review.

  8. Reviewers need to be thorough in documenting case performance by providing clear, concise, and specific descriptions of any errors in order to identify problem areas. Positive actions should also be described in specifics. Citing IRM or other references will add validity to the narratives. Narrative remarks that are too general may not offer sufficient detail to allow either specific recommendations for improvement or recognition of good performance.

  9. Once the headers, attributes and narratives are completed, the Reviewer should select the "Complete Review" screen. At this point, the EQ system notifies the Reviewer if there are any required items needed to complete the case review and/or provides suggestions of recommended items. The reviewer will not be able to complete the case review until all required input is addressed.

4.24.7.6  (08-13-2008)
Consistency of Reviews

  1. Reviewers must strive to be consistent over time in rating similar case actions on all reviews.

  2. Incomplete or inaccurate quality reviews can lead to a distortion of Business Results - Quality Performance. Furthermore, general narrative remarks may not provide sufficient detail to allow for meaningful drivers for improvement.

  3. Guidelines for consistent reviews should cover the following items.

    1. Rate all attributes that apply to the case being reviewed.

    2. Rate good performance as well as errors/opportunities for improvement.

    3. There is no need to rate actions that are not applicable to the case being reviewed; the system will automatically rate them as "N/A" .

    4. Multiple reason codes may be selected for multiple errors.

    5. Input specific narrative comments for both positive and negative actions to ensure more accurate reviews.

4.24.7.7  (08-13-2008)
Consistency Checks

  1. Consistency checks of quality reviews can be performed in several ways:

    1. Reviewers can independently evaluate the same case. Subsequent comparison of the ratings and discussion can reveal inconsistent treatment.

    2. Available management information reports can be used to evaluate consistency.

    3. To facilitate consistency, discussions can be held at meetings to discuss specific elements and case scenarios to develop consensus.

  2. Consistency checks of EQRS will be done by Territory Managers on a regular basis. In addition, the Excise Quality Coordinator can be used in this process. Consistency checks of NQRS Reviewers will be done by the NQRS Manager. Reviewers can independently evaluate the same case. Subsequent comparison and discussion of ratings can reveal inconsistent treatment.

4.24.7.8  (08-13-2008)
Management Use of Quality Data

  1. Embedded Quality data will be used by management to assess program quality performance as well as to identify training and education needs.

  2. NQRS data will not be used to evaluate individual employee performance. This is the role of the EQRS, where attributes are linked to Critical Job Elements (CJEs).

4.24.7.9  (08-13-2008)
EQRS Responsibilities

  1. The case review data is used by managers to document all case related reviews of employees, including case reviews, compliance reviews, 4502 reviews, workload reviews, on the job visitations, and daily case reviews etc.

  2. Attributes in EQRS are tied to Critical Job Elements (CJEs) to assist managers in performance evaluation. Managers should provide the individual feedback report to their employees in a timely manner.

4.24.7.10  (08-13-2008)
NQRS Responsibilities

  1. The NQRS program is administered within the Office of the Director, Specialty Program.

  2. Excise National Reviewers, supervised by a section supervisor, conduct reviews of both examinations and compliance review cases and provide data to Excise Headquarters at the National and Territory level on a quarterly basis.

  3. To accomplish these objectives, Excise National Reviewer responsibilities include:

    • timely completion of NQRS case reviews,

    • accurate and consistent application of the attributes,

    • accuracy of NQRS database input, and

    • timely delivery of NQRS data to Excise Headquarters.


More Internal Revenue Manual