Accessibility Skip to Top Navigation Skip to Main Content Home  |  Change Text Size  |  Contact IRS  |  About IRS  |  Site Map  |  Español  |  Help  

4.26.18  Embedded Quality

4.26.18.1  (07-01-2007)
Introduction to Embedded Quality

  1. Embedded Quality is an automated quality review system which allows reviewers to rate case quality using quality standards called "attributes." The use of the same attributes for both managerial (EQRS) and national (NQRS) reviews provides consistency and comparability.

  2. The Embedded Quality Review System (EQRS) is used by managers to document all case related reviews of employees, including closed case reviews, 4502 reviews, workload reviews, etc. Attributes in EQRS are linked to Critical Job Elements (CJEs) to assist managers in performance evaluation. Procedures for EQRS may be found in the BSA Manager's Tool Kit and in BSA EQRS training course #19054.

  3. The National Quality Review System (NQRS) is used by BSA national reviewers to review both sample and mandatory review cases. Sample cases are collected using a statistically valid sample so that results will be representative of the entire BSA case universe.

  4. NQRS data is used by management to assess program performance and to identify training and educational needs to improve work processes.

4.26.18.2  (07-01-2007)
NQRS Responsibilities

  1. The NQRS program is administered within the Workload Identification, Selection, Delivery and Monitoring (WISDM) section of BSA Policy.

  2. BSA national reviewers, supervised by a WISDM section supervisor, conduct reviews of both Title 31 and Form 8300 cases and provide data to BSA Headquarters on at least a quarterly basis.

  3. To accomplish these objectives, BSA national reviewer responsibilities include:

    • Timely completion of NQRS case reviews

    • Accurate and consistent application of the attributes

    • Accuracy of NQRS database input, and

    • Timely delivery of NQRS data to BSA Headquarters.

4.26.18.3  (07-01-2007)
Attributes

  1. An attribute may be defined as a specific aspect of case work that the review will measure. "Attribute" is synonymous with "quality standard." Attributes compare to, and replace, the quality standards formerly used by SB/SE.

  2. Attributes are concise statements of SB/SE BSA's expectations for quality examinations/reviews and are guidelines to assist examiners in fulfilling their professional responsibilities. Each attribute is defined by elements representing components that are present in a quality examination/review.

  3. Attributes provide objective criteria against which case quality is assessed and are grouped into the following five quality measurement categories:

    • Timeliness

    • Professionalism

    • Regulatory accuracy

    • Procedural accuracy, and

    • Customer accuracy.

  4. Attributes directly link to the examiner's critical job elements (CJEs) in EQRS.

  5. Attributes are divided into six categories for ease of data input, as follows:

    • Exam planning (100s)

    • Investigative/examination techniques (400s)

    • Timeliness (500s)

    • Customer relations/professionalism (600s)

    • Documentation/reports (700s), and

    • Customer accuracy (800).

  6. The BSA website and the EQ Home page provide various reference materials to aid the reviewer in rating cases. These reference materials include:

    1. the BSA Embedded Quality Job Aid, which contains attribute definitions, available reason codes, points to consider, case examples, IRM references and Critical Job Element cites;

    2. the BSA NQRS Attribute Reference Guide, which has similar information to the Job Aid, condensed into chart format but adds references to BSA reengineering work papers; and

    3. CJE to Attribute Crosswalks, which match the attributes to the critical job elements for BSA Revenue Agent and BSA Compliance Officer positions.

4.26.18.4  (07-01-2007)
Scoring System

  1. Each attribute and aspect has been worded as a question. This format minimizes the need for interpretation and is scored as:

    • a "Yes" meaning met,

    • a "No" meaning not met, or

    • an "N/A" meaning not applicable.

  2. This scoring system provides maximum flexibility for reviewers, while emphasizing select key elements and aspects that represent priorities for a quality examination.

4.26.18.5  (07-01-2007)
Key Elements

  1. Key elements and aspects are those selected items that have been determined as a prerequisite to a quality case and are essential to IRS goals.

  2. Every attribute contains one or more key elements/aspects.

  3. If any one of the key elements/aspects of the attribute is rated "No," it will not cause the entire attribute to be rated "Not Met."

4.26.18.6  (07-01-2007)
Case Selection

  1. CTR Operations at the Detroit Computing Center (DCC) will select a valid sample of closures for NQRS review. The sample size is determined statistically to provide a valid sample at the National level. The sampling plan is determined from the number of closures per the BSA work plan.

  2. CTR Operations will select Title 31 cases and Form 8300 cases separately using the statistically valid sample selection rate determined for that type case.

  3. Survey cases will be included in the closures from which the sample is drawn.

  4. CTR Operations will send all selected cases to the NQRS Section Chief. The Section Chief will assign the cases to the reviewers.

4.26.18.7  (07-01-2007)
Other Cases Subject to NQRS Review

  1. In addition to the closed case statistical sampling from CTR Operations, the following Title 31 cases are also subject to NQRS review as "Mandatory Review" cases:

    • Form 5104 penalty referral cases not accepted for referral to FinCEN,

    • Headquarters examination cases,

    • Special project cases as designated, and

    • In process cases, as necessary.

  2. In addition to the closed case statistical sampling from CTR Operations, the following Form 8300 cases are also subject to NQRS review as "Mandatory Review" cases:

    • Headquarters examination cases,

    • Special project cases as designated, and

    • In process cases, as necessary.

  3. Mandatory review cases are to be sent to review by the group manager with a Special Handling Notice. The "Mandatory Review" block should be checked on the Closed Case Document.

4.26.18.8  (07-01-2007)
Case Review Procedures

  1. The first step in the Embedded Quality process is the review of the case. The purpose of the national BSA quality review process is to provide management with measures on how well work processes are carried out and how these processes can be improved.

  2. National reviewers are responsible for reviewing BSA cases for quality using five measurement categories:

    • Timeliness (Efficient issue resolution using workload management and time utilization techniques);

    • Professionalism (Effective communication techniques);

    • Regulatory accuracy (Adhering to statutory/regulatory process requirements);

    • Procedural accuracy (Non-statutory/regulatory internal process requirements); and

    • Customer accuracy (Correct and complete case resolution with no adverse impact on the customer).

  3. It is important that reviewers have an in-depth understanding of the EQ attributes and process measures, as well as a good understanding of BSA law, requirements and procedures. Furthermore, it is important that the reviews be complete, accurate and consistent to promote a true and valid sampling that management can rely upon to improve examination processes. See IRM 4.26.18.12. Consistency of Reviews.

  4. Closed case review results are entered into a web-based Data Collection Instrument (DCI) on the National Quality Review System (NQRS). See IRM 4.26.18.11 (below) for detailed input procedures.

  5. Generating Reports: Once the review is completed, the results are included in the raw data from which the EQ system generates reports. Reports will be generated separately for Title 31 and Form 8300 case reviews to ensure separation of the Title 31 and Title 26 data. See IRM 4.26.18.14 (below) for more on reports.

  6. For help in reviewing cases, reviewers may access electronic research portals and reference materials on the Embedded Quality (EQ) home page. Available reference materials include the BSA Job Aid, Attribute Reference Guides and Crosswalks, as well as links to the IRM. The EQ home page can be accessed at http://sbse.web.irs.gov/EQ/default.htm. Reviewers can also access the BSA web page for additional information.

4.26.18.9  (07-01-2007)
NQRS Reviewers Guide

  1. An Embedded Quality Job Aid has been developed to be used as a guide in completing the NQRS database input screens including headers and attributes.

  2. The Job Aid provides operational and job aid definitions, points to consider, reason codes (If available), rating guide explanations, attribute examples, IRM references and CJE links for each attribute in the guide.

4.26.18.10  (07-01-2007)
Introduction to NQRS Database

  1. Case reviews are input into the web-based system known as the National Quality Review System (NQRS).

  2. Each case review is input using a Data Collection Instrument (DCI). The DCI guides the review process and documents reviewer comments. This documentation can be extracted from the system through statistical and informational analysis reports.

  3. The NQRS database is accessed through the EQ home page at http://sbse.web.irs.gov/eq/default.htm.

  4. Reviewers must complete an On-line Form 5081 to obtain a password to access the NQRS system.

4.26.18.11  (07-01-2007)
Completion of Database Input

  1. The Data Collection Instrument (DCI) provides the principle documentation for the reviewer's case evaluation and conclusions. A DCI will be completed for each case reviewed in the NQRS system.

  2. Header input procedures: The first input section of the DCI is the data input of headers that capture basic information about a closed case, such as "Who, What, When, Where and Why." This information if categorized into four groupings:

    • Review information: considers important review data

    • Case information: considers basic case information

    • Process measures: considers case actions taken by the examiner

    • Special Use: considers special tracking for local or national purposes.

      Note:

      Header fields in bold are mandatory fields that must be completed in order to complete the DCI.

  3. Evaluating and Coding the Attributes: The next step in the case review process is the evaluation of the case by rating attributes that are specific to a Specialized Product Review Group (SPRG). (BSA is one of the SPRGs.) Attributes on the DCI are either required or optional and all required attributes (in bold) must be rated "Yes" or"No" in order for the DCI to be recorded as complete. Other attribute fields should be completed if information is available to ensure an accurate quality case review.

  4. Attribute reason codes: Most of the attributes have reason codes. Once the attributes have been rated, at least one reason code must be selected (where reason codes are present) for each attribute rated as not met ("N" ). The reason code that best applies should be selected. If none apply, "Other" may be selected and a narrative should be entered describing the reason.

  5. Reason Code Narratives: In addition to selecting a reason code on all attributes rated "N," the reviewer is required to write a narrative in the attribute narrative box specifically describing why a case failed a particular attribute. Positive comments may also be entered for attributes that have been met and are encouraged in order to provide a complete and balanced case review.

  6. Narrative Guidelines: NQRS reviewers need to be thorough in documenting case performance by providing clear, concise and specific descriptions of any errors in order to identify problem areas. Positive actions should also be described in specifics. Citing IRM or other references will add validity to the narratives. Narrative remarks that are too general may not offer sufficient detail to allow either specific recommendations for improvement or recognition of good performance.

  7. Completing the DCI: Once the headers, attributes and narratives are completed, the reviewer should select the "Complete Review" screen. At this point, the EQ system notifies the reviewer if there are any required items needed to complete the case review and/or provides suggestions of recommended items. The reviewer will not be able to complete the case review until all required input is addressed.

4.26.18.12  (07-01-2007)
Consistency of Reviews

  1. Since the attribute ratings in NQRS form the basis of quality performance measurement for BSA, it is important that attribute ratings are complete and accurate. Reviewers must strive to be consistent over time in rating similar case actions on all reviews.

  2. Incomplete or inaccurate national quality reviews can lead to a distortion of Business Results - Quality Performance. Furthermore, general narrative remarks may not provide sufficient detail to allow for meaningful drivers for improvement.

  3. Guidelines for consistent reviews:

    • Rate all attributes that apply to the case being reviewed.

    • Rate good performance as well as errors/opportunities for improvement.

    • There is no need to rate actions that are not applicable to the case being reviewed; the system will automatically rate them as "N/A" .

    • Multiple reason codes may be selected for multiple errors.

    • Input specific narrative comments for both positive and negative actions to ensure more accurate reviews.

4.26.18.13  (07-01-2007)
Consistency Checks

  1. Consistency checks of national reviews can be performed in several ways:

    1. Reviewers can independently evaluate the same case. Subsequent comparison of the ratings and discussion can reveal inconsistent treatment.

    2. The NQRS Section Chief can critique completed case reviews on a regular basis and provide feedback to reinforce expectations of the reviews. Also, available management information reports can be used to evaluate consistency.

    3. To facilitate consistency between NQRS and EQRS, reviewers can lead discussions at group meetings to discuss specific elements and case scenarios to develop consensus.

  2. Consistency checks of EQRS will be done by Territory Managers on a regular basis. National reviewers may be called in to assist in this process.

4.26.18.14  (07-01-2007)
NQRS Reports

  1. The NQRS database includes basic pro forma reports designed to assist in the analysis of data collected through case reviews. These reports include:

    • DCI Reports (Shows attribute scores and narratives);

    • Organizational Reports (Embedded Quality Score, Interim Quality Score, Measurement Category Score, Attribute Results, Top Ten Defects/Successes and Customer Drivers.)

    • Attribute Narrative Report; and

    • Adhoc Reports.

  2. Reports are generated separately for Title 31 and Form 8300 case work.

  3. NQRS reports can be generated periodically, as needed. At a minimum, these reports will be generated and analyzed on a quarterly basis.

  4. The NQRS Reviewer's Guide should be used when interpreting or analyzing NQRS reports. The definitions of attributes and process measures are narrow and rules for ratings are specific to the individual elements.

4.26.18.15  (07-01-2007)
Management Use of NQRS Data

  1. NQRS data will be used by management to assess program quality performance as well as to identify training and education needs.

  2. NQRS data will not be used to evaluate individual employee performance. This is the role of the EQRS, where attributes are linked to Critical Job Elements (CJEs).

4.26.18.16  (07-01-2007)
Disposition of Cases After Review

  1. Commendatory: If the NQRS reviewer identifies a case exhibiting exceptional quality or one reflecting innovative examination techniques, the reviewer will prepare a Form 3990 Reviewer's Report (Commendatory) and forward to the NQRS Section Chief for approval. Upon approval, the form will be forwarded to the group. The case will be closed by the Section Chief to CTR Operations (DCC). The commendatory report information will be reflected as a best practice in the quarterly NQRS reports.

  2. Correction: If the NQRS reviewer identifies a circumstance in a case review that would have an adverse impact on the BSA program (such as the failure to issue a Letter 1112 where it was appropriate), the reviewer will prepare a Form 3990 and forward to the NQRS Section Chief for approval. Upon approval, the entire case file will be returned to the group for corrective action. After corrective actions have been taken, the case will be returned to the Section Chief for processing back to CTR Operations (DCC).

  3. All other completed review cases will be forwarded to the NQRS Section Chief for processing back to CTR Operations (DCC).


More Internal Revenue Manual