The Office Audit Redesign Pilot Was Effective in Meeting Its Goals, but Its Implementation Needs to Be Monitored

 

January 2004

 

Reference Number:  2004-30-033

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

January 14, 2004

 

 

 

MEMORANDUM FOR COMMISSIONER, SMALL BUSINESS/SELF-EMPLOYED DIVISION

 

FROM:     Gordon C. Milbourn III /s/ Gordon C. Milbourn III

                 Acting Deputy Inspector General for Audit

 

SUBJECT:     Final Audit Report - The Office Audit Redesign Pilot Was Effective in Meeting Its Goals, but Its Implementation Needs to Be Monitored  (Audit # 200330011)

 

This report presents the results of our review of the Internal Revenue Service’s (IRS) Office Audit (OA) Redesign Pilot.  The overall objective of this review was to determine whether the Small Business/Self-Employed (SB/SE) Division was effective in meeting its goals during the OA Redesign Pilot.  This review was part of our efforts to provide ongoing input during the SB/SE Division’s Examination Reengineering process.

The SB/SE Division Examination function’s responsibility is to examine tax returns to determine the correct Federal tax liabilities.  Within the Examination function, the OA function examines tax returns during face-to-face meetings with taxpayers in an IRS office.  Between Fiscal Years 1997 and 2000, the Examination function experienced a 64 percent decline in total assessed dollars and a 66 percent decline in closed cases. 

The SB/SE Division initiated an in-depth effort to reengineer its examination processes, products, and services to help address these declines.  As part of these efforts, it conducted the OA Redesign Pilot, which tested new OA procedures to determine their effectiveness in increasing Examination productivity and reducing taxpayer burden.  Based on the success of the Pilot and after final approval, the SB/SE Division plans to implement the new procedures nationwide to all OA groups.

Overall, the OA Pilot was effective in testing the redesigned tools and procedures, and the results indicate that the new tools and procedures should be implemented nationwide.  However, program results need to be monitored and manager reviews improved.

The Pilot realized three of five expected outcomes:  increased first appointment closures, increased employee satisfaction, and increased customer satisfaction.  Our review of a judgmental sample of 88 closed Pilot cases showed that 52 percent were closed at first appointment, compared to 27 percent for 12 OA groups that did not participate in the Pilot.  Employee satisfaction surveys showed that satisfaction levels for managers and clerical staff were mostly positive.  Although the satisfaction results for the Tax Compliance Officers (TCO), who conduct the examinations, were mixed, the OA Pilot team management evaluated this feedback and is planning some revisions that should have a positive impact.  Based on 225 Customer Satisfaction Surveys, 91 percent of the Pilot group customers were satisfied, compared to 62 percent for the non-Pilot group customers.  In addition, other Examination program measures showed positive trends such as increases in agreed cases, improved Examination quality, and increased dollars assessed per return.  The number of taxpayer postponements and taxpayer no-shows was reduced, and Examination no-change rates were the same. 

However, two expected outcomes were not met:  decreased Examination time and cycle time.  Both Examination time and cycle time were greater for Pilot cases than non-Pilot cases.  In addition, the measure used for cycle time was not actually representative of cycle time.  The Pilot team management recognized these problems and is planning certain revisions, which we believe should help. 

Pilot participants were generally using the new tools and procedures, such as Planning Sheets, Materiality Worksheets, Microsoft Outlook Calendars, Focused Information Document Requests, and Call Back Letters; therefore, any conclusions reached by the Pilot team on their effectiveness were appropriate.  While most of these tools and procedures improved the Examination process, Pilot team management is planning revisions that should make the tools even more effective. 

However, the managerial review process needs reevaluation.  Managers were required to review cases biweekly to allow them to assess consistency of planning, scheduling, and Examination activities.  Our review of 88 closed Pilot cases showed that biweekly case reviews were documented in only 2 cases (2 percent).  Although discussions with group managers indicated they conducted biweekly case reviews, they also said they did not document these reviews.  The intent was for the review to be nonevaluative during the Pilot.  Pilot team management is planning some revisions to the managerial reviews, including once-a-month open case reviews to discuss cases and potential issues with the TCOs.  However, these reviews still will not be required to be recorded.

We recommended that the Director, Compliance, SB/SE Division, conduct formal follow-up evaluations of new tools and procedures designed to reduce Examination time and cycle time and ensure that managerial reviews are properly documented. 

Management’s Response:  The Commissioner, SB/SE Division, agreed with our recommendations.  Compliance Policy Analysts will monitor the effectiveness of the Reengineering roll-out and monitor key program results to ensure the new tools are effectively implemented and used throughout the SB/SE Division’s Office Audit program.  In addition, language has been added to Office Audit managerial training materials that provides Office Audit group managers with guidelines for conducting and documenting open case reviews.  Compliance Policy Analysts will also review open case reviews during visitations conducted to evaluate the deployment of the Office Audit Reengineering process changes to ensure they are documented as appropriate.  Management’s complete response to the draft report is included as Appendix V.

Copies of this report are also being sent to the IRS managers who are affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Philip Shropshire, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs), at (215) 516-2341.

 

Table of Contents

Background

The Office Audit Redesign Pilot Was Mostly Successful

Two Measures Need to Be Closely Monitored as the New Procedures Are Implemented

Recommendation 1:

Although the New Tools and Procedures Were Effective, the Managerial Review Process Should Be Reevaluated

Recommendation 2:

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Other Measurable Benefits From the Pilot

Appendix V – Management’s Response to the Draft Report

 

Background

The Small Business/Self-Employed (SB/SE) Division Examination function’s responsibility is to examine tax returns to determine the correct Federal tax liabilities.  Within the Examination function, the Office Audit (OA) function examines tax returns during face-to-face meetings with taxpayers in an Internal Revenue Service (IRS) office.  Between Fiscal Years 1997 and 2000, the Examination function experienced a 64 percent decline in total assessed dollars and a 66 percent decline in closed cases. 

The SB/SE Division initiated an in-depth effort to reengineer its Examination processes, products, and services to help address these declines.  As part of this Reengineering effort, the SB/SE Division conducted the OA Redesign Pilot during which it tested new OA tools and procedures to determine their effectiveness in increasing Examination productivity and reducing taxpayer burden.  These new tools and procedures included the following:

The SB/SE Division conducted the OA Pilot from September 2002 to March 2003 in nine OA Pilot groups.  Tax Compliance Officers (TCO) started and closed approximately 650 tax examinations during the Pilot. 

The expected outcomes for the Pilot were increased first appointment closures, reduced time on cases, reduced cycle time, and increased employee and customer satisfaction.  Other measurable potential benefits included increases in agreed cases and Examination quality, and reductions in taxpayer appointment postponements, taxpayer no-shows, and the number of examinations that result in no changes to the tax liability.  After final approval, the SB/SE Division plans to implement the new procedures nationwide.

This review was performed at the Santa Ana, California; Houston, Texas; and Boston, Massachusetts, SB/SE Division Territory Offices from March through September 2003.  This audit was conducted in accordance with Government Auditing Standards.  Detailed information on our audit objective, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.

The Office Audit Redesign Pilot Was Mostly Successful

Overall, the OA Pilot was effective in testing the redesigned tools and procedures, and the results indicate that the new tools and procedures should be implemented nationwide.  The Pilot had effective management oversight, three of five expected outcomes were realized, and other Examination program measures showed positive trends.

Management oversight was effective

The OA Pilot team management provided thorough oversight to the participants to help them use the new tools and is planning revisions to some of the tools to improve their effectiveness.  Sites and participants chosen for the Pilot were typical and representative of the OA Program overall.  For example, the Pilot included groups from metropolitan, suburban, and rural areas and groups with all TCOs in the same location or in multiple locations. 

Based on discussions with Examination function management and statistical analysis of Audit Information Management System (AIMS) reports, the cases selected for the Pilot were typical and representative of overall Examination cases worked by the OA program.  In addition, Pilot management provided adequate training on the new tools and procedures to the Pilot participants.

Three of five expected outcomes were met, and other measurable benefits showed positive trends

First appointment closures increased

A first appointment closure occurs when the TCO issues the Examination results report to the taxpayer at the time of the initial appointment or shortly thereafter with no subsequent revisions.  One of the Pilot’s expected outcomes was for the TCOs to close more cases at their first appointments to reduce taxpayer burden.  Our review of a judgmental sample of 88 closed Pilot cases showed that 46 cases (52 percent) were closed at the first appointment.  Compared to 12 OA groups that did not participate in the Pilot, this was a positive outcome since only 27 percent of their cases were closed at the first appointment.

In addition, Pilot team management analyzed the total number of cases closed by all the Pilot groups at the first appointment during the Pilot period and identified similar results.  Fifty-two percent of all closed Pilot cases were closed at the first appointment.

Customer satisfaction improved

Customer satisfaction was measured using the SB/SE Division Customer Satisfaction Surveys.  During the Pilot, the Pilot team collected Customer Satisfaction Survey results from 225 customers, consisting of taxpayers and paid professionals. 

These results were then compared to approximately 4,900 Customer Satisfaction Surveys from non-Pilot groups taken from the period July through September 2002.  The customer satisfaction results of the Pilot were good overall when compared to the baseline results.  Ninety-one percent of the Pilot group customers were satisfied with their examinations compared to 62 percent for the non-Pilot group customers.

Employee satisfaction improved

Employee satistaction was measured using employee surveys and focus group discussions conducted at the end of the Pilot.  Forty-nine TCOs, eight group managers, and nine clerical staff participated in the surveys and focus group sessions.  Pilot participants were asked to compare their satisfaction to the period prior to the Pilot by answering a series of questions about the effectiveness of the new tools and the efficiency of conducting the various phases of the examination.  In addition, there was one question that asked employees if, overall, they were more satisfied with their jobs.

As the chart below shows, the results from the overall satisfaction question for managers and clerical staff were mostly positive.  The results for the TCOs were mixed.  While the TCOs thought some of the new tools were helpful and enabled them to do a quality examination, some tools did not work very well.

 

Percentage of Employee Satisfaction Levels

 

TCOs

Managers

Clerical

More Satisfied

37.5

62.5

55.6

Neutral

31.3

25.0

44.4

Less Satisfied

31.3

12.5

0

Total

100.1*

100

100

*Totals more than 100 due to rounding the category numbers.

Source:  Post Pilot Surveys, March 2003.

Pilot team management evaluated participants’ comments and is making recommendations that take into consideration the feedback from employees.  For example, 83 percent of the TCOs stated that the process would be easier to use if the tools and templates were better integrated into Report Generation Software (RGS).  Management recommended improvements in overall usability and RGS integration methods for tools and templates. 

Other measurable benefits also improved

In addition to the expected outcomes, the Pilot anticipated other potential benefits that included increases in agreed cases and Examination quality and reductions in appointment postponements, taxpayer no-shows, and no-change rates.  Also, we performed an analysis of dollars assessed per return to determine if there was a positive trend.  Our analysis showed that all of these potential benefits were either realized or the same when compared to non-Pilot results, as follows:

·        No-change rates – virtually the same.

·        Agreed cases – higher percentage.

·        Postponements – lower percentage.

·        No-shows – lower percentage.

·        Case quality – better. 

·        Dollars assessed per return – higher.

For detailed results of the analysis and comparison sources, see Appendix IV.

While we agree the new tools and procedures should be implemented nationwide, our review showed that time spent on examinations increased and the managerial review process should be reevaluated. 

Two Measures Need to Be Closely Monitored as the New Procedures Are Implemented

Examination time is the direct time that the TCO charges to the examination of a tax return.  This includes pre-examination planning, inspecting books and records, resolving issues, and closing time.  Cycle time is generally defined as the number of days between the first date the TCO charges time to the examination of a tax return and the closing date of the case.

These measures are standard business measures used in the SB/SE Division Examination Program.  Reducing both measures were expected outcomes of the Pilot; however, these goals were not met.  In addition, the measure used for cycle time was not actually representative of cycle time. 

Examination time was greater for Pilot cases

Our analysis of the number of hours charged to Pilot cases and to non-Pilot cases showed that Examination time was greater for Pilot cases.  The Pilot team management’s analysis also identified this difference.

 

Examination Time

 

The chart was removed due to its size.  To see the chart, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

 

As illustrated in the chart above, our review of 88 closed Pilot cases showed that the average hours spent examining tax returns was 9.2 hours.  In addition, analysis of all closed Pilot cases from the AIMS as of March 31, 2003, showed the average hours spent was 10.2 hours.  Therefore, the Pilot procedures did not reduce time on cases when compared to the average of 6.8 hours for the same period on non-Pilot cases.

Although the Pilot team management analyzed different data sources, they reached the same conclusion:  the Examination time was not reduced.  Their comparison of the EQMS results for Pilot cases as of March 31, 2003, showed the average hours spent on cases was 11.1 hours.  This was compared to 7.7 hours from EQMS results for non-Pilot cases from the prior 7-month period. 

 

Cycle time was longer for Pilot cases

Analysis of Pilot cases closed as of March 2003 compared to non-Pilot cases started and closed over the same period showed that cycle time on Pilot cases was longer than on non-Pilot cases.  The following graph shows Pilot and non-Pilot average cycle days increased for each month of the Pilot.  These averages will continue to rise as older cases are closed, but at some point in the future the cycle time would stop increasing and level off to provide a more reliable measure.  As of March 31, 2003, the average Pilot case cycle time was 121 days, compared to 80 days for started and closed non-Pilot cases.

Average Cycle Time

The chart was removed due to its size.  To see the chart, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

 

The Pilot team management explained that 1 reason they did not use the standard method to compute cycle time was because all Pilot cases were reviewed by the EQMS before closing, which took an additional 10 to 15 days longer to close than the non-Pilot cases nationwide.  Taking the 15 days into consideration, the cycle time of 106 days was still higher than the non-Pilot cases, which was 80 days.

Instead of using cycle time, Pilot team management measured the average age of the open case inventory for all Pilot sites, including Pilot and non-Pilot cases.  Management reported that there was improvement in this measure over the period of the Pilot.  Their analysis showed that average cycle days of in-process inventory (open cases) decreased in Pilot groups as time went on.  While this analysis may have provided sufficient short-term information, the traditional measure of cycle time needs to be monitored as the new procedures are implemented and results are analyzed.  If cycle time is the desired measurement, the time on cases should be measured from start to closure so the results will not be misunderstood.

Potential reasons and effects for increased time

Increased Examination and cycle time could have several causes:

·        New tools and procedures, such as Materiality Worksheets, were complex and added time to the pre-examination phase.

·        Problems integrating the new tools within the RGS added time to the pre-examination phase. 

Increasing Examination and cycle time could result in a decline in the Examination program’s business results.  For example, although total dollars assessed increased, average dollars assessed per hour was lower for closed Pilot cases when compared to non-Pilot cases because Pilot Examination time was higher.  Analysis of closed Pilot cases as of March 31, 2003, showed dollars assessed per hour of $161 compared to $198 for non-Pilot started and closed cases during the same period.  For the same period for the prior year, the average was $176.

Actions planned by Pilot team management

The Pilot team management recognized the problems that potentially caused increased Examination and cycle time and recommended the following revisions to new tools and procedures before nationwide implementation:

·        Integrating the RGS with Pilot tools and procedures.

·        Simplifying the materiality evaluation process by replacing the Materiality Worksheet with a narrative checklist for risk assessment.

·        Revising follow-up telephone call procedures by eliminating multiple calls to taxpayers and encouraging examination completion by the TCOs and clerks.

·        Training the TCOs on management of work schedules.

·        Establishing guidelines for the number of new starts per examiner.

We agree that these revisions should help reduce Examination and cycle time. 

Recommendation

The Director, Compliance, SB/SE Division, who is responsible for implementing policies for the Examination function, should:

1.      Conduct formal follow-up evaluations of the new tools and procedures to monitor their effectiveness in reducing Examination and cycle time on cases.

Management’s Response:  Compliance Policy Analysts will monitor the effectiveness of the Reengineering roll-out and monitor key program results to ensure the new tools are effectively implemented and used throughout the SB/SE Division’s Office Audit program.  Key statistical indicators will be reviewed monthly.  Compliance Area visitations will be conducted as necessary.  The Program Manager, Exam General Processes, will advise the Director, Reporting Compliance, of the impact the new tools and procedures are having on Examination time and cycle time.

Although the New Tools and Procedures Were Effective, the Managerial Review Process Should Be Reevaluated

Pilot participants were generally using the new Pilot tools and procedures.  Our review of 88 closed cases showed that in 79 cases (90 percent), the TCOs used all of the Pilot tools and procedures while working the cases.  In addition, a review of Pilot progress reports, meeting notes, and closed case data showed the tools were being used.  Therefore, any conclusions reached by the Pilot team on the effectiveness of the tools were appropriate. 

Feedback from the employees showed that they thought most of the tools improved the Examination process.  As the employees used the new tools, they identified some problems in using them effectively, which were discussed during focus groups and feedback sessions.  Pilot team management is planning revisions to improve some of the processes.  We reviewed these revisions and think they should adequately address some of the difficulties encountered.  

For example, some of these problems involved the use of the Microsoft Outlook Planning Calendar.  Inconsistent types and amount of data were input onto the Calendar and duplication existed with other tools available to the TCOs.  Also, two of three group managers interviewed stated they did not use the Microsoft Outlook Planning Calendars for the ordering of cases or the assignment of cases because of duplication of information already available.  The revisions planned by SB/SE Division management will eliminate the need to maintain duplicate paper calendars and provide training for scheduling and using Microsoft Outlook for managing inventory.

Another example is the Materiality Worksheet, which the TCOs identified as not very useful.  The revisions planned will replace the materiality scoring with a narrative checklist to substantiate the risk level of an issue.  Although tools were being effectively used, the management review process could be improved.

Managerial reviews need to be improved

Managers were required to review cases biweekly to assess the consistency of planning, scheduling, and Examination activities.  One of the goals of the Examination Reengineering program was for managerial involvement to be up-front, early in the Examination process. 

IRS procedures state that nonevaluative reviews do not contain a written rating; however, some documentation is appropriate to establish that it actually occurred.  Employees need to see feedback instructions for completing the examinations, and managers need to have a way to follow up on their suggestions.  In addition, one of the General Accounting Office’s Standards for Internal Control in the Federal Government states that internal control, such as managerial review, needs to be clearly documented, and the documentation should be readily available for examination.

Our review of 88 closed Pilot cases showed the biweekly reviews were documented in the case files in only 2 cases  (2 percent).  Three group managers interviewed stated they conducted biweekly case reviews.  However, they did not document these reviews because the intent was for the review to be nonevaluative during the Pilot.  Pilot team management observed that some managers of other Pilot groups used a separate checklist to document biweekly reviews, although it was not required.  Without proper documentation of managerial reviews where case guidance is provided, it is difficult to assess the impact that such reviews have on case development. 

Pilot team management is planning some revisions to the managerial reviews.  These reviews include once-a-month open case reviews to discuss cases and potential issues with the TCOs, in addition to biweekly analysis of inventory status and scheduling casework.  However, the reviews still will not be required to be recorded.   

Recommendation

The Director, Compliance, SB/SE Division, should:

2.      Ensure that monthly open case reviews where case guidance is provided are documented in the case activity record.  Open cases selected should include cases in various Examination phases.

Management’s Response:  Language has been added to Office Audit managerial training materials that provides Office Audit group managers with guidelines for conducting and documenting open case reviews.  Compliance Policy Analysts will also review open case reviews during visitations conducted to evaluate the deployment of the Office Audit Reengineering process changes to ensure they are documented as appropriate.    

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

Our overall objective was to determine whether the Small Business/Self-Employed (SB/SE) Division was effective in meeting its goals during the Office Audit (OA) Redesign Pilot.  Specifically, we:

I.        Determined how effectively the OA Pilot participants used the new procedures.

A.     Determined the status and effectiveness of the various new techniques and tools.

1.      Reviewed the Pilot’s final recommendations report for an overview of whether the new tools were used and how they were measured.

2.      Obtained managers’ and employees’ opinions on the new procedures by analyzing the results from the employee focus group and feedback sessions.

3.      Determined if the use of Microsoft Outlook Planning Calendars for ordering returns and scheduling appointments improved inventory management. 

4.      Reviewed a judgmental sample of 88 cases closed between January and March 2003 from 3 Pilot sites visited (we were advised they were representative of the overall Pilot) to discuss issues and perform the following tests:

a.       Reviewed the Call Back Letters and the Tax Compliance Officers’ (TCO) activity records to determine whether the Letters were effective.

b.      Determined if Focused Information Data Requests caused taxpayers to bring in all the necessary documents and whether cases were closed on first appointments.

c.       Determined if Planning Forecasts were useful to the TCOs and managers in scheduling work and for time management.

d.      Determined if the Materiality Worksheet and the Audit Planning Sheets were used and if biweekly managerial reviews were performed.

B.     Determined if cases selected by Pilot sites were representative of overall Examination cases worked by the OA.

1.      Compared the Audit Information Management System (AIMS) activity and source codes for Pilot cases closed as of March 31, 2003, with cases at the same sites closed between October 1, 2001, and March 31, 2002, and cases from non-Pilot sites closed between October 1, 2002, and March 31, 2003. 

2.      Determined the reliability of AIMS data analyzed by comparing it to an Examination function statistical report of closed cases, called Table 37, and validating certain data fields.

3.      Discussed with group managers the types of cases requested for the Pilot and the process for requesting returns.  

C.     Determined if participants in the Pilot sites were representative of most TCOs overall by discussing how the sites were selected and comparing the grade levels of the 49 Pilot participants with non-participating TCOs. 

D.     Determined the adequacy of the training provided to Pilot participants.

E.      Conducted an independent analysis of the Pilot case data from the AIMS and the Examination Quality Measurement System (EQMS) as follows:

1.      Analyzed AIMS Pilot data on closed cases as of June 30, 2003, to determine the number of closed cases, average hours worked per case, average cycle time, average dollars assessed per return, average dollars assessed per hour, and types of closings.  We compared these data to non-Pilot cases closed between October 1, 2001, and June 30, 2002 (started after September 2001), and closed between October 1, 2002, and June 30, 2003 (started after September 2002).

2.      Analyzed EQMS data for the Pilot’s closed cases and compared the prior year’s EQMS data from September 1, 2001, to March 31, 2002, to the current period’s data from September 1, 2002, to March 31, 2003.  

II.     Determined if management had adequate plans for measuring the Pilot results. 

A.     Determined if there were adequate oversight controls over the Pilot.

B.     Analyzed the tools and data used to measure the results to determine if they adequately informed SB/SE Division management whether goals were being achieved.

C.     Determined if the method for measuring employee satisfaction was sufficient.   

D.     Determined if the method for measuring customer satisfaction was sufficient and analyzed results.

 

Appendix II

 

Major Contributors to This Report

 

Philip Shropshire, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs)

Parker F. Pearson, Director

Lynn Wofchuck, Audit Manager

Richard T. Hayes, Senior Auditor

Julian E. O’Neal, Senior Auditor

Phyllis E. Heald, Auditor

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Services and Enforcement  SE 

Acting Deputy Commissioner, Small Business/Self-Employed Division  SE:S

Acting Director, Compliance, Small Business/Self-Employed Division  SE:S:C

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Management Controls  OS:CFO:AR:M

Audit Liaison:  Commissioner, Small Business/Self-Employed Division  SE:S

 

Appendix IV

 

Other Measurable Benefits From the Pilot

 

Treasury Inspector General for Tax Administration Analysis

                                                                                                                                       Pilot Results Sources                    Comparative Results Sources

 

Judgmental review

AIMS Pilot

EQMS Pilot

AIMS Non-Pilot

EQMS National

No-Change Rates

36.4%

36.5%

 

36.1%

 

Agreed Cases

63.6%

60.5%

 

59.5%

 

Postponements

14.7%

 

23%

 

28.3%

No Shows

 

 

7.6%

 

13%

EQMS Quality

 

 

77.9%

 

70.1%

Dollars per Return

 

$1,653

 

$1,341

 

Sources:  Judgmental review – Review of a judgmental sample of 88 closed Pilot cases. 

AIMS Pilot – Analysis of 655 closed Pilot cases as of March 31, 2003, from the AIMS database.
EQMS Pilot – Pilot EQMS results obtained from Pilot team management.       
AIMS Non-Pilot – Analysis of 14,845 closed non-Pilot cases started by September 1, 2002, and closed    between October 1, 2002, and March 31, 2003.                                                     
EQMS National – Analysis of all EQMS results from September 1, 2001 – March 31, 2002,  from the EQMS database.

 

 

Appendix V

 

Management’s Response to the Draft Report

 

The response was removed due to its size.  To see the response, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.