This document is available in three formats: this web page (for browsing content), PDF (comparable to original document formatting), and WordPerfect. To view the PDF you will need Acrobat Reader, which may be downloaded from the Adobe site. For an official signed copy, please contact the Antitrust Documents Group.

U.S. Department of Justice Seal
U.S. DEPARTMENT OF JUSTICE

Antitrust Division

   City Center Building
1401 H Street, NW
Washington, DC 20530

March 6, 1998

Liam S. Coonan, Esq.
Senior Vice President and
Assistant General Counsel
SBC Communications, Inc.
175 E. Houston Street
San Antonio, Texas 78205

Re: SBC Performance Measures

Dear Mr. Coonan:

As part of the Department's commitment to work with all Bell companies on relevant issues in advance of their section 271 applications, the Department of Justice and SBC Communications, Inc. ("SBC") have, as you know, been spending considerable time discussing issues relating to wholesale support processes and performance measures. In that regard, you have provided us with a draft list of proposed performance measures, a list that you have supplemented as our discussions have progressed.

Attachment A is a comprehensive list of performance measures. With the qualifications set forth below, we are satisfied that the performance measures listed in Attachment A, to which SBC has agreed,(1) would be sufficient, if properly implemented, to satisfy the Department's need for performance measures for evaluating a Section 271 application filed in the not-too-distant future.

We appreciate SBC's engagement with the Department on satisfying our competitive assessment in advance of a filing and look forward to working with you on additional related issues. One such issue is whether the performance measures in Attachment A have been "properly implemented," since the majority of our discussions have dealt with the performance measures themselves and since it is upon the actual measures that this letter focuses. As you can appreciate, there are important repercussions that may arise from how the measures are implemented. For example, definitional issues and other details connected with the measures themselves (such as the basis upon which due dates and start and stop times are set in particular measures) could significantly affect the meaning of the data. Thus, because we have not yet reached agreement on issues such as data retention, presentation, and reporting (e.g., disaggregation, reporting intervals and formats), and analysis, we expect that Department staff and SBC will continue to work towards resolution of these issues. We also expect that Department staff and SBC will discuss performance standards and benchmarking, other important aspects of the Department's performance analysis.

Moreover, while we are satisfied at the present time that the measures set out in Attachment A would, if properly implemented, suffice for present purposes, performance measurement is a dynamic area and future developments could necessitate changes in our views of appropriate performance measures. For example, while the measures listed in Attachment A are structured to cover the provision of unbundled network elements, once it becomes clear how unbundled network elements will be provided so as to allow requesting carriers to combine such elements in order to provide a telecommunications service, we may find that other measures are necessary to assess performance in this situation. In addition, the development of new services or new methods of providing existing services could necessitate additional performance measures. Alternatively, through ongoing regulatory proceedings, our own investigation, or otherwise, we might learn of additional risks, and even occurrences, of discrimination of which we were not previously aware. Accordingly, we would expect SBC to implement additional measures or modifications to existing measures should it become apparent to the Department that they are necessary. On the other hand, developments might reveal that certain measures were no longer necessary and could be eliminated.

Our satisfaction with the performance measures set out in Attachment A must be placed in its proper context. First, it is limited to the Department's application of its competitive standard. Under section 271, the Department is to evaluate applications for Bell entry using "any standard" the Department believes is appropriate, and the FCC is required to give "substantial weight" to that evaluation. As we have explained, our standard, in addition to the specific statutory prerequisites, requires a demonstration that local markets in a state have been "fully and irreversibly opened to competition," and appropriate performance measures, standards, and benchmarks are important to the Department's application of our competitive standard.

Second, our conclusions relate only to the Department's evaluation of section 271 applications and should not be construed as an expression of the Department's views concerning the appropriate resolution of any federal or state regulatory proceeding relating to performance measures. The FCC and some state commissions have ongoing proceedings considering both performance measures and performance standards, including company-specific and state-specific issues. These proceedings may produce performance measures different from, or in addition to, those described in Attachment A.

I am hopeful that we can resolve the remaining issues expeditiously through our ongoing discussions. I appreciate your cooperation in addressing these issues and look forward to our continuing mutual efforts. If you have any questions or suggestions regarding these issues, please call.

     

Sincerely,

/s/

Donald J. Russell
Chief
Telecommunications task Force


     

Attachment A

PERFORMANCE MEASURES

  1. PRE-ORDERING

    1. Pre-order OSS Availability: Measures both the hours and days the BOC's pre-order OSSs are available to CLECs and non-scheduled downtime.

    2. Pre-order System Response Times: Measures, in seconds, the speed with which the CLEC Service Representatives receive information (including rejection and error messages) for processes described below with a customer on the line. These cycle-time measures assume the CLEC has mechanical access to the BOC databases and should be measured in a manner that allows appropriate comparisons to like cycle times experienced by BOC retail service representatives. Times are provided separately for the following functions:

      1. Address verification

      2. Request for telephone number

      3. Request for customer service record (CSR)

      4. Service and product availability

      5. Appointment scheduling

  2. ORDERING

    1. Firm Order Commitment (FOC) Cycle Time: Measures the average time from CLEC service order submission to BOC response, confirming receipt of a properly formatted and appointed order and committing to complete the order by a specified date. In addition, may be presented as the percentage returned within an agreed upon interval.

    2. Rejected Order Cycle Time: Measures the average time, from CLEC service order submission to BOC response, for rejecting an incomplete service order or one containing errors. Each submission of an order, up to and including the FOC, requires a response cycle-time result.

    3. Ordering Quality: The following performance measures are important determinants of service order processing parity or adequacy. Each is important in its own right and provides insights into different aspects of order quality. While the entire set would not be required, Percent Flow Through and either Percent Rejected Orders or Order Submissions per Order are necessary.

      1. Percent Rejected Orders: Measured at the BOC gateway, it is the result of dividing rejected orders by total orders submitted, manually or mechanically. It is an adequacy measure because there are no equivalent BOC analogs. BOC orders are "rejected" via automatic edits before the order leaves the service representative position.

      2. Order Submissions per Order: Measured at the BOC gateway, it is determined by dividing total order submissions by the number of orders receiving a firm order commitment.

      3. Percent Flow Through: Measures the percentage of orders that flow from the BOC gateway to acceptance by the BOC service order processor without manual intervention. Orders rejected at the gateway are excluded.

    4. Ordering OSS Availability: Measures both the hours and days the BOC's ordering OSSs are available to CLECs and non-scheduled downtime.

    5. Ordering Center Availability: Reports both the hours and days of operation of the BOC ordering center.

    6. Speed of Answer­Ordering Center: Measures the average time to reach a BOC service representative.

  3. PROVISIONING

    1. Service Provisioning Interval: Measures the time from customer request for service to completion when the appointment is offered by the BOC, either from a common appointment database, generally used in a resale environment, or by agreed-to appointment intervals, more commonly used in a UNE environment. Service Provisioning Interval should be measured both as a mean, or average interval, and as a percent over a standard interval. Next available appointments offered from the work schedule OSS and expedited requests should be included for measurement; customer-requested due dates longer than the offered appointment should be excluded.

      1. Average Service Provisioning Interval: Measured in days from end-user request to order completion and counted separately for dispatched and non-dispatched orders.

      2. Percent Service Provisioned Out of Interval: Measures the percentage of service orders completed in more than an agreed upon number of days. Ideally, measured incrementally by day. For example, orders completed in more than 3 days, 4 days, 5 days, and 6 days. This performance measure depicts the tail of the interval curve. Combined with the Average Installation Interval, portrays a robust picture of provisioning cycle time.

    2. Other Provisioning Measures

      1. Percent Interconnection Facilities Provisioned Out of Interval: Measures the percentage of interconnection facilities (switched trunks and dedicated circuits) provisioned in more than an agreed upon number of days.

      2. Percent Missed Appointments­Company Reasons: Order completion is measured against the original CLEC-requested due date. No due date changes may be made unless explicitly specified by the end user or explicitly agreed to by the CLEC and the BOC. Orders missed for company reasons­load, facilities, or other­are included. Orders missed due to customer reasons are not counted as a miss for purposes of this measure.

      3. Percent New Service Failures: Measures the number of trouble reports on newly provisioned service within an agreed number of days of the original trouble. Studies have shown high correlation between provisioning errors and trouble reports occurring within 10 days and lower correlations beyond 10 days.

      4. Completed Service Order Accuracy: Measures the extent to which orders are completed by the BOC as ordered by the CLEC.

      5. Orders Held for Facilities: Measures service orders not completed by the original due date because of a lack of network facilities (including loops and central office equipment) in terms of (a) the average time between the original due date and the final completion date, and (b) the number of pending orders, as of the report date, held beyond a specified period (usually 30 days) following the original due date.

      6. Average Completion Notice Interval: Measures the average time from order completion to notification of the CLEC for orders submitted on a mechanized basis.

  4. MAINTENANCE

    1. Trouble Reporting & Clearance

      1. Trouble Report Rate: Measured as the number of trouble reports per customer or access line per month.

      2. Percent Repeat Reports: Measured as the percentage of end-user troubles on the same access line within an agreed number of days of the original trouble. Studies have shown high correlation between repair errors and repeat reports occurring within 10 days and lower correlations beyond 10 days.

      3. Percent Out of Service Over 24 Hours: Measured as a percentage of out-of-service troubles cleared within 24 hours.

      4. Percent Missed Appointments: Measures the percentage of trouble reports cleared after the promised appointment. Requires that appointment times, once set, cannot be changed except by the end user.

      5. Mean Time to Repair: Measured as the average interval from trouble report to clearance.

      6. Interconnection Facilities Restored Out of Interval: Measures the percentage of interconnection facilities (switched trunks and dedicated circuits) reported out of service and restored after an agreed-to interval. May also be measured and reported as an average interval.

      7. Maintenance OSS Availability: Measures both the hours and days the BOC's maintenance OSSs are available to CLECs and non-scheduled downtime.

      8. Maintenance Center Speed of Answer: Measures the average time to reach a BOC repair service representative.

    2. Network Quality

      1. Percent Blocked Calls: Measures trunking grade (quality) of service. Should be provided separately for the following types of trunks:
        1. ILEC End Office to CLEC End Office Trunk Groups
        2. ILEC Tandem to CLEC End Office Trunk Groups
        3. ILEC Tandem to and from ILEC End Office Trunk Groups

  5. BILLING

    1. Bill Timeliness: Measures the percentage of billing records delivered within an agreed-to interval. Should be provided for the following billing information provided to CLECs:

      1. Daily Usage File (DUF): Measures, from message creation to the availability of the usage information to the CLEC, the percentage of DUF's provided within the interval.

      2. Wholesale Bill: Measures the percentage of wholesale bills issued within an agreed-to number of days following the end of the billing cycle.

    2. Bill Completeness: Measures the percentage of complete billing records for usage charges, recurring charges, and non-recurring charges provided to CLECs. Should be measured after bills are released. Under approved conditions, sufficiently robust pre-release test and audit procedures could substitute for a post-release audit.

      1. Usage: Measures unbillable usage and usage from the current bill cycle not included on the current wholesale bill.

      2. Recurring Charges: Measures current bill cycle recurring charges not included on the current wholesale bill.

      3. Non-Recurring Charges: Measures non-recurring charges completed in the current bill period not included on the current wholesale bill.

    3. Bill Accuracy: Measures the percentage of accurate billing records for usage charges, recurring charges, and non-recurring charges provided to CLECs. Should be measured after bills are released. Under approved conditions, sufficiently robust pre-release test and audit procedures could substitute for a post-release audit.

  6. OTHER

    1. Operator Services Toll Speed of Answer: Measures raw interval in seconds or as a percentage under a set objective. Should be provided separately for unbranded and branded service.

    2. Directory Assistance Speed of Answer: Measures raw interval in seconds or as a percentage under a set objective. Should be provided separately for unbranded and branded service.

    3. 911 Database Update Timeliness and Accuracy: Measures the percentage of missed due dates of 911 database updates and the percentage of accurate updates.


FOOTNOTE

1. As we have discussed with you, the Department has agreed to narrow variances from Attachment A in light of certain SBC processes and procedures. Specifically, we have agreed that SBC need not provide separate operator services and directory assistance speed-of-answer measurements for branded and unbranded calls and that SBC can limit its 911 measurements to an error-clearing interval measure that is presently under development.