Slide 1 Federal Motor Carrier Safety Administration How does FMCSA Evaluate the Quality of State-Reported Data? August 2005 Slide 2 How does FMCSA Evaluate the Quality of State-reported data? FMCSA looks at the completeness, timeliness, accuracy and consistency of State-reported crashes involving large trucks and buses, and roadside inspection data. Slide 3 FMCSA Standards Completeness - State-reported records missing from MCMIS Timeliness - uploading State-reported records to MCMIS within the standard timeframes Accuracy - matching State-reported records to a motor carrier in MCMIS Consistency - reporting State-reported records uniformly over time Slide 4 Evaluation Methodology Step 1. Define performance measures and indicators based upon FMCSA Standards Step 2. Develop a rating for each State (Good, Fair, Poor) Step 3. Generate quarterly results FMCSA Standards 5 Performance Measures 1 Overriding Indicator Overall State Ratings (Good, Fair, Poor) Quarterly Results Slide 5 Evaluation Methodology: Step 1: Define Performance Measures Completeness Measure - State-reported records missing from MCMIS Crash - compare # fatal cases in MCMIS to # fatal cases in FARS Evaluation Criteria 1 calendar year Large trucks only Slide 6 Evaluation Methodology: Step 1: Define Performance Measures Timeliness Measures - # of State-reported records uploaded to MCMIS within the standard timeframes Crash - % of crash records reported within 90 days Roadside inspections - % of inspection records reported within 21 days Evaluation Criteria 12 month timeframe Large trucks and buses only ‘Add’ records used Slide 7 Evaluation Methodology: Step 1: Define Performance Measures Accuracy Measures - # of State-reported records that match a motor carrier in MCMIS Crash - % of matched fatal and non-fatal crash records in MCMIS for interstate carriers and intrastate hazardous material carriers Roadside inspections - % of matched inspection records in MCMIS for interstate carriers and intrastate hazardous material carriers Evaluation Criteria 12 month timeframe Large trucks and buses Slide 8 Evaluation Methodology: Step 1: Define Performance Measures Overriding Consistency Indicator - State-reported crash records uniformly reported in MCMIS over time Crash - % of non-fatal crash records in MCMIS compared to previous 3-year reporting period Evaluation Criteria 12 month timeframe Large trucks and buses Slide 9 Evaluation Methodology: Step 2: Develop State Rating Results of 5 performance measures and 1 overriding indicator Measures - receive ratings Indicator - receives flag Slide 10 Evaluation Methodology: Step 2: Develop State Rating Overall State Rating - quarterly results A rating of insufficient data for an individual measure or indicator is not included in the calculation of the Overall State Rating. Slide 11 Evaluation Methodology: Step 3: Generate Quarterly Results - June 24, 2005 Slide 12 Evaluation Methodology: Step 3: Generate Quarterly Results Publish Results on Analysis & Information (A&I) Online web site SafeStat module Data Quality module Slide 13 Evaluation Methodology: Step 3: Generate Quarterly Results Slide 14 Evaluation Methodology: Step 3: Generate Quarterly Results Graphic of Summary Report, View Rating Results Slide 15 How the Evaluation Results Improved Online 15 Month Comparison of Overall State Ratings March ‘04 June ‘05 Good: 24 States 25 States Fair: 13 States 20 States Poor: 14 States 6 States Slide 16 June 24, 2005 Quarterly Results Graphic of color coded map. Slide 17 The Future Graphic of color coded map in all green.