A&I Online
Skip to Content arrowA&I Home    arrowWhat's New!    arrowContacts    arrowSite Guide    arrowA&I Data    arrowFeedback   A&I User Login
 layout image
Improvement: Data Quality Tools State Analysis Descriptions    restricted area - locked

State Data Analysis Reports that support the State Safety Data Quality (SSDQ) evaluation and the individual measures.
Click here for a Printer Friendly Version of this page.Print Friendly Version
The SSDQ results provide the States with a comprehensive evaluation on the completeness, timeliness, accuracy, and consistency of State-reported crash and roadside inspection data in MCMIS. The State Data Analysis Reports were created to provide additional detail on activity that may be occurring within a State not immediately apparent from the measure itself. These reports may support a State's endeavors to identify improvement strategies for improving their overall State data quality.

Crash Timeliness: Monthly Analysis
Crash Accuracy: Records Reported by Badge Number 
Inspection Timeliness: Monthly Analysis
Inspection Timeliness: Records Reported by Inspector ID
Inspection Timeliness: Records Reported by Inspection Facility Type and Inspection Level
Inspection Accuracy: Records Reported by Inspector ID


Crash Timeliness: Monthly Analysis

The ’Monthly Analysis’ report presents the number of crash records reported to FMCSA within 90 days of the crash event broken down by 1 month increments for the 12-month analysis period. Each month was analyzed for reporting crash records within the FMCSA standard of 90 days. The data for this report is based on MCMIS fatal and non-fatal crash records that represent interstate and intrastate carriers and includes large truck and bus vehicle types.

The graph illustrates the monthly analysis results for crash timeliness. The results for each month are plotted against the FMCSA standard for reporting crash records within 90 days - ’Good’ reporting is achieved at 85%. Fluctuations in reporting may be observed. Any significant changes in reporting should be evaluated further to determine if corrective actions are needed.

Crash events that occurred during the 12-month analysis period were included in this analysis. FMCSA calculates the number of days an event takes to be reported to FMCSA by calculating the number of days from the crash event to the date the record was sent to MCMIS. The calculation only considers the first time the record was uploaded to MCMIS; uploading a record more than once to MCMIS does not negatively impact timeliness.

Definitions
Event Date Range Analyzed (1 month) - One month period of analysis
# Records with Upload Date > 90 Days - Number of records reported more than 90 days after the crash event
% Records with Upload Date > 90 Days - Percentage of records reported more than 90 days after the crash event. The percentage is calculated by dividing the number of records reported more than 90 days by the number of interstate & intrastate crash records
# Records with Upload Date <= 90 Days - Number of records reported less than or equal to 90 days after the crash event
% Records with Upload Date <= 90 Days - Percentage of records reported less than or equal to 90 days after the crash event. The percentage is calculated by dividing the number of records reported less than or equal to 90 days by the number of interstate & intrastate crash records
# Interstate & Intrastate Crash Records - Total number of crash records reported to MCMIS during the 1-month period

Back to Top


Crash Accuracy: Records Reported by Badge Number

The ’Records Reported by Badge #’ report presents the number of crash records that were matched and unmatched to a company registered in MCMIS by officer badge number. The data for this report is based on MCMIS fatal and non-fatal crash records for 12 months that represent interstate carriers and intrastate carriers transporting hazardous material and includes large truck and bus vehicle types. The badge number analysis provides supportive information when analyzing the relationship between the number of matched crash records in MCMIS and the officer responsible for reporting the crash record information. Note: Officer badge numbers identified as ’blank’ indicate that the badge number data may not have been entered into SAFETYNET properly or may not have been collected on the crash report form.

Crash records entered per FMCSA’s "Procedures for Entering Crashes without Carrier Identification into SAFETYNET" were not included in this analysis.

Definitions
Officer Badge - Badge # of officer recording the crash event information
# UnMatched Records - Number of records unmatched to a company registered in MCMIS by badge #
% UnMatched Records - Percentage of records unmatched to a company registered in MCMIS by badge #. The percentage is calculated by dividing the number of unmatched records by the number of interstate & HM intrastate crash records
# Matched Records - Number of records matched to a company registered in MCMIS by badge #
% Matched Records - Percentage of records matched to a company registered in MCMIS by badge #. The percentage is calculated by dividing the number of matched records by the number of interstate & HM intrastate crash records
# Interstate & HM Intrastate Crash Records - Total number of matched and unmatched crash records

Back to Top


Inspection Timeliness: Monthly Analysis

The ’Monthly Analysis’ report presents the number of inspection records reported to FMCSA within 21 days of the inspection event broken down by 1 month increments for the 12-month analysis period. Each month was analyzed for reporting inspection records within the FMCSA standard of 21 days. The data for this report is based on MCMIS inspection records that represent interstate and intrastate carriers and includes large truck and bus vehicle types.

The graph illustrates the monthly analysis results for inspection timeliness. The results for each month are plotted against the FMCSA standard for reporting inspection records within 21 days - 'Good' reporting is achieved at 85%. Fluctuations in reporting may be observed. Any significant changes in reporting should be evaluated further to determine if corrective actions are needed.

Inspection events that occurred during the 12-month analysis period were included in this analysis. FMCSA calculates the number of days an event takes to be reported to FMCSA by calculating the number of days from the inspection event to the date the record was sent to MCMIS. The calculation only considers the first time the record was uploaded to MCMIS; uploading a record more than once to MCMIS does not negatively impact timeliness.

Definitions
Event Date Range Analyzed (1 month) - One month period of analysis
# Records with Upload Date > 21 Days - Number of records reported more than 21 days after the inspection event
% Records with Upload Date > 21 Days - Percentage of records reported more than 21 days after the inspection event. The percentage is calculated by dividing the number of records reported more than 21 days by the number of interstate & intrastate inspection records
# Records with Upload Date <= 21 Days - Number of records reported less than or equal to 21 days after the inspection event
%Records with Upload Date <= 21 Days - Percentage of records reported less than or equal to 21 days after the inspection event. The percentage is calculated by dividing the number of records reported less than or equal to 21 days by the number of interstate & intrastate inspection records
#Interstate & Intrastate Inspection Records - Total number of inspection records reported to MCMIS during the 1-month period

Back to Top


Inspection Timeliness: Records Reported by Inspector ID

The ’Records Reported by Inspector ID’ report presents the number of inspection records reported to FMCSA within 21 days of the inspection event by inspection identification number. The data for this report is based on MCMIS inspection records for 12 months that represent interstate and intrastate carriers and includes large truck and bus vehicle types.

The inspection identification number analysis provides supportive information when analyzing the relationship between the number of inspection records reported within 21 days and the inspector responsible for reporting the inspection record information. Note: Inspector ID numbers identified as 'blank' indicate that the inspector ID number data may not have been entered into SAFETYNET properly or may not have been collected on the inspection report form.

Inspection events that occurred during the 12-month analysis period were included in this analysis. FMCSA calculates the number of days an event takes to be reported to FMCSA by calculating the number of days from the inspection event to the date the record was sent to MCMIS. The calculation only considers the first time the record was uploaded to MCMIS; uploading a record more than once to MCMIS does not negatively impact timeliness.

Definitions
Inspector ID # - Identification # of inspector recording the inspection event information
# Records with Upload Date > 21 Days - Number of records reported more than 21 days after the inspection event
% Records with Upload Date > 21 Days - Percentage of records reported more than 21 days after the inspection event. The percentage is calculated by dividing the number of records reported more than 21 days by the number of interstate & intrastate inspection records
# Records with Upload Date <= 21 Days - Number of records reported less than or equal to 21 days after the inspection event
% Records with Upload Date <= 21 Days - Percentage of records reported less than or equal to 21 days after the inspection event. The percentage is calculated by dividing the number of records reported less than or equal to 21 days by the number of interstate & intrastate inspection records
# Interstate & Intrastate Inspection Records - Total number of inspection records reported to MCMIS during the 12-month period

Back to Top


Inspection Timeliness: Records Reported by Inspection Facility Type and Inspection Level

The ’Records Reported by Inspection Facility Type & Inspection Level’ report presents the number of inspection records reported to FMCSA within 21 days of the inspection event by the facility type and the inspection level. The first table presents a summary of all inspections by facility type, and the second table summarizes the reporting of inspection records by inspection level. The data for this report is based on MCMIS inspection records for 12 months that represent interstate and intrastate carriers and includes large truck and bus vehicle types.

The inspection facility type and inspection level analysis provides supportive information when analyzing the relationship between the number of inspection records reported within 21 days and the type and location of the inspection event.

Inspection events that occurred during the 12-month analysis period were included in this analysis. FMCSA calculates the number of days an event takes to be reported to FMCSA by calculating the number of days from the inspection event to the date the record was sent to MCMIS. The calculation only considers the first time the record was uploaded to MCMIS; uploading a record more than once to MCMIS does not negatively impact timeliness.

Definitions
Facility Type - Type of inspection facility - either a fixed site or at the roadside
Inspection Level - The North American Standard Truck Inspection procedures have identified six levels of inspections: full, walk-around, driver-only, special study, terminal and radioactive materials
# Records with Upload Date > 21 Days - Number of records reported more than 21 days after the inspection event
% Records with Upload Date > 21 Days - Percentage of records reported more than 21 days after the inspection event. The percentage is calculated by dividing the number of records reported more than 21 days by the number of interstate & intrastate inspection records
# Records with Upload Date <= 21 Days - Number of records reported less than or equal to 21 days after the inspection event
% Records with Upload Date <= 21 Days - Percentage of records reported less than or equal to 21 days after the inspection event. The percentage is calculated by dividing the number of records reported less than or equal to 21 days by the number of interstate & intrastate inspection records
# Interstate & Intrastate Inspection Records - Total number of inspection records reported to MCMIS during the 12-month period

Back to Top


Inspection Accuracy: Records Reported by Inspector ID

The ’Records Reported by Inspector ID’ report presents the number of inspection records that were matched and unmatched to a company registered in MCMIS by inspection identification number. The data for this report is based on MCMIS inspection records for 12 months that represent interstate carriers and intrastate carriers transporting hazardous material and includes large truck and bus vehicle types. The inspection identification number analysis provides supportive information when analyzing the relationship between the number of matched inspection records in MCMIS and the inspector responsible for reporting the inspection record information. Note: Inspector ID numbers identified as ’blank’ indicate that the inspector ID number data may not have been entered into SAFETYNET properly or may not have been collected on the inspection report form.

Definitions
Inspector ID # - Identification ID # of the inspector recording the inspection event information
# Unmatched Records - Number of records unmatched to a company registered in MCMIS by inspector #
% Unmatched Records - Percentage of records unmatched to a company registered in MCMIS by inspector ID #. The percentage is calculated by dividing the number of unmatched records by the number of interstate & HM intrastate inspection records
# Matched Records - Number of records matched to a company registered in MCMIS by inspector ID #
% Matched Records - Percentage of records matched to a company registered in MCMIS by inspector ID #. The percentage is calculated by dividing the number of matched records by the number of interstate & HM intrastate inspection records
#Interstate & HM Intrastate Inspection Records - Total number of matched and unmatched inspection records

Back to Top


restricted area - locked Restricted to Authorized FMCSA and State Enforcement Users. A&I User Login Required

U.S. Department of Transportation Logo U.S. Department of Transportation   Federal Motor Carrier Safety Administration
A&I Home Page   |     |     |     |     |  
  |     |  
FMCSA Home Page