Office of Operations
21st Century Operations Using 21st Century Technologies

2017 Traffic Incident Management National Analysis Report

Executive Summary

November 2017

United States Department of Transportation Federal Highway Administration

Office of Operations
1200 New Jersey Avenue, SE
Washington, DC 20590

 

Printable version [PDF 291 KB]
You may need the Adobe® Reader® to view the PDFs on this page.
Contact Information: Operations Feedback at OperationsFeedback@dot.gov




Notice

This document is disseminated under the sponsorship of the U.S. Department of Transportation in the interest of information exchange. The U.S. Government assumes no liability for the use of the information contained in this document.

The U.S. Government does not endorse products or manufacturers. Trademarks or manufacturers' names appear in this report only because they are considered essential to the objective of the document.

The contents of this report reflect the views of the authors, who are responsible for the facts and accuracy of the data presented herein. The contents do not necessarily reflect the official policy of the U.S. Department of Transportation. This report does not constitute a standard, specification, or regulation.

Quality Assurance Statement

The Federal Highway Administration (FHWA) provides high-quality information to serve Government, industry, and the public in a manner that promotes public understanding. Standards and policies are used to ensure and maximize the quality, objectivity, utility, and integrity of its information. FHWA periodically reviews quality issues and adjusts its programs and processes to ensure continuous quality improvement.


Background

The Federal Highway Administration (FHWA) has used the Traffic Incident Management Capability Maturity Self-Assessment (TIM CM SA) for the past 15 years to evaluate the state of practice in traffic incident management across the country. Originally developed by FHWA in 2002, State and local TIM program managers use the TIM CM SA annually to benchmark and evaluate TIM program success and areas of improvement.

The TIM CM SA has undergone a number of revisions over the years to reflect changes in TIM practice. The most significant and recent of these revisions occurred in 2015 to align the TIM SA with the Capability Maturity Framework (CMF).1 Due to the nature of the revisions completed in 2015, a recalibration of the baseline scores was necessary that year to protect the value of the TIM SA as a tool to measure national TIM progress over time.

The combined impact of the numerous changes implemented in 2015 resulted in a slight decrease in the 2015 national TIM CM SA score from the 2014 score, but some of that decrease was regained in the 2016 TM CM SA. Figure 1 shows the overall national scores for the past decade, which include major revisions in 2007, 2011 and 2015.

2017 Traffic Incident Management Self-Assessment Results

In 2017 a total of 98 locations completed a TIM SA for inclusion in the national analysis, an increase of 4 from 2016. The 55 scored questions contained within the TIM SA were grouped into 3 sections: Strategic, Tactical, and Support. The initial assessments completed in 2003, 2004, and 2005 (78 in total) continue to be used as the baseline scores, although it should be noted that the baseline scores are recalibrated each year that a major revision to the TIM CM SA is completed (2007, 2011, and 2015).

Chart lists annual scores as follows: 2007, 58.3 percent; 2008, 59.6 percent; 2009, 60.6 percent; 2010, 63.9 percent; 2011, 68.2 percent; 2012, 70.2 percent; 2013, 73.9 percent; 2014, 74.2 percent; 2015, 67.1 percent; and 2016, 68.6 percent.
Figure 1. Chart. Traffic Incident Management Capability Maturity Self-Assessment national scores 2007 – 2016.

Table 1 shows the average score for each of the three TIM SA sections from the Baseline and 2016, along with the percentage change from the Baseline.

Table 1. Mean score for each section (Baseline and 2017)
Section # of Questions Mean Score (percent) High Score 2017 (possible) Change in scores from Baseline (percent) Section Weights (percent)
Baseline 2017
Strategic 28 42.4% 62.4% 36.6 (40) 47.1% 40%
Tactical 22 64.6% 73.3% 38.9 (40) 13.5% 40%
Support 5 39.7% 69.9% 20.0 (20) 76.1% 20%
Overall 55 50.7% 68.3% 92.9 (100) 34.6% 100%

The 2016 overall TIM SA score was 68.6 percent (out of a possible 100 percent), representing a 35.3 percent increase over the Baseline. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas, and non-top 75 metropolitan areas:

  • Top 40 metros: 72.3 percent
  • Top 75 metros: 70.5 percent
  • Non-top 75: 63.4 percent
  • Overall: 68.3 percent

As indicated above, the significant revisions implemented in 2015 resulted in an overall decrease in the national score from 2014 to 2015 (down 9.5 percent). The incremental change in the second year post revision was a slight decrease in the overall national score of 0.3 percentage points (-0.5 percent) from 68.6 to 68.3 percent. As will be described in this report, this is primarily attributed to changes made to the TIM Training questions which, while not requiring a recalibrating of baseline scores, did set higher thresholds in the scoring guidance for the TIM Training questions.

The TIM CM SA is intended to represent the consensus opinion of the TIM stakeholders completing an annual assessment in each TIM program area (city/region/State). TIM CM SA participants were asked for the first time this year to identify which TIM stakeholders (by stakeholder type, not specific name or agency) were involved in completing the annual assessment. Figure 2 shows the percentage involvement of TIM stakeholder groups in completing this year's assessments

Chart breaks out self-assessment participation by stakeholder group, as follows: Transportation, 83.7 percent; law enforcement, 48.0 percent; fire and rescue, 30.6 percent; towing and recovery, 26.5 percent; emergency management, 25.5 percent; public safety communications, 19.4 percent; other, 11.2 percent; traffic information media, 10.2 percent; hazmat contractors, 2.0 percent.
Figure 2. Chart. Traffic incident management stakeholder participation in completing 2017 Traffic Incident Management Capability Maturity Self-Assessment.
A listing of all 55 TIM SA questions, their respective Baseline and 2017 scores, and the percentage of programs scoring each question 3 or higher2 can be found in Appendix A.

Strategic

The 28 questions in the Strategic section are grouped into three subsections: Formal TIM Programs; TIM Training and After Action Reports; and TIM Performance Measures. The Strategic section typically receives the lowest score of the three sections; this has traditionally been the result of low scores on the TIM Performance Measures subsection. The 2017 TIM CM SA is no exception, with the Strategic section achieving a score of 62.4 percent compared to 73.3 percent in Tactical and 69.9 percent in Support.

This year's Strategic score represents a 2.3 percent decrease from the 2016 score of 63.9 percent. Changes implemented to the scoring guidance provided for some of the questions in the Strategic section are responsible for this decrease.

Question 13 in the TIM Training and After Action Reports subsection asks about the percentage of TIM responders completing the 4-Hour SHRP 2 TIM Responder Training. The scoring guidance for this question, shown in Table 2 below, has been updated each year since 2015 to reflect increased numbers of responders nationally completing the training. According to numbers from FHWA, as of October 2, 2017, over 284,000 individuals have received the training, which represents 24.7 percent of the total responders to be trained.3

Table 2. Scoring guidance for traffic incident management training question #13.
Scoring Guidance 2015 2016 2017
Score 1 if: Less than 5% Less than 10% Less than 15%
Score 2 if: Between 6-7% Between 11-15% Between 16-30%
Score 3 if: Between 8-9% Between 16-19% Between 31-45%
Score 4 if: Over 10% Over 20% Over 45%

As shown in Table 3 below, the average score for that question decreased 19 percent from 2016 and is down 16.8 percent from its baseline in 2015.4


Table 3. Traffic incident management training question #13.
Question 2015 Average Score 2016 Average Score 2017 Average Score
13. What percentage (estimated) of TIM responders in the region identified as needing training have received the 4-Hour SHRP2 TIM Responder Training (in-person or via Web-Based Training), or equivalent? 2.82 2.90 2.35

Among locations that submitted a TIM CM SA in 2016 and 2017, the average decrease to the score on Question #13 was 1 point and the average score for new submissions in 2017 was 2.5. Combined, these lower scores corroborate the lower percentage of programs scoring Question #13 at 3 or higher, which was 42.9 percent in 2017 versus the Baseline of 57.9 percent (see Appendix A).

Changes were also implemented to the TIM Performance Measures (TIM PM) questions in 2017. FHWA has a companion initiative underway as part of the Every Day Counts (EDC) program to increase "the amount, consistency and quality of TIM data collection" to support "performance measures for evaluating and improving traffic incident response."5 Working with FHWA's EDC team, the TIM PM questions were reworded and reordered to improve the granularity of data collected.

As an example, in the 2016 TIM CM SA, questions 19 and 20 asked:

19. How is data for Roadway/Incident Clearance Time being collected?

20. Has the TIM program established TIM performance targets for Roadway/Incident Clearance Time?

In 2017, separate questions were created asking how Roadway Clearance Time data and Incident Clearance Time data are collected. Deconstructing the TIM PM questions allows respondents to provide more detailed information (and accompanying score) on each individual TIM PM.

Similarly, in the 2016 TIM CM SA there was one question (#24) on the use of TIM PM data to influence operations. In the 2017 TIM CM SA, that question was separated into three questions asking about each individual TIM PM. This change had a corresponding impact on scores as shown in Table 4 below.

Table 4. Traffic incident management performance measures (PM) questions on using PM data to influence operations.
Question 2017 Average Score Change from Baseline
20. How does your agency use RCT performance data to influence your operations? 2.13 -3.5%
24. How does your agency use ICT performance data to influence your operations? 1.99 -10.0%
28. How does your agency use Secondary Crash performance data to influence your TIM operations? 1.67 -24.3%

Using the same baseline score for each (based on the baseline for the 2016 version of the question), respondents indicated higher scores for using RCT performance data to influence operations than for the other two performance measures. In previous years' TIM CM SA, these lower scores for ICT and Secondary Crash performance data presumably would be masked by a higher score assigned based on the use of RCT performance data to influence operations. Having three separate questions mitigates the impact of that masking and contributes to the lower score for the Strategic section in 2017.

The TIM PM subsection is traditionally the lowest scoring of the TIM CM SA subsections each year, but scores have been improving over time, which corresponds with FHWA's increased leadership in this area. The evolution in the collection and use of TIM PM data is evident when looking across scores for the Top 40 metropolitan areas, the Top 75 and all other areas submitting a TIM CM SA.

As shown in Table 5 below, where TIM programs are typically more advanced and have resources available for TIM PM collection and analysis, scores across those questions are, for the most part, higher.

Table 5. Top 40 major metropolitan area scores versus top 75 and non-top 75.
Question Top 40 Metropolitan Area Average Score Top 75 Metropolitan Area Average Score Non-Top 75 Average Score
20. How does your agency use RCT performance data to influence your operations? 2.4 2.3 1.7
24. How does your agency use ICT performance data to influence your operations? 2.2 2.1 1.6
28. How does your agency use Secondary Crash performance data to influence your TIM operations? 1.7 1.8 1.5

Scores for Question #8 in the Strategic section corroborate that the top 40 metropolitan areas typically have better resourced TIM programs capable of collecting and analyzing TIM PM data.

Table 6. Traffic incident management program funding.
Question Top 40 Metropolitan Area Average Score Top 75 Metropolitan Area Average Score Non-Top 75 Average Score
8. Are funds available for TIM activities? 3.0 2.8 2.4

The TIM programs that achieved the highest scores in the Strategic section are listed alphabetically in Table 7 text. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.

Table 7. Highest scoring – Strategic
Traffic Incident Management (TIM) Program
Atlanta, Georgia
Buffalo, New York
Louisville, Kentucky
Miami – Dade, Florida
Phoenix, Arizona

TACTICAL

The 22 questions in the Tactical section are focused on the following three areas:

  • TIM Laws
  • Policies and Procedures for Incident Response and Clearance
  • Responder and Motorist Safety

The Tactical section continues as the highest scoring of the three TIM SA sections, achieving an overall score of 73.3 percent. Three of the five highest scoring questions on the 2017 TIM SA are in the Tactical section, as part of the Policies and Procedures subsection (Table 8).

Table 8. Traffic incident management (TIM) policies and procedures – highest scoring in 2017.
Question 2017 Average Score Percent of TIM SA Scoring 3 or Higher
39. Is there a policy in place that clearly identifies reportable types and quantities, and appropriate Hazmat response? 3.21 85.7%
40. Does at least one responding agency have the authority to override the decision to utilize the responsible party's Hazmat contractor and call in other resources? 3.29 83.7%
44. Is there a procedure in place for removal of abandoned vehicles? 3.36 80.6%

High scores in this area can be attributed, in part, to the National TIM Responder Training which emphasizes the need for policies and procedures that provide for responder and motorist safety and quick clearance. Further evidence of where the National TIM Responder Training is making a difference is in the increase in score over the Baseline for questions #48, #49 and #50.

48. Is there a mutually understood procedure/guidance in place for safe vehicle positioning?

49. Are there mutually understood procedures/guidelines in place for use of emergency-vehicle lighting?

50. Are TIM responders following high-visibility safety apparel requirements as outlined in the MUTCD?

Combined, these three questions had an average score of 2.96 in 2017, which is a 131.2 percent increase over the Baseline. Response vehicle positioning, emergency-vehicle lighting use and high-visibility safety apparel are part of the curriculum in Lesson 4 (Safe Vehicle Positioning) and Lesson 5 (Scene Safety) in the National TIM Responder Training Course.

There are two questions in the TIM SA that query respondents on Safety Service Patrols (#32 and #33). The first asks about the existence of a Safety Service Patrol and the second asks respondents to score the Safety Service Patrol's level of coverage.

Nearly 70 percent (69.4 percent) of respondents scored both questions 3 or 4 (with 29.6 percent scoring both questions 4) meaning that there are a large number of Safety Service Patrols across the country that range from mid-level to full-function Safety Service Patrols. Services provided by these Safety Service Patrols include motorist assistance to incident response and clearance, and emergency traffic control and scene management. Furthermore, these Safety Service Patrols range from medium fleets providing service on most major roadways to fleets large enough to provide ample coverage on all major roadways.

Sixty-nine percent of the 2017 TIM SA respondents provided information on levels of coverage, with the combined Safety Service Patrol coverage extending over 4,917 centerline miles and 18,532 lane miles (some programs reported centerline, others lane miles). The median centerline miles coverage reported by 2017 TIM SA respondents was 110 miles and the median lane miles coverage was 141 miles.

The TIM programs that achieved the highest scores in the Tactical section are listed alphabetically in Table 9. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.

Table 9. Highest scoring – Tactical.
Traffic Incident Management Program
Alachua – Bradford, Florida
Atlanta, Georgia
Dallas – Ft. Worth, Texas
Seattle, Washington
Virginia – Northern Virginia/Suburban Washington, DC

Support

The Support section had the second highest overall score of 69.6 percent and had the largest increase over Baseline of the three sections (76.1 percent).

The questions in Support focused on the tools and technologies enabling improved incident detection, response and clearance. The major revision completed in 2015 removed questions on traveler information, returning the emphasis to the infrastructure and activities that enable incident information exchange between TIM program stakeholders. This allows programs to rate their progress on items over which their TIM program has control.

The five questions in the Support section all address TIM data sharing and integration among TIM stakeholders. The highest scoring question in the Support section was Question #51 (below), which scored an average of 3.34, the second highest scoring question on the 2017 TIM SA.

51. Are TIM stakeholders aware of and actively utilizing Traffic Management Center/Traffic Operations Center resources to coordinate incident detection, notification, and response?

The questions on data and video sharing between agencies provide greater granularity on the level of data and video sharing. While the two questions achieved similar scores, the TIM data question (as opposed to video) had a higher percentage of TIM SA respondents scoring their program 3 or 4 on Question #48 (Table 10).

Table 10. Traffic incident management data and video collection and use.
Question 2017 Average Score Percent of TIM SA Scoring 3 or Higher
48. What TIM data (i.e., number of involved vehicles, number of lanes blocked, length of queue, etc.) is captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 2.93 79.6%
49. Is TIM video captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 2.82 74.5%

The lowest scoring of the five questions in the Support section asks respondents about policies or procedures in place for signal timing changes to support traffic management during incident response, which received an average score of 2.19 in 2017, with just over a third (35.7 percent) of the TIM CM SA scoring this question 3 or higher. A review of the comments submitted with this question indicates that scores may increase over the next several years as more Integrated Corridor Management (ICM) plans are implemented. Signal timing changes to facilitate traffic incident response and traffic management, particularly on routes parallel to those where an incident has occurred, is identified as a key component of a successful ICM plan in FHWA's ICM and TIM primer.6

The TIM programs that achieved the highest scores in the Support section are listed alphabetically in Table 11. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.

Table 11. Highest scoring – Support.
Traffic Incident Management Program
Alachua – Bradford, Florida
Cincinnati, Ohio
Columbus, Ohio
Idaho
Jacksonville, Florida
Louisville, Kentucky
Philadelphia, Pennsylvania
Phoenix, Arizona
San Bernardino, California
San Diego, California
San Francisco, California
Washington, DC

Summary

A total of 98 TIM CM SA were completed in 2017, with an average overall score of 68.3 percent (out of a possible 100 percent). Overall scores were up 34.6 percent over the Baseline scores. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas, and non-top 75 metropolitan areas:

  • Top 40 metros: 72.3 percent
  • Top 75 metros: 70.5 percent
  • Non-top 75: 63.4 percent
  • Overall: 68.3 percent

The highest scores were achieved in Tactical (73.3 percent) and the largest percentage increase in scores from the Baseline was in Support (76.1 percent). Low-scoring questions and those with the least improvement over Baseline indicate specific program areas where additional guidance from FHWA may be warranted

The lowest scoring questions on this year's TIM CM SA, were in the TIM Performance Measures subsection and include questions on collection and use of secondary crash data. Data on secondary crashes and for the larger suite of TIM Performance Measures is a key focus of FHWA's Every Day Counts (EDC-4) initiative for 2017-2018. As such, scores in this area should increase in the coming years.

Another indicator of potential focus areas for FHWA is on those questions that achieve scores below their Baseline score. In the 2017 TIM CM SA, there were six questions which received an average score below Baseline (Table 12).

Table 12. Scores below Baseline.
Question 2017 Average Score Percent Change from Baseline
13. What percentage (estimated) of TIM responders in the region identified as needing training have received the 4-hour SHRP 2 TIM Responder Training (in-person or via web-based training), or equivalent? 2.35 -16.8%
20. How does your agency use RCT performance data to influence your TIM operations? 2.13 -3.5%
24. How does your agency use ICT performance data to influence your TIM operations? 1.99 -10.0%
28. How does your agency use Secondary Crash performance data to influence your TIM operations? 1.67 -24.3%
30. Is a Driver Removal Law in place and understood by TIM stakeholders? 2.83 -6.1%
44. Is there a procedure in place for removal of abandoned vehicles? 3.36 -3.3%

As described earlier in this report, the decrease in score from the Baseline on question 13 is the result of changes made to the scoring guidance on percentages of responders trained. Similarly, the three TIM PM questions (20, 24 and 28) were described in the Strategic section of this report and are the result of deconstructing several questions in the TIM PM subsection, which brings more granularity to the scoring for each TIM PM.

The decrease in average score below Baseline for question 30 on Driver Removal Laws is the result of additional non-Top 75 locations completing the TIM SA this year. Among the Top 40 Metropolitan areas, the average score for question 30 was 3.1. For the top 75 Metropolitan areas the average score was 3.0. For the 31 non-Top 75 locations submitting a 2017 TIM CM SA, the average score on question 30 was a 2.4. As the National TIM Training course continues to be offered in the non-Top 75 locations, the importance of safe, quick clearance policies and procedures should reverse this trend for questions 30 and 44.



APPENDIX A. Summary of 2017 Traffic Incident Management (TIM) Self-Assessment (SA) Results

Question Mean Score % Change from Baseline % Scoring 3 or Higher
Baseline 2017 Baseline 2017
Strategic
1. Is there a formal TIM program that is supported by a multidiscipline, multi-agency team or task force, which meets regularly to discuss and plan for TIM activities? 1.9 3.04 60.0% 28.0% 79.1%
2. Are all disciplines and agencies participating in on-going TIM enhancement activities/efforts? 1.9 3.04 60.0% 28.0% 79.1%
3. Is the importance of TIM understood by all TIM stakeholders and supported by multidiscipline, multi-agency agreements or memorandums of understanding (MOUs)? 1.71 2.67 56.3% 18.0% 56.1%
4. Is agency leadership actively involved in program-level TIM decisions (i.e. policy establishment, training, funding, legislation, etc.)? 1.71 2.80 63.5% 18.0% 68.4%
5. Is there a full-time position within at least one of the participating agencies with responsibility for coordinating the TIM program as their primary job function? 2.28 2.90 27.1% 54.0% 57.1%
6. Are the TIM response roles and responsibilities of public and private sector TIM stakeholders mutually understood? 1.71 3.11 82.0% 18.0% 86.7%
7. Is planning to support TIM activities, including regular needs assessments, done across and among participating agencies? 1.35 2.84 110.1% 12.0% 68.4%
8. Are funds available for TIM activities? 1.71 2.65 55.1% 18.0% 56.1%
9. Is TIM considered and incorporated into planning efforts for construction and work zones? 2.47 3.19 29.0% 35.0% 79.3%
10. Is TIM considered and incorporated into planning efforts for special events such as sporting events, concerts, conventions, etc? 2.47 3.18 29.0% 35.0% 79.3%
11. Is TIM considered and incorporated into planning efforts for weather-related events? 2.47 3.18 29.0% 35.0% 79.3%
12. Have stakeholders in the region participated in a SHRP2 National TIM Responder Training Program, or equivalent, Train-the-Trainer (TtT) session and are they actively training others? 1.26 2.79 121.1% 9.0% 65.3%
13. What percentage (estimated) of TIM responders in the region identified as needing training have received the 4-Hour SHRP2 TIM Responder Training (in-person or via Web-Based Training), or equivalent? 2.82 2.35 -16.8% 57.9% 42.9%
14. Is the SHRP2 TIM Responder Training being conducted in a multidiscipline setting? 2.97 2.99 0.7% 66.3% 63.3%
15. Has the SHRP2 TIM Responder Training, or equivalent, been incorporated into the local academy and/or technical college curriculums? 1.77 2.19 24.1% 10.5% 27.6%
16. Does the TIM program conduct multidiscipline, multi-agency after-action reviews (AARs)? 1.62 2.68 65.7% 18.0% 54.1%
17. Is Roadway Clearance Time being measured utilizing FHWA's standard definition time between first recordable awareness of an incident by a responsible agency and first confirmation that all lanes are available for traffic flow? 0.64 2.52 293.8% 3.0% 51.0%
18. Which of the following data collection and analysis practices best align with your region for RCT? 0.64 2.24 250.8% 3.0% 37.8%
19. Has the TIM program established performance targets for RCT? 0.64 2.24 250.8% 3.0% 37.8%
20. How does your agency use RCT performance data to influence your TIM operations? 1.16 2.23 92.6% 4.0% 40.8%
21. Is Incident Clearance Time (ICT) measured and used by your agency? FHWA defines ICT as the "time between the first recordable awareness of the incident and the time at which the last responder has left the scene." 2.21 2.13 -3.5% 35.8% 45.9%
22. Which of the following data collection and analysis practice best aligns with your region for ICT? 0.64 2.11 230.0% 3.0% 33.7%
23. Has the TIM program established performance targets for ICT? 1.16 1.87 61.0% 4.0% 27.6%
24. How does your agency use ICT performance data to influence your TIM operations? 2.21 1.99 -10.0% 35.8% 31.6%
25. Is the number of Secondary Crashes being measured and used? FHWA defines Secondary Crashes as the number of unplanned crashes beginning with the time of detection of the primary crash where a collision occurs either a) within the incident scene or b) within the queue, including the opposite direction, resulting from the original incident? 1.03 1.91 85.3% 8.0% 33.7%
26. How is data for the number of Secondary Crashes collected? 1.88 1.97 4.8% 29.5% 30.6%
27. Has the TIM program established performance targets for a reduction in the number of Secondary Crashes? 1.16 1.31 12.6% 4.0% 7.1%
28. How does your agency use Secondary Crash performance data to influence your TIM operations? 2.21 1.67 -24.3% 35.8% 18.4%
Tactical
29. Is an Authority Removal Law in place and understood by TIM stakeholders? 2.92 3.00 2.7% 67.0% 74.5%
30. Is a Driver Removal Law in place and understood by TIM stakeholders? 3.01 2.83 -6.1% 71.0% 73.5%
31. What activities are in place to outreach to and educate the public and elected officials about TIM? 2.38 2.57 8.1% 46.3% 59.2%
32. Is there a Safety Service Patrol program in place for incident and emergency response? 2.73 3.03 11.0% 67.0% 73.5%
33. What level of coverage does the Safety Service Patrol program provide? 2.73 3.03 11.0% 67.0% 73.5%
34. Do TIM responders routinely utilize the Incident Command System (ICS), specifically Unified Command (UC), while on scene? 2.55 3.18 24.8% 58.0% 85.7%
35. Are temporary traffic control (TTC) devices (e.g., cones, advanced warning signs, etc.) pre-staged in the region to facilitate timely response? 2.21 2.67 21.0% 41.0% 61.2%
36. Do towing and recovery procedures/rotation list policies deploy resources based on type/severity of incident? 3.14 3.15 0.5% 74.7% 79.6%
37. Do towing and recovery procedures/rotation list policies include company/operator qualifications, equipment requirements, and/or training requirements? 2.86 2.97 3.8% 67.0% 71.4%
38. Do towing and recovery procedures/rotation list policies include penalties for non-compliance of response criteria? 2.49 2.69 8.0% 55.8% 65.3%
39. Is there a policy in place that clearly identifies reportable types and quantities, and appropriate Hazmat response? 2.89 3.21 11.2% 69.0% 85.7%
40. Does at least one responding agency have the authority to override the decision to utilize the responsible party's Hazmat contractor and call in other resources? 3.22 3.29 2.0% 9.0% 83.7%
41. For incidents involving a fatality, is there a procedure in place for early notification and timely response of the Medical Examiner? 2.53 2.64 4.3% 55.0% 57.7%
42. For incidents involving a fatality, is there a procedure for the removal of the deceased prior to Medical Examiner arrival? 2.53 2.64 4.3% 55.0% 57.7%
43. Are there procedures in place for expedited crash investigations? 2.59 2.78 7.2% 72.0% 57.1%
44. Is there a procedure in place for removal of abandoned vehicles? 3.47 3.36 -3.3% 91.0% 80.6%
45. Do standardized, documented TIM response procedures/guidelines exist? 2.73 2.78 1.8% 61.1% 69.4%
46. Do TIM responders routinely utilize temporary traffic control devices to provide traffic control for the three incident classifications (minor, intermediate, major) in compliance with the MUTCD? 1.93 2.96 53.3% 27.0% 71.4%
47. Do TIM responders routinely utilize traffic control procedures to provide back of traffic queue warning to approaching motorists? 1.56 2.71 74.0% 17.0% 65.3%
48. Is there a mutually understood procedure/guideline in place for safe vehicle positioning? 1.28 2.96 131.2% 14.0% 69.4%
49. Are there mutually understood procedures/guidelines in place for use of emergency-vehicle lighting? 1.28 2.96 131.2% 14.0% 69.4%
50. Are TIM responders following high-visibility safety apparel requirements as outlined in the MUTCD? 1.28 2.96 131.2% 14.0% 69.4%
Support
51. Are TIM stakeholders aware of and actively utilizing Traffic Management Center/Traffic Operations Center resources to coordinate incident detection, notification and response? 1.98 3.34 68.5% 41.0% 91.8%
52. What TIM data (i.e., number of involved vehicles, number of lanes blocked, length of queue, etc.) is captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 1.43 2.93 104.8% 10.0% 79.6%
53. Is TIM video captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 1.43 2.82 96.9% 10.0% 74.5%
54. Are there policies or procedures in place for signal timing changes to support traffic management during incident response? 1.55 2.19 41.5% 18.0% 35.7%
55. Are there pre-planned detour and/or alternate routes identified and shared between TIM stakeholders? 1.55 2.70 74.5% 18.0% 61.2%

1 This revision included a renaming of the annual assessment to the TIM Capability Maturity Self-Assessment or TIM CM SA as referred to throughout this report. [ Return to note 1. ]

2 Scores of 3 and 4 indicate the highest levels of progress for a particular question. [ Return to note 2. ]

3 P. Jodoin, "National TIM Responder Training Program Implementation Progress." Unpublished presentation obtained on October 2, 2017. [ Return to note 3. ]

4 Prior to the 2015 TIM CM SA revision, the question on percentage of responders trained was a non-scored supplemental question. [ Return to note 4. ]

5 Federal Highway Administration, Office of Innovative Program Delivery, Center for Accelerating Innovation. EDC-4, Using Data to Improve Traffic Incident Management. Available online at: https://www.fhwa.dot.gov/innovation/everydaycounts/edc_4/timdata.cfm [ Return to note 5. ]

6R. Brewster, J. Bachman, R. Hurtado, and D. Newton. Integrated Corridor Management and Traffic Incident Management: A Primer. Federal Highway Administration, FHWA-HOP-16-035. January 2016. [ Return to note 6. ]

Office of Operations