Office of Operations
21st Century Operations Using 21st Century Technologies

2016 Traffic Incident Management National Analysis Report

Executive Summary

November 2016

United States Department of Transportation Federal Highway Administration

Office of Operations
1200 New Jersey Avenue, SE
Washington, DC 20590

Download the Printable Version [PDF, 113KB]
You may need the Adobe Reader to view the PDFs on this page.
Contact Information: OperationsFeedback@dot.gov


Background

For nearly 15 years the Federal Highway Administration (FHWA) has utilized the Traffic Incident Management Self-Assessment (TIM SA) to evaluate the state of practice in traffic incident management across the country. Originally developed by FHWA in 2002, the TIM SA is utilized by State and local TIM program managers to annually benchmark and evaluate TIM program success and areas of improvement.

A number of revisions to the TIM SA have been implemented over time to reflect changes in TIM practice. The most recent of these revisions occurred in 2015 to align the TIM SA with the Capability Maturity Framework (CMF). Significant changes to the TIM SA as part of this revision included:

  • Assessing and scoring TIM program level of success from 1-4; previously TIM SA respondents had utilized scores of 0-4 to rate program success in three distinct areas.
  • Providing specific scoring guidance for each question on the TIM SA to mitigate subjectivity in responses.
  • Addition of new scored questions, including several on the National TIM Responder Training Course.
  • A reweighting of the three TIM SA sections (Strategic, Tactical, and Support) in the overall scoring based on the number of questions included in each.
  • A recalibration of the baseline scores to protect the value of the TIM SA as a tool to measure national TIM progress over time.

The combined impact of these changes resulted in the 2015 national TIM SA score being slightly lower (9.5 percent) than the 2014 national TIM SA score. However, the impact of these changes in the 2016 TIM SA and subsequent years should be muted and the incremental changes year over year should return to an increase in the national TIM SA score.

2016 Traffic Incident Management Self-Assessment Results

In 2016 a total of 94 locations completed a TIM SA for inclusion in the national analysis, down one from 2015. The 51 scored questions contained within the TIM SA were grouped into three sections; Strategic, Tactical, and Support. The initial assessments completed in 2003, 2004, and one in 2005 (78 in total) continue to be used as the baseline scores; however the scores were recalibrated in 2015 as a result of the significant revisions to the TIM SA described above.

Table 1 shows the average score for each of the three TIM SA sections from the Baseline and 2016, along with the percentage change from the Baseline.

Table 1. Mean score for each section (Baseline and 2015)
Section # of Questions Mean Score (percent) High Score 2015 (possible) Change in scores from Baseline (percent) Section Weights (percent)
Baseline 2015
Strategic 24 42.4% 63.9% 36.6 (40) 50.6% 40%
Tactical 22 64.6% 72.8% 38.9 (40) 12.7% 40%
Support 5 39.7% 69.7% 20.0 (20) 75.7% 20%
Overall 51 50.7% 68.6% 92.9 (100) 35.3% 100%

The 2016 overall TIM SA score was 68.6 percent (out of a possible 100 percent), representing a 35.3 percent increase over the Baseline. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas, and non-top 75 metropolitan areas:

  • Top 40 metros: 73.4 percent
  • Top 75 metros: 70.9 percent
  • Non-top 75: 63.1 percent
  • Overall: 68.6 percent

As described above, the significant revisions implemented in 2015 resulted in an overall decrease in the national score from 2014 to 2015 (down 9.5 percent). The incremental change in the first year post-revision saw an increase in the overall national score of 1.5 percentage points (2.2 percent) from 67.1 to 68.6 percent.

A listing of all 51 TIM SA questions, their respective Baseline and 2016 scores and the percentage of programs scoring each question 3 or higher1 can be found in Appendix A.

Strategic

The 24 questions in the Strategic section are grouped into three subsections: Formal TIM Programs, TIM Training and After Action Reports, and TIM Performance Measures. As part of the 2015 TIM SA revisions, a number of new scored questions were added to the Strategic section on the National TIM Responder Training Course and the data used in calculating TIM Performance Measures (TIM PM).

The consistently low scores in the TIM Performance Measures subsection have resulted in the Strategic section annually receiving the lowest score in the TIM SA. This was the case once again in 2016, with the questions in the Strategic section achieving a score of 63.9 percent. However, this does represent a 50.6 percent increase over the Baseline.

The questions on the National TIM Responder Training (including three new scored questions added in 2015) all realized increased scores in 2016, representative of the continued and increased deployment of the training by FHWA. As shown in Table 2, Questions #12-14 received high average scores and a high percentage of the TIM SA scoring each question 3 or higher.

While Question #15 achieved a lower score than the other TIM training questions, it also saw improvement in the average score from 2015.

Table 2. Traffic incident management (TIM) training questions.
Question 2016 Average Score Percent of TIM SA Scoring 3 or Higher
12. Have stakeholders in the region participated in a SHRP2 National TIM Responder Training Program, or equivalent, Train-the-Trainer (TtT) session and are they actively training others? 2.57 51.1%
13. What percentage (estimated) of TIM responders in the region identified as needing training have received the 4-Hour SHRP2 TIM Responder Training (in-person or via Web-Based Training), or equivalent? 2.90 61.7%
14. Is the SHRP2 TIM Responder Training being conducted in a multidiscipline setting? 3.05 68.1%
15. Has the SHRP2 TIM Responder Training, or equivalent, been incorporated into the local academy and/or technical college curriculums? 2.01 20.2%

The impact of the widespread deployment of the National TIM Responder Training by FHWA is also being realized in scores elsewhere in the TIM SA. Among the top five scoring questions overall on the 2016 TIM SA is Question #6, "Are the TIM response roles and responsibilities of public and private sector TIM stakeholders mutually understood?" The question received an average score of 3.22 and 91.5 percent of participating locations scored this question 3 or higher.

The scoring guidance for Question #6 provides the following description:

Score 4 if: TIM roles and responsibilities are mutually understood by the majority of public and private sector disciplines. Roles and responsibilities are clearly documented with multidiscipline agreements, policies, or manuals. There is strong recognition that each discipline has a job to do and that safe, quick clearance is a priority for all. Routine multidiscipline training and exercises reinforce the importance of working as a team.

Using the guidance provided, 31 percent of the TIM SA respondents scored their programs 4 on Question #6. The companion non-scored supplemental question (#6a) asks respondents to describe how the roles and responsibilities of public and private sector TIM stakeholders are communicated. A majority of those providing supplemental responses point to the Strategic Highway Research Program 2 (SHRP2) National TIM Responder Training. This is further corroborated by examining locations that scored their programs high (3 or 4) on Question #6 and on the TIM training questions; 36 percent of TIM SA respondents scored their programs 3 or 4 on Questions #6, #12, and #14.

Significant progress has been made in the area of TIM Performance Measurement over the past decade and the scores in the TIM PM subsection reflect that progress. Scores for both Roadway Clearance (RC) and Incident Clearance (IC) indicate that an increasing number of locations around the country are measuring both TIM PMs using the FHWA definitions and that the data is being used to impact operations. However, average scores for the third TIM PM, secondary crashes, are the lowest on the 2016 TIM SA. Only three questions scored a 2 or less on the 2016 TIM SA, and all three were questions on secondary crashes (Table 3).

Table 3. Traffic incident management (TIM) performance measures – secondary crashes.
Question 2016 Average Score Percent of TIM SA Scoring 3 or Higher
21. Is the number of Secondary Crashes being measured utilizing FHWA's standard definition "number of unplanned crashes beginning with the time of detection of the primary crash where a collision occurs either: a) within the incident scene; or b) within the queue, including the opposite direction, resulting from the original incident?" 1.88 29.8%
22. How is data for the number of Secondary Crashes collected? 2.00 31.9%
23. Has the TIM program established TIM performance targets for a reduction in the number of Secondary Crashes? 1.36 8.5%

Slightly less than half (47.9 percent) of TIM SA respondents scored their program a 1 on Question #21, indicating that secondary crashes are typically not measured. Of the remaining locations that did score their program a 2 or higher, only 24 locations provided secondary incident data. Those locations reported that, on average, secondary incidents comprised 12.1 percent of all incidents, an increase from 8.1 percent reported in 2015. However, caution should be taken in interpreting changes in the percentage of incidents reported as secondary given the varied levels of use of FHWA's secondary incident definition.

The comments provided by TIM SA respondents to the questions on secondary crashes indicate that while their definition of a secondary incident may match FHWA's, there is limited data collection and analysis on those incidents. Similar to last year, some TIM SA respondents indicated that their program is currently in the process of either developing methods for collecting secondary crash data, or revising current accident reporting systems to include secondary crash data in the hopes of including this metric in the TIM SA in future years.

Another output of the TIM SA is the TIM Performance Measures (PM) Database. This database is populated annually based on responses to the TIM SA. Information on the three key PM metrics – Roadway Clearance Time (RCT), Incident Clearance Time (ICT) and secondary crashes – is tracked annually and compared to a Baseline (2011) level.

Average RCT decreased to 47.79 minutes in 2016, down 25.1 percent from the 63.80 minutes reported in 2015. Average incident clearance time (ICT) decreased by 13.3 percent from 2015 to 2016 (61.53 minutes in 2015 versus 53.36 minutes in 2016). However, one challenge with the TIM PM Database is the lack of consistent data provision by the TIM SA participants. The TIM PM data requested is part of the non-scored, supplemental data and therefore the locations submitting varies each year.

The TIM programs that achieved the highest scores in the Strategic section are listed alphabetically in Table 4. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.

Table 3. Highest scoring – strategic
Traffic Incident Management (TIM) Program
Chattanooga, Tennessee
Cincinnati, Ohio
Columbus, Ohio
Louisville, Kentucky
Milwaukee, Wisconsin

TACTICAL

The 22 questions in the Tactical section are focused on the following three areas:

  • TIM Laws
  • Policies and Procedures for Incident Response and Clearance
  • Responder and Motorist Safety

The Tactical section continues as the highest scoring of the three TIM SA sections, achieving an overall score of 72.8 percent. Three of the five highest scoring questions on the 2016 TIM SA are in the Tactical section, as part of the Policies and Procedures subsection (Table 5).

Table 5. Traffic incident management (TIM) policies and procedures – highest scoring in 2016.
Question 2016 Average Score Percent of TIM SA Scoring 3 or Higher
35. Is there a policy in place that clearly identifies reportable types and quantities, and appropriate Hazmat response? 3.26 88.3%
36. Does at least one responding agency have the authority to override the decision to utilize the responsible party's Hazmat contractor and call in other resources? 3.36 86.2%
40. Is there a procedure in place for removal of abandoned vehicles? 3.34 79.8%

High scores in this area can be attributed, in part, to the National TIM Responder Training which emphasizes the need for policies and procedures that provide for responder and motorist safety and quick clearance.

Question #30 queries TIM SA respondents on the use of the Incident Command System (ICS) while on scene. With an average score of 3.16 in 2016 and 85.1 percent of locations scoring this question 3 or higher, it is evident that use of ICS is widespread. Lesson #6 of the National TIM Training Program focuses on Command Responsibilities including ICS and Unified Command (UC) and the high score here may be attributable, in part, to the large numbers of responders participating in the national TIM training.

There are two questions in the TIM SA that query respondents on Safety Service Patrols (#28 and #29). The first asks about the existence of a Safety Service Patrol and the second asks respondents to score the Safety Service Patrol's level of coverage. Nearly 50 percent (48.9 percent) of respondents scored both questions 3 or 4 (with 31 percent scoring both questions 4), meaning that across the country there are a large number of Safety Service Patrols that range from mid-level to full-function Safety Service Patrols. Services provided by these Safety Service Patrols include motorist assistance to incident response and clearance, and emergency traffic control and scene management. Furthermore, these Safety Service Patrols range from medium fleets providing service on most major roadways to fleets large enough to provide ample coverage on all major roadways.

Sixty-seven percent of the 2016 TIM SA respondents provided information on levels of coverage, with the combined Safety Service Patrol coverage of 10,740 centerline miles and 12,419 lane miles (some programs reported centerline, others lane miles). The median centerline miles coverage reported by 2016 TIM SA respondents was 106 and the median lane miles coverage was 369.

The TIM programs that achieved the highest scores in the Tactical section are listed alphabetically in Table 6. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.

Table 6. Highest scoring – tactical.
TIM Program
Atlanta, Georgia
Cincinnati, Ohio
Dallas-Ft. Worth, Texas
Seattle, Washington
Virginia – Northern Virginia/Suburban Washington, DC

Support

The questions in Support focused on the tools and technologies enabling improved incident detection, response, and clearance. The 2015 TIM SA revision removed questions on Traveler Information, returning the emphasis to the infrastructure and activities that enable incident information exchange between TIM program stakeholders. This allows programs to rate their progress on items over which their TIM program has control as well as aligning the Support section with one of the three key objectives of the National Unified Goal for Traffic Incident Management – prompt, reliable, interoperable communications.

The five questions in the Support section all address TIM data sharing and integration among TIM stakeholders. The highest scoring question in the Support section was Question #47 (below) which scored an average score 3.37, the highest scoring question on the 2016 TIM SA.

47. Are TIM stakeholders aware of and actively utilizing Traffic Management Center/Traffic Operations Center resources to coordinate incident detection, notification and response?

The questions on data and video sharing between agencies provide greater granularity on the level of data and video sharing. While the two questions achieved nearly identical average scores, the TIM data question (as opposed to video) had a higher percentage of TIM SA respondents scoring their program 3 or 4 on Question #48 (Table 7).

Table 7. Traffic incident management (TIM) data and video collection and use.
Question 2016 Average Score Percent of TIM SA Scoring 3 or Higher
48. What TIM data (i.e., number of involved vehicles, number of lanes blocked, length of queue, etc.) is captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 2.87 77.7%
49. Is TIM video captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 2.85 72.3%

The support section had the second highest overall score of 69.7 percent and had the largest increase over Baseline of the three sections (75.7 percent).

The TIM programs that achieved the highest scores in the Support section are listed alphabetically in Table 8. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.

Table 8. Highest scoring ­ support.
Traffic Incident Management (TIM Program
Alachua – Bradford, Florida
Cincinnati, Ohio
Columbus, Ohio
El Paso, Texas
Louisville, Kentucky
Omaha, Nebraska
Philadelphia, Pennsylvania
Phoenix, Arizona
San Diego, California
Seattle, Washington
Washington, DC

Summary

A total of 94 TIM SA were completed in 2016, with an average overall score of 68.6 percent (out of a possible 100 percent). Overall scores were up 35.3 percent over the Baseline scores. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas, and non-top 75 metropolitan areas:

  • Top 40 metros: 73.4 percent
  • Top 75 metros: 70.9 percent
  • Non-top 75: 63.1 percent
  • Overall:  68.6 percent

The highest scores were achieved in Tactical (72.8 percent) and the largest percentage increase in scores from the Baseline was in Support (75.7 percent). Low scoring questions and those with the least improvement over Baseline indicate specific program areas where additional guidance from FHWA may be warranted.

The lowest scoring questions on this year's TIM SA, as described above, were all on secondary crashes. Data on secondary crashes and for the larger suite of TIM Performance Measures is a key focus of FHWA's Every Day Counts (EDC-4) initiative for 2017-20182 and as such, scores in this area should increase in the coming years.

Another indicator of potential focus areas for FHWA is on those questions that experience a small change over Baseline. Fifteen questions on this year's TIM SA experienced changes over Baseline of less than 10 percent. However, six of those questions had average scores of 3 or more, leaving less room for improvement.3 These are areas where TIM programs have consistently scored well and continue to do so, including towing and recovery and hazmat response procedures.

Another area with lower scores (<3) and little improvement over Baseline (<10 percent) is in the Tactical section and deals with response procedures when the incident involves a fatality.

37. For incidents involving a fatality, is there a procedure in place for early notification and timely response of the Medical Examiner?

38. For incidents involving a fatality, is there a procedure for the removal of the deceased prior to Medical Examiner arrival?

Scores for Questions #37 and 38 are averaged (composite question) and in 2016, the average score was 2.59, representing a 2.2 percent increase over Baseline. Additionally, only 56.9 percent of responding locations scored this question 3 or higher. This is an area that could receive additional emphasis and instruction in the National TIM Responder Training course as part of Lesson #8 on Special Circumstances.



APPENDIX A. Summary of 2016 Traffic Incident Management (TIM) Self-Assessment (SA) Results

Question Mean Score % Change from Baseline % Scoring 3 or Higher
Baseline 2016 Baseline 2016
Strategic
1. Is there a formal TIM program that is supported by a multidiscipline, multi-agency team or task force, which meets regularly to discuss and plan for TIM activities? 1.9 2.94 54.5% 28.0% 76.1%
2. Are all disciplines and agencies participating in on-going TIM enhancement activities/efforts? 1.9 2.94 54.5% 28.0% 76.1%
3. Is the importance of TIM understood by all TIM stakeholders and supported by multidiscipline, multi-agency agreements or memorandums of understanding (MOUs)? 1.71 2.68 56.8% 18.0% 56.4%
4. Is agency leadership actively involved in program-level TIM decisions (i.e. policy establishment, training, funding, legislation, etc.)? 1.71 2.80 63.6% 18.0% 66.0%
5. Is there a full-time position within at least one of the participating agencies with responsibility for coordinating the TIM program as their primary job function? 2.28 2.86 25.5% 54.0% 54.3%
6. Are the TIM response roles and responsibilities of public and private sector TIM stakeholders mutually understood? 1.71 3.22 88.5% 18.0% 91.5%
7. Is planning to support TIM activities, including regular needs assessments, done across and among participating agencies? 1.35 2.82 108.8% 12.0% 66.0%
8. Are funds available for TIM activities? 1.71 2.47 44.3% 18.00% 47.9%
9. Is TIM considered and incorporated into planning efforts for construction and work zones? 2.47 3.19 29.1% 35.0% 80.1%
10. Is TIM considered and incorporated into planning efforts for special events such as sporting events, concerts, conventions, etc? 2.47 3.18 29.1% 35.0% 80.1%
11. Is TIM considered and incorporated into planning efforts for weather-related events? 2.47 3.18 29.1% 35.0% 80.1%
12. Have stakeholders in the region participated in a SHRP2 National TIM Responder Training Program, or equivalent, Train-the-Trainer (TtT) session and are they actively training others? 1.26 2.57 104.3% 9.0% 51.1%
13. What percentage (estimated) of TIM responders in the region identified as needing training have received the 4-Hour SHRP2 TIM Responder Training (in-person or via Web-Based Training), or equivalent? 2.82 2.90 2.9% 57.9% 61.7%
14. Is the SHRP2 TIM Responder Training being conducted in a multidiscipline setting? 2.97 3.05 2.9% 66.3% 68.1%
15. Has the SHRP2 TIM Responder Training, or equivalent, been incorporated into the local academy and/or technical college curriculums? 1.77 2.01 13.8% 10.5% 20.2%
16. Does the TIM program conduct multidiscipline, multi-agency after-action reviews (AARs)? 1.62 2.6 60.2% 18.0% 47.9%
17. Is Roadway Clearance Time being measured utilizing FHWA's standard definition "time between first recordable awareness of an incident by a responsible agency and first confirmation that all lanes are available for traffic flow?" 0.64 2.63 310.6% 3.0% 56.4%
18. Is Incident Clearance Time being measured utilizing FHWA's standard definition "time between the first recordable awareness of the incident and the time at which the last responder has left the scene?" 0.64 2.40 275.7% 3.0% 48.9%
19. How is data for Roadway /Incident Clearance Time being collected? 0.64 2.74 328.9% 3.0% 62.8%
20. Has the TIM program established TIM performance targets for Roadway/Incident Clearance Time? 1.16 2.21 90.8% 4.0% 34.0%
21. Is the number of Secondary Crashes being measured utilizing FHWA's standard definition "number of unplanned crashes beginning with the time of detection of the primary crash where a collision occurs either a) within the incident scene or b) within the queue, including the opposite direction, resulting from the original incident?" 1.03 1.88 82.8% 8.0% 29.8%
22. How is data for the number of Secondary Crashes collected? 1.88 2.00 6.1% 29.5% 31.9%
23. Has the TIM program established TIM performance targets for a reduction in the number of Secondary Crashes? 1.16 1.36 17.4% 4.0% 8.5%
24. Is TIM performance data used to influence and/or improve operations? 2.21 2.29 3.5% 35.8% 39.4%
Tactical
25. Is an Authority Removal Law in place and understood by TIM stakeholders? 2.92 3.02 3.5% 67.0% 75.5%
26. Is a Driver Removal Law in place and understood by TIM stakeholders? 3.01 2.91 -3.2% 71.0% 78.7%
27. What activities are in place to outreach to and educate the public and elected officials about TIM? 2.38 2.5 5.1 46.3% 48.9%
28. Is there a Safety Service Patrol program in place for incident and emergency response? 2.73 3.04 11.3% 67.0% 75.0%
29. What level of coverage does the Safety Service Patrol program provide? 2.73 3.04 11.3% 67.0% 75.0%
30. Do TIM responders routinely utilize the Incident Command System (ICS), specifically Unified Command (UC), while on scene? 2.55 3.16 23.9% 58.0% 85.1%
31. Are temporary traffic control (TTC) devices (e.g., cones, advanced warning signs, etc.) pre-staged in the region to facilitate timely response? 2.21 2.60 17.5% 41.0% 57.4%
32. Do towing and recovery procedures/rotation list policies deploy resources based on type/severity of incident? 3.14 3.17 1.1% 74.7% 80.9%
33. Do towing and recovery procedures/rotation list policies include company/operator qualifications, equipment requirements, and/or training requirements? 2.86 2.91 1.9% 67.0% 69.1%
34. Do towing and recovery procedures/rotation list policies include penalties for non-compliance of response criteria? 2.49 2.59 3.6% 55.8% 57.4%
35. Is there a policy in place that clearly identifies reportable types and quantities, and appropriate Hazmat response? 2.89 3.26 12.6% 69.0% 88.3%
36. Does at least one responding agency have the authority to override the decision to utilize the responsible party's Hazmat contractor and call in other resources? 3.2 3.36 4.4% 9.0% 86.2%
37. For incidents involving a fatality, is there a procedure in place for early notification and timely response of the Medical Examiner? 2.53 2.59 2.2% 55.0% 56.9%
38. For incidents involving a fatality, is there a procedure for the removal of the deceased prior to Medical Examiner arrival? 2.53 2.59 2.2% 55.0% 56.9%
39. Are there procedures in place for expedited crash investigations? 2.59 2.78 7.2% 72.0% 53.2%
40. Is there a procedure in place for removal of abandoned vehicles? 3.47 3.34 -3.7% 91.0% 79.8%
41. Do standardized, documented TIM response procedures/guidelines exist? 2.73 2.73 0.3% 61.1% 64.9%
42. Do TIM responders routinely utilize temporary traffic control devices to provide traffic control for the three incident classifications (minor, intermediate, major) in compliance with the MUTCD? 1.93 2.84 47.2% 27.0% 61.7%
43. Do TIM responders routinely utilize traffic control procedures to provide back of traffic queue warning to approaching motorists? 1.56 2.67 71.2% 17.0% 56.4%
44. Is there a mutually understood procedure/guideline in place for safe vehicle positioning? 1.28 2.97 131.9% 14.0% 72.3%
45. Are there mutually understood procedures/guidelines in place for use of emergency-vehicle lighting? 1.28 2.97 131.9% 14.0% 72.3%
46. Are TIM responders following high-visibility safety apparel requirements as outlined in the MUTCD? 1.28 2.97 131.9% 14.0% 72.3%
Support
47. Are TIM stakeholders aware of and actively utilizing Traffic Management Center/Traffic Operations Center resources to coordinate incident detection, notification and response? 1.98 3.37 70.3% 41.0% 89.4%
48. What TIM data (i.e., number of involved vehicles, number of lanes blocked, length of queue, etc.) is captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 1.43 2.87 100.9% 10.0% 77.7%
49. Is TIM video captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 1.43 2.85 99.4% 10.0% 72.3%
50. Are there policies or procedures in place for signal timing changes to support traffic management during incident response? 1.55 2.19 41.4% 18.0% 34.0%
51. Are there pre-planned detour and/or alternate routes identified and shared between TIM stakeholders? 1.55 2.66 71.6% 18.0% 60.6%

1 In both the previous TIM SA scoring schema and the newly revised scoring schema (implemented in 2015), scores of 3 and 4 indicate the highest levels of progress for a particular question [ Return to note 1. ]

2 U.S. Department of Transportation Federal Highway Administration. Using Data to Improve Traffic Incident Management. Available online at: https://www.fhwa.dot.gov/innovation/everydaycounts/edc_4/timdata.cfm [ Return to note 2. ]

3 The questions with high scores (3+) but little (<10%) increase over Baseline are Questions #14, #25, #32, #33, #34, and #36 (see Appendix A). [ Return to note 3. ]

Office of Operations