Emergency Transportation Operations

2014 Traffic Incident Management National Analysis Report

October 2014

United States Department of Transportation Federal Highway Administration

Office of Operations
1200 New Jersey Avenue, SE
Washington, DC 20590

Download the Printable Version [PDF, 122KB]
You may need the Adobe Reader to view the PDFs on this page.
Contact Information: OperationsFeedback@dot.gov


Table of Contents

Background

Strategic

Tactical

Support

Summary

Appendix A. Summary of 2014 TIM SA Results

List of Tables

Table 1. Mean Score for Each Section (Baseline and 2013)
Table 2. Comparison of TIM SA National Average to National Median Scores
Table 3. Highest Scoring – Strategic
Table 4. Highest Scoring – Tactical
Table 5. Highest Scoring


Background

The Traffic Incident Management Self-Assessment (TIM SA) was first developed by the Federal Highway Administration (FHWA) in 2002 as a benchmarking tool for evaluating TIM program components and overall TIM program success. The initial assessments were conducted in 2003 and assessments have been conducted annually since then. The TIM SA serves several functions, among them serving as a tool for state and local TIM program managers to assess progress and identify areas for improvement at state and local levels. Similarly, analysis of the aggregated TIM SA results allows FHWA to identify program gaps and better target TIM program resources.

In 2014 a total of 99 locations completed a TIM SA for inclusion in the national analysis. The 34 scored questions contained within the TIM SA were grouped into three sections; Strategic, Tactical and Support. In order to benchmark progress over time, the initial assessments completed in 2003, 2004 and one in 2005 (78 in total) have been used each year as the Baseline.

Table 1 shows the average score for each of the three TIM SA sections from the Baseline and 2014, along with the percentage change from the Baseline.

Table 1. Mean Score for Each Section (Baseline and 2014)
Section # of Questions Mean Score (percent) High Score 2013 (possible) Change in scores from Baseline (percent) Section Weights (percent)
Baseline 2013
Strategic 12 35.0% 31.5% 30 (30) 76.2% 30%
Tactical 16 64.1% 80.5% 40 (40); 25.6% 40%
Support 6 39.4% 78.4% 30 (30) 99.0% 30%
Overall 34 48.0% 74.2% 99.7 (100) 54.8% 100%

The 2013 overall TIM SA score was 74.2 percent (out of a possible 100%), representing a 54.8 percent increase over the Baseline. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas and non-top 75 metropolitan areas:

  • Top 40 metros: 80.6%
  • Top 75 metros: 76.9%
  • Non-top 75: 67.7%
  • Overall: 74.2%

There was little incremental change in the overall score in 2014 from the 2013 TIM SA, with the overall national score increasing 0.41 percent from 73.9 to 74.2 percent. This should not, however, be construed to imply that overall TIM program performance is slowing or that little progress is being made to advance TIM program excellence across the country. In fact, there are several factors contributing to the smaller incremental change this year which may be experienced going forward as well.

  • New TIM programs are completing and submitting a TIM SA as part of the national analysis. These new, emerging programs have not had the time or experience to achieve high scores on the TIM SA and their overall lower scores create downward pressure on the overall TIM SA national score. However, the emergence of new TIM programs nationwide points to increased awareness of the important role of TIM in roadway safety and mobility. As such, a better indicator of overall TIM program performance nationwide may be to look at increases in the median national TIM SA score. The median score will be more representative of central tendency and will mitigate the impact of outlier scores at both the low and high ends. The median national TIM SA score over the last several years is shown in Table 2 below.
  • Established TIM programs are routinely scoring high scores and there is little room for improvement. As shown in Table 2, the top end in the range of scores continues to get higher and is now close to a perfect score of 100.
Table 2. Comparison of TIM SA National Average to National Median Scores
Year # of TIM SA National Average Score Range of Scores National Median Score
2014 99 74.2 99.7 – 10.4 78
2013 93 73.9 97.4 – 24.1 76.3
2012 104 70.2 96.7 – 6.7 73.8
2011 93 68.2 95.8 – 4.5 70.8

A listing of all 34 TIM SA questions, their respective Baseline and 2014 scores and the percentage of programs scoring each question 3 or higher1 can be found in Appendix A.

Strategic

The questions in the Strategic section asked respondents to rate progress in how the TIM program is organized, resourced, supported and sustained. Key elements of this section include multi-agency coordination and TIM performance measures. While the Strategic section had the lowest score of the three sections (61.7%), the Strategic questions have realized a 76.2 percent increase compared to the Baseline, indicating improvement in this area.

Despite progress in the Strategic area, the five questions receiving the lowest mean score in the TIM SA were all in this section, with four out of five coming from the subsection on TIM Performance Measurement. The questions on TIM Performance Measurement have consistently been among the lowest scoring on the TIM SA. The TIM Performance Measurement subsection focused on three key metrics: Roadway Clearance Time (RCT), Incident Clearance Time (ICT), and reduction of secondary incidents. Of the three performance measures, reduction in secondary incidents (Question 4.1.3.5) had the lowest score (1.13), which represents only a 9.8 percent improvement over the Baseline. This question was first introduced as part of the TIM SA Revision in 2009 and it has been the lowest scoring individual question on the TIM SA each year since then.

Almost half of respondents (47.5%) stated that there was "No Activity" in this area. The comments indicate that the absence of a clear definition of what constitutes a secondary incident hinders data collection and analysis in this area. For those areas that do report progress, several of the comments point to the inclusion of secondary incident information on the crash report form. Scores on this question may advance in the future with identification of a standard definition for secondary incidents. Additionally, FHWA should continue to promote the reduction of secondary incidents as part of the overall return-on-investment for Safety Service Patrols and TIM programs in general.

Another important part of the TIM SA is the TIM Performance Measures (PM) Database. This database is populated annually based on responses to the TIM SA. Information on the three key PM metrics (RCT, ICT, and secondary incidents) is tracked annually and compared to a Baseline (2011) level. Average RCT decreased to 62.93 minutes in 2014, down 14 percent from the 73.16 minutes reported in 2013. This is also the first year since the Baseline (2011) that average RCT is below the Baseline of 65.39 minutes.

In terms of ICT, the overall average time increased by 13.2 percent from 2013 to 2014 (56.58 minutes in 2013 versus 64.07 minutes in 2014). However, looking just at the locations that submitted ICT data in 2013 and 2014, the increase in ICT was less dramatic, moving from 56.34 minutes in 2013 to 59.96 minutes in 2014 (6.4% increase). Therefore, the increase in the overall average (13.2%) may reflect the addition of new locations submitting ICT data in 2014.

Just over 50 percent of locations indicated some activity on the tracking of reductions in secondary incidents (4.1.3.5) and of those locations, only 19 provided secondary incident data for the TIM PM database. In 2014, those locations reported that, on average, secondary incidents comprised just 2 percent of all incidents, a slight decline from 2.8 percent reported in 2013. However, caution should be taken in interpreting changes in the percentage of incidents reported as secondary given the lack of a uniform definition for secondary incidents.

In addition to questions on performance measures, the Strategic section included other areas with room for improvement. Question 4.1.2.1 on the multi-agency agreements/MOUs used to structure TIM programs received the second lowest individual score on the overall TIM SA (2.00) and fewer than 50 percent of the assessments scored this question 3 or higher. This question was divided into four composite questions to query specific elements of multi-agency coordination:

  1. Is the agreement/MOU signed by top officials from participating agencies?
  2. Are incident scene roles and responsibilities for each participating agency clearly defined in the agreement and communicated to all participating agencies?
  3. Are agency roles and responsibilities for planning for and funding for the TIM program clearly defined in the agreement/MOU?
  4. Are safe, quick clearance goals stated as time goals for incident clearance (e.g. 90 minutes) in the agreement/MOU?

The lowest scoring of the four composite questions was Part C regarding defined agency roles for planning and funding (1.44). The lack of a formal structure for multiagency collaboration continues to hinder the advancement of multiagency TIM programs beyond ad hoc activities.

The low scores on this question suggest that many TIM programs lack a formal structure for multiagency collaboration. Furthermore, even in locations that do have formal agreements, many do not have a process in place to systematically review and renew these agreements. Respondents were asked how frequently the agreements/MOUs were updated and nearly a third responded "as needed." Absent a planned, systematic review and update process for these agreements, there is a risk that they will become obsolete and eventually disregarded by the participating agencies.

As it has been for the past five years, the highest score in the Strategic section was achieved in planning for special events (4.1.1.4) with a mean score of 3.55. Planning for special events was the fifth highest scoring question overall in the 2014 TIM SA. The score for this question was the composite average of individual scores in planning for the following types of events: Construction and Maintenance; Sporting Events, Concerts, Conventions; Weather-related Events and Catastrophic Events. Among those four categories, Sporting Events, Concerts, Conventions (4.1.1.4.b) and Weather-related Events (4.1.1.4.c) achieved the highest mean scores of 3.60 and 3.56, respectively. Catastrophic events received the lowest score of the four event types (3.49). However, with a high composite score for planned special events (3.55) and a limited delta between the individual event type scores (0.11), there is little to indicate that additional support, training or outreach from FHWA on any one event type will result in overall advancement of the scores on this question.

The TIM programs that achieved the highest scores in the Strategic section are listed alphabetically in Table 3. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.

Table 3. Highest Scoring – Strategic
TIM Program
Greensboro, NC
Jacksonville, FL
Kansas City, MO/KS
Knoxville, TN
Louisville, KY

Tactical

The questions in Tactical focused on the policies and procedures used by TIM professionals when responding to incidents. This includes the policies and procedures in place to ensure motorist and responder safety. Collectively, these questions continue to score among the highest in the TIM SA and are reflective of many of the core competencies taught as part of the National TIM Responder Training sponsored by FHWA. In 2014 this section achieved an overall score of 80.5 percent, making it the highest scoring of the three sections. Three of the five questions achieving the highest mean score in the 2014 TIM SA were in the Tactical section.

Safe, quick clearance (SQC) laws are a key element of the Tactical section. Question 4.2.2.1 on Move Over laws received the highest mean score (3.66) in the Tactical section, indicating a high degree of success in promulgating Move Over laws. Question 4.2.1.1 on Authority Removal had a mean score in 2014 of 3.28. The third SQC law, Driver Removal (4.2.1.2), scored 3.03 in 2014. Scores for all three laws continue to trend upward, however Driver Removal had the smallest increase over Baseline (0.7%) in the 2014 TIM SA suggesting a need for education on the importance of enacting Driver Removal laws. Additionally, more work needs to be done on the implementation of each of the SQC laws. All three scores were composite scores that first asked if the law existed and then asked if the law was utilized, communicated, or enforced (depending on the law in question). All three laws had lower scores in the execution element of the composite score. While passage of the laws is important, there will be no safety benefits if the laws are not utilized. Specifically, Driver Removal laws had the lowest implementation score, which is likely one of the reasons this question had the lowest score of the three SQC law questions.

The lowest scoring question in the Tactical section dealt with equipment staging and lighting procedures that maximize traffic flow around the incident while also protecting responders (4.2.2.5). However, while scoring the lowest among the Tactical questions, it did achieve one of the highest percentage year-over-year increases (6.7%) from 2013 to 2014 among all 34 questions on the TIM SA.

This question queried respondents about four specific types of procedures which received the following scores:

  1. PPE (Personal Protective Equipment) used by responders: 3.51
  2. Vehicle and equipment staging procedures: 2.90
  3. Light-shedding procedures: 2.46
  4. Pre-established, signed accident investigation sites: 1.27

In reviewing the comments submitted for the first three (PPE, staging procedures and light-shedding), nearly one-fourth of the respondents reference training, and specifically the SHRP 2 National TIM Responder Training, as being responsible for their scores on these sub-questions. Therefore, it can be expected that scores for this question, and specifically these three sub-questions, will increase in the coming years as the National TIM Training Course reaches more responders.

In response to the sub-question on pre-established, signed accident investigation sites, a number of the comments reference more informal policies to remove vehicles from the incident scene to nearby parking lots or other areas where the investigation can be more safely conducted.

The use of traffic control procedures for the end of the incident traffic queue is an effective strategy for reducing the occurrence of secondary incidents. However, among the questions in the Tactical section, this question (4.2.2.4) had the smallest percentage of assessments (65%) scoring the question 3 or higher, indicating it as an area where additional outreach and education by FHWA could improve scores. Additionally, it experienced a slight decrease (3.3%) in score from 2013. The use of traffic control procedures downstream of the incident may be limited based on available resources (access to DMS/CMS and other means to provide advance notification) and as such, may be more prevalent in metropolitan areas where available equipment and technology more readily facilitates advance notification and traffic control. An examination of the difference in scores for this question between the top 40 metro areas and the non-top 75 areas corroborates this; the top 40 metro areas had an average score of 2.93 while the non-top 75 had an average score of 2.76.

The TIM programs that achieved the highest scores in the Tactical section are listed alphabetically in Table 4. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.

Table 4. Highest Scoring – Tactical
TIM Program
Ft. Pierce, FL
Ft. Lauderdale, FL
Greensboro, NC
Knoxville, TN
Washington – Statewide

Support 

The questions in Support focused on the tools and technologies enabling improved incident detection, response and clearance.  Without the infrastructure and back office support for incident information exchange, the detection, verification, response and clearance times are delayed and responder and motorist safety is jeopardized.  As a result, one of the three key objectives of the National Unified Goal for Traffic Incident Management is prompt, reliable, interoperable communications.

The support section had the second highest overall score of 78.4 percent and had the largest increase compared to the Baseline of the three sections (99.0%). The rapid increase in scores indicates that technology and data analysis are becoming increasingly prevalent in TIM operations.

The provision of real-time motorist information, to include incident-specific information (4.3.2.1), scored the highest of the questions in the Support section (3.57) and received the third highest individual score on the overall 2014 TIM SA. This score is a composite score querying the use of three different methods for providing motorist information:

  • Traveler information delivered via 511/website – 3.70
  • Traveler information delivered via mobile applications – 3.47
  • Traveler information delivered through traffic media access to TMC/TOC data/information – 3.54

Nearly a third of the locations (30%) specifically cite use of a 511 system (phone, website or both), with Trip Check and Quick Map also being cited by a number of locations.

Traveler information services have also seen a considerable increase in score compared to the Baseline as a result of technological advances.  The provision of travel time estimates to motorists (4.3.2.2) achieved one of the highest percentage increases from the Baseline (219.4%). 

While the Support section has a number of high-scoring questions, a few questions suggest room for improvement.  Two of the questions in this section were among the bottom five in terms of year-over-year change.  Both 4.3.1.2, data/video sharing between agencies and 4.3.1.4, interoperable, interagency communications on-site between incident responders, experienced decreases in the mean score in 2014 from the 2013 score (3.1% and 3.6%, respectively).  The locations that scored these questions the lowest (No Activity or Little Activity) tend to be more rural areas with emerging TIM programs.

The TIM programs that achieved the highest scores in the Support section are listed alphabetically in Table 5.  Jurisdictions with low scores may wish to reach out to these locations for information on best practices.

Table 5. Highest Scoring – Support
TIM Program
Brevard, FL
Greensboro, NC
Kansas City, MO/KS
Lake Sumter, FL
Marion County, FL
Orlando, FL
Salt Lake City, UT
Washington – Statewide
Volusia – Flagler, FL

Summary

A total of 99 TIM SA were completed in 2014, with an average overall score of 74.2 percent (out of a possible 100%). Overall scores were up 54.8 percent compared to the Baseline scores. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas and non-top 75 metropolitan areas:

  • Top 40 metros: 80.6%
  • Top 75 metros: 76.9%
  • Non-top 75: 67.7%
  • Overall: 74.2%

The highest scores were achieved in Tactical (80.5%) and the largest percentage increase in scores from the Baseline was in Support (99.0%). Low scoring questions and those with the least improvement over Baseline indicate specific program areas where additional guidance from FHWA may be warranted. Specifically, the 2014 TIM SA scores highlight a need for special attention in the following areas:

  • Collecting and analyzing data relating to performance measures, particularly secondary incidents;
  • Multi-agency agreements and MOUs and;
  • Traffic control procedures for the end of the incident queue.

APPENDIX A. Summary of 2014 TIM SA Results

Strategic Section
Question Number Question Mean Score
Range = 0 to 4
Percentage of Assessments Scoring 3 or Higher Percentage Change in 2013 Mean Scores from Baseline
Baseline 2013 Baseline 2013
4.1.1.1 Have a TIM multi-agency team or task force which meets regularly to discuss and plan for TIM activities? 1.90 3.03 28 72 59.5
4.1.1.2 Is multi-agency training held at least once a year on TIM-specific topics?
  • NIMS/ ICS 100
  • Training of mid-level managers from primary agencies on the National Unified Goal?
  • Traffic control?
  • Work zone safety?
  • Safe parking?
1.26 2.87 9 80 128.2
4.1.1.3 Conduct multi-agency post-incident debriefings? 1.62 2.63 18 59 62.1
4.1.1.4 Conduct planning for special events?
  • Construction and maintenance?
  • Sporting events, concerts, conventions, etc.?
  • Weather-related events?
  • Catastrophic events?
2.47 3.55 35 93 43.5
4.1.2.1 Is the TIM program supported by multi-agency agreements/memoranda of understanding?
  • Is the agreement/MOU signed by top officials from participating agencies?
  • Are incident scene roles and responsibilities for each participating agency clearly defined in the agreement and communicated to all participating agencies?
  • Are agency roles and responsibilities for planning for and funding for the TIM program clearly defined in the agreement/MOU?
  • Are safe, quick clearance goals stated as time goals for incident clearance (e.g. 90 minutes) in the agreement/MOU?
1.71 2.00 18 47 17.1
4.1.2.2 Is planning to support the TIM activities done across and among participating agencies? 1.35 2.60 12 60 82.3
4.1.2.3 Is there someone from at least one of the participating agencies responsible for coordinating the TIM program as their primary job function? 2.28 2.67 54 60 17.0
4.1.3.1 Have multi-agency agreement on the two performance measures being tracked?
  • Roadway clearance time?
  • Incident clearance time?
0.64 2.39 3 53 274.1
4.1.3.2 Has the TIM program established methods to collect and analyze the data necessary to measure performance in reduced roadway clearance time and reduced incident clearance time? 0.64 2.47 3 56 286.7
4.1.3.3 Have targets (e.g. time goals) for performance of the two measures? 1.16 2.15 4 45 85.5
4.1.3.4 Routinely review whether progress is made in achieving the targets? 0.74 2.11 3 47 185.3
4.1.3.5 Track performance in reducing secondary incidents? 1.03 1.13 8 16 9.8


Tactical Section
Question Number Question Mean Score
Range = 0 to 4
Percentage of Assessments Scoring 3 or Higher Percentage Change in 2013 from Baseline Scores
Baseline 2014 Baseline 2014
4.2.1.1 Have "authority removal" laws allowing pre-designated responders to remove disabled or wrecked vehicles and spilled cargo?
  • Is there an "authority removal" law in place?
  • Is it understood and utilized by responders?
2.92 3.28 67 84 12.4
4.2.1.2

Have "driver removal" laws which require drivers involved in minor crashes (not involving injuries) to move vehicles out of the travel lanes?

  • Is there a "driver removal" law in place?
  • Is it communicated to motorists?
3.01 3.03 71 81 0.7
4.2.1.3 Use a safety service patrol for incident and emergency response? 2.73 2.82 67 75 3.2
4.2.1.4 Utilize the Incident Command System on-scene? 2.55 3.53 58 88 38.2
4.2.1.5 Have response equipment pre-staged for timely response? 2.21 2.92 41 71 32.1
4.2.1.6 Identify and type resources so that a list of towing and recovery operators (including operator capabilities and special equipment) is available for incident response and clearance? 2.86 3.38 67 82 18.3
4.2.1.7 Identify and type resources so that a list of HazMat contractors (including capabilities and equipment) is available for incident response? 2.89 3.44 69 86 19.2
4.2.1.8 Does at least one responding agency have the authority to override the decision to utilize the responsible party's HazMat contractor and call in other resources? 3.22 3.59 89 89 11.5
4.2.1.9 In incidents involving fatalities, is the Medical Examiner response clearly defined and understood? 2.53 3.23 55 79 27.8
4.2.1.10 Are there procedures in place for expedited accident reconstruction/ investigation? 2.59 3.05 72 73 17.7
4.2.1.11 Is there a policy in place for removal of abandoned vehicles? 3.47 3.57 91 88 2.7
4.2.2.1 Have "move over" laws which require drivers to slow down and if possible move over to the adjacent lane when approaching workers or responders and equipment in the roadway?
  • Is there a "move over" law in place?
  • Is it communicated to drivers?
3.2 3.66 85 96 45.7
4.2.2.2 Train all responders in traffic control following MUTCD guidelines? 1.97 3.21 28 79 63.1
4.2.2.3 Routinely utilize transportation resources to conduct traffic control procedures for various levels of incidents in compliance with the MUTCD? 1.93 3.41 27 85 76.9
4.2.2.4 Routinely utilize traffic control procedures for the end of the incident traffic queue? 1.56 2.86 17 65 83.2
4.2.2.5 Have mutually understood equipment staging and emergency lighting procedures on-site to maximize traffic flow past an incident while providing responder safety?
  • Vehicle and equipment staging procedures?
  • Light-shedding procedures?
  • PPE used by responders?
  • Pre-established, signed accident investigation sites?
1.38 2.54 14 70 83.7


Support Section
Question Number Question Mean Score
Range = 0 to 4
Percentage of Assessments Scoring 3 or Higher Percentage Change in 2013 from Baseline Scores
Baseline 2014 Baseline 2014
4.3.1.1 Does the TIM program use a Traffic Management Center/Traffic Operations Center to coordinate incident detection, notification and response? 1.98 3.53 41 89 78.0
4.3.1.2 Is there data/video sharing between agencies? 1.43 3.30 10 81 131.0
4.3.1.3 Does the TIM program have specific policies and procedures for traffic management during incident response?
  • Signal timing changes?
  • Pre-planned detour and alternate routes identified and shared between agencies?
1.55 2.46 18 61 58.7
4.3.1.4 Does the TIM program provide for interoperable, interagency communications on-site between incident responders? 1.61 2.81 17 66 74.4
4.3.2.1 Have a real-time motorist information system providing incident-specific information?
  • Traveler information delivered via 511/ website?
  • Traveler information delivered via mobile applications?
  • Traveler information delivered through traffic media access to TMC/ TOC data/ information?
1.9 3.57 27 92 87.8
4.3.2.2 Are motorists provided with travel time estimates for route segments? 0.99 3.16 12 80 219.4

1 TIM SA respondents are asked to rate their progress as Low, Medium or High, values which are then translated into a numeric score ranging from 0-4, with 4 being the highest score possible per question. [ Return to note 1. ]