2014 Traffic Incident Management National Analysis Report
Office of Operations
Table of Contents
List of Tables
Table 1. Mean Score for Each Section (Baseline and 2013)
The Traffic Incident Management Self-Assessment (TIM SA) was first developed by the Federal Highway Administration (FHWA) in 2002 as a benchmarking tool for evaluating TIM program components and overall TIM program success. The initial assessments were conducted in 2003 and assessments have been conducted annually since then. The TIM SA serves several functions, among them serving as a tool for state and local TIM program managers to assess progress and identify areas for improvement at state and local levels. Similarly, analysis of the aggregated TIM SA results allows FHWA to identify program gaps and better target TIM program resources.
In 2014 a total of 99 locations completed a TIM SA for inclusion in the national analysis. The 34 scored questions contained within the TIM SA were grouped into three sections; Strategic, Tactical and Support. In order to benchmark progress over time, the initial assessments completed in 2003, 2004 and one in 2005 (78 in total) have been used each year as the Baseline.
Table 1 shows the average score for each of the three TIM SA sections from the Baseline and 2014, along with the percentage change from the Baseline.
The 2013 overall TIM SA score was 74.2 percent (out of a possible 100%), representing a 54.8 percent increase over the Baseline. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas and non-top 75 metropolitan areas:
There was little incremental change in the overall score in 2014 from the 2013 TIM SA, with the overall national score increasing 0.41 percent from 73.9 to 74.2 percent. This should not, however, be construed to imply that overall TIM program performance is slowing or that little progress is being made to advance TIM program excellence across the country. In fact, there are several factors contributing to the smaller incremental change this year which may be experienced going forward as well.
A listing of all 34 TIM SA questions, their respective Baseline and 2014 scores and the percentage of programs scoring each question 3 or higher1 can be found in Appendix A.
The questions in the Strategic section asked respondents to rate progress in how the TIM program is organized, resourced, supported and sustained. Key elements of this section include multi-agency coordination and TIM performance measures. While the Strategic section had the lowest score of the three sections (61.7%), the Strategic questions have realized a 76.2 percent increase compared to the Baseline, indicating improvement in this area.
Despite progress in the Strategic area, the five questions receiving the lowest mean score in the TIM SA were all in this section, with four out of five coming from the subsection on TIM Performance Measurement. The questions on TIM Performance Measurement have consistently been among the lowest scoring on the TIM SA. The TIM Performance Measurement subsection focused on three key metrics: Roadway Clearance Time (RCT), Incident Clearance Time (ICT), and reduction of secondary incidents. Of the three performance measures, reduction in secondary incidents (Question 18.104.22.168) had the lowest score (1.13), which represents only a 9.8 percent improvement over the Baseline. This question was first introduced as part of the TIM SA Revision in 2009 and it has been the lowest scoring individual question on the TIM SA each year since then.
Almost half of respondents (47.5%) stated that there was "No Activity" in this area. The comments indicate that the absence of a clear definition of what constitutes a secondary incident hinders data collection and analysis in this area. For those areas that do report progress, several of the comments point to the inclusion of secondary incident information on the crash report form. Scores on this question may advance in the future with identification of a standard definition for secondary incidents. Additionally, FHWA should continue to promote the reduction of secondary incidents as part of the overall return-on-investment for Safety Service Patrols and TIM programs in general.
Another important part of the TIM SA is the TIM Performance Measures (PM) Database. This database is populated annually based on responses to the TIM SA. Information on the three key PM metrics (RCT, ICT, and secondary incidents) is tracked annually and compared to a Baseline (2011) level. Average RCT decreased to 62.93 minutes in 2014, down 14 percent from the 73.16 minutes reported in 2013. This is also the first year since the Baseline (2011) that average RCT is below the Baseline of 65.39 minutes.
In terms of ICT, the overall average time increased by 13.2 percent from 2013 to 2014 (56.58 minutes in 2013 versus 64.07 minutes in 2014). However, looking just at the locations that submitted ICT data in 2013 and 2014, the increase in ICT was less dramatic, moving from 56.34 minutes in 2013 to 59.96 minutes in 2014 (6.4% increase). Therefore, the increase in the overall average (13.2%) may reflect the addition of new locations submitting ICT data in 2014.
Just over 50 percent of locations indicated some activity on the tracking of reductions in secondary incidents (22.214.171.124) and of those locations, only 19 provided secondary incident data for the TIM PM database. In 2014, those locations reported that, on average, secondary incidents comprised just 2 percent of all incidents, a slight decline from 2.8 percent reported in 2013. However, caution should be taken in interpreting changes in the percentage of incidents reported as secondary given the lack of a uniform definition for secondary incidents.
In addition to questions on performance measures, the Strategic section included other areas with room for improvement. Question 126.96.36.199 on the multi-agency agreements/MOUs used to structure TIM programs received the second lowest individual score on the overall TIM SA (2.00) and fewer than 50 percent of the assessments scored this question 3 or higher. This question was divided into four composite questions to query specific elements of multi-agency coordination:
The lowest scoring of the four composite questions was Part C regarding defined agency roles for planning and funding (1.44). The lack of a formal structure for multiagency collaboration continues to hinder the advancement of multiagency TIM programs beyond ad hoc activities.
The low scores on this question suggest that many TIM programs lack a formal structure for multiagency collaboration. Furthermore, even in locations that do have formal agreements, many do not have a process in place to systematically review and renew these agreements. Respondents were asked how frequently the agreements/MOUs were updated and nearly a third responded "as needed." Absent a planned, systematic review and update process for these agreements, there is a risk that they will become obsolete and eventually disregarded by the participating agencies.
As it has been for the past five years, the highest score in the Strategic section was achieved in planning for special events (188.8.131.52) with a mean score of 3.55. Planning for special events was the fifth highest scoring question overall in the 2014 TIM SA. The score for this question was the composite average of individual scores in planning for the following types of events: Construction and Maintenance; Sporting Events, Concerts, Conventions; Weather-related Events and Catastrophic Events. Among those four categories, Sporting Events, Concerts, Conventions (184.108.40.206.b) and Weather-related Events (220.127.116.11.c) achieved the highest mean scores of 3.60 and 3.56, respectively. Catastrophic events received the lowest score of the four event types (3.49). However, with a high composite score for planned special events (3.55) and a limited delta between the individual event type scores (0.11), there is little to indicate that additional support, training or outreach from FHWA on any one event type will result in overall advancement of the scores on this question.
The TIM programs that achieved the highest scores in the Strategic section are listed alphabetically in Table 3. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.
The questions in Tactical focused on the policies and procedures used by TIM professionals when responding to incidents. This includes the policies and procedures in place to ensure motorist and responder safety. Collectively, these questions continue to score among the highest in the TIM SA and are reflective of many of the core competencies taught as part of the National TIM Responder Training sponsored by FHWA. In 2014 this section achieved an overall score of 80.5 percent, making it the highest scoring of the three sections. Three of the five questions achieving the highest mean score in the 2014 TIM SA were in the Tactical section.
Safe, quick clearance (SQC) laws are a key element of the Tactical section. Question 18.104.22.168 on Move Over laws received the highest mean score (3.66) in the Tactical section, indicating a high degree of success in promulgating Move Over laws. Question 22.214.171.124 on Authority Removal had a mean score in 2014 of 3.28. The third SQC law, Driver Removal (126.96.36.199), scored 3.03 in 2014. Scores for all three laws continue to trend upward, however Driver Removal had the smallest increase over Baseline (0.7%) in the 2014 TIM SA suggesting a need for education on the importance of enacting Driver Removal laws. Additionally, more work needs to be done on the implementation of each of the SQC laws. All three scores were composite scores that first asked if the law existed and then asked if the law was utilized, communicated, or enforced (depending on the law in question). All three laws had lower scores in the execution element of the composite score. While passage of the laws is important, there will be no safety benefits if the laws are not utilized. Specifically, Driver Removal laws had the lowest implementation score, which is likely one of the reasons this question had the lowest score of the three SQC law questions.
The lowest scoring question in the Tactical section dealt with equipment staging and lighting procedures that maximize traffic flow around the incident while also protecting responders (188.8.131.52). However, while scoring the lowest among the Tactical questions, it did achieve one of the highest percentage year-over-year increases (6.7%) from 2013 to 2014 among all 34 questions on the TIM SA.
This question queried respondents about four specific types of procedures which received the following scores:
In reviewing the comments submitted for the first three (PPE, staging procedures and light-shedding), nearly one-fourth of the respondents reference training, and specifically the SHRP 2 National TIM Responder Training, as being responsible for their scores on these sub-questions. Therefore, it can be expected that scores for this question, and specifically these three sub-questions, will increase in the coming years as the National TIM Training Course reaches more responders.
In response to the sub-question on pre-established, signed accident investigation sites, a number of the comments reference more informal policies to remove vehicles from the incident scene to nearby parking lots or other areas where the investigation can be more safely conducted.
The use of traffic control procedures for the end of the incident traffic queue is an effective strategy for reducing the occurrence of secondary incidents. However, among the questions in the Tactical section, this question (184.108.40.206) had the smallest percentage of assessments (65%) scoring the question 3 or higher, indicating it as an area where additional outreach and education by FHWA could improve scores. Additionally, it experienced a slight decrease (3.3%) in score from 2013. The use of traffic control procedures downstream of the incident may be limited based on available resources (access to DMS/CMS and other means to provide advance notification) and as such, may be more prevalent in metropolitan areas where available equipment and technology more readily facilitates advance notification and traffic control. An examination of the difference in scores for this question between the top 40 metro areas and the non-top 75 areas corroborates this; the top 40 metro areas had an average score of 2.93 while the non-top 75 had an average score of 2.76.
The TIM programs that achieved the highest scores in the Tactical section are listed alphabetically in Table 4. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.
The questions in Support focused on the tools and technologies enabling improved incident detection, response and clearance. Without the infrastructure and back office support for incident information exchange, the detection, verification, response and clearance times are delayed and responder and motorist safety is jeopardized. As a result, one of the three key objectives of the National Unified Goal for Traffic Incident Management is prompt, reliable, interoperable communications.
The support section had the second highest overall score of 78.4 percent and had the largest increase compared to the Baseline of the three sections (99.0%). The rapid increase in scores indicates that technology and data analysis are becoming increasingly prevalent in TIM operations.
The provision of real-time motorist information, to include incident-specific information (220.127.116.11), scored the highest of the questions in the Support section (3.57) and received the third highest individual score on the overall 2014 TIM SA. This score is a composite score querying the use of three different methods for providing motorist information:
Nearly a third of the locations (30%) specifically cite use of a 511 system (phone, website or both), with Trip Check and Quick Map also being cited by a number of locations.
Traveler information services have also seen a considerable increase in score compared to the Baseline as a result of technological advances. The provision of travel time estimates to motorists (18.104.22.168) achieved one of the highest percentage increases from the Baseline (219.4%).
While the Support section has a number of high-scoring questions, a few questions suggest room for improvement. Two of the questions in this section were among the bottom five in terms of year-over-year change. Both 22.214.171.124, data/video sharing between agencies and 126.96.36.199, interoperable, interagency communications on-site between incident responders, experienced decreases in the mean score in 2014 from the 2013 score (3.1% and 3.6%, respectively). The locations that scored these questions the lowest (No Activity or Little Activity) tend to be more rural areas with emerging TIM programs.
The TIM programs that achieved the highest scores in the Support section are listed alphabetically in Table 5. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.
A total of 99 TIM SA were completed in 2014, with an average overall score of 74.2 percent (out of a possible 100%). Overall scores were up 54.8 percent compared to the Baseline scores. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas and non-top 75 metropolitan areas:
The highest scores were achieved in Tactical (80.5%) and the largest percentage increase in scores from the Baseline was in Support (99.0%). Low scoring questions and those with the least improvement over Baseline indicate specific program areas where additional guidance from FHWA may be warranted. Specifically, the 2014 TIM SA scores highlight a need for special attention in the following areas:
APPENDIX A. Summary of 2014 TIM SA Results
1 TIM SA respondents are asked to rate their progress as Low, Medium or High, values which are then translated into a numeric score ranging from 0-4, with 4 being the highest score possible per question. [ Return to note 1. ]
United States Department of Transportation - Federal Highway Administration