Office of Operations
21st Century Operations Using 21st Century Technologies

2012 Traffic Incident Management National Analysis Report
Executive Summary

Download the Printable Version [PDF, 196 KB]
You may need the Adobe® Reader® to view the PDFs on this page.
Contact Information: OperationsFeedback@dot.gov

Background

The Traffic Incident Management Self-Assessment (TIM SA) was developed by the Federal Highway Administration (FHWA) as a benchmarking tool for evaluating TIM program components and overall TIM program success. Development of the TIM SA was initiated in 2002 and the first assessments were conducted in 2003. The TIM SA serves several functions. Through the TIM SA, state and local TIM program managers are able to assess progress and identify areas for improvement at state and local levels. Similarly, analysis of the aggregated TIM SA results allows FHWA to identify program gaps and better target TIM program resources.

The 2012 TIM SA had a record number of assessments submitted; a total of 104 locations completed a TIM SA for inclusion in the national analysis. The 34 scored questions contained within the TIM SA were grouped into three sections; Strategic, Tactical and Support. In order to benchmark progress for each question and the three sections over time, the initial assessments completed in 2003, 2004 and one in 2005 (78 in total) have been used each year as the Baseline.

Table 1 shows the average score for each of the three TIM SA sections from the Baseline and 2012, along with the percentage change from the Baseline. The 2012 overall TIM SA score was 70.2 percent (out of a possible 100%), representing a 46.5 percent increase compared to the Baseline. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas and non-top 75 metropolitan areas:

  • Top 40 metros: 76.5%
  • Top 75 metros: 71.8%
  • Non-top 75: 66.7%
  • Overall: 70.2%

A listing of all 34 TIM SA questions, their respective Baseline and 2012 scores and the percentage of programs scoring each question 3 or higher1 can be found in Appendix A.

Table 1. Mean Score for Each Section (Baseline and 2012)
Section # of Questions Mean Score High Score 2012
(possible)
% Change in scores from Baseline Section Weights
Baseline 2012
Strategic 12 35.00% 56.10% 29.4 (30) 60.40% 30%
Tactical 16 64.10% 77.30% 39.7 (40) 20.60% 40%
Support 6 39.40% 75.00% 30.0 (30) 90.20% 30%
Overall Total 34 48.00% 70.20% 96.7 (100) 46.50% 100%

Strategic

The questions in the Strategic section asked respondents to rate progress in how the TIM program is organized, resourced, supported and sustained. Key elements of this section include multi-agency coordination and TIM performance measures. While the strategic section had the lowest score of the three sections (56.1%), the strategic questions have realized a 60.4 percent increase compared to the Baseline, indicating improvement in this area.

Despite progress in the Strategic area, the four questions receiving the lowest mean score in the TIM SA were all in this section, with three out of four coming from the subsection on TIM Performance Measurement. The questions on TIM Performance Measurement have consistently been among the lowest scoring on the TIM SA. The TIM Performance Measurement subsection focused on three key metrics: Roadway Clearance Time, Incident Clearance Time, and reduction of secondary incidents. Of the three performance measures, reduction in secondary incidents (Question 4.1.3.5) had the lowest score (0.97). This low score is 5.7 percent below the Baseline; one of only two questions to perform below the Baseline level. Furthermore, the 0.97 score makes it the lowest scoring individual question in the 2012 TIM SA. Exactly half of respondents stated that there was "no activity" in this area. In 2011, many respondents commented that their TIM program had very recently started to track secondary incidents and that data would be available for the 2012 TIM SA. It appears that this recent activity was partially responsible for the increase in score. However, there still is much improvement needed in secondary incident tracking. There continues to be a lack of standardization in the definition of "secondary incidents." Education and outreach on the importance of tracking and reducing secondary incidents should continue to be a priority for FHWA.

Another area of concern that was identified by low scores focused on multi-agency coordination and training. One such low score was found in question 4.1.2.1 dealing with the multi-agency agreements/MOUs used to structure TIM programs. This question was divided into four composite questions to query specific elements of multi-agency coordination.

  1. Is the agreement/MOU signed by top officials from participating agencies?
  2. Are incident scene roles and responsibilities for each participating agency clearly defined in the agreement and communicated to all participating agencies?
  3. Are agency roles and responsibilities for planning for and funding for the TIM program clearly defined in the agreement/MOU?
  4. Are safe, quick clearance goals stated as time goals for incident clearance (e.g. 90 minutes) in the agreement/MOU?

This question had the third lowest score (1.89) in the 2012 TIM SA, which was only a 10.8 percent increase compared to Baseline. Part C regarding defined agency roles for planning and funding scored the lowest of the four parts (1.25). While all four composite questions have room for improvement, clearly defining multi-agency planning and funding roles requires the most attention.

Also as part of this question there were two supplemental, non-scored questions that asked how frequently the agreements/MOUs were updated and which agencies were primary signatories on the agreement. "As needed" was the most frequently cited response, followed by "Has not been updated." TIM programs that lack a planned, systematic review and update process could experience difficulty maintaining continuity, particularly if there is turnover in coordination contacts at participating agencies.

The highest score in the Strategic section was achieved in planning for special events (4.1.1.4) with a mean score of 3.31. Planning for special events was the ninth highest scoring question overall in the 2012 TIM SA. The score for this question was the composite average of individual scores in planning for the following types of events: Construction and Maintenance; Sporting Events, Concerts, Conventions; Weather-related Events and Catastrophic Events. Among those categories, Weather-related events (4.1.1.4.c) and Construction and Maintenance (4.1.1.4.a) achieved the highest mean scores of 3.48 and 3.40, respectively. Catastrophic events garnered the lowest score of the four event types (3.11). While this is a good score, catastrophic events arguably require the most preparation of the four event types due to their unplanned nature. Areas that have not incorporated planning for catastrophic events in their TIM programs should consider doing so.

Tactical

The questions in Tactical focused on the policies and procedures used by field personnel when responding to incidents. This included the policies and procedures specifically targeting motorist and responder safety. Collectively, these questions consistently score among the highest in the TIM SA and in 2012 this section achieved an overall score of 77.3 percent, making it the highest scoring of the three sections. Three of the five questions achieving the highest mean score were in the Tactical section.

One of the key elements of the Tactical section is the presence and execution of three core safe, quick clearance (SQC) laws. Question 4.2.2.1 on Move Over laws received the highest mean score (3.60) in the Tactical section, indicating a high degree of success in promulgating Move Over laws. Question 4.2.1.1 on Authority Removal had a mean score in 2012 of 3.17. The third SQC law, Driver Removal (4.2.1.2), scored 3.03 in 2012. All three of these questions were composite scores that first asked if the law existed and then asked if the law was utilized, communicated, or enforced (depending on the law in question). All three laws had lower scores in the execution element of the composite score, which continues to plague the full utilization of these laws for safe, quick clearance.

Respondents generally reported that the laws were communicated to drivers through both static signs and dynamic message boards, as well as through public outreach campaigns. For the Move Over law question, respondents were asked if the law was enforced, to which 87.6 percent indicated it was enforced. Utilization of authority removal laws was confounded in some areas by an absence of training on when and how to apply the law. Furthermore, some areas reported that the law was not utilized due to a lack of "hold harmless" laws to limit the liability of responding agencies. These comments point to the need for more outreach by FHWA on the benefits of authority removal for SQC. The comments revealed that there was generally less outreach done on informing motorists of Driver Removal laws, which is likely one of the reasons this question had the lowest score of the three SQC law questions.

The lowest scoring question in the Responder and Motorist Safety subsection dealt with mutually understood equipment staging and lighting procedures to maximize traffic flow around the incident while protecting responders (4.2.2.5). Though it has increased 54.7 percent compared to the Baseline, the relatively low mean score of 2.13 points to continued challenges in achieving consensus on how responder equipment should be staged and how responder lights should be deployed and eventually shed as the incident moves toward clearance. This question is a composite question made up of several sub-questions which help reveal certain strengths and weaknesses in this TIM subject area. The four specific types of procedures queried in this question received the following scores:

  • PPE (Personal Protective Equipment) used by responders: 3.15
  • Vehicle and equipment staging procedures: 2.61
  • Light-shedding procedures: 2.00
  • Pre-established, signed accident investigation sites: 0.78

The use of PPE by responders received the highest score of the four procedures analyzed. A supplemental, non-scored question asked which responder groups were regularly using PPE. The respondents indicated that, generally, all responders used PPE. Some areas indicated that PPE was less common among certain responders. However, there did not appear to be a noticeable trend of one agency using PPE less frequently than others. In terms of vehicle and equipment staging procedures, there was continued evidence that many TIM programs lack training and formal procedures on how responder vehicles should be staged. Safe equipment staging is one of the core competencies taught in the Strategic Highway Research Program 2 (SHRP 2) National TIM Responder Training Course and the widespread dissemination of that training by FHWA should improve the scores for this sub-question.

Light-shedding was the second lowest scoring procedure, indicating the need for more clarity on proper light-shedding procedures. In the comments, several respondents indicated that requests for turning off lights occurred regularly at incident scenes. Again, if responders were already trained and informed of proper procedures, these requests would not be necessary, helping to reduce clearance times. Signed accident investigation sites scored the lowest of the four procedures. Research on the use of accident investigation sites which quantifies their value for improving responder safety and reducing secondary incidents may be necessary.

In addition to questions that achieved low mean scores, some questions performed poorly compared to Baseline. One question in this section that did not perform well compared to the Baseline was question 4.2.1.3 on the use of Safety Service Patrols (SSPs) for incident response. This question had the second lowest score in the section (2.68) and was one of only two questions in the TIM SA to perform worse than the Baseline (-1.7%). This year marks the second year of decline for the SSP question score. Further analysis indicates that the overall 2012 score for this question represents a 2.9 percent decline from 2011. Mean scores were lower across metro areas of all sizes; the mean score for the top 40 metro areas declined 2.2 percent between 2011 and 2012 while the mean score for top 75 metro areas declined by 1.7 percent. A primary driver of this decline from 2011 may be the fact that 29 percent of the TIM programs that classified their Safety Service Patrol as Full Function in the 2011 TIM SA now classify their program as a Mid-Level Service Patrol, indicating some possible constriction in operations (time of day, days of week, lane miles covered, and/or services offered).

Encouraging the use of Full Function Service Patrols is a key objective of the FHWA-sponsored TIM Decision Maker Education and Outreach initiative. Given the possible reduction in operations resulting in more Mid-Level than Full Function SSPs, there is a renewed need for tools that can be used by SSP managers to rationalize the necessity and benefits Full Function SSPs. Products coming out of the TIM Decision Maker Education and Outreach initiative, including the SSP cost-benefit calculator and the public outreach campaign materials are examples of such tools.

Support

The questions in Support focused on the tools and technologies enabling improved incident detection, response and clearance. Without the infrastructure and back office support for incident information exchange, the detection, verification, response and clearance times are delayed and responder and motorist safety is jeopardized. As a result, one of the three key objectives of the National Unified Goal for Traffic Incident Management is prompt, reliable, interoperable communications.

The support section had the second highest overall score of 75.0 percent and had the largest increase compared to the Baseline of the three sections (90.2%). Significant progress in this section indicates that technology and data analysis are becoming increasingly prevalent in TIM operations.

The use of a Traffic Management Center/Traffic Operations Center (TMC/TOC) to coordinate incident detection, notification and response (4.3.1.1) again scored the highest of the questions in the Data subsection with a mean score of 3.42, representing a 72.9 percent increase compared to Baseline. This was a slight decrease compared to 2011 (-3.2%) and the score for this question should be monitored to ensure this is not a sustained downward trend. The overall decline in the score for this question is reflected in the drop in mean score for the top 40 metro areas (-3.9%), top 75 metro areas (-2.8%) and those areas not in a top 75 metro area (-3.0%). It is important to note that none of the comments indicated closure or cutbacks in a TMC/TOC. Most respondents indicated that a TMC/TOC existed and some even mentioned that expansions and upgrades were in the planning stages. Based on those responses, it is expected that this question will experience an increase in score in the 2013 TIM SA.

Another area of success in this section was data/video sharing between agencies. This question (4.3.1.2) scored well (3.26), increasing 127.9 percent compared to Baseline. Not surprisingly, advances in technology have been beneficial to this question's score by making data and video sharing easier. In past TIM SA, the comments suggested that video sharing was not as prevalent as data sharing, however it appears that is no longer the case. Furthermore, several respondents also reported that additional data/video sharing agreements are in development, indicating that this question's score should continue to increase.

Traveler information services have also dramatically increased in score compared to the Baseline as a result of technological advances. The provision of travel time estimates to motorists (4.3.2.2) achieved one of the highest percentage increases from the Baseline (183.6%). The significant increase in score is evidence of the rapidly evolving technologies that are available for disseminating traveler information. However, this question did see a slight drop in average score compared to 2011 (-1.8%) and bears watching to ensure that this is not a trend. After isolating scores for submissions by metro area size, it appears that the drop in overall score was due to a noticeable decrease in scores in areas outside the top 75 metros (-15.7%). For top 75 metro areas, the mean score increased from 2011 to 2012 by 4.7 percent. There was no information in the comments to indicate areas had cut back in this area. In fact, several areas mentioned they were working to develop travel time estimates. The decrease in scores was likely due to the addition of several new submissions to the TIM SA in 2012 from areas outside the top 75 metros that scored this question low.

Question 4.3.1.3, which dealt with procedures for traffic management during incident response, had the lowest score in the Support section (2.24) and the smallest change compared to the Baseline (44.2%). This question contained two composite scores on signal timing changes and pre-planned detour routes. Signal timing changes had the lower mean score of 1.88, compared to 2.60 for pre-planned detour routes. Through an analysis of the comments, it appears that the ability to change signal timing remotely is spreading. However, even in locations where the capacity exists, the relationship between the TMC/TOC and the local municipality (pertaining to signal timing) was not formally defined. As for the pre-planned detour route composite question, respondents provided a wide variety of responses. In many instances, detour routes were in place for emergencies, but not for traffic incidents. There also seemed to be a lack of cross-agency and cross-jurisdictional coordination on detour route planning.

Another area in need of some improvement dealt with interoperable, interagency communications between responders. The mean score for this question was 2.80, which was a 73.8 percent increase compared to the Baseline. It appears from the comments that progress has been made with interoperable, interagency communications usage and several respondents indicated that communication improvement projects were underway. However, many TIM partner agencies are still not able to communicate with each other. Over one-third (35.6%) of respondents received a score of less than three on this question. Inability for responders to communicate on-scene is a significant obstacle to SQC.

Opportunities for TIM Stakeholders

One of the key purposes of the TIM SA is to identify TIM program areas where resources can be deployed to address gaps, both at the local level and nationally. First and foremost, a review of the questions that achieved the lowest mean scores highlights program areas that are in the most need of attention. However an analysis of program areas that did not advance the mean score from year to year, regardless of the numeric value of the score, presents additional opportunities for TIM stakeholders to address program gaps.

TIM Performance Measures

As has been the case each year, questions on TIM Performance Measures are some of the lowest scoring questions in the TIM SA. This year, three of the four questions achieving the lowest mean scores were in the TIM Performance Measurement subsection. The lowest score overall was achieved in secondary incident tracking (4.1.3.5). The secondary incident question also was one of only two questions to score lower than Baseline (-5.7%), and the only question with a mean score below 1.0. While all three performance measures had weak scores, additional attention should be given to secondary incident outreach. In particular, TIM Stakeholders should consider focusing outreach efforts on the importance of reducing secondary incidents and offer ways to define and capture secondary incident data.

Given the increased focus on performance measurement in the new transportation bill (MAP-21), it is imperative that TIM programs commit the personnel and resources to collecting and evaluating performance measures data. FHWA has led the effort to define the measures through the TIM Performance Measures Focus States Initiative and provides ongoing support through the TIM Performance Measures Knowledgebase. As a next step, FHWA will be providing benchmarking data through the TIM Performance Measures Database. The FHWA-developed database is being populated with TIM PM data from the 2011 TIM SA (as a baseline) and with the addition of the 2012 TIM SA data, new reporting will be available for TIM program managers to benchmark TIM program performance measurement. Going forward, this database will be updated on an annual basis concurrent with the annual TIM SA cycle.

Multi-agency Coordination

Multi-agency coordination is another perennial weakness identified through the TIM SA analysis. Many of the lowest scoring questions contained a multi-agency element. Lacking defined incident scene roles, training procedures and multi-agency communication can often inhibit SQC through confusion, inefficiencies and possibly even hazardous conditions for on-scene responders. Generally, the following key areas are in most need of attention:

  • Formalized TIM partnerships
  • " Multi-disciplinary training
  • " Multi-agency communication

In many locations, even if multi-agency coordination occurred, there was little to no formalized process behind the collaboration. It is important for TIM programs to be formally structured, have the underlying structure undergo scheduled updates, have dedicated "champions" and formally meet on a regular basis.

Multi-disciplinary training is critical for disseminating TIM best practices and for eliminating agency barriers that inhibit SQC by on-scene responders. For example, question 4.2.2.5 on equipment staging was one of the lowest scoring questions in the 2012 TIM SA. Conducting multi-disciplinary training with law enforcement, fire/rescue, DOT, towing, etc. allows for responders to better understand the perspective of responders outside their agency and promotes collaboration. It is also important to encourage the inclusion of non-traditional agencies into TIM training, such as medical examiners.

Multi-agency communication is also extremely important for meeting SQC goals during incident response. While progress has been made in the area of interoperable, interagency communication, over one-third (35.6%) of 2012 TIM SA respondents did not report adequate progress in this area according to question 4.3.1.4.

All of these weaknesses in multi-agency coordination provide TIM stakeholders with opportunities for strengthening TIM programs through enhanced TIM partnerships. The SHRP 2 National TIM Responder Training Course and the National TIM Guidance under development by FHWA should both provide opportunities for improvement in multi-agency coordination.

Safety Service Patrols

The question on safety service patrols (4.2.1.3) was the only question in the 2012 TIM SA to decline two years in a row and was one of only two questions to score lower than Baseline. While the mean score is not particularly low (2.68), the downward trend is problematic, particularly for a TIM program element that is critical for achieving SQC goals. Furthermore, the decline in scores was greater in larger metropolitan areas compared to smaller metros. While SSP programs benefit regions of all sizes, their benefits can be particularly strong in larger metropolitan areas that deal with a greater number of traffic incidents. The work done as part of the FHWA-sponsored TIM Decision Maker Education and Outreach should be leveraged by TIM stakeholders as a way to promulgate the benefits of SSP services and provide local TIM program managers with information to help justify SSP program expense. In particular, the work by FHWA to develop and deploy a return on investment calculator for SSP programs will be an important tool for program managers to use to rationalize the continued expense of the SSP.

Leveraging Other Programs

There are several concurrent efforts underway that can and should be leveraged to improve TIM performance, and therefore, increase TIM SA scores.

National Traffic Incident Management Responder Training

The SHRP 2 National Traffic Incident Management Responder Training curriculum directly addresses many of the multi-agency collaboration weaknesses identified in the 2012 TIM SA. The curriculum has been extensively peer reviewed and pilot-tested in several states and is based on the knowledge gaps identified in past TIM SA reports. Now that pilot testing has been completed, the curriculum will be rolled out nationally throughout the remainder of 2012 and into 2013. FHWA has included this training as part of its Every Day Counts (EDC) initiative. The goal of EDC is to encourage innovations in the transportation system that increase efficiency and safety. The inclusion of the training as part of EDC further underscores how critical the training is for advancing improved TIM and increasing responder safety.

National Traffic Incident Management Coalition (NTIMC) and the TIM Network

The NTIMC is comprised of TIM stakeholder organizations and is designed to function as a collaborative network of government, industry and practitioners. In 2011, the NTIMC created the TIM Network,2 which connects TIM professionals from different disciplines, provides a forum to discuss developing issues of national interest, and offers a way for the NTIMC to validate suggested practices from state, regional and local TIM practitioners with national level expertise. The TIM Network holds monthly webinars on current TIM issues and best practices, maintains a Facebook page with over 1,000 followers, and produces the monthly Responder newsletter. All of these outreach efforts continue to serve as an important outlet for dissemination of TIM best practices.

Transportation and Public Safety Summit

In June 2012, a Summit of over 50 senior executives in the transportation and public safety fields was held in Washington, DC to establish a forum for top leadership to discuss critical issues related to improving roadway safety and operations. A key goal of the Summit was to identify innovations and partnerships in the areas of TIM legislation, policy, multi-disciplinary training and outreach strategies. The Summit attendees, who represented the critical TIM partner disciplines of law enforcement, fire/rescue, transportation and emergency medical services, emerged with a common set of policies and strategies to more effectively implement TIM strategies. TIM stakeholders should continue to capitalize on the momentum of this and future summits to promote the TIM SA as a valuable tool for local TIM programs to leverage strengths and address weaknesses.

Developing a Framework for Emergency Responder/Roadside Worker Struck-by/Near-miss Database

This is the first priority study to be advanced from the National Cooperative Highway Research Program (NCHRP) 20-7 (282) Research Needs Assessment for Roadside Worker and Vehicle Visibility initiative completed in early 2011. Research is currently underway, and a preliminary framework for the database is expected to be delivered in late 2012. The database is widely recognized as the first critical step in understanding the root causes of incident responder struck-by/near-miss incidents and developing training and best practices to mitigate those incidents. Given that less than one out of five respondents to the 2012 TIM SA are maintaining a struck-by database, the results of this study should be leveraged to increase the number of areas collecting information on responder struck-by injuries and fatalities over the next few years. This data could then be used to populate a national struck by database.

Technical Guidance for Traffic Incident Management Performance Measurement Implementation

This study is planned as part of the NCHRP, administered by the Transportation Research Board (TRB). Its objective is to "develop technical guidelines and related resources to assist DOTs in standardizing TIM PM terminology, data standards, data collections and data analysis."3 Once completed, additional advances in the scores for the TIM Performance Measures should occur.

Summary

A total of 104 TIM SA were completed in 2012, with an average overall score of 70.2 percent (out of a possible 100%). Overall scores were up 46.5 percent compared to the Baseline scores. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas and non-top 75 metropolitan areas:

  • Top 40 metros: 76.5%
  • Top 75 metros: 71.8%
  • Non-top 75: 66.7%
  • Overall: 70.2%

The highest scores were achieved in Tactical (77.3%) and the largest percentage increase in scores from the Baseline was in Support (90.2%). Low scoring questions and those with the least improvement over Baseline indicate specific program areas where additional guidance from FHWA may be warranted. Specifically, the 2012 TIM SA scores highlight a need for additional guidance in the following areas:

  • Collecting and analyzing data relating to performance measures, particularly secondary incidents;
  • Multi-agency coordination and;
  • Investment in Safety Service Patrols.

APPENDIX A. Summary of 2012 TIM SA Results

STRATEGIC SECTION
Question Number Question Mean Score Range = 0 to 4 % of Assessments Scoring 3 or Higher % Change in 2012 Mean Scores from Baseline
Baseline 2012 Baseline 2012
4.1.1.1 Have a TIM multi-agency team or task force which meets regularly to discuss and plan for TIM activities? 1.9 2.66 28% 59% 40.20%
4.1.1.2

Is multi-agency training held at least once a year on TIM-specific topics?

  • NIMS/ ICS 100
  • Training of mid-level managers from primary agencies on the National Unified Goal?
  • Traffic control?
  • Work zone safety?
  • Safe parking?
1.26 2.48 9% 62% 96.40%
4.1.1.3 Conduct multi-agency post-incident debriefings? 1.62 2.58 18% 60% 59.10%
4.1.1.4

Conduct planning for special events?

  • Construction and maintenance?
  • Sporting events, concerts, conventions, etc?
  • Weather-related events?
  • Catastrophic events?
2.47 3.31 35% 89% 34.20%
4.1.2.1
  • Is the TIM program supported by multi-agency agreements/memoranda of understanding?
  • Is the agreement/MOU signed by top officials from participating agencies?
  • Are incident scene roles and responsibilities for each participating agency clearly defined in the agreement and communicated to all participating agencies?
  • Are agency roles and responsibilities for planning for and funding for the TIM program clearly defined in the agreement/MOU?
  • Are safe, quick clearance goals stated as time goals for incident clearance (e.g. 90 minutes) in the agreement/MOU?
1.71 1.89 18% 44% 10.80%
4.1.2.2 Is planning to support the TIM activities done across and among participating agencies? 1.35 2.34 12% 45% 73.10%
4.1.2.3 Is there someone from at least one of the participating agencies responsible for coordinating the TIM program as their primary job function? 2.28 2.42 49% 49% 6.30%
4.1.3.1

Have multi-agency agreement on the two performance measures being tracked?

  • Roadway clearance time?
  • Incident clearance time?
0.64 2.19 3% 47% 241.80%
4.1.3.2 Has the TIM program established methods to collect and analyze the data necessary to measure performance in reduced roadway clearance time and reduced incident clearance time? 0.64 2.22 3% 53% 247.10%
4.1.3.3 Have targets (e.g. time goals) for performance of the two measures? 1.16 2 4% 42% 72.40%
4.1.3.4 Routinely review whether progress is made in achieving the targets? 0.74 1.88 3% 42% 153.40%
4.1.3.5 Track performance in reducing secondary incidents? 1.03 0.97 8% 13% -5.70%


TACTICAL SECTION
Question Number Question Mean Score Range = 0 to 4 % of Assessments Scoring 3 or Higher % Change in 2012 Mean Scores from Baseline
Baseline 2012 Baseline 2012
4.2.1.1

Have “authority removal” laws allowing pre-designated responders to remove disabled or wrecked vehicles and spilled cargo?

  • Is there an “authority removal” law in place?
  • Is it understood and utilized by responders?
2.92 3.17 67% 83% 8.7%
4.2.1.2

Have “driver removal” laws which require drivers involved in minor crashes (not involving injuries) to move vehicles out of the travel lanes?

  • Is there a “driver removal” law in place?
  • Is it communicated to motorists?
3.01 3.03 71% 80% 0.6%
4.2.1.3 Use a safety service patrol for incident and emergency response? 2.73 2.68 67% 71% 1.7%
4.2.1.4 Utilize the Incident Command System on-scene? 2.55 3.40 58% 83% 33.5%
4.2.1.5 Have response equipment pre-staged for timely response? 2.21 3.03 41% 76% 37.1%
4.2.1.6 Identify and type resources so that a list of towing and recovery operators (including operator capabilities and special equipment) is available for incident response and clearance? 2.86 3.39 67% 83% 18.7%
4.2.1.7 Identify and type resources so that a list of HazMat contractors (including capabilities and equipment) is available for incident response? 2.89 3.30 69% 81% 14.1%
4.2.1.8 Does at least one responding agency have the authority to override the decision to utilize the responsible party’s HazMat contractor and call in other resources? 3.22 3.36 84% 84% 4.4%
4.2.1.9 In incidents involving fatalities, is the Medical Examiner response clearly defined and understood? 2.53 2.99 55% 70% 18.2%
4.2.1.10 Are there procedures in place for expedited accident reconstruction/ investigation? 2.59 2.77 64% 54% 6.9%
4.2.1.11 Is there a policy in place for removal of abandoned vehicles? 3.47 3.48 87% 87% 0.2%
4.2.2.1

Have “move over” laws which require drivers to slow down and if possible move over to the adjacent lane when approaching workers or responders and equipment in the roadway?

  • Is there a “move over” law in place?
  • Is it communicated to drivers?
3.2 3.6 85% 96% 12.5%
4.2.2.2 Train all responders in traffic control following MUTCD guidelines? 1.97 2.93 28% 69% 48.9%
4.2.2.3 Routinely utilize transportation resources to conduct traffic control procedures for various levels of incidents in compliance with the MUTCD? 1.93 3.38 27% 80% 74.9%
4.2.2.4 Routinely utilize traffic control procedures for the end of the incident traffic queue? 1.56 2.81 17% 63% 80.0%
4.2.2.5

Have mutually understood equipment staging and emergency lighting procedures on-site to maximize traffic flow past an incident while providing responder safety?

  • Vehicle and equipment staging procedures?
  • Light-shedding procedures?
  • PPE used by responders?
  • Pre-established, signed accident investigation sites?
1.38 2.13 14% 56% 54.7%


SUPPORT SECTION
Question Number Question Mean Score Range = 0 to 4 % of Assessments Scoring 3 or Higher % Change in 2012 Mean Scores from Baseline
Baseline 2012 Baseline 2012
4.3.1.1

Does the TIM program use a Traffic Management Center/Traffic Operations Center to coordinate incident detection, notification and response?

1.98 3.42 41% 86% 72.9%
4.3.1.2

Is there data/video sharing between agencies?

1.43 3.26 10% 78% 127.9%
4.3.1.3

Does the TIM program have specific policies and procedures for traffic management during incident response?

  • Signal timing changes?
  • Pre-planned detour and alternate routes identified and shared between agencies?
1.55 2.24 18% 52% 44.2%
4.3.1.4 Does the TIM program provide for interoperable, interagency communications on-site between incident responders? 1.61 2.80 17% 64% 73.8%
4.3.1.5

Have a real-time motorist information system providing incident-specific information?

  • Traveler information delivered via 511/ website?
  • Traveler information delivered via mobile applications?
  • Traveler information delivered through traffic media access to TMC/ TOC data/ information?
1.90 3.47 27% 90% 82.7%
4.3.1.6 Are motorists provided with travel time estimates for route segments? 0.99 2.81% 12% 67% 183.6%


1 TIM SA respondents are asked to rate their progress as Low, Medium or High, values which are then translated into a numeric score ranging from 0-4, with 4 being the highest score possible per question. [ return to note 1 ]

2 Traffic Incident Management (TIM) Network Website, http://timnetwork.org [ return to note 2 ]

3 National Cooperative Highway Research Program (NCHRP) 07-20. Technical Guidance for Traffic Incident Management Performance Measurement Implementation. Project statement available online at http://apps.trb.org/cmsfeed/TRBNetProjectDisplay.asp?ProjectID=3160. Accessed 10/21/11. [ return to note 3 ]

Office of Operations