Office of Operations
21st Century Operations Using 21st Century Technologies

2011 Traffic Incident Management National Analysis Report
Executive Summary

U.S. Department of Transportation Federal Highway Administration

U.S. Department of Transportation
Federal Highway Administration
Office of Operations
1200 New Jersey Avenue, SE
Washington, DC 20590

December 2011

Download the Printable Version [PDF, 205KB]
You may need the Adobe Reader to view the PDFs on this page.


Table of Contents

Background

Strategic

Tactical

Support

Opportunities for TIM Stakeholders

Multi-agency Coordination
TIM Performance Measures
Safe, Quick Clearance Laws and Policies

Leveraging Other Programs

National Traffic Incident Management Coalition (NTIMC) and the TIM Network
Developing a Framework for Emergency Responder/Roadside Worker Struck-by/Near-miss Database
Technical Guidance for Traffic Incident Management Performance Measurement Implementation
Traffic Incident Management Responder Training

Summary

Appendix A. Summary of 2011 TIM SA Results

List of Tables

Table ES1. Mean Score for Each Section (Baseline and 2011)
Table ES2. 2011 TIM SA Results for the Strategic Section
Table ES3. 2011 TIM SA Results for the Tactical Section
Table ES4. 2011 TIM SA Results for the Support Section


Background

The Traffic Incident Management Self-Assessment (TIM SA) was developed by the Federal Highway Administration (FHWA) as a benchmarking tool for evaluating TIM program components and overall TIM program success. Development of the TIM SA was initiated in 2002, and the first assessments were conducted in 2003. The TIM SA serves several functions. Through the TIM SA, State and local TIM program managers are able to assess progress and identify areas for improvement at State and local levels. Similarly, analysis of the aggregated TIM SA results allows FHWA to identify program gaps and better target TIM program resources.

The 2011 TIM SA had a record number of assessments submitted: a total of 93 locations completed a TIM SA for inclusion in the national analysis. A revision of the TIM SA was completed in 2011, and the results of this year's TIM SA reflect those modifications. Among the changes, the latest TIM SA revision included an increase in the number of questions from 31 to 34. Some existing questions were also modified to reflect current TIM practice. Additionally, clarifying instructions were added to key questions to reduce subjectivity.

The 34 questions were grouped into three sections: Strategic, Tactical, and Support. In order to benchmark progress for each question and the three sections over time, the initial assessments completed in 2003, 2004, and one in 2005 (78 in total) have been used each year as the Baseline.

Table 1 shows the average score for each of the three TIM SA sections from the Baseline and 2011, along with the percentage change from the Baseline. The table also shows the high score achieved in each of the three program areas. The overall mean score for the 2011 TIM SA was 68.2 percent, representing a 42.3-percent increase over the Baseline. A listing of all 34 TIM SA questions, their respective Baseline and 2011 scores, and the percentage of programs scoring each question 3 or higher1 can be found in Appendix A.

Table ES1. Mean Score for Each Section (Baseline and 2011)
Section No. of
Questions
Mean Score High Score
2011
(possible)
% Change
in Scores
From
Baseline
Section
Weights
Baseline 2011
Strategic 12 35.0% 53.4% 30.0 (30) 52.5% 30%
Tactical 16 64.1% 75.3% 39.7 (40) 17.4% 40%
Support 6 39.4% 73.7% 29.6 (30) 87.0% 30%
Overall Total 34 48.0% 68.2% 95.8 (100) 42.3% 100%

Strategic

The questions in the Strategic section asked respondents to rate progress in how the TIM program is organized, resourced, supported, and sustained. Key elements of this section include multi-agency coordination and TIM performance measures. While the Strategic section had the lowest score of the three sections (53.4%), the questions in this section have realized a 52.5-percent increase over the Baseline, indicating improvement in this area.

Despite progress in the Strategic area, the five questions receiving the lowest mean score in the TIM SA were all in this section, with four out of five coming from the subsection on TIM Performance Measurement. The questions on TIM Performance Measurement have consistently been among the lowest scoring on the TIM SA. The TIM Performance Measurement subsection focused on three key metrics: roadway clearance time, incident clearance time, and reduction of secondary incidents. Of the three performance measures, reduction in secondary incidents (Question 4.1.3.5) had the lowest score (0.87). This measure also saw a significant decrease in mean score of 32 percent from 2010 to 2011. This drop puts the 2011 mean score 15.4 percent below the Baseline, one of only two questions to perform below the Baseline level. Furthermore, the 0.87 score makes it the lowest scoring individual question in the 2011 TIM SA. Nearly half of respondents (46.2%) stated that there was "no activity" in this area. The lack of activity was not unique to smaller locations, as many top ten metropolitan areas reported "no activity" as well. Despite the low score, many respondents commented that their TIM program has very recently started to track secondary incidents and that data should be available for the 2012 TIM SA. However, some respondents commented that it is difficult to define secondary incidents and even more difficult to track reductions in their occurrence. Further guidance on this subject should continue to be a priority for FHWA.

As a whole, the mean score for the TIM Performance Measurement subsection has more than doubled since the Baseline. However, the 2011 mean score is below the 2010 score of 1.84. Given that there were no significant revisions to this subsection, the drop in score is likely attributed to changes in performance measure activities. Numerous comments indicate funding difficulties at TIM programs across the country, which could be one explanation for the drop in score because collecting and analyzing performance measures requires investments in technology and staff resources.

Another area of concern that was identified by low scores focused on multi-agency coordination. Question 4.1.2.1 dealt with the multi-agency agreements/MOUs used to structure TIM programs. This question was modified as part of the 2011 TIM SA, and four composite questions were created to yield a more nuanced score:

  1. Is the agreement/MOU signed by top officials from participating agencies?
  2. Are incident scene roles and responsibilities for each participating agency clearly defined in the agreement and communicated to all participating agencies?
  3. Are agency roles and responsibilities for planning for and funding for the TIM program clearly defined in the agreement/MOU?
  4. Are safe, quick clearance goals stated as time goals for incident clearance (e.g., 90 minutes) in the agreement/MOU?

This question scored 1.74 in the 2011 TIM SA, which is only a 1.6-percent increase over Baseline. Part C of this question, defined agency roles for planning and funding, scored the lowest of the four composite questions (0.98). While all four composite questions have room for improvement, clearly defining multi-agency planning and funding roles and responsibilities in the MOA/MOU is the most problematic.

In addition to the scored questions, the TIM SA prompts respondents for additional information about their respective programs through supplemental, non-scored questions. Question 4.1.2.1 contained two supplemental, non-scored questions that asked how frequently the agreements/MOUs were updated and which agencies were primary signatories on the agreement. "As needed" was the most frequently cited response, followed closely by "Has not been updated." TIM programs that lack a planned, systematic review and update process could experience difficulty maintaining continuity, particularly if there is turnover among the primary contacts in each of the participating agencies.

Continuing in the theme of multi-agency coordination, question 4.1.2.2 dealt with multi-agency TIM planning. The mean score for this question was 2.12, an improvement of 56.9 percent compared to Baseline, representing the largest increase in this subsection. While this improvement is laudable, the score for this question is still relatively low. The comments submitted for this question generally indicated that TIM planning was taking place within individual agencies but was not being performed across agencies.

The highest score in the Strategic section was achieved by the question on planning for special events (4.1.1.4), which achieved a mean score of 3.28. The score for this question was the composite average of individual scores in planning for the following types of events: Construction and Maintenance; Sporting Events, Concerts, Conventions; Weather-related Events, and Catastrophic Events. Among those categories, Construction and Maintenance (4.1.1.4.a) and Sporting Events, Concerts, Conventions (4.1.1.4.b) achieved the highest mean scores of 3.38 and 3.37, respectively. The advanced notice of these events affords opportunities for planning, resulting in higher scores.

Tactical

The questions in the Tactical section focused on the policies and procedures used by field personnel when responding to incidents. These included the policies and procedures specifically targeting motorist and responder safety. Collectively, these questions consistently score among the highest in the TIM SA, and in 2011 this section achieved an overall score of 75.3 percent, making it the highest scoring of the three sections. Three of the five questions achieving the highest mean score were in the Tactical section.

One of the key elements of the Tactical section is the presence and execution of three core safe, quick clearance (SQC) laws. Question 4.2.2.1 on Move Over laws received the highest mean score (3.53) in the Tactical section, indicating a high degree of success in promulgating Move Over laws. Question 4.2.1.1 on Authority Removal had a mean score in 2011 of 2.99. The third SQC law, Driver Removal (4.2.1.2), scored 2.98 in 2011. The 2011 mean score for driver removal represented a 6-percent decrease compared to 2010. This drop is likely due to qualifying language added to the three SQC law questions in the 2011 TIM SA asking respondents to consider whether or not the laws are effectively communicated to responders and drivers. All three laws had lower scores in the communication element of the composite score, indicating that simply having a law is only part of the process for improving SQC.

Respondents indicated that the Driver Removal and Move Over laws were generally communicated to drivers through both static signs and dynamic message boards. In terms of authority removal laws, there were many comments that indicated confusion and hesitation by responders to utilize Authority Removal. Better training on the benefits and process behind Authority Removal may be necessary in areas with lower scores. There may also be some issues with the enforcement of Move Over and Driver Removal laws. While 90 percent of respondents indicated that Move Over laws were being enforced by police officers, there have been anecdotal reports that citations are not being upheld by the court system.

The lowest scoring question in the Tactical section dealt with mutually understood equipment staging and lighting procedures to maximize traffic flow around the incident while protecting responders (4.2.2.5). Though it has increased 53 percent over the Baseline, the relatively low mean score of 2.11 points to continued challenges in achieving consensus on how responder equipment should be staged and how responder lights should be deployed and eventually shed as the incident moves toward clearance. It is also important to note that this question was refined in 2011 by adding composite questions to better assess TIM programs in this area. These changes are likely responsible for the 4-percent drop in scores from 2010 to 2011. The four specific types of procedures queried in this question received the following scores:

  • Personal Protective Equipment (PPE) used by responders: 3.20
  • Vehicle and equipment staging procedures: 2.60
  • Light-shedding procedures: 1.82
  • Pre-established, signed accident investigation sites: 0.83

The change to composite scoring for 4.2.2.5 created the opportunity for a more nuanced analysis of the deficiencies in staging and lighting procedures. The use of PPE by responders received the highest score of the four procedures analyzed. In terms of vehicle and equipment staging procedures, there was continued evidence that fire, law enforcement, and transportation can sometimes disagree on how response equipment should be deployed to protect responders. Light-shedding procedures and accident investigation sites scored the lowest of the four procedures, clearly indicating room for improvement.

In addition to questions that achieved low mean scores, some questions experienced very little improvement over the baseline. One question in this section that did not perform well compared to the Baseline was question 4.2.1.3 on the use of Safety Service Patrols (SSPs) for incident response. While the mean score of 2.76 is fair, it only represents a 1.2-percent increase over the Baseline. One potential reason for the stagnant score could be related to the scoring definition provided to better standardize responses. This year's TIM SA provided respondents with a series of FHWA definitions to guide their responses. Another potential reason for the decline could be ongoing State budget deficits that may be negatively impacting SSP funding. There was nothing in the comments to suggest this was the reason for the decline; however, previous focus group work with SSP practitioners has highlighted State budget challenges as a significant problem in fully funding SSP functions.

Another area in need of improvement is fatal accident procedures. Two new supplemental questions were added to the 2011 TIM SA that addressed early notification of the medical examiner (ME) and removal of the deceased prior to medical examiner arrival. Of those that answered the supplemental questions, 59 percent indicated there was some type of early notification. However, many respondents noted that there was a shortage of medical examiners, and it was difficult to get the medical examiners to show up promptly. Only 29 percent of respondents indicated there was a policy in place for removal of the deceased prior to medical examiner arrival. The comments for the medical examiner questions indicated that there could be better coordination with the medical examiner as part of TIM multi-agency task forces.

Responses to the questions on accident investigation and quick clearance procedures also indicated need for improvement. Many areas lacked procedures for expedited accident reconstruction and investigation, as evidenced by the relatively low score for that question (2.59). With many TIM programs implementing clearance time goals, it is important to streamline as many of the elements of incident response and clearance as possible. Along those lines, a supplemental question was added to the 2011 TIM SA to ask if there was an incentive program for towing operators to expedite removal of commercial vehicle or spilled cargo incidents. Only 22.2 percent answered "Yes" to that question. Incentive programs can lead to quicker incident clearance times for more severe incidents and should be considered by areas that do not have such programs in place.

Support

The questions in the Support section focused on the tools and technologies enabling improved incident detection, response, and clearance. Without the infrastructure and back office support for incident information exchange, the detection, verification, response, and clearance times are delayed and responder and motorist safety is jeopardized. As a result, one of the three key objectives of the National Unified Goal for Traffic Incident Management is rapid, reliable, interoperable communications.

The Support section had the second highest overall score of 73.7 percent and had the largest increase compared to the Baseline of the three sections (87.0 percent). Significant progress in this section indicates that technology and data analysis are becoming increasingly prevalent in TIM operations.

The use of a Traffic Management Center/Traffic Operations Center (TMC/TOC) to coordinate incident detection, notification, and response (4.3.1.1) scored the highest of any question in the overall TIM SA with a mean score of 3.54, representing a 78.7-percent increase over Baseline. Comments for this question indicated that most respondents had a TMC or TOC; however, some mid-sized metropolitan areas continued to lack this critical resource for efficiently managing incident response.

Another area of success in this section was data/video sharing between agencies. This question (4.3.1.2) scored well (3.17), increasing 121.8 percent over Baseline. Technological advances in recent years are facilitating the exchange of data and video between agencies, though the comments suggested that video sharing was not as prevalent as data sharing. The comments also indicated that this question's score should continue to improve in the coming years as several locations noted that data/video sharing agreements were in the works.

Traveler information services have also dramatically increased in score compared to the Baseline as a result of technological advances. The provision of travel time estimates to motorists (4.3.2.2) achieved one of the highest percentage increases in 2011 from the Baseline (188.9 percent). The comments suggested that use of these technologies is emerging not only in major metropolitan areas but in smaller metropolitan areas as well. Question 4.3.2.1 queried respondents on ways that traveler information is delivered, specifically 511/website, mobile application, and through traffic media access to TMC/TOC data/information. All three types of communication had a mean score above 3, with 511/website leading at 3.56, followed by media access at 3.44 and mobile application at 3.06. Considering the recent emergence of mobile applications, a score over 3.0 is an indicator of how quickly the technology is being deployed for TIM purposes.

Question 4.3.1.3, which dealt with procedures for traffic management during incident response, had the lowest score in the Support section (2.14) and the smallest change compared to the Baseline (38.1 percent). This question was slightly revised in 2011 to add composite questions on signal timing changes and pre-planned detour routes. Signal timing changes had the lower mean score of 1.84, compared to 2.44 for pre-planned detour routes. The respondents indicated through the comments that, in many places, the ability to remotely change signal timing did not exist. Often that ability resides with local municipalities and must be done on-site. Respondents noted that cross-agency and cross-jurisdictional coordination was generally the biggest barrier to the use of preplanned detours.

That question also contained two supplemental, non-scored questions on the utilization of HOV (High Occupancy Vehicle) lane opening/closing and ramp metering for traffic management. Only 22.5 percent of respondents indicated that they used the opening/closing of HOV lanes for traffic management purposes during incident response. Likewise, only 25.9 percent utilized ramp metering in similar circumstances. This is likely the result of the limited number of TIM SA respondent locations equipped with HOV lanes and/or ramp metering.

Another area in need of some improvement dealt with interoperable, interagency communications between responders. The mean score for this question (4.3.1.4) was fair at 2.62 and was a 63.0-percent increase over the Baseline. Despite improvement in this area, there is still work to be done to improve interoperable, interagency communications because 17.2 percent of respondents scored this question "Low," unchanged from the 2010 TIM SA. The comments also suggested that even in some locations where the technology was in place, the use of interoperable, interagency communications was not part of multi-agency training. TIM program managers have long understood that the inability for responders to communicate on-scene is a significant obstacle to safe, effective incident response and clearance.

Opportunities for TIM Stakeholders

One of the key purposes of the TIM SA is to identify TIM program areas where resources can be deployed to address gaps, both at the local level and nationally. First and foremost, a review of the questions that achieved the lowest mean scores highlights program areas that are in the most need of attention. However, an analysis of program areas that did not advance the mean score from year to year, regardless of the numeric value of the score, presents additional opportunities for TIM stakeholders to address program gaps.

Multi-agency Coordination

Through the analysis of the 2011 TIM SA responses, a consistent deficiency has emerged relating to multi-agency coordination. Many of the lowest scoring questions contained a multi-agency element. The five lowest scoring questions were in the Strategic section, which focused on multi-agency TIM teams, formalized TIM programs, and performance measures that are intended to transcend agency borders. In some locations, respondents indicated that even if multi-agency coordination occurred, there was little to no formalized process behind the collaboration. Lacking defined incident scene roles, training procedures, and multi-agency communication can often lead to confusion, inefficiencies, and possibly even hazardous conditions for on-scene responders.

Similarly, as part of the National Traffic Incident Management Coalition's (NTIMC) Strategic Direction Setting Webinar (held September 8, 2011), TIM practitioners identified Multi-agency TIM Teams as the top priority area where additional assistance is needed and where NTIMC resources should be focused in providing help. While there have been many successes in developing strong multi-agency relationships, there continues to be a need for significant outreach work on the benefits of multi-agency coordination and the steps for achieving a true multi-agency TIM program.

TIM Performance Measures

As is the case each year, questions on TIM Performance Measures are some of the lowest scoring questions but also tend to have the largest increase over the Baseline. This year, four of the five questions achieving the lowest mean scores in 2011 were in the TIM Performance Measurement subsection. The lowest score overall was achieved in secondary incident tracking (4.1.3.5), which was also the lowest scoring question in 2010. The secondary incident question also experienced a marked decline in mean score (-15.4 percent) compared to the Baseline.

TIM Performance Measurement is admittedly a difficult element of TIM practice to implement. Collecting and analyzing data to track multi-agency incident and clearance times is difficult, and tracking secondary incidents, which can be hard to define, can be even more challenging. That is why it is critical that FHWA continues to provide support at the national level for TIM performance measurement through programs such as the TIM Performance Measures Focus States Initiative and the TIM Performance Measures Knowledgebase. These programs should be credited for contributing to the substantial increases in scores over the Baseline. It appears likely that future surface transportation reauthorization bills will have a significant performance measures component. TIM programs that proactively move toward performance measurement not only will gain a better understanding of their TIM program operations but will be prepared for any future performance measurement requirements. However, despite the initial successes of FHWA outreach programs, there is still a need for additional guidance in the area of secondary incidents.

Safe, Quick Clearance Laws and Policies

The 2011 TIM SA added new questions on SQC laws and policies to gauge the implementation of those laws. This was done to clarify the differences between simply having SQC laws in place and their effective use by responders. Move Over laws that drivers do not know about or are not enforced do not increase responder safety. Driver Removal laws that motorists do not understand will still result in vehicles unnecessarily blocking travel lanes. Decision makers who oppose implementation of Authority Removal due to liability concerns will contribute to spilt cargo and disabled vehicles impeding traffic flow around incidents. As expected, all three questions relating to the utilization and understanding of SQC laws scored lower than the questions on the existence of the law.

TIM stakeholders must continue education and outreach to decision makers about the critical need to have in place and enforce all three SQC laws. Furthermore, FHWA can provide leadership on the outreach and education messages and tools for drivers to ensure compliance with those laws.

Leveraging Other Programs

There are a number of concurrent efforts underway that can and should be leveraged to improve TIM performance and, therefore, increase TIM SA scores.

National Traffic Incident Management Coalition (NTIMC) and the TIM Network

The NTIMC is composed of TIM stakeholder organizations, and the TIM Network represents practitioner-level involvement across the range of TIM disciplines. The value of both the NTIMC and the TIM Network was acknowledged by participants in the NTIMC Strategic Direction Setting Webinar (held September 8, 2011). In the post-webinar survey, respondents indicated that the NTIMC and its TIM Network should be utilized primarily for: 1) serving as a source for best practices publications; 2) providing information about available TIM training; and 3) providing guidance on TIM-related Federal initiatives. As the NTIMC continues its strategic direction-setting activity, opportunities for utilizing the NTIMC and the TIM Network for these three distinct purposes should advance TIM programs and have a positive impact on TIM SA scores going forward.

Developing a Framework for Emergency Responder/Roadside Worker Struck-by/Near-miss Database

This is the first priority study to be advanced from the National Cooperative Highway Research Program (NCHRP) 20-7 (282) Research Needs Assessment for Roadside Worker and Vehicle Visibility initiative completed in early 2011. It has been funded and will be conducted as a separate NCHRP 20-7 special study, laying the groundwork for architecting the Struck-by/Near-miss database. The database is widely recognized as the first critical step in understanding the root causes of incident responder struck-by/near-miss incidents and developing training and best practices to mitigate those incidents.

Technical Guidance for Traffic Incident Management Performance Measurement Implementation

This study is planned as part of the NCHRP, administered by the Transportation Research Board (TRB). Its objective is to "develop technical guidelines and related resources to assist DOTs in standardizing TIM PM terminology, data standards, data collections, and data analysis."2 Once completed, additional advances in the scores for the TIM Performance Measures should occur.

Traffic Incident Management Responder Training

The Strategic Highway Research Program (SHRP 2) Traffic Incident Responder Training curriculum development and pilot testing were completed in early 2011. As part of the SHRP 2 pre-implementation activities, the national curriculum will undergo additional pilot testing and refinement through mid-2012. The knowledge gaps identified in the TIM SA have been utilized to date to populate the training messages included in the curriculum and should continue to do so as the training is pilot tested and finalized.

Summary

The 2011 TIM SA was revised to reflect changes in the current state of TIM practice and to reduce the subjectivity of responses. A total of 93 TIM SAs were completed in 2011, with an average overall score of 68.2 percent (out of a possible 100 percent). Overall scores were up 42.3 percent over the Baseline scores. The highest scores were achieved in Tactical (75.3 percent), and the largest percentage increase in scores from the Baseline was in Support (87.0 percent).

Low-scoring questions and those with the least improvement over Baseline indicate specific program areas where additional guidance from FHWA may be warranted. Questions dealing with multi-agency cooperation tended to receive some of the lowest scores, indicating the need for additional cross-agency communication and planning. TIM Performance Measurement is a perennial low-scoring subsection. In particular, tracking reductions in the occurrence of secondary incidents is almost non-existent at this point. The effective implementation of SQC laws was another area in need of improvement. This year, questions on the enforcement and communication of SQC laws were added to address implementation. While many areas have enacted the laws, they must also be effectively implemented in order to have the desired impacts on safety.

Appendix A. Summary of 2011 TIM SA Results


Table ES2. 2011 TIM SA Results for the Strategic Section
Question
Number
Question Mean Score
Range = 0 to 4
% of
Assessments
Scoring 3
or Higher
% Change
in 2011
Mean
Scores
From
Baseline
Baseline 2011 Baseline 2011
4.1.1.1 Have a TIM multi-agency team or task force which meets regularly to discuss and plan for TIM activities? 1.90 2.66 28% 59% 39.8%
4.1.1.2

Is multi-agency training held at least once a year on TIM-specific topics?

  • NIMS/ICS 100
  • Training of mid-level managers from primary agencies on the National Unified Goal?
  • Traffic control?
  • Work zone safety?
  • Safe parking?
1.26 2.37 9% 58% 87.7%
4.1.1.3 Conduct multi-agency post-incident debriefings? 1.62 2.56 18% 55% 58.0%
4.1.1.4

Conduct planning for special events?

  • Construction and maintenance?
  • Sporting events, concerts, conventions, etc?
  • Weather-related events?
  • Catastrophic events?
2.47 3.28 35% 88% 33.0%
4.1.2.1

Is the TIM program supported by multi-agency agreements/memoranda of understanding?

  • Is the agreement/MOU signed by top officials from participating agencies?
  • Are incident scene roles and responsibilities for each participating agency clearly defined in the agreement and communicated to all participating agencies?
  • Are agency roles and responsibilities for planning for and funding for the TIM program clearly defined in the agreement/MOU?
  • Are safe, quick clearance goals stated as time goals for incident clearance (e.g., 90 minutes) in the agreement/MOU?
1.71 1.74 18% 37% 1.6%
4.1.2.2 Is planning to support the TIM activities done across and among participating agencies? 1.35 2.12 12% 40% 56.9%
4.1.2.3 Is there someone from at least one of the participating agencies responsible for coordinating the TIM program as their primary job function? *NB 2.28 *NB 45% N/A
4.1.3.1

Have multi-agency agreement on the two performance measures being tracked?

  • Roadway clearance time?
  • Incident clearance time?
0.64 1.94 3% 41% 203.3%
4.1.3.2 Has the TIM program established methods to collect and analyze the data necessary to measure performance in reduced roadway clearance time and reduced incident clearance time? 0.64 2.18 3% 47% 241.1%
4.1.3.3 Have targets (e.g. time goals) for performance of the two measures? 1.16 1.84 4% 38% 58.5%
4.1.3.4 Routinely review whether progress is made in achieving the targets? 0.74 1.78 3% 32% 141.2%
4.1.3.5 Track performance in reducing secondary incidents? 1.03 0.87 8% 6% -15.4%

*NB=New Baseline. This indicates a new question to the TIM SA. The data for this question will be used in Baseline calculations for this year's report and going forward.

Table ES3. 2011 TIM SA Results for the Tactical Section
Question
Number
Question Mean Score
Range = 0 to 4
% of
Assessments
Scoring 3
or Higher
% Change
in 2011
Mean
Scores
From
Baseline
Baseline 2011 Baseline 2011
4.2.1.1

Have "authority removal" laws allowing pre-designated responders to remove disabled or wrecked vehicles and spilled cargo?

  • Is there an "authority removal" law in place?
  • Is it understood and utilized by responders?
2.92 2.99 67% 76% 2.6%
4.2.1.2

Have "driver removal" laws which require drivers involved in minor crashes (not involving injuries) to move vehicles out of the travel lanes?

  • Is there a "driver removal" law in place?
  • Is it communicated to motorists?
3.01 2.98 71% 74% -0.9%
4.2.1.3 Use a safety service patrol for incident and emergency response? 2.73 2.76 67% 73% 1.2%
4.2.1.4 Utilize the Incident Command System on-scene? 2.55 3.41 58% 84% 33.7%
4.2.1.5 Have response equipment pre-staged for timely response? 2.21 2.84 41% 68% 28.4%
4.2.1.6 Identify and type resources so that a list of towing and recovery operators (including operator capabilities and special equipment) is available for incident response and clearance? 2.86 3.28 67% 77% 14.7%
4.2.1.7 Identify and type resources so that a list of HazMat contractors (including capabilities and equipment) is available for incident response? 2.89 3.16 69% 75% 9.4%
4.2.1.8 Does at least one responding agency have the authority to override the decision to utilize the responsible party's HazMat contractor and call in other resources? *NB 3.22 *NB 81% N/A
4.2.1.9 In incidents involving fatalities, is the Medical Examiner response clearly defined and understood? 2.53 2.96 55% 69% 16.9%
4.2.1.10 Are there procedures in place for expedited accident reconstruction/ investigation? *NB 2.59 *NB 55% N/A
4.2.1.11 Is there a policy in place for removal of abandoned vehicles? *NB 3.47 *NB 87% N/A
4.2.2.1

Have "move over" laws which require drivers to slow down and if possible move over to the adjacent lane when approaching workers or responders and equipment in the roadway?

  • Is there a "move over" law in place?
  • Is it communicated to drivers?
3.20 3.53 85% 92% 10.4%
4.2.2.2 Train all responders in traffic control following MUTCD guidelines? 1.97 2.86 28% 65% 45.2%
4.2.2.3 Routinely utilize transportation resources to conduct traffic control procedures for various levels of incidents in compliance with the MUTCD? 1.93 3.32 27% 81% 72.2%
4.2.2.4 Routinely utilize traffic control procedures for the end of the incident traffic queue? 1.56 2.68 17% 54% 71.6%
4.2.2.5

Have mutually understood equipment staging and emergency lighting procedures on-site to maximize traffic flow past an incident while providing responder safety?

  • Vehicle and equipment staging procedures?
  • Light-shedding procedures?
  • PPE used by responders?
  • Pre-established, signed accident investigation sites?
1.38 2.11 14% 51% 53.1%

*NB=New Baseline. This indicates a new question to the TIM SA. The data for this question will be used in Baseline calculations for this year's report and going forward.

Table ES4. 2011 TIM SA Results for the Support Section
Question
Number
Question Mean Score
Range = 0 to 4
% of
Assessments
Scoring 3
or Higher
% Change
in 2011
Mean
Scores
From
Baseline
Baseline 2011 Baseline 2011
4.3.1.1 Does the TIM program use a Traffic Management Center/Traffic Operations Center to coordinate incident detection, notification, and response? 1.98 3.54 41% 89% 78.7%
4.3.1.2 Is there data/video sharing between agencies? 1.43 3.17 10% 75% 121.8%
4.3.1.3

Does the TIM program have specific policies and procedures for traffic management during incident response?

  • Signal timing changes?
  • Pre-planned detour and alternate routes identified and shared between agencies?
1.55 2.14 18% 49% 38.1%
4.3.1.4 Does the TIM program provide for interoperable, interagency communications onsite between incident responders? 1.61 2.62 17% 59% 63.0%
4.3.2.1

Have a real-time motorist information system providing incident-specific information?

  • Traveler information delivered via 511/ website?
  • Traveler information delivered via mobile applications?
  • Traveler information delivered through traffic media access to TMC/TOC data/information?
1.90 3.35 27% 89% 76.6%
4.3.2.2 Are motorists provided with travel time estimates for route segments? 0.99 2.86 12% 66% 188.9%

  1. TIM SA respondents are asked to rate their progress as Low, Medium, or High, values that are then translated into a numeric score ranging from 0 to 4, with 4 being the highest score possible per question. [Return to text]
  2. National Cooperative Highway Research Program (NCHRP) 07-20. Technical Guidance for Traffic Incident Management Performance Measurement Implementation. Project statement available online at http://apps.trb.org/cmsfeed/TRBNetProjectDisplay.asp?ProjectID=3160. Accessed 10/21/11. [Return to text]
Office of Operations