Office of Operations
21st Century Operations Using 21st Century Technologies

Executive Summary

Background

The Traffic Incident Management Self-Assessment (TIM SA) is a benchmarking tool for evaluating TIM program components and overall TIM program success. The Federal Highway Administration (FHWA) initiated development of the TIM SA in 2002 and the first assessments were conducted in 2003. While the TIM SA is intended to provide local TIM program managers with a way to assess progress, analysis of the aggregated TIM SA results also allows FHWA to identify program gaps and better target TIM program resources.

There are 80 FHWA-defined operational areas (States, regions, localities) in the annual TIM SA process. The original plan for the TIM SA was to have 40 of the operational areas complete a re-assessment in 2004 and the remaining 40 to do so in 2005. In 2006, the decision was made to have all 80 areas conduct the TIM SA on an annual basis. Since the inaugural TIM SA in 2003, additional TIM programs beyond the original 80 have completed and submitted the TIM SA for inclusion in the national analysis. A total of 86 TIM SA were submitted for the 2009 national analysis, the largest number submitted to date. Table ES1 shows the total number of new and re-assessments each year.

Table ES1. TIM SA Completed
Year New Assessments Re-Assessments Total Completed
2003 70 -- 70
2004 7 25 32
2005 1 41 42
2006 -- 70 70
2007 5 62 67
2008 2 74 76
2009 6 80 86

The TIM SA underwent a review and revision in 2007 to more closely align the TIM SA with current TIM state of practice and to create synergy with a number of complementary federal initiatives. The TIM SA Revision was completed in 2008. Among other changes, the Revision included a reduction in the number of questions from 34 to 31 and a renaming of the three primary categories of questions as follows:

  • Program and Institutional Issues was renamed Strategic.
  • Operational Issues was renamed Tactical.
  • Communication and Technology Issues was renamed Support.

In order to benchmark progress in the three sections, the initial assessments completed in 2003, 2004 and one in 2005 (78 in total) are used as the Baseline data against which subsequent years (2006 and beyond) are evaluated. Table ES2 shows the average score for each of the three TIM SA sections from the Baseline and 2009, along with the percentage change from the Baseline. The table also shows the high score achieved in each of the three program areas.

Table ES2. Mean Score for Each Section (Baseline and 2009)
Section # of Questions Mean Score Baseline Mean Score 2009 High Score 2009 (possible) % Change in scores from Baseline Section Weights
Strategic 11 36.3% 51.1% 28.2 (30) 40.9% 30%
Tactical 13 57.6% 68.8% 39.2 (40) 19.5% 40%
Support 7 41.3% 59.0% 30.0 (30) 42.8% 30%
Overall Total 31 45.9% 60.60% 96.6 (100) 31.9% 100%

Strategic

The questions in the Strategic section ask respondents to rate progress in how the TIM program is organized, resourced, supported and sustained. The Strategic questions also cover TIM performance measures. The Strategic questions have realized a 40.9 percent increase over the Baseline.

Despite progress in the Strategic area, the five questions receiving the lowest mean score in the TIM SA are in this section, with four of the five coming from the subsection on TIM Performance Measurement (Table ES3). The lowest scoring question on tracking performance in reducing secondary incidents was added as part of the TIM SA Revision and therefore does not have a Baseline against which to measure progress. In 2010 and subsequent TIM SA analyses, the 2009 mean score of 1.03 will become the Baseline for this question.

Table ES3. Lowest Mean Scores (2009)
Mean Score Rank in 2009/ Baseline Question Number Question 2009 Mean Score (n=76) % Scoring 3 or Higher (2009) % Change in 2009/Baseline Mean Scores
31/-- 4.1.3.5
Strategic
Track performance in reducing secondary incidents? 1.03 8% --
30/23 4.1.3.4
Strategic
Routinely review whether progress is made in achieving the targets? 1.63 26% 120.0%
29/24 4.1.3.1
Strategic
Have multi-agency agreement on the two performance measures being tracked (roadway clearance time and incident clearance time)? 1.66 26% 159.8%
28/19 4.1.3.2
Strategic
Is there a process in place to ensure the continuity of these agreements / memoranda of understanding through integrated planning and budgeting across and among participating agencies? 1.79 28% 32.6%
27/21 4.1.3.3 Strategic Have targets (i.e. time goals) for performance of the two measures? 1.84 33% 58.4%

The questions in TIM Performance Measurement are also among the questions that achieved the largest increase from the Baseline. Table ES4 shows that scores for three of the TIM Performance Measurement questions have more than doubled since the Baseline.

Table ES4. Largest Changes in Mean Score (2009 from Baseline)
Mean Score Rank in 2009/Baseline Question Number Question 2009 Mean Score (n=86) % Scoring 3 or Higher (2009) % Change in 2009 Mean Scores from Baseline
24/24 4.1.3.2
Strategic
Has the TIM program established methods to collect and analyze the data necessary to measure performance in reduced roadway clearance time and reduced incident clearance time? 1.97 33% 207.0%
29/24 4.1.3.1
Strategic
Have multi-agency agreement on the two performance measures being tracked? 1.66 26% 159.8%
30/23 4.1.3.4 Strategic Routinely review whether progress is made in achieving the targets? 1.63 26% 120.0%
22/22 4.3.2.2
Support
Are motorists provided with travel time estimates for route segments? 2.13 42% 114.9%
20/20 4.1.2.2 Strategic Conduct training?
  • NIMS training?
  • Training on the NTIMC National Unified Goal?
  • Other training?
2.16 49% 71.7%

The questions in Tactical focus on the policies and procedures used by field personnel when responding to incidents. This includes the policies and procedures specifically targeting motorist and responder safety. Collectively, these questions consistently score among the highest in the TIM SA and in 2009 this section achieved an overall score of 68.8 percent. Four of the five questions achieving the highest mean score are in the Tactical section (Table ES5).

The highest scoring question in the 2009 TIM SA on "move over" laws was added as part of the 2008 TIM SA Revision and therefore does not have a Baseline score. With 85 percent of the assessments scoring this question 3 or higher and with 47 states with "move over" laws already in place, the expectation is that this question will remain in the top five scoring questions in subsequent analyses.

Tactical

Table ES5. Highest Mean Scores (2009)
Mean Score Rank in 2009/Baseline Question Number Question 2009 Mean Score (n=86) % Scoring 3 or Higher (2009) % Change in 2009 Mean Scores from Baseline
1/-- 4.2.2.1 Tacticcal Have “move over” laws which require drivers to slow down and if possible move over to the adjacent lane when approaching workers or responders and equipment in the roadway? 3.20 85% --
2/2 4.2.1.3 Tactical Use a safety service patrol for incident and emergency response? 3.10 83% 13.7%
3/5 4.1.2.4 Strategic Conduct planning for special events? 3.09 88% 25.0%
4/4 4.2.1.4
Tactical
Utilize the Incident Command System? 3.08 76% 20.8%
4/1 4.2.1.6 Tactical Identify and type resources so that a list of towing, recovery and hazardous materials response operators (including operator capabilities and special equipment) is available for incident response and clearance? 3.08 74% 7.7%

In part due to the already high scores in the Tactical section, it is also the TIM SA section with the questions achieving the smallest increases in mean score from the Baseline. However, as shown in Table ES6, two of the three questions with little change over Baseline point to a need for additional guidance in hazardous materials incident response.

Table ES6. Smallest Changes in Mean Score (2009 from Baseline)
Mean Score Rank in 2009/Baseline Question Number Question 2009 Mean Score (n=86) % Scoring 3 or Higher (2009) % Change in 2009 Mean Scores from Baseline
15/3 4.2.1.7 Tactical Have specific policies and procedures for hazmat and fatal accident response that also address maintaining traffic flow around the incident? 2.50 56% -7.7%
4/1 4.2.1.6 Tactical Identify and type resources so that a list of towing, recovery and hazardous materials response operators (including operator capabilities and special equipment) is available for incident response and clearance? 3.08 74% 7.7%
2/2 4.2.1.3 Tactical Use a safety service patrol for incident and emergency response? 3.10 83% 13.7%

Support

The questions in Support focus on the tools and technologies enabling improved incident detection, response and clearance. The questions in Support collectively continue to experience the largest increase over the Baseline, up 42.8 percent. However, in 2009 the overall mean score declined slightly from the 2008 score of 59.4 to 59.0.

In the Data subsection, the highest scoring question is 4.3.1.1 on the use of a Traffic Management Center/Traffic Operations Center (TMC/TOC) to coordinate incident detection, notification and response. However, lower scores throughout this subsection indicate that the potential of TMCs/TOCs is not yet being fully realized due to several factors including limited co-location of public safety and transportation in the centers.

Summary

The 2009 TIM SA is the first completed following an extensive review and revision completed in 2008. As a result of the revision, several key changes were made to the TIM SA.

  • The three subsections were renamed.
  • The total number of questions was reduced from 34 to 31.
  • A new scoring approach was instituted which asked respondents to rate progress using High, Medium and Low rather than the numeric scoring of 0-4.
  • An online TIM SA was introduced to make it easier for participants to respond to the questions.

With a total of 86 TIM SA completed in 2009, it appears that the revisions had a positive impact on participation. The 86 assessments represent 80 re-assessments and six new locations submitting an assessment for the first time. An overall score of 60.6 percent was achieved, representing a 31.9 percent increase over the Baseline. The highest scores continue to be in the Tactical section and the largest percentage increase over Baseline was once again in the Support section.

Low scoring questions and those with the least improvement over Baseline indicate specific program areas where additional guidance from FHWA is warranted. This includes TIM Performance Measurement and in particular, additional guidance on secondary incident definitions and technical direction on tracking reductions in the occurrence of secondary incidents.

Office of Operations