Office of Operations
21st Century Operations Using 21st Century Technologies

Executive Summary

Background

The Traffic Incident Management Self-Assessment (TIM SA) provides a means for evaluating progress in achievement of individual TIM program components and overall TIM program success. Now in its sixth year, the TIM SA also has allowed the Federal Highway Administration (FHWA) to identify program gaps and target resources to TIM program advancement.

There are 80 FHWA-defined operational areas (States, regions, localities) in the annual TIM SA process. It should be noted that the original plan for the TIM SA was to have all 80 operational areas submit a baseline assessment in 2003, with re-assessments conducted by 40 areas in 2004 and the other 40 areas in 2005. The decision was made in 2006 to have all 80 areas conduct the TIM SA on an annual basis. Since its launch in 2003, additional operational areas beyond the original 80 have completed and submitted the TIM SA for inclusion in the national analysis. Table 1 shows the total number of new and re-assessments completed each year.

Table 1. TIM SA Completed
Year New Assessments Re-Assessments Total Completed
2003 70 -- 70
2004 7 25 32
2005 1 41 42
2006 -- 70 70
2007 5 62 67
2008 2 74 76

Each year FHWA Division Office personnel in the 80 operational areas are asked to collaborate with Regional, State, and local TIM stakeholders to come to consensus on a score for each of 34 TIM program questions, grouped in the following sections:

  • Program and Institutional Issues (Strategic).
  • Operational Issues (Tactical).
  • Communication and Technology Issues (Support).

For analysis purposes, the initial assessments completed in 2003, 2004, and one in 2005 (78 in total) are used as the Baseline data against which subsequent years (2006 and beyond) are evaluated. Table 2 shows the average score for each TIM SA section from the Baseline and 2008, along with the percentage change from the Baseline.

Table 2. Mean Score for Each Section (Baseline and 2008)
Section # of Questions Mean Score Baseline Mean Score 2008 % Change in scores from Baseline Section Weights
Program and Institutional Issues 12 36.30% 51.00% 40.50% 30%
Operational Issues 14 57.60% 66.20% 15.00% 40%
Communications and Technology Issues 8 41.30% 59.40% 43.80% 30%
Overall Total 34 45.90% 59.60% 29.90% 100%

Program and Institutional Issues

Program and Institutional Issues, also referred to as Strategic Issues, are those that address how a program is organized; its objectives and priorities; agency roles and relationships; resource allocation; and performance measurement. The increase in Program and Institutional Issues in 2008 from the Baseline was 40.5 percent.

Within Program and Institutional Issues, the questions in the subsection on TIM Performance Measures have consistently received the lowest mean scores in the TIM SA (Table 3) while simultaneously achieving some of the largest percentage changes from the Baseline (Table 4). This was true again in 2008. Overall, the 2008 scores for the four TIM Performance Measures questions increased 84.5 percent over the Baseline scores.

Table 3. Lowest Mean Scores (2008)
Mean Score Rank in 2008/ Baseline Question Number Question 2008 Mean Score (n=76) % Scoring 3 or Higher (2008) % Change in 2008/Baseline Mean Scores
34/32 4.1.3.4
Program and Institutional Issues
Conduct periodic review of whether or not progress is being made to achieve targets? 1.36 19% 83.80%
33/33 4.1.3.2
Program and Institutional Issues
Have agreed upon methods to collect and analyze/track performance measures? 1.46 12% 127.60%
32/34 4.1.3.1
Program and Institutional Issues
Have multi-agency agreements on what measures will be tracked and used to measure program performance? 1.52 13% 136.70%
31/30 4.1.3.3
Program and Institutional Issues
Have established targets for performance (Response, Clearance)? 1.57 13% 35.60%

Table 4. Largest Changes in Mean Score (2008 from Baseline)
Mean Score Rank in 2008/Baseline Question Number Question 2008 Mean Score (n=76) % Scoring 3 or Higher (2008) % Change in 2008 Mean Scores from Baseline
32/34 4.1.3.1 Program and Institutional Issues Have multi-agency agreements on what measures will be tracked and used to measure program performance? 1.52 13% 136.70%
33/33 4.1.3.2 Program and Institutional Issues Have agreed upon methods to collect and analyze/track performance measures? 1.46 12% 127.60%
34/32 4.1.3.4 Program and Institutional Issues Conduct periodic review of whether or not progress is being made to achieve targets? 1.36 19% 83.80%

Operational Issues

Operational Issues, also referred to as Tactical Issues, address the policies, procedures, and processes used in the field while responding to an incident. Consistently year after year, questions in Operational Issues continue to be among the highest scoring (Table 5). This has proven to be true even among the emerging programs completing the TIM SA, where the section on Operational Issues is the highest scoring of the three sections.

Table 5. Highest Mean Scores (2008)
Mean Score Rank in 2008/Baseline Question Number Question 2008 Mean Score (n=76) % Scoring 3 or Higher (2008) % Change in 2008 Mean Scores from Baseline
1/4 4.2.1.3 Operational Issues Have a pre-identified (approved) contact list of resources (including special equipment) for incident clearance and hazardous materials response? 3.2 86% 11.90%
2/1 4.2.1.2 Operational Issues Identify high-ranking agency members available on 24/7 basis to respond to a major incident (Major Incident Response Team)? 3.16 78% 9.00%
3/1 4.2.3.5 Operational Issues Have a pre-qualified list of available and contracted towing and recovery operators (to include operators' capabilities)? 2.99 76% 3.30%
4/6 4.2.3.1 Operational Issues Utilize the Incident Command System? 2.92 78% 14.60%
5/8 4.1.2.5 Program and Institutional Issues Conduct planning for "special events?" 2.87 54% 16.20%

The five questions realizing the least upward movement in mean score are all in Operational Issues (Table 6). This could be a reflection of the fact that many programs have reached a level of satisfaction in these operational areas and are concentrating their resources on other TIM areas.

Table 6. Smallest Changes in Mean Score (2008 from Baseline)
Mean Score Rank in 2008/Baseline Question Number Question 2008 Mean Score (n=76) % Scoring 3 or Higher (2008) % Change in 2008 Mean Scores from Baseline
8/3 4.2.3.3 Operational Issues Have specific policies and procedures for hazardous materials response that also address maintenance of traffic flow? 2.73 66% -5.6%
11/7 4.2.3.2 Operational Issues Have specific policies and procedures for fatal accident investigation that also address maintenance of traffic flow? 2.61 55% 3.1%
3/2 4.2.3.5 Operational Issues Have a pre-qualified list of available and contracted towing and recovery operators (to include operators' capabilities)? 2.99 76% 3.3%
6/5 4.2.3.6 Operational Issues Use motorist assist service patrols? 2.87 76% 5.0%
2/1 4.1.2.2 Operational Issues Identify high-ranking agency members available on 24/7 basis to respond to a major incident (Major Incident Response Team)? 3.16 78% 9.0%

Communications and Technology Issues

The questions in Communication and Technology Issues, also referred to as Support Issues, address the resources utilized in TIM and the associated policies and procedures governing their use.

Among the three primary sections in the TIM SA (Strategic, Tactical and Support), the combined questions in Communications and Technology Issues realized the highest percentage increase in 2008 from the Baseline. With a cumulative score of 59.4 percent, the 2008 scores for the eight questions experienced a 43.8 percent increase over the Baseline.

For the third year in a row, the largest percentage increase in mean score (100.3 percent) in Communications and Technology Issues was for providing motorists with travel time estimates for route segments.

Summary

A total of 76 TIM SAs were completed in 2008, with an average overall score of 59.6 percent (out of a possible 100 percent). Overall scores are up nearly 30 percent (29.9 percent) over the Baseline scores. Continuing the trend from previous years, the highest scores were achieved in Operational Issues (66.2 percent) and the largest percentage increase in scores from the Baseline was in Communications and Technology Issues.

The four questions in the subsection on TIM Performance Measures continue to be the lowest scoring individual questions while achieving some of the largest percentage increases over the Baseline.

The mean scores in Program and Institutional Issues have been the lowest overall since the start of the TIM SA. This tracks the evolution of TIM programs, where the initial work has typically focused on on-scene operations conducted by an ad hoc group of stakeholders and not on building a formal program. However, progress is being made to advance TIM strategic program elements as evidenced by increasing mean scores. This could be the result of increased multi-agency cooperation in other highway safety programs and the obligations of national preparedness. It also could be a reflection of the attention paid to increasing highway congestion and the resulting mitigation solutions.

With one exception, the questions in Operational Issues each has a mean score higher than 2, with 36 percent of the questions scoring 2.75 or higher. However, the questions in Operational Issues experienced decreases in mean scores, either from the Baseline or from the 2007 scores. Six questions achieved lower mean scores in 2008 than in 2007. While the decreases were not significant, the fact that scores are decreasing may indicate a leveling out of effort in Operational Issues or may be the result of an increase in the number of emerging programs completing the TIM SA, where lower scores overall are expected.

The Communications and Technology Issues questions have experienced the largest increase over the Baseline, up 43.8 percent. Of the three TIM SA sections, Communications and Technology Issues are seemingly the most resource dependent. With current economic conditions and budgets at all levels of government experiencing shortfalls, it would be reasonable to expect that the scores in this area either level off or in some cases decrease. However, this is an area where the TIM nexus to national emergency response and preparedness goals is demonstrating the ability to advance TIM program performance and therefore increase TIM SA scores. The critical need for communications connectivity and interoperability between responders at all levels is driving federal investments through programs such as the U.S. Department of Homeland Security Urban Area Security Initiative (UASI).

Office of Operations