Office of Operations
21st Century Operations Using 21st Century Technologies

Chapter 2. Strategic Section

The number of questions in the Strategic section was reduced from 28 to 21, and grouped into three subsections:

  • Formal Traffic Incident Management (TIM) Programs
  • TIM Training and After-Action Reports
  • TIM Performance Measures (TIM PM)

The TIM PM subsection is the largest, with 12 questions. Overall low TIM PM scores have typically resulted in the Strategic section scoring the lowest of the three sections; the 2020 Traffic Incident Management Capability Maturity Self-Assessment (TIM CM SA) is no different. The Strategic section scored a 64.7 percent compared to 75.9 percent in Tactical and 67.0 percent in Support. This year’s Strategic score, however, represents a 9.9 percent increase over the baseline of 58.9 percent. Among the three subsections, the average scores were:

  • Formal TIM Programs: 3.0
  • TIM Training and After-Action Reports: 2.7
  • TIM PM: 2.3

Key to the success of a formal TIM program is regular meetings for TIM team member agencies. Among the 2020 TIM CM SA participants, 68 percent indicated meeting at least four times per year, if not more. Participants noted that in 2020 their teams met virtually to advance their TIM programs, rather than in person.

Three questions in the Strategic section addressed the Federal Highway Administration’s (FHWA) national TIM Responder Training Program, which was originally developed as part of the second Strategic Highway Research Program (SHRP2). Respondents were asked to score their participation in SHRP2 training, if the training is conducted in a multidiscipline setting, and if the training has been incorporated into the State or local academy and/or technical college curriculums. Prior to the 2020 TIM CM SA revisions, respondents had been asked to score their programs based on the number of responders who had completed the training, relative to the total number of responders in the State. The FHWA decided to remove this question from the TIM CM SA, because training participation is tracked through a separate initiative. According to FHWA data, as of December 14, 2020, more than 507,000 individuals have received the training, which represents 43.8 percent of total responders to be trained.3

Despite the challenges of in-person training in 2020, TIM responders had additional training opportunities beyond SHRP2 training including Incident Command System/National Incident Management System, livestock incidents, cable barrier incident response, hazardous materials (HazMat), push/pull/drag, and queue awareness.

Five questions in the 2020 TIM CM SA had an average score at least 20 percent higher than the baseline; four of these questions were in the Strategic section, as shown in table 2. This indicates that, despite a short time line between the new 2015 baseline and 2020, TIM programs continue to make improvements in these four areas.

Table 2. Strategic questions with a score at least 20 percent higher than the baseline.
Question Mean Score Change from Baseline (%)
Baseline 2020
5. Are funds available for TIM activities? 2.4 3.0 23.0
8. Has the SHRP2 TIM Responder Training, or equivalent, been incorporated into the State or local academy and/or technical college curriculums? 1.8 2.4 34.7
18. Is the number of Secondary Crashes being measured and used? FHWA defines Secondary Crashes as the “number of unplanned crashes beginning with the time of detection of the primary crash where a collision occurs either a) within the incident scene or b) within the queue, including the opposite direction, resulting from the original incident.” 1.9 2.3 22.8
19. How is data for the number of Secondary Crashes collected? 1.9 2.3 22.2
Note: The numbers in this table demonstrate general patterns, and have been rounded for ease of communication.
TIM = traffic incident management. SHRP2 = second Strategic Highway Research Program.

Within the same time period between the 2015 baseline and 2020 TIM CM SA, four questions had scores that decreased from the baseline, all in the TIM PM subsection, as shown in table 3.

Table 3. Traffic Incident Management Performance Measures questions with an average score below the baseline.
Question 2020 Average Score Change from Baseline (%)
11. Which of the following data collection and analysis practices best align with your region for RCT? 2.4 -8.5
15. Which of the following data collection and analysis practices best align with your region for ICT? 2.4 -10.4
16. Has the TIM program established performance targets for ICT? 2.1 -1.8
21. How does your agency use Secondary Crash performance data to influence TIM operations? 1.9 -14.6
Note: The numbers in this table demonstrate general patterns, and have been rounded for ease of communication.
ICT = incident clearance time. RCT = roadway clearance time. TIM = traffic incident management.

For questions 11 and 15 concerning data collection and analysis for roadway clearance time (RCT) and incident clearance time (ICT), approximately 20 percent of respondents scored their programs with a 1. According to the TIM CM SA scoring guidance, programs should be scored with a 1 if “Data are present but not necessarily accessible or useful because they are not collected with a focus on performance measures.”4 As shown in table 4, primarily the non-top 75 areas are not yet using the available data to measure RCT and ICT.

Table 4. Data collection and analysis questions for incident clearance time and roadway clearance time.
Question Top 40 Metropolitan Area Average Score Top 75 Metropolitan Area Average Score Non-Top 75 Average Score
11. Which of the following data collection and analysis practices best align with your region for RCT? 2.6 2.7 2.1
15. Which of the following data collection and analysis practices best align with your region for ICT? 2.6 2.6 2.1
Note: The numbers in this table demonstrate general patterns, and have been rounded for ease of communication.
ICT = incident clearance time. RCT = roadway clearance time.

Although scores for TIM PM questions have traditionally been low, progress continues in this area. As shown in table 5, only two questions in the 2020 TIM CM SA had an average score below 2.0, both on Secondary Crashes.

Table 5. 2020 questions with an average score below 2.0.
Question Baseline 2020 Average Score
20. Has the TIM program established performance targets for a reduction in the number of Secondary Crashes? 1.4 1.6
21. How does your agency use Secondary Crash performance data to influence your TIM operations? 2.2 1.9
Note: The numbers in this table demonstrate general patterns, and have been rounded for ease of communication.
TIM = traffic incident management.

TIM programs with the highest scores in the Strategic section are shown in table 6.

Table 6. Highest-scoring programs in the Strategic section.
Traffic Incident Management Program
Atlanta, Georgia
Buffalo, New York
Columbus, Ohio
Louisville, Kentucky
Miami-Dade County, Florida

previous | next

3 Federal Highway Administration, National TIM Responder Training Program Update, Talking TIM Webinar (December 15, 2020). [ Return to note 3. ]

4 Federal Highway Administration, Traffic Incident Management Capability Maturity Self-Assessment 2020 User Guide and Questions (September 1, 2020). [ Return to note 4. ]