Office of Operations
21st Century Operations Using 21st Century Technologies

2015 Traffic Incident Management National Analysis Report

Executive Summary

November 2015

United States Department of Transportation Federal Highway Administration

Office of Operations
1200 New Jersey Avenue, SE
Washington, DC 20590

Download the Printable Version [PDF, 323KB]
You may need the Adobe Reader to view the PDFs on this page.
Contact Information: OperationsFeedback@dot.gov


Background

In 2002, the Federal Highway Administration (FHWA) initiated the development of the Traffic Incident Management Self-Assessment (TIM SA) to be used by State and local TIM program managers to benchmark and evaluate TIM program success and areas of improvement. The TIM SA was first deployed by FHWA in 2003 and assessments have been conducted annually since.

Over the years, FHWA has undertaken several revisions of the TIM SA to bring it more in line with current TIM state-of-practice. These revisions included the addition and/or removal of individual TIM SA questions and the rewording of some questions to limit the subjectivity of responses inherent in a self-assessment. The two major revisions, in 2007 and 2011, also included changes to the scoring schema and as such, required a recalibration of the baseline scores to protect the value of the TIM SA as a tool to measure national TIM progress over time.

In 2015 FHWA initiated a third major revision of the TIM SA which was designed to serve several purposes. As with previous revisions, this one was intended to reflect current TIM state-of-practice. For example, given the widespread deployment of the National TIM Responder Training Course by FHWA, TIM SA respondents are now asked to score their individual program's involvement in the training and utilization of the curriculum for improving on-scene response. Previously the questions on the National TIM Responder Training Course were included as non-scored supplemental questions (2013 and 2014).

One of the most significant revisions to the TIM SA in 2015 was designed to align the TIM SA with the emerging Capability Maturity Framework (CMF). The CMF allows organizations to assess their level of capability in six “dimensions” defined as:1

  • Business Processes
  • Systems and Technology
  • Performance Measurement
  • Culture
  • Organization and Staffing
  • Collaboration

Within each dimension, organizations can identify which level of maturity they believe their program has achieved from among four levels:2

  • Level 1 – Performed
  • Level 2 – Managed
  • Level 3 – Integrated
  • Level 4 – Optimizing

Merging the TIM SA into the CMF required changes to a number of the questions and a revised scoring schema that allows TIM SA respondents to score their program's level of success (or maturity) from 1-4. Previously the TIM SA asked respondents to assign a score to each question ranging from 0-4.

A hallmark of the CMF is that it provides scoring guidance for each question posed in each dimension. While the TIM SA previously provided some general scoring guidance, it was necessary for the 2015 TIM SA to develop more specific scoring guidance for each TIM SA question. The benefit of this specific guidance is that it significantly reduces the subjectivity that had previously impacted TIM SA scores.

The combination of a new scoring schema and the addition of a number of scored questions that  in previous years had been non-scored questions resulted in a need to recalibrate the baseline scores in each of the three TIM SA sections – Strategic, Tactical and Support – and recalculate the overall TIM SA baseline score. For a complete description of the process used to recalculate the baseline scores, see Appendix A.

In summary, the major changes implemented in the 2015 TIM SA include the following:

  • The inclusion of new scored questions on the National TIM Responder Training Course.
  • A renumbering of the TIM SA questions sequentially to reduce confusion from the previous numbering system, which utilized 4.1.x.x for all Strategic questions, 4.2.x.x for all Tactical questions and 4.3.x.x for all Support questions.
  • A reweighting of the three TIM SA sections in the overall scoring based on the number of questions included in each.
  • Specific scoring guidance provided for all 51 questions in the TIM SA.
  • A recalibration of the baseline scores to protect the value of the TIM SA as a tool to measure national TIM progress over time.

2015 Traffic Incident Management Self-Assessment  Results

In 2015 a total of 95 locations completed a TIM SA for inclusion in the national analysis. The 51 scored questions contained within the TIM SA were grouped into three sections: Strategic, Tactical and Support. The initial assessments completed in 2003, 2004, and 2005 (78 in total) continue to be used as the baseline scores; however, the scores were recalibrated this year as a result of the significant revisions to the TIM SA described above.

The most significant impact on the baseline scores from the change in the scoring schema is the result of the lowest possible score now being 1 rather than 0. Therefore, questions which previously had a low baseline average score due the number of locations scoring that question 0 now have a higher baseline average score from those same locations’ score moving up to 1.

Additionally, the baseline scores were submitted as part of TIM SAs conducted more than 10 years ago, when many TIM programs were new or emerging and therefore individual questions may have been scored lower. Baseline submittals represented 34 locations answering 40 questions each year they submitted. Of those submittals, 18.7 percent were scored less than 1. In the 2015 TIM SA, with a decade of advancement in TIM state-of-practice, the number of individual questions scored 1 has been reduced by nearly 20 percent to 15.1 percent of questions (in the 2015 TIM SA is the lowest score a question can receive).

The more specific scoring guidance for each question in the 2015 TIM SA also had an impact on the change in average scores from 2014 to 2015. Locations which in previous TIM SAs may have scored a question higher based on the more nebulous or non-existent scoring criteria now have very specific criteria in the 2015 TIM SA against which to rate their program. The net result is a slight decrease in scores from 2014 to 2015.

Additionally, some of the questions eliminated as part of the 2015 revision were those for which a large number of programs had consistently high scores. For instance, in previous years' TIM SA, the question on Move Over laws was a scored question. With Move Over laws now in place in all 50 states, TIM SA respondents were regularly scoring that question as a 4. Absent the need to now measure progress in securing a Move Over law as part of an individual TIM program, the question on Move Over laws is non-scoring in the 2015 TIM SA. The result is the elimination of a routinely high scored question with the potential impact affecting (lowering) overall TIM scores in the Tactical section.

The TIM SA baseline scores were also impacted by the redistribution of the weights assigned to the three sections in the overall score. In previous years' TIM SAs, the section scores were weighted as follows:  Strategic (30 percent); Tactical (40 percent); and Support (30 percent). In the 2015 revision, the number of questions in the Strategic section doubled (12 to 24), so the section weights were adjusted to account for the relative number of questions in each section. Table 1 shows the new weighting for the three sections.

Table 1. Mean Score for Each Section (Baseline and 2015)
Section # of Questions Mean Score (percent) High Score 2015 (possible) Change in scores from Baseline (percent) Section Weights (percent)
Baseline 2015
Strategic 24 42.4% 61.8% 36.6 (40) 45.7% 40%
Tactical 22 64.6% 71.9% 38.3 (40) 11.2% 40%
Support 5 39.7% 68.5% 20.0 (20) 72.6% 30%
Overall 51 50.7% 67.1% 92.9 (100) 32.4% 100%

Finally, because of the increase in the baseline scores from the elimination of the 0 score, there is less of a delta between the recalculated baseline and the 2015 scores. This results in a decrease in the percentage increase in 2015 over the baseline scores.

Table 1 shows the average score for each of the three TIM SA sections from the Baseline and 2015, along with the percentage change from the Baseline.

The 2015 overall TIM SA score was 67.1 percent (out of a possible 100 percent), representing a 32.4 percent increase over the Baseline. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas and non-top 75 metropolitan areas:

  • Top 40 metros: 72.5 percent
  • Top 75 metros: 69.5 percent
  • Non-top 75: 61.5 percent
  • Overall: 67.1 percent

As described above, there was a slight impact on the national score from the re-baseline when looking at the incremental change from 2014 to 2015. The overall national score decreased 7.1 percentage points (9.5 percent) from 74.2 to 67.1 percent.

A listing of all 51 TIM SA questions, their respective Baseline and 2015 scores and the percentage of programs scoring each question 3 or higher3 can be found in Appendix B.

Strategic

The 24 questions in the Strategic section are grouped into three subsections: Formal TIM Programs; TIM Training and After Action Reports; and TIM Performance Measures. The Strategic section saw the greatest increase in the number of questions as part of the 2015 revision and includes new scored questions on the National TIM Responder Training Course and data used in calculating TIM Performance Measures (TIM PM).

The Strategic section is historically the lowest scoring in the TIM SA, primarily driven by low scores in the TIM Performance Measures subsection. The Strategic section was once again the lowest scoring section in the 2015 TIM SA, achieving a score of 61.8 percent. However, this does represent a 45.7 percent increase over the newly recalculated Baseline.

Among the new questions in 2015 on TIM training, the average scores reflect the level of deployment of the National TIM Responder Training course by FHWA. As shown in Table 2, Questions 13 and 14 both received high average scores and a high percentage of the TIM SA scoring each question 3 or higher. Where there is room for improvement is the incorporation of the TIM training course into the local academy and/or technical college curriculum. However, given the relative newness of the national training program, it is not unexpected that scores to Question 15 are lower, as widespread incorporation of this training will take longer to achieve.

Table 2. New traffic incident management training questions in 2015.
Question 2015 Average Score Percent of TIM SA Scoring 3 or Higher
13. What percentage (estimated) of TIM responders in the region identified as needing training have received the 4-Hour SHRP2 TIM Responder Training (in-person or via Web-Based Training), or equivalent? 2.82 57.90%
14. Is the SHRP2 TIM Responder Training being conducted in a multidiscipline setting? 2.97 66.30%
15. Has the SHRP2 TIM Responder Training, or equivalent, been incorporated into the local academy and/or technical college curriculums? 1.77 10.50%

Significant progress has been made in the area of TIM Performance Measurement over the past decade and the scores in the TIM PM subsection reflect that progress. Scores for both Roadway Clearance (RC) and Incident Clearance (IC) indicate that an increasing number of locations around the country are measuring both TIM PMs using the FHWA definitions and that the data being used to measure both is being collected on a significant percentage of all incidents that occur in the regions reporting. However, average scores for the third TIM PM, secondary crashes, are among the lowest on the 2015 TIM SA. A total of four questions scored less than 2 on the 2015 TIM SA; one was Question 15 on TIM Training (Table 2) and the other three are the three questions on secondary crashes (Table 3).

Table 3. Traffic incident management performance measures – secondary crashes.
Question 2015 Average Score Percent of TIM SA Scoring 3 or Higher
21. Is the number of Secondary Crashes being measured utilizing FHWA's standard definition "number of unplanned crashes beginning with the time of detection of the primary crash where a collision occurs either: a) within the incident scene; or b) within the queue, including the opposite direction, resulting from the original incident? 1.87 31.60%
22. How is data for the number of Secondary Crashes collected? 1.88 29.50%
23. Has the TIM program established TIM performance targets for a reduction in the number of Secondary Crashes? 1.36 10.50%

Slightly more than half (54.7 percent) of TIM SA respondents scored their program a 1 on Question 21, indicating that secondary crashes are typically not measured. Of the remaining locations that did score their program a 2 or higher, only 22 locations provided secondary incident data. Those locations reported that, on average, secondary incidents comprised 8.1 percent of all incidents, an increase from 2.0 percent reported in 2014. However, caution should be taken in interpreting changes in the percentage of incidents reported as secondary given the lack of a uniform definition for secondary incidents.

The comments provided by TIM SA respondents to the questions on secondary crashes indicate that the absence of a clear definition of what constitutes a secondary incident hinders data collection and analysis in this area. Other TIM SA respondents indicated that their program currently in the process of either developing methods for collecting secondary crash data, or revising current accident reporting systems to include secondary crash data in the hopes of including this metric in the TIM SA next year. One potential impact on the scores for secondary crashes is the number of rural TIM programs submitting TIM SA for the national analysis. One comment from a rural TIM program respondent indicated that secondary crashes are not an issue as they are typically able to get traffic control in place before significant traffic queues build around incidents, reducing the likelihood of secondary crashes.

Another important output of the TIM SA is the TIM Performance Measures (PM) Database. This database is populated annually based on responses to the TIM SA. Information on the three key PM metrics – Roadway Clearance Time (RCT), Incident Clearance Time (ICT) and secondary crashes – is tracked annually and compared to a Baseline (2011) level.

Average RCT increased to 63.80 minutes in 2015, up 1.4 percent from the 62.93 minutes reported in 2014. While it did increase slightly, average RCT remains less than the 2011 RCT Baseline of 65.39 minutes. Average incident clearance time (ICT) decreased by 4 percent from 2014 to 2015 (64.07 minutes in 2014 versus 61.53 minutes in 2015).

The TIM programs that achieved the highest scores in the Strategic section are listed alphabetically in Table 4. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.

Table 3. Highest Scoring – Strategic
TIM Program
Cincinnati, OH
Kansas City, MO/KS
Louisville, KY
Milwaukee, WI
Orlando, FL

TACTICAL

The 2015 TIM SA revision resulted in the addition of a third subsection to the Tactical section and across all three subsections a total of four new scored questions were added. The 22 questions in the Tactical section are now focused on the following three areas:

  • TIM Laws
  • Policies and Procedures for Incident Response and Clearance
  • Responder and Motorist Safety

The Tactical section continues as the highest scoring of the three TIM SA sections, achieving an overall score of 71.9 percent. Four of the five highest scoring questions on the 2015 TIM SA are in the Tactical section, as part of the Policies and Procedures subsection (Table 5).

Table 5. Traffic incident management policies and procedures – highest scoring in 2015
Question 2015 Average Score Percent of TIM SA Scoring 3 or Higher
36. Does at least one responding agency have the authority to override the decision to utilize the responsible party's Hazmat contractor and call in other resources? 3.34 82.10%
40. Is there a procedure in place for removal of abandoned vehicles? 3.31 81.10%
35. Is there a policy in place that clearly identifies reportable types and quantities, and appropriate Hazmat response? 3.21 83.20%
32. Do towing and recovery procedures/rotation list policies deploy resources based on type/severity of incident? 3.14 74.70%

TIM Policies and Procedures that provide for responder and motorist safety and expedited incident clearance are central to the National TIM Responder Training Course curriculum. The high scores in this area may already reflect the level of deployment of the training and it can be expected that these scores will continue to advance as the number of responders trained increases. Another avenue for measuring use of these policies and procedures in future years will be the National TIM Responder Training Course assessment tool which is currently being developed by FHWA. Once the assessment tool is launched by FHWA, responders who have completed the training will be able to rate their own use of the policies and procedures in incident response.

The TIM programs that achieved the highest scores in the Tactical section are listed alphabetically in Table 6. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.

Table 6. Highest scoring – tactical.
TIM Program
Cincinnati, OH
Ft. Lauderdale, FL
West Palm Beach, FL
Seattle, WA
Virginia – Northern VA/Suburban DC

Support

The questions in Support focused on the tools and technologies enabling improved incident detection, response and clearance. Without the infrastructure and back office support for incident information exchange, the detection, verification, response and clearance times are delayed and responder and motorist safety is jeopardized. As a result, one of the three key objectives of the National Unified Goal for Traffic Incident Management is prompt, reliable, interoperable communications.

As part of the 2015 TIM SA revision process, a decision was made to eliminate the Support subsection on Traveler Information. Though the provision of traveler information allows motorists an opportunity to make route and modal changes when incidents occur, the TIM SA subject matter experts involved in the 2015 revision process believed that the questions on Traveler Support were more indicative of a region’s Information Technology (IT) resource and capabilities and less a function of TIM program performance.

The five questions that remain in the Support section in 2015 all address TIM data sharing and integration among TIM stakeholders. The highest scoring question in the Support section was Question 47 (below) which scored an average score 3.32, the second highest scoring question on the 2015 TIM SA.

47. Are TIM stakeholders aware of and actively utilizing Traffic Management Center/Traffic Operations Center resources to coordinate incident detection, notification and response?

Additionally, the pre-2015 question on data and video sharing between agencies (4.3.1.2.) has now been broken out into two separate questions to provide greater granularity on the level of data and video sharing. While the two questions achieved nearly identical average scores, the TIM data question had a higher percentage of TIM SA respondents scoring their program a 3 or 4 on Question 48 (Table 7).

Table 7. TIM data and video collection and use.
Question 2015 Average Score Percent of TIM SA Scoring 3 or Higher
48. What TIM data (i.e., number of involved vehicles, number of lanes blocked, length of queue, etc.) is captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 2.81 70.5%
49. Is TIM video captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 2.80 68.4%

The support section had the second highest overall score of 68.5 percent and had the largest increase over Baseline of the three sections (72.6 percent).

The TIM programs that achieved the highest scores in the Support section are listed alphabetically in Table 8. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.

Table 8. Highest scoring ­ support.
TIM Program
Alachua – Bradford, FL
Cincinnati, OH
Columbus, OH
El Paso, TX
Idaho - Statewide
Louisville, KY
Orlando, FL
Philadelphia, PA
San Diego, CA
Washington, DC

Summary

In 2015 the TIM SA underwent a significant revision, the third major revision to the TIM SA since its initial deployment in 2003. As with previous revisions, this year's was intended to better reflect current TIM state-of-practice in the TIM SA subsections and individual questions. The 2015 revision process was also designed to align the TIM SA with the emerging Capability Maturity Framework. The CMF alignment resulted in scoring changes and the addition of specific scoring criteria for each question in the TIM SA. This scoring guidance mitigates the subjectivity which had previously impacted the TIM SA scores. The net result of these changes was an increase in the overall Baseline score and a slight decrease in the incremental change from 2014 to 2015.

A total of 95 TIM SA were completed in 2015, with an average overall score of 67.1 percent (out of a possible 100 percent). Overall scores were up 32.4 percent over the recalibrated Baseline scores. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas and non-top 75 metropolitan areas:

  • Top 40 metros: 72.5 percent
  • Top 75 metros: 69.5 percent
  • Non-top 75: 61.5 percent
  • Overall: 67.1 percent

The highest scores were achieved in Tactical (71.9 percent) and the largest percentage increase in scores from the Baseline was in Support (72.6 percent). Low scoring questions and those with the least improvement over Baseline indicate specific program areas where additional guidance from FHWA may be warranted. Specifically, the 2015 TIM SA scores highlight a need for special attention in the following areas:

  • TIM Training:  With continued deployment of the National TIM Responder Training Course, there will be an opportunity to institutionalize the training within local responder academy and technical college course curricula. To accomplish this, FHWA may want to engage its Executive Leadership Group as advocates for the training with their respective training organizations. Not only will this positively impact scores on Question 15 specifically addressing deployment of the training, but will also have an exponential impact on scores throughout the TIM SA as gains are realized in the Tactical subsections on Policies and Procedures for Incident Response and Clearance, and Responder and Motorist Safety.
  • TIM Performance Measures:  FHWA’s initial work to develop consensus on definitions for Roadway Clearance Time and Incident Clearance Time through its TIM Performance Measures Focus States Initiative,4 along with subsequent work to build the business case for collecting and analyzing TIM PM data, have paid off in continually increasing scores in the TIM PM subsection of the TIM SA. FHWA now has an opportunity to expand data collection and analysis of data on secondary incidents.

There were four questions that experienced a drop in average score below Baseline in the 2015 TIM SA (Table 9). The drops were not significant and may be more a function of the specific scoring guidance provided in the 2015 TIM SA rather than an indicator of declining performance. A more telling indicator will be the incremental change in average score for these questions from 2015 to 2016 when the specific scoring guidance is used again to score questions in 2016.

Table A1. Question mapping - strategic.
Question Baseline 2015 Average Score Percent Change from Baseline
26. Is a Driver Removal Law in place and understood by TIM stakeholders? 3.01 2.85 -5.20%
33. Do towing and recovery procedures/rotation list policies include company/operator qualifications, equipment requirements, and/or training requirements? 2.86 2.84 -0.60%
37. For incidents involving a fatality, is there a procedure in place for early notification and timely response of the Medical Examiner? 2.53 2.47 -2.40%
38. For incidents involving a fatality, is there a procedure for the removal of the deceased prior to Medical Examiner arrival? 2.53 2.47 -2.40%
40. Is there a procedure in place for removal of abandoned vehicles? 3.47 3.31 -4.70%


APPENDIX A. RECALCULATING THE TRAFFIC INCIDENT MANAGEMENT SELF-ASSESSMENT BASELINE SCORES

As a first step in the recalibration of the TIM SA Baseline scores was to map the 51 questions in the 2015 Traffic Incident Management Self-Assessment (TIM SA) to questions comprising the Baseline scores (Table A.1). The Baseline scores for existing questions were utilized in the new Baseline calculation. For questions added for the first time in the 2015 TIM SA, the average score from 2015 becomes the Baseline for that question. Questions removed from the 2015 TIM SA likewise have had the associated Baseline score removed from the new Baseline calculation.

If a question was previously a composite scored question and no longer is, the previous overall score for that question was used in the baseline calculation for each of the now individual questions. Finally, after each question has a baseline score, the new section baseline scores are calculated, and the new overall baseline score is calculated the using new section weights.

NOTE: Questions in Table A.1 below that are marked with an asterisk (*) are non-scored questions.

Table A1. Question mapping – strategic.
New Question (Section 1 – Strategic) Previous Question (Section 1 – 4.1 Strategic)
1. Is there a formal TIM program that is supported by a multidiscipline, multi-agency team or task force, which meets regularly to discuss and plan for TIM activities? 4.1.1.1. Have a TIM multi-agency team or task force which meets regularly to discuss and plan for TIM activities?
1a. How frequently does the team or task force meet?* 4.1.1.1.b. How frequently does the team/task force meet?*
2. Are all disciplines and agencies participating in on-going TIM enhancement activities/efforts? 4.1.1.1.a. What agencies are represented on the team/task force?*
3. Is the importance of TIM understood by all TIM stakeholders and supported by multidiscipline, multi-agency agreements or memorandums of understanding (MOUs)? 4.1.2.1. Is the TIM program supported by multi-agency agreements/memoranda of understanding? (Composite score for 4.1.2.1.a. through 4.1.2.1.d. below)
3a. How often is the document updated?* 4.1.2.1.a.1. How often is the document updated?*
3b. Which agencies are signatories on the agreement /MOU?* 4.1.2.1.a.2. Which agencies are signatories on the agreement/MOU?*
4. Is agency leadership actively involved in program-level TIM decisions (i.e. policy establishment, training, funding, legislation, etc.)? 4.1.2.1.a. Is the agreement/MOU signed by top officials from participating agencies?
5. Is there a full-time position within at least one of the participating agencies with responsibility for coordinating the TIM program as their primary job function? 4.1.2.3. Is there someone from at least one of the participating agencies responsible for coordinating the TIM program as their primary job function?
6. Are the TIM response roles and responsibilities of public and private sector TIM stakeholders mutually understood? 4.1.2.1.b. Are incident scene roles and responsibilities for each participating agency clearly defined in the agreement and communicated to all participating agencies?
6a. How are the roles and responsibilities of public and private sector TIM stakeholders communicated to participating agencies?* 4.1.2.1.b.1 How are the roles and responsibilities defined in the agreement/MOU communicated to participating agencies?*
7. Is planning to support TIM activities, including regular needs assessments, done across and among participating agencies? 4.1.2.2. Is planning to support the TIM activities done across and among participating agencies?
8. Are funds available for TIM activities? 4.1.2.1.c. Are agency roles and responsibilities for planning for and funding for the TIM program clearly defined in the agreement/MOU?
9. Is TIM considered and incorporated into planning efforts for construction and work zones? 4.1.1.4.a. Construction and maintenance?
10. Is TIM considered and incorporated into planning efforts for special events such as sporting events, concerts, conventions, etc? 4.1.1.4.b. Sporting events/concerts/conventions/etc?
11. Is TIM considered and incorporated into planning efforts for weather-related events? 4.1.1.4.c. Weather-related events?
12. Have stakeholders in the region participated in a SHRP2 National TIM Responder Training Program, or equivalent, Train-the-Trainer (TtT) session and are they actively training others?

4.1.1.2. Is multi-agency training held at least once a year on TIM-specific topics? (Composite score for 4.1.1.2.a through 4.1.1.2.e below)

4.1.1.2.a. NIMS/ ICS 100?

4.1.1.2.b. Training of mid-level managers from the primary agencies on the National Unified Goal?

4.1.1.2.c. Traffic control?

4.1.1.2.d. Work zone safety?

4.1.1.2.e. Safe parking?

12a. Is there any other TIM-related supplemental or topic-specific training being provided?* NEW
13. What percentage (estimated) of TIM responders in the region identified as needing training have received the 4-Hour SHRP2 TIM Responder Training (in-person or via Web-Based Training), or equivalent? NEW
14. Is the SHRP2 TIM Responder Training being conducted in a multidiscipline setting? NEW
15. Has the SHRP2 TIM Responder Training, or equivalent, been incorporated into the local academy and/or technical college curriculums? NEW
16. Does the TIM program conduct multidiscipline, multi-agency after-action reviews (AARs)? 4.1.1.3 Conduct multi-agency post-incident debriefings?
16a. How many multi-agency AARs were held in the last 12 months?*

4.1.1.3.a. Is there a defined incident level or threshold at which mandatory, multi-agency post-incident reviews are conducted?

_____Yes  _____No

If yes, what is that level? ______________________

How many post-incident reviews were held in the last 12 months? ________________*
17. Is Roadway Clearance Time being measured utilizing FHWA’s standard definition “time between first recordable awareness of an incident by a responsible agency and first confirmation that all lanes are available for traffic flow? 4.1.3.1. Have multi-agency agreement on the two performance measures being tracked: (Composite score for 4.1.3.1.a and 4.1.3.1.b below)
    4.1.3.1.a. Roadway Clearance Time?
17a. If available, what was the average Roadway Clearance Time for the prior year?* 4.1.3.2.a. If yes, what is your locale’s average Roadway Clearance Time for the prior year (September 1, 2012 to August 31, 2013)? * ___minutes
17b. If applicable, describe the difference between your definition for Roadway Clearance Time and the standard definition?* 4.1.3.2.a. FHWA defines Roadway Clearance Time as the “time between first recordable awareness of an incident by a responsible agency and first confirmation that all lanes are available for traffic flow.” *

Is your performance measure:

__consistent with FHWA’s definition

__measured as first recordable

__awareness by a DOT (start time)

__ Other (describe)

18. Is Incident Clearance Time being measured utilizing FHWA’s standard definition “time between the first recordable awareness of the incident and the time at which the last responder has left the scene? 4.1.3.1.b. Incident Clearance Time?
18a. If available, what was the average Incident Clearance Time for the prior year?* 4.1.3.2.b. If yes, what is your locale’s average Incident Clearance Time for the prior year (September 1, 2012 to August 31, 2013)?  ____minutes*
18b. If applicable, describe the difference between your definition for Incident Clearance Time and the standard definition.*

4.1.3.2.b. FHWA defines Incident Clearance Time as the “time between the first recordable awareness of the incident and the time at which the last responder has left the scene.”*

 Is your performance measure:

__consistent with FHWA’s definition

__measured as first recordable

__awareness by a DOT (start time)

__measured as time that DOT leaves scene (end time)

__measured as time that enforcement leaves scene (end time)

__Other (describe)

19. How is data for Roadway /Incident Clearance Time being collected? 4.1.3.2. Has the TIM program established methods to collect and analyze the data necessary to measure performance in reduced roadway clearance time and reduced incident clearance time?

19a. What type of incident data are used to calculate Roadway/Incident Clearance Time? (Choose the option that best describes your data or provide your own description.)?*

__ All incidents  

__ Major incidents only  

__ DOT-involved incidents only  

__ FSP-involved incidents only  

__ Other (describe)

4.1.3.2.b. What type of incident data are used to calculate Incident Clearance Time? (Choose the option that best describes your data or provide your own description.)*

__all incidents

__major incidents only

__DOT-involved incidents only

__FSP-involved incidents only

__Other (describe)

19b. What percentage of incidents is being considered when calculating Roadway/Incident Clearance Time?* NEW
20. Has the TIM program established TIM performance targets for Roadway/Incident Clearance Time? 4.1.3.3. Have targets (e.g. time goals) for performance of the two measures?
20a. How is progress measured?* 4.1.3.4.a. How is progress measured?*
21. Is the number of Secondary Crashes being measured utilizing FHWA's standard definition "number of unplanned crashes beginning with the time of detection of the primary crash where a collision occurs either a) within the incident scene or b) within the queue, including the opposite direction, resulting from the original incident?

4.1.3.5.a FHWA defines Secondary Incidents as "unplanned incidents beginning with the time of detection of the primary incident where a collision occurs either (a) within the incident scene or (b) within the queue, including the opposite direction, resulting from the original incident." Is your performance measure:*

Is your performance measure:

__consistent with FHWA’s definition

__Other (describe)

21a. If available, what was the estimated number of secondary crashes relative to the total number of crashes considered (total data set) for the prior year?* 4.1.3.5.a If yes, what is your locale’s estimate of the number of secondary incidents relative to total incidents for the prior year (September 1, 2012 to August 31, 2013)?*
21b. If applicable, describe the difference between your definition for number of Secondary Crashes and the standard definition. ?* NEW
22. How is data for the number of Secondary Crashes collected? NEW

22a. What type of data are used to calculate the number of Secondary Crashes? (Choose the option that best describes your data or provide your own description.)?*  

__ All crashes  

__ Major crashes only  

__ DOT-involved crashes only  

__ FSP-involved crashes only  

__  Other (describe

4.1.3.5.a What type of incident data are used to calculate Secondary Incident metrics? (Choose the option that best describes your data or provide your own description.)*

__all incidents

__major incidents only

__DOT-involved incidents only

__FSP-involved incidents only

__Other (describe)

22b. What percentage of crashes is being considered when calculating the number of Secondary Crashes?* 4.1.3.5.a If yes, what is your locale's estimate of the number of secondary incidents relative to total incidents for the prior year?*
23. Has the TIM program established TIM performance targets for a reduction in the number of Secondary Crashes? 4.1.3.3. Have targets (e.g. time goals) for performance of the two measures?
24. Is TIM performance data used to influence and/or improve operations? NEW

24a. Is data being collected on other performance measures by any of the following agencies? (check all that apply)?*  

__ Law Enforcement  

__ Fire/Rescue  

__ MPO  

__ DOT  

__ Other (please specify)

If yes, please describe:

4.1.3.1.c. Is data being collected on other performance measures by any of the following agencies? (check all that apply)*

__Law Enforcement

__Fire/Rescue

__MPO

__DOT

__Other (please specify)

If yes, please describe:




Table A2. Question mapping – tactical.
New Question (Section 2 – Tactical) Previous Question (Section 4.2 – Tactical)
25. Is an Authority Removal Law in place and understood by TIM stakeholders? NEW
26. Is a Driver Removal Law in place and understood by TIM stakeholders?

4.2.1.2. Have "driver removal" laws which require drivers involved in minor crashes (not involving injuries) to move vehicles out of the travel lanes? (Composite score for 4.2.1.2.a. and 4.2.1.2.b. below)

4.2.1.2.a. Is there a "driver removal" law in place?

4.2.1.2.b. Is it communicated to motorists?

27. What activities are in place to outreach to and educate the public and elected officials about TIM? NEW
27a. Is the Move Over Law enforced?* NEW
27b. In addition to internal agency-specific reporting is information on responder injuries sustained during traffic incident response being recorded in a "responder struck-by database"?* 4.2.2.1.c.1. In addition to internal agency-specific reporting, is information on responder injuries sustained during traffic incident response being recorded in a "responder struck-by database?" *
27c. In addition to internal agency-specific reporting is information on responder fatalities which occur during traffic incident response being recorded in a "responder struck-by database"?* 4.2.2.1.c.2. In addition to internal agency-specific reporting, is information on responder fatalities which occur during traffic incident response being recorded in a "responder struck-by database?"*
27d. If yes to one/both questions above, who maintains the database?* 4.2.2.1.c.3. If yes to either/both questions above, who maintains the database?*
27e. If yes to one/both questions above, how is the struck-by information being reported?* 4.2.2.1.c.4. If yes to either/both questions above, how is the struck-by information being reported?*
28. Is there a Safety Service Patrol program in place for incident and emergency response? 4.2.1.3. Use a Safety Service Patrol for incident and emergency response?
29. What level of coverage does the Safety Service Patrol program provide? NEW
29a. If there is a Safety Service Patrol program, please provide details on lane miles covered, hours of operation, days  of operation, services provided, number of vehicles, equipment on vehicles and any operator training.*

4.2.1.3.a. If there is a safety service patrol, please provide details:*

Lane miles covered ________

Hours of operation _________

Days of operation ___________

Services provided___________

Number of vehicles__________

Equipment on vehicles_________

Operator training_____________. 

30. Do TIM responders routinely utilize the Incident Command System (ICS), specifically Unified Command (UC), while on scene? 4.2.1.4. Utilize the Incident Command System on-scene?
31. Are temporary traffic control (TTC) devices (e.g., cones, advanced warning signs, etc.) pre-staged in the region to facilitate timely response? 4.2.1.5. Have response equipment pre-staged for timely response?
31a. Are there other types of equipment or resources pre-staged (e.g., crash investigation equipment)?* NEW
32. Do towing and recovery procedures/rotation list policies deploy resources based on type/severity of incident? 4.2.1.6.a. Deployed based on incident type and severity?*
33. Do towing and recovery procedures/rotation list policies include company/operator qualifications, equipment requirements, and/or training requirements? 4.2.1.6. Identify and type resources so that a list of towing and recovery operators (including operator capabilities and special equipment) is available for incident response and clearance?
34. Do towing and recovery procedures/rotation list policies include penalties for non-compliance of response criteria? NEW
35. Is there a policy in place that clearly identifies reportable types and quantities, and appropriate Hazmat response? 4.2.1.7. Identify and type resources so that a list of HazMat contractors (including capabilities and equipment) is available for incident response?
36. Does at least one responding agency have the authority to override the decision to utilize the responsible party's Hazmat contractor and call in other resources? 4.2.1.8. Does at least one responding agency have the authority to override the decision to utilize the responsible party's HazMat contractor and call in other resources?
37. For incidents involving a fatality, is there a procedure in place for early notification and timely response of the Medical Examiner? 4.2.1.9.a. Is there a procedure for early notification of the Medical Examiner?*
38. For incidents involving a fatality, is there a procedure for the removal of the deceased prior to Medical Examiner arrival? 4.2.1.9.b. Is there a procedure for removal of the deceased prior to Medical Examiner arrival?*
39. Are there procedures in place for expedited crash investigations? 4.2.1.10. Are there procedures in place for expedited accident reconstruction/investigation?
39a. What technology is used to support crash investigations?* 4.2.1.10.a. Is the use of technology part of the reconstruction procedures? If yes, what technologies are used?*
40. Is there a procedure in place for removal of abandoned vehicles? 4.2.1.11. Is there a policy in place for removal of abandoned vehicles?
41. Do standardized, documented TIM response procedures/guidelines exist? 4.2.1.12. Is there a Policy and Procedures Manual with standard operating guidelines for responders? 
New Question (Section 2 – Tactical) Previous Question (Section 4.2 – Tactical)
42. Do TIM responders routinely utilize temporary traffic control devices to provide traffic control for the three incident classifications (minor, intermediate, major) in compliance with the MUTCD? 4.2.2.3. Routinely utilize transportation resources to conduct traffic control procedures for various levels of incidents in compliance with the MUTCD?
43. Do TIM responders routinely utilize traffic control procedures to provide back of traffic queue warning to approaching motorists? 4.2.2.4. Routinely utilize traffic control procedures for the end of the incident traffic queue?
44. Is there a mutually understood procedure/guideline in place for safe vehicle positioning?

4.2.2.5. Have mutually understood equipment staging and emergency lighting procedures on-site to maximize traffic flow past an incident while providing responder safety? (Composite score of 4.2.2.5.a. through 4.2.2.5.d. below)

4.2.2.5.a. Vehicle and equipment staging procedures?

45. Are there mutually understood procedures/guidelines in place for use of emergency-vehicle lighting? 4.2.2.5.b. Light-shedding procedures?
46. Are TIM responders following high-visibility safety apparel requirements as outlined in the MUTCD? 4.2.2.5.c. PPE used by responders?
46a. Which responders are regularly wearing their high-visibility safety apparel?* 4.2.2.5.c.1. Which responders are using PPE?*



Table A3. Question mapping – support.
New Question (Section 3 – Support)  Previous Question (Section 3 – 4.3 Support)
47. Are TIM stakeholders aware of and actively utilizing Traffic Management Center/Traffic Operations Center resources to coordinate incident detection, notification and response? 4.3.1.1. Does the TIM program use a Traffic Management Center/Traffic Operations Center to coordinate incident detection, notification and response?
48. What TIM data (i.e., number of involved vehicles, number of lanes blocked, length of queue, etc.) is captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 4.3.1.2. Is there data/video sharing between agencies?
49. Is TIM video captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 4.3.1.2. Is there data/video sharing between agencies?
50. Are there policies or procedures in place for signal timing changes to support traffic management during incident response?

4.3.1.3. Does the TIM program have specific policies and procedures for traffic management during incident response (Composite score of 4.3.1.3.a. through 4.3.1.3.b. below)

4.3.1.3.a. Signal timing changes?

51. Are there pre-planned detour and/or alternate routes identified and shared between TIM stakeholders? 4.3.1.3.b. Pre-planned detour and alternate routes identified and shared between agencies?


APPENDIX B. Summary of 2015 TIM SA Results

Question Mean Score % Change from Baseline % Scoring 3 or Higher
Baseline 2015 Baseline 2015
Strategic
1. Is there a formal TIM program that is supported by a multidiscipline, multi-agency team or task force, which meets regularly to discuss and plan for TIM activities? 1.9 2.86 50.40% 28.00% 66.30%
2. Are all disciplines and agencies participating in on-going TIM enhancement activities/efforts? 1.9 2.86 50.40% 28.00% 66.30%
3. Is the importance of TIM understood by all TIM stakeholders and supported by multidiscipline, multi-agency agreements or memorandums of understanding (MOUs)? 1.71 2.58 50.80% 18.00% 47.40%
4. Is agency leadership actively involved in program-level TIM decisions (i.e. policy establishment, training, funding, legislation, etc.)? 1.71 2.73 59.40% 18.00% 62.10%
5. Is there a full-time position within at least one of the participating agencies with responsibility for coordinating the TIM program as their primary job function? 2.28 2.78 21.90% 54.00% 52.60%
6. Are the TIM response roles and responsibilities of public and private sector TIM stakeholders mutually understood? 1.71 3.04 77.90% 18.00% 82.10%
7. Is planning to support TIM activities, including regular needs assessments, done across and among participating agencies? 1.35 2.66 97.30% 12.00% 55.80%
8. Are funds available for TIM activities? 1.71 2.4 41.00% 18.00% 40.00%
9. Is TIM considered and incorporated into planning efforts for construction and work zones? 2.47 3.18 28.70% 35.00% 76.80%
10. Is TIM considered and incorporated into planning efforts for special events such as sporting events, concerts, conventions, etc? 2.47 3.18 28.70% 35.00% 76.80%
11. Is TIM considered and incorporated into planning efforts for weather-related events? 2.47 3.18 28.70% 35.00% 76.80%
12. Have stakeholders in the region participated in a SHRP2 National TIM Responder Training Program, or equivalent, Train-the-Trainer (TtT) session and are they actively training others? 1.26 2.54 101.30% 9.00% 47.40%
13. What percentage (estimated) of TIM responders in the region identified as needing training have received the 4-Hour SHRP2 TIM Responder Training (in-person or via Web-Based Training), or equivalent? 2.82 2.82 - 57.90% 57.90%
14. Is the SHRP2 TIM Responder Training being conducted in a multidiscipline setting? 2.97 2.97 - 66.30% 66.30%
15. Has the SHRP2 TIM Responder Training, or equivalent, been incorporated into the local academy and/or technical college curriculums? 1.77 1.77 - 10.50% 10.50%
16. Does the TIM program conduct multidiscipline, multi-agency after-action reviews (AARs)? 1.62 2.54 57.20% 18.00% 45.30%
17. Is Roadway Clearance Time being measured utilizing FHWA's standard definition “time between first recordable awareness of an incident by a responsible agency and first confirmation that all lanes are available for traffic flow? 0.64 2.52 293.10% 3.00% 53.70%
18. Is Incident Clearance Time being measured utilizing FHWA's standard definition "time between the first recordable awareness of the incident and the time at which the last responder has left the scene? 0.64 2.39 273.40% 3.00% 49.50%
19. How is data for Roadway /Incident Clearance Time being collected? 0.64 2.61 307.90% 3.00% 53.70%
20. Has the TIM program established TIM performance targets for Roadway/Incident Clearance Time? 1.16 2.17 86.90% 4.00% 33.70%
21. Is the number of Secondary Crashes being measured utilizing FHWA's standard definition “number of unplanned crashes beginning with the time of detection of the primary crash where a collision occurs either a) within the incident scene or b) within the queue, including the opposite direction, resulting from the original incident? 1.03 1.87 81.90% 8.00% 31.60%
22. How is data for the number of Secondary Crashes collected? 1.88 1.88 - 29.50% 29.50%
23. Has the TIM program established TIM performance targets for a reduction in the number of Secondary Crashes? 1.16 1.36 17.10% 4.00% 10.50%
24. Is TIM performance data used to influence and/or improve operations? 2.21 2.21 - 35.80% 35.80%
Tactical
25. Is an Authority Removal Law in place and understood by TIM stakeholders? 2.92 3.04 4.20% 67.00% 73.70%
26. Is a Driver Removal Law in place and understood by TIM stakeholders? 3.01 2.85 -5.20% 71.00% 72.60%
27. What activities are in place to outreach to and educate the public and elected officials about TIM? 2.38 2.38 - 46.30% 46.30%
28. Is there a Safety Service Patrol program in place for incident and emergency response? 2.73 3.02 10.50% 67.00% 74.70%
29. What level of coverage does the Safety Service Patrol program provide? 2.73 3.02 10.50% 67.00% 74.70%
30. Do TIM responders routinely utilize the Incident Command System (ICS), specifically Unified Command (UC), while on scene? 2.55 3.11 21.80% 58.00% 78.90%
31. Are temporary traffic control (TTC) devices (e.g., cones, advanced warning signs, etc.) pre-staged in the region to facilitate timely response? 2.21 2.6 17.60% 41.00% 52.60%
32. Do towing and recovery procedures/rotation list policies deploy resources based on type/severity of incident? 3.14 3.14 - 74.70% 74.70%
33. Do towing and recovery procedures/rotation list policies include company/operator qualifications, equipment requirements, and/or training requirements? 2.86 2.84 -0.60% 67.00% 63.20%
34. Do towing and recovery procedures/rotation list policies include penalties for non-compliance of response criteria? 2.49 2.49 - 55.80% 55.80%
35. Is there a policy in place that clearly identifies reportable types and quantities, and appropriate Hazmat response? 2.89 3.21 11.10% 69.00% 83.20%
36. Does at least one responding agency have the authority to override the decision to utilize the responsible party's Hazmat contractor and call in other resources? 3.22 3.34 3.60% 9.00% 82.10%
37. For incidents involving a fatality, is there a procedure in place for early notification and timely response of the Medical Examiner? 2.53 2.47 -2.40% 55.00% 66.30%
38. For incidents involving a fatality, is there a procedure for the removal of the deceased prior to Medical Examiner arrival? 2.53 2.47 -2.40% 55.00% 66.30%
39. Are there procedures in place for expedited crash investigations? 2.59 2.72 4.90% 72.00% 51.60%
40. Is there a procedure in place for removal of abandoned vehicles? 3.47 3.31 -4.70% 91.00% 81.10%
41. Do standardized, documented TIM response procedures/guidelines exist? 2.73 2.73 - 61.10% 61.10%
42. Do TIM responders routinely utilize temporary traffic control devices to provide traffic control for the three incident classifications (minor, intermediate, major) in compliance with the MUTCD? 1.93 2.83 46.70% 27.00% 61.10%
43. Do TIM responders routinely utilize traffic control procedures to provide back of traffic queue warning to approaching motorists? 1.56 2.74 75.40% 17.00% 63.20%
44. Is there a mutually understood procedure/guideline in place for safe vehicle positioning? 1.28 2.94 130.00% 14.00% 63.20%
45. Are there mutually understood procedures/guidelines in place for use of emergency-vehicle lighting? 1.28 2.94 130.00% 14.00% 63.20%
46. Are TIM responders following high-visibility safety apparel requirements as outlined in the MUTCD? 1.28 2.94 130.00% 14.00% 63.20%
Support
47. Are TIM stakeholders aware of and actively utilizing Traffic Management Center/Traffic Operations Center resources to coordinate incident detection, notification and response? 1.98 3.32 67.50% 41.00% 86.30%
48. What TIM data (i.e., number of involved vehicles, number of lanes blocked, length of queue, etc.) is captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 1.43 2.81 96.50% 10.00% 70.50%
49. Is TIM video captured via TMCs and/or public safety CAD systems and is it shared with other disciplines for real-time operational purposes? 1.43 2.8 95.80% 10.00% 68.40%
50. Are there policies or procedures in place for signal timing changes to support traffic management during incident response? 1.55 2.18 40.60% 18.00% 33.70%
51. Are there pre-planned detour and/or alternate routes identified and shared between TIM stakeholders? 1.55 2.6 67.70% 18.00% 58.90%

1 U.S. Department of Transportation Federal Highway Administration. Organizing for Reliability – Capability Maturity Model Assessment and Implementation Plans Executive Summary (Washington, DC: FHWA, May 2015). [ Return to note 1. ]

2 Ibid. [ Return to note 2. ]

3In both the previous TIM SA scoring schema and the newly revised 2015 scoring schema, scores of 3 and 4 indicate the highest levels of progress for a particular question. [ Return to note 3. ]

4U.S. Department of Transportation, Federal Highway Administration, Focus States Initiative Traffic Incident Management Performance Measures Final Report (Washington, DC: FHWA, January 2009). [ Return to note 4. ]

Office of Operations