2016 Traffic Incident Management National Analysis ReportExecutive SummaryNovember 2016 Office of Operations Download the Printable Version [PDF, 113KB] BackgroundFor nearly 15 years the Federal Highway Administration (FHWA) has utilized the Traffic Incident Management Self-Assessment (TIM SA) to evaluate the state of practice in traffic incident management across the country. Originally developed by FHWA in 2002, the TIM SA is utilized by State and local TIM program managers to annually benchmark and evaluate TIM program success and areas of improvement. A number of revisions to the TIM SA have been implemented over time to reflect changes in TIM practice. The most recent of these revisions occurred in 2015 to align the TIM SA with the Capability Maturity Framework (CMF). Significant changes to the TIM SA as part of this revision included:
The combined impact of these changes resulted in the 2015 national TIM SA score being slightly lower (9.5 percent) than the 2014 national TIM SA score. However, the impact of these changes in the 2016 TIM SA and subsequent years should be muted and the incremental changes year over year should return to an increase in the national TIM SA score. 2016 Traffic Incident Management Self-Assessment ResultsIn 2016 a total of 94 locations completed a TIM SA for inclusion in the national analysis, down one from 2015. The 51 scored questions contained within the TIM SA were grouped into three sections; Strategic, Tactical, and Support. The initial assessments completed in 2003, 2004, and one in 2005 (78 in total) continue to be used as the baseline scores; however the scores were recalibrated in 2015 as a result of the significant revisions to the TIM SA described above. Table 1 shows the average score for each of the three TIM SA sections from the Baseline and 2016, along with the percentage change from the Baseline.
The 2016 overall TIM SA score was 68.6 percent (out of a possible 100 percent), representing a 35.3 percent increase over the Baseline. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas, and non-top 75 metropolitan areas:
As described above, the significant revisions implemented in 2015 resulted in an overall decrease in the national score from 2014 to 2015 (down 9.5 percent). The incremental change in the first year post-revision saw an increase in the overall national score of 1.5 percentage points (2.2 percent) from 67.1 to 68.6 percent. A listing of all 51 TIM SA questions, their respective Baseline and 2016 scores and the percentage of programs scoring each question 3 or higher1 can be found in Appendix A. StrategicThe 24 questions in the Strategic section are grouped into three subsections: Formal TIM Programs, TIM Training and After Action Reports, and TIM Performance Measures. As part of the 2015 TIM SA revisions, a number of new scored questions were added to the Strategic section on the National TIM Responder Training Course and the data used in calculating TIM Performance Measures (TIM PM). The consistently low scores in the TIM Performance Measures subsection have resulted in the Strategic section annually receiving the lowest score in the TIM SA. This was the case once again in 2016, with the questions in the Strategic section achieving a score of 63.9 percent. However, this does represent a 50.6 percent increase over the Baseline. The questions on the National TIM Responder Training (including three new scored questions added in 2015) all realized increased scores in 2016, representative of the continued and increased deployment of the training by FHWA. As shown in Table 2, Questions #12-14 received high average scores and a high percentage of the TIM SA scoring each question 3 or higher. While Question #15 achieved a lower score than the other TIM training questions, it also saw improvement in the average score from 2015.
The impact of the widespread deployment of the National TIM Responder Training by FHWA is also being realized in scores elsewhere in the TIM SA. Among the top five scoring questions overall on the 2016 TIM SA is Question #6, "Are the TIM response roles and responsibilities of public and private sector TIM stakeholders mutually understood?" The question received an average score of 3.22 and 91.5 percent of participating locations scored this question 3 or higher. The scoring guidance for Question #6 provides the following description: Score 4 if: TIM roles and responsibilities are mutually understood by the majority of public and private sector disciplines. Roles and responsibilities are clearly documented with multidiscipline agreements, policies, or manuals. There is strong recognition that each discipline has a job to do and that safe, quick clearance is a priority for all. Routine multidiscipline training and exercises reinforce the importance of working as a team. Using the guidance provided, 31 percent of the TIM SA respondents scored their programs 4 on Question #6. The companion non-scored supplemental question (#6a) asks respondents to describe how the roles and responsibilities of public and private sector TIM stakeholders are communicated. A majority of those providing supplemental responses point to the Strategic Highway Research Program 2 (SHRP2) National TIM Responder Training. This is further corroborated by examining locations that scored their programs high (3 or 4) on Question #6 and on the TIM training questions; 36 percent of TIM SA respondents scored their programs 3 or 4 on Questions #6, #12, and #14. Significant progress has been made in the area of TIM Performance Measurement over the past decade and the scores in the TIM PM subsection reflect that progress. Scores for both Roadway Clearance (RC) and Incident Clearance (IC) indicate that an increasing number of locations around the country are measuring both TIM PMs using the FHWA definitions and that the data is being used to impact operations. However, average scores for the third TIM PM, secondary crashes, are the lowest on the 2016 TIM SA. Only three questions scored a 2 or less on the 2016 TIM SA, and all three were questions on secondary crashes (Table 3).
Slightly less than half (47.9 percent) of TIM SA respondents scored their program a 1 on Question #21, indicating that secondary crashes are typically not measured. Of the remaining locations that did score their program a 2 or higher, only 24 locations provided secondary incident data. Those locations reported that, on average, secondary incidents comprised 12.1 percent of all incidents, an increase from 8.1 percent reported in 2015. However, caution should be taken in interpreting changes in the percentage of incidents reported as secondary given the varied levels of use of FHWA's secondary incident definition. The comments provided by TIM SA respondents to the questions on secondary crashes indicate that while their definition of a secondary incident may match FHWA's, there is limited data collection and analysis on those incidents. Similar to last year, some TIM SA respondents indicated that their program is currently in the process of either developing methods for collecting secondary crash data, or revising current accident reporting systems to include secondary crash data in the hopes of including this metric in the TIM SA in future years. Another output of the TIM SA is the TIM Performance Measures (PM) Database. This database is populated annually based on responses to the TIM SA. Information on the three key PM metrics – Roadway Clearance Time (RCT), Incident Clearance Time (ICT) and secondary crashes – is tracked annually and compared to a Baseline (2011) level. Average RCT decreased to 47.79 minutes in 2016, down 25.1 percent from the 63.80 minutes reported in 2015. Average incident clearance time (ICT) decreased by 13.3 percent from 2015 to 2016 (61.53 minutes in 2015 versus 53.36 minutes in 2016). However, one challenge with the TIM PM Database is the lack of consistent data provision by the TIM SA participants. The TIM PM data requested is part of the non-scored, supplemental data and therefore the locations submitting varies each year. The TIM programs that achieved the highest scores in the Strategic section are listed alphabetically in Table 4. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.
TACTICALThe 22 questions in the Tactical section are focused on the following three areas:
The Tactical section continues as the highest scoring of the three TIM SA sections, achieving an overall score of 72.8 percent. Three of the five highest scoring questions on the 2016 TIM SA are in the Tactical section, as part of the Policies and Procedures subsection (Table 5).
High scores in this area can be attributed, in part, to the National TIM Responder Training which emphasizes the need for policies and procedures that provide for responder and motorist safety and quick clearance. Question #30 queries TIM SA respondents on the use of the Incident Command System (ICS) while on scene. With an average score of 3.16 in 2016 and 85.1 percent of locations scoring this question 3 or higher, it is evident that use of ICS is widespread. Lesson #6 of the National TIM Training Program focuses on Command Responsibilities including ICS and Unified Command (UC) and the high score here may be attributable, in part, to the large numbers of responders participating in the national TIM training. There are two questions in the TIM SA that query respondents on Safety Service Patrols (#28 and #29). The first asks about the existence of a Safety Service Patrol and the second asks respondents to score the Safety Service Patrol's level of coverage. Nearly 50 percent (48.9 percent) of respondents scored both questions 3 or 4 (with 31 percent scoring both questions 4), meaning that across the country there are a large number of Safety Service Patrols that range from mid-level to full-function Safety Service Patrols. Services provided by these Safety Service Patrols include motorist assistance to incident response and clearance, and emergency traffic control and scene management. Furthermore, these Safety Service Patrols range from medium fleets providing service on most major roadways to fleets large enough to provide ample coverage on all major roadways. Sixty-seven percent of the 2016 TIM SA respondents provided information on levels of coverage, with the combined Safety Service Patrol coverage of 10,740 centerline miles and 12,419 lane miles (some programs reported centerline, others lane miles). The median centerline miles coverage reported by 2016 TIM SA respondents was 106 and the median lane miles coverage was 369. The TIM programs that achieved the highest scores in the Tactical section are listed alphabetically in Table 6. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.
SupportThe questions in Support focused on the tools and technologies enabling improved incident detection, response, and clearance. The 2015 TIM SA revision removed questions on Traveler Information, returning the emphasis to the infrastructure and activities that enable incident information exchange between TIM program stakeholders. This allows programs to rate their progress on items over which their TIM program has control as well as aligning the Support section with one of the three key objectives of the National Unified Goal for Traffic Incident Management – prompt, reliable, interoperable communications. The five questions in the Support section all address TIM data sharing and integration among TIM stakeholders. The highest scoring question in the Support section was Question #47 (below) which scored an average score 3.37, the highest scoring question on the 2016 TIM SA. 47. Are TIM stakeholders aware of and actively utilizing Traffic Management Center/Traffic Operations Center resources to coordinate incident detection, notification and response? The questions on data and video sharing between agencies provide greater granularity on the level of data and video sharing. While the two questions achieved nearly identical average scores, the TIM data question (as opposed to video) had a higher percentage of TIM SA respondents scoring their program 3 or 4 on Question #48 (Table 7).
The support section had the second highest overall score of 69.7 percent and had the largest increase over Baseline of the three sections (75.7 percent). The TIM programs that achieved the highest scores in the Support section are listed alphabetically in Table 8. Jurisdictions with low scores may wish to reach out to these locations for information on best practices.
SummaryA total of 94 TIM SA were completed in 2016, with an average overall score of 68.6 percent (out of a possible 100 percent). Overall scores were up 35.3 percent over the Baseline scores. The TIM SA mean scores tended to be higher in larger metropolitan areas than in smaller areas. Specifically, mean scores were calculated for the top 40 metropolitan areas (by population), the top 75 metropolitan areas, and non-top 75 metropolitan areas:
The highest scores were achieved in Tactical (72.8 percent) and the largest percentage increase in scores from the Baseline was in Support (75.7 percent). Low scoring questions and those with the least improvement over Baseline indicate specific program areas where additional guidance from FHWA may be warranted. The lowest scoring questions on this year's TIM SA, as described above, were all on secondary crashes. Data on secondary crashes and for the larger suite of TIM Performance Measures is a key focus of FHWA's Every Day Counts (EDC-4) initiative for 2017-20182 and as such, scores in this area should increase in the coming years. Another indicator of potential focus areas for FHWA is on those questions that experience a small change over Baseline. Fifteen questions on this year's TIM SA experienced changes over Baseline of less than 10 percent. However, six of those questions had average scores of 3 or more, leaving less room for improvement.3 These are areas where TIM programs have consistently scored well and continue to do so, including towing and recovery and hazmat response procedures. Another area with lower scores (<3) and little improvement over Baseline (<10 percent) is in the Tactical section and deals with response procedures when the incident involves a fatality. 37. For incidents involving a fatality, is there a procedure in place for early notification and timely response of the Medical Examiner? 38. For incidents involving a fatality, is there a procedure for the removal of the deceased prior to Medical Examiner arrival? Scores for Questions #37 and 38 are averaged (composite question) and in 2016, the average score was 2.59, representing a 2.2 percent increase over Baseline. Additionally, only 56.9 percent of responding locations scored this question 3 or higher. This is an area that could receive additional emphasis and instruction in the National TIM Responder Training course as part of Lesson #8 on Special Circumstances. APPENDIX A. Summary of 2016 Traffic Incident Management (TIM) Self-Assessment (SA) Results
1 In both the previous TIM SA scoring schema and the newly revised scoring schema (implemented in 2015), scores of 3 and 4 indicate the highest levels of progress for a particular question [ Return to note 1. ] 2 U.S. Department of Transportation Federal Highway Administration. Using Data to Improve Traffic Incident Management. Available online at: https://www.fhwa.dot.gov/innovation/everydaycounts/edc_4/timdata.cfm [ Return to note 2. ] 3 The questions with high scores (3+) but little (<10%) increase over Baseline are Questions #14, #25, #32, #33, #34, and #36 (see Appendix A). [ Return to note 3. ] |
United States Department of Transportation - Federal Highway Administration |