Office of Operations
21st Century Operations Using 21st Century Technologies

Process for Establishing, Implementing, and Institutionalizing a Traffic Incident Management Performance Measurement Program

STEP 4: Analyze Data and Report Performance

In this step, traffic incident management (TIM) performance analysis and reporting are discussed. Information on and examples of aggregate and disaggregate analyses; the analysis of TIM performance trends; advanced analysis/visualization of TIM data; performance reports, dashboards, and scorecards; and internal use and reporting of TIM performance are provided here.

Traffic Incident Management Performance Analysis

Aggregate Analysis

An agency can use the TIM data collected in Step 3 to calculate the TIM performance measures for all incidents during a specified time period, which is a good starting point for understanding regional TIM performance at the highest level. Table 3 is a TIM performance measures summary for Florida Department of Transportation (FDOT) District 4 for the week of September 8, 2013.

Table 3. One-week performance measures summary for Florida Department of Transportation District 4.
Empty cell. 52-Week Average Current Week Previous Week
Events included in Performance Measures 86 114 89
A. Notification Duration (min.)1
B. Verification Duration (min.) 1.1 1.2 1.4
C. Response Duration (min.) 3.8 4.0 4.2
D. Open Roads Duration (min.) 31.4 28.1 35.5
E. Departure Duration (min.) 18.6 17.0 21.4
Roadway Clearance Duration (min.) 36.2 33.2 41.1
Incident Clearance Duration (min.) 54.8 50.2 62.5
(Source: Florida Highway Patrol.)

1Florida Highway Patrol Data is not available for Notification Duration. [Return to Table Note 1]

Disaggregate Analysis

Considering the range of incident types and incident characteristics, aggregate measures of performance may not always be informative enough. Minor incidents confined to the shoulder can generally be cleared relatively quickly as compared to major incidents that block multiple roadway lanes and/or involve injuries/fatalities. Combining these wide-ranging clearance times into one overall average value results in a loss of understanding as to how the incident characteristics impact incident response and clearance. Instead, calculating the average clearance times for minor and major incidents separately can provide more useful and revealing information about performance that can help TIM programs identify ways to improve. Furthermore, using more refined information on TIM performance can help an agency better demonstrate accountability, process efficiency, and/or program effectiveness.

While Step 1 noted data elements for characterizing TIM performance, the most commonly used include incident type (e.g., crash, noncrash); incident severity/duration/impact; injury severity (e.g., property damage only, fatality); and roadway information (e.g., name). Figure 7 is a table extracted from the November 2014 Southeast Michigan Transportation Operations Center (SEMTOC) Performance Measures Report. This table shows the total incidents, incidents per mile, and average incident clearance time (ICT) reported by freeway. The measures are compared to those from the previous month, as well as those from the same month during the previous year.

This table shows how the average ICT for each roadway can differ greatly from the overall regional average, illustrating the importance of conducting a more refined analysis of performance.

Figure 8 shows the results from an analysis of TIM performance developed from data from the Washington State Department of Transportation (WSDOT). This figure shows the number of responses and the average ICTs by incident type and injury severity for 2002 and 2004. Incident types include collisions, noncollision blocking incidents, and noncollision/nonblocking incidents. Injury severity includes fatality, injury, and noninjury incidents. While the overall average ICTs for 2002 and 2004 are not shown, it is clear from the range of disaggregate ICTs (5 to 271 minutes), as well as the range of responses per incident type (30 to 7,172), that the overall average ICT values would not provide as useful information as presented in these graphs, once again illustrating the importance and value of breaking down TIM performance measures.

Figure 7 is a table extracted from the November 2014 Southeast Michigan Traffic Operations Center (SEMTOC) Performance Measures Report.
Figure 7. Table. Extracted from the November 2014 Michigan Department of Transportation Southeast Michigan Transportation Operations Center performance measures report—incident clearance time by freeway.3
(Source: Michigan Department of Transportation.)

Figure 8 shows the results from an analysis of Traffic Incident Management (TIM) performance developed from data from the Washington State Department of Transportation.
Figure 8. Graph. Washington State Department of Transportation traffic incident management performance by incident type and injury severity.4
(Source: Washington State Department of Transportation.)

Analyzing Trends in Traffic Incident Management Performance

Incident information from New Jersey Department of Transportation's (NJDOT) traffic operations center (TOC) is used to produce monthly reports on the incident management program. The reports provide a range of aggregate performance measures, including monthly and yearly trends in average ICT (as shown in Figure 9). The reports also provide more disaggregate trend analysis by presenting average ICT by major highway (Table 4).

Figure 9 shows monthly and yearly trends in aggregate average Incident Clearance Times (ICT) for New Jersey, calculated from data from the New Jersey Department of Transportation's traffic operations centers.
Figure 9. Graph. Trends in aggregate average incident clearance times, May 2013.
(Source: New Jersey Department of Transportation.)

Table 4. Trends in average incident clearance times by roadway.
Interstate Number of Incidents Average Duration (H:MM) Year to Date Average Duration (H:MM) Duration Monthly Trend Duration Yearly Trend
I-195 11 0:30 0:46 Down arrow Down arrow
I-280 31 0:25 0:31 Up arrow Down arrow
I-287 54 0:53 0:46 Up arrow Up arrow
I-295 97 0:42 0:42 Down arrow Down arrow
I-676 9 0:31 0:23 Up arrow Up arrow
I-76 11 0:24 0:27 Side to side arrow Down arrow
I-78 41 0:39 0:45 Down arrow Down arrow
I-80 78 0:34 0:34 Up arrow Down arrow
I-95 10 0:32 0:38 Down arrow Up arrow
NJ 24 5 0:28 0:42 Up arrow Down arrow
NJ 42 36 0:27 0:26 Up arrow Down arrow
NJ 55 6 0:32 0:41 Up arrow Up arrow
(Source: New Jersey Department of Transportation.)

NJDOT shares its performance measures with the State police and the transportation commissioner and staff, as well as the public via presentations, the Governor's dashboard, and the Federal Highway Administration (FHWA).

WSDOT uses the WSDOT Incident Tracking System (WITS) data archive to conducted short-term trend analyses to monitor overall program performance. WSDOT uses the WITS data archive to conducted short-term trend analyses to monitor overall program performance. WSDOT's Gray Notebook is a quarterly performance report on transportation systems, programs, and department management. The Goals, Performance, and Trends provides an overview of the key performance indicators for five of six policy goals; one of which is mobility. These trends shows the current and previous performance mark for each measure, including average ICT for all incident response (IR) program responses, and indicates which way the program is trending. Figure 10 is an extract from the December 31, 2014 report.5 Figure 11 shows a three-year trend analysis comparing total IR team responses to average ICT before and after expansion of the IR program.6 The trends show a significant drop in average ICT after expansion of the program, as well as a continued downward trend and leveling off of ICT despite a steady and significant increase in the number of responses.

Figure 10 is an extract from the Washington State Department of Transportation's Gray Notebook report, focusing on performance measures that fall under Washington State Department of Transportation's mobility (congestion relief) policy goal.
Figure 10. Graph. Extract from Washington State Department of Transportation's Gray Notebook—Goals, performance and trends, December 31, 2014.
(Source: Washington State Department of Transportation.)

Shows a 3-year (2002, 2003, and 2004) trend analysis from the Washington State Department of Transportation.
Figure 11. Graph. Analysis of incident response trends using Washington State Department of Transportation's statewide incident tracking system data.
(Source: Washington State Department of Transportation.)

Figure 12 and Figure 13 are graphs generated from annual TIM performance measures provided by the Minnesota Department of Transportation (MnDOT). Figure 12 shows the six-year trend (2008 to 2013) for average roadway clearance time (RCT), and Figure 13 shows the same six-year trend for average ICT. In addition, the performance trends are shown separately by incident type, including crashes, injury crashes, rollovers, spinouts, blocking stalls, and blocking unoccupied stalls, as well as the overall annual performance averages (indicated by the dashed trend lines). Not only do these graphs indicate how overall TIM performance is trending, the graphs provide information on where performance is and how it is trending for different types of incidents in relation to the overall average and other incident types. This type of information can be useful in identifying if there is a specific type of incident that needs special attention. For example, while the average RCTs for most incident types have been decreasing or generally holding steady, the average RCTs for spinouts gradually increased over the six-year period. Armed with this information, the TIM partners could looking for ways to improve RCT for these types of incidents.

Figure 12 shows the average roadway clearance times (RCT) from the Minnesota Department of Transportation (MnDOT) in the years 2008 through 2013 for six different types of incidents, in addition to the overall average across all incident types.
Figure 12. Graph. Minnesota Department of Transportation traffic incident management performance six-year trend—average roadway clearance time by incident type.
(Source: Minnesota Department of Transportation.)

Figure 13 shows the average incident clearance times (ICT) from the Minnesota Department of Transportation (MnDOT) in the years 2008 through 2013 for six different types of incidents, in addition to the overall average across all incident types.
Figure 13. Graph. Minnesota Department of Transportation traffic incident management performance six-year trend—average incident clearance time by incident type.
(Source: Minnesota Department of Transportation.)

Advanced Analysis/Visualization of Traffic Incident Management Data


Real-time analysis

In addition to weekly and quarterly performance reports on regional TIM, the Virginia Department of Transportation (VDOT) is developing real-time analysis tools and reporting capabilities for the TIM program. Real-time analysis tools and reporting capabilities include showing current impacts of an incident, consequences of extended lane closures, congestion impacts, detour options, etc. VDOT hopes that, once developed, these real-time reporting tools and capabilities will aid in the decision-making process.

Mining of archived snapshots

The Freeway and Arterial System of Transportation (FAST) recently generates a "30-60-90" RCT calculation for the Nevada Department of Transportation (NDOT) using the following categorization of incidents:

  • An incident meets the 30-minute roadway clearance criterion if it involves no injuries and it is removed from the travel lanes in 30 minutes or less.
  • An incident meets the 60-minute roadway clearance criterion if injuries are involved and it is removed from the travel lanes in 60 minutes or less.
  • An incident meets the 90-minute roadway clearance criterion if it involves a fatality and it is cleared in less than 90 minutes.

To aid with these calculations, FAST added a check box on the traffic management center (TMC) incident screen for operators to indicate when an injury/ambulance is involved. In addition, FAST archives closed-circuit television (CCTV) snapshot images taken during the incident timeframe at the incident location, as well as of adjacent roadway segments, such as ramps or arterial streets) (Figure 14). By reviewing these snapshots (animation plays at 15-second intervals), analysts can examine the impacts of the incidents on the roadways (which lanes are blocked/cleared and when) to obtain additional details. FAST makes use of these snapshot archives to help generate reports to the NDOT.

Screenshot of the Freeway Arterial System of Transportation (FAST) snapshot archiving function, in which closed-circuit television snapshot images taken at the incident location and adjacent roadway segments are archived in the FAST interface.
Figure 14. Screenshot. Snapshot archiving.
(Source: Freeway and Arterial System of Transportation.)

Heat Maps

A heat map is a graphical representation of data where the individual values contained in a matrix are represented as colors. Heat maps can be used to show the impact of an incident (and the associated response and clearance times) on congestion (e.g., speeds, density, delay). An example heat map from FAST is shown in Figure 15. In this heat map the colors represent the average speed along one corridor over a 24-hour period. It can be seen in the plot how using heat maps could aid in determining if a crash is a secondary crash, an approach that is favored by FAST over the TMC operators making the determination at the time of the crash. Another example of a heat map representing the effects of an incident is provided in Figure 16 This heat map is part of the Incident Timeline Tool developed by the University of Maryland Center for Advanced Transportation Technology (UMD-CATT).

A heat map of average speeds along a freeway, in a single direction of travel showing congestion throughout the entire freeway segment during the morning rush hour period and at one end of the freeway segment during the afternoon.
Figure 15. Heat Map. Heat map of average speed.
(Source: Freeway and Arterial System of Traffic.)

A heat map showing the impact of an incident on travel speeds on a particular segment of freeway over time.
Figure 16. Heat Map. Heat map illustrating the impact of an incident.
(Source: University of Maryland Center for Advanced Transportation Technology.)

Traffic Incident Management Performance Reporting

The systematic, ongoing performance measurement process involves collecting and analyzing data to determine if organizational objectives have been met, and then using the information internally to make strategic and tactical decisions, as well as reporting the findings to stakeholders and customers.

Performance reports

Weekly, monthly, quarterly, and/or annual performance reports, developed and available for various audiences, are a common way of reporting TIM performance. Good examples include:

Michigan Department of Transportation's (MDOT) Monthly Performance Measures Reports for the West Michigan TOC and the Southeast Michigan TOC are archived and available to the public via MDOT's Web site. Figure 17 is an extract from West Michigan TOC for October 2014.7 The graph on the left presents the aggregate analysis (overall averages) of roadway and ICT, the graph on the right breaks down the number of incidents by incident severity/duration (to give further context to the average clearance times), and the number and percentage of secondary crashes is noted at the bottom. In addition to the monthly reports, MDOT produces an annual Performance Measures Report, which essentially rolls up the monthly reports to cover the entire year's activities and performance.

The Gray Notebook is the WSDOT's quarterly accountability report, providing the latest information on system performance and project delivery.

Through Tennessee Department of Transportation's (TDOT) Locate/IM database, each regional system has the capability to produce a quarterly report on traffic incidents and HELP Truck activity, including total incidents, events affecting traffic, clearance times, and the types of service provided by the HELP patrol in each region. The system also allows for a statewide quarterly report to be generated, combining each region's information.

Figure 17 is comprised of two related figures depicting traffic incident management (TIM) performance measures from Michigan, shown side by side.
Figure 17. Graph. Extract from West Michigan traffic operations center's October 2014 performance measures report.
(Source: Michigan Department of Transportation.)

Dashboards

Dashboards are another way of reporting TIM performance, and are generally geared more towards the public. The VDOT's online dashboard presents a variety of performance data, including incident durations. Figure 18 is a screenshot of the incident duration page of the dashboard. (Virginia DOT, accessed April 2015.) While this page presents the aggregate analysis of incident duration of all incidents statewide over the past three months, the data can be filtered to show a more disaggregate analysis by district, incident severity, incident type, and for various time frames. At the bottom of the page, the user can choose to view the incident details, including individual ICT, or to display trends in average clearance times over the past few months. VDOT currently is in the process of revamping its dashboard.

Figure 18 is a screenshot of the Virginia Department of Transportation"s (VDOT) dashboard for reporting traffic incident management (TIM) information to the public—specifically the incident duration aspect of the dashboard.
Figure 18. Screenshot. Virginia Department of Transportation incident clearance times (3 months).
(Source: Virginia Department of Transportation.)

Scorecards

The Wisconsin Department of Transportation's (WisDOT) Performance Improvement program focuses on the core goal areas of Mobility, Accountability, Preservation, Safety and Service (MAPSS). A quarterly MAPSS Performance Improvement Report summarizes the progress of selected customers to show the current state of Wisconsin's transportation system. The department also has interactive Web pages within each core goal area for performance measures on a two-page scorecard, and then details the progress of each measure in the body of the report. These scorecard measures have been deemed of highest importance to WisDOT customers who are interested in "drilling down" into the data (Figure 19). One of the scorecard measures is the average incident clearance time for "extended duration incidents" (EDI)—those incidents that close one direction of an interstate for two hours or more, or both directions for 30 minutes or more.8

Figure 19 is a screenshot of the Wisconsin Department of Transportation's (WisDOT) interactive web page for its incident response goal area.
Figure 19. Screenshot. Screenshot of Wisconsin Department of Transportation's interactive Web page for incident response goal area.
(Source: Wisconsin Department of Transportation.)

Internal Use and Reporting

Reporting requirements and public transparency are only some of the reasons for collecting data and analyzing TIM performance. Another primary reason for collecting and analyzing TIM data is to be able to make strategic and tactical, data-driven decisions to impact TIM program performance.

The TDOT maintains an internal performance goal to open travel lanes within 90 minutes for 94 percent of all incidents. TDOT collects RCT data and tracks performance to ensure that it is meeting that goal. If the goal is not being met, it works to determine what needs to be done to improve performance. Past examples of improvements include training and expanded HELP coverage areas. In addition, secondary crashes are recorded by both TDOT (via the TMCs) and the State police (via the TITAN database's electronic crash reports). Having data on secondary crashes has allowed Tennessee to identify serious secondary crashes that have occurred in the queue of a primary incident. As a result, TDOT developed a "queue protection" program to minimize secondary crashes. The program involves deploying equipment (e.g., trucks, arrow boards) and trained personnel to help protect queues that develop as a result of incidents. This program has been in operators for about two years; and while TDOT does not have historical data with which to compare, preliminary data suggest a 20- to 30-percent reduction in secondary crashes over the past year.

In early 2011, there was a major policy revision in Arizona requiring police officers to move vehicles completely off the roadway (away from view) during incidents. The Arizona Department of Public Safety (AZDPS) used performance measures before and after this policy change to determine if the policy had an impact on TIM performance. Table 5 compares the average RCT and ICT for crashes that occurred between October and December 2010 (prior to the policy change) and four years later between October and December 2014 (after the policy change) by injury severity. For noninjury and injury incidents, average clearance times decreased after implementation of the policy, suggesting that the change was effective at reducing the clearance times of these crashes, particularly the noninjury crashes. For fatal crashes, however, clearance times actually increased, suggesting that the policy change no impact on these severe and highly sensitive crashes.

Table 5. Arizona Department of Public Safety Metropolitan Phoenix traffic incident management performance between October-December 2010 and October-December 2014.
Injury Category Performance Measure October-December 2010 Performance October-December 2014 Performance Percent Change
Non-injury RCT 45 9 -80%
Non-injury ICT 84 34 -60%
Injury RCT 54 23 -54%
Injury ICT 94 54 -43%
Fatal RCT 212 267 +26%
Fatal ICT 214 282 +32%
(Source: Arizona Department of Public Safety.)

Note: RCT is roadway clearance time, ICT is incident clearance time.

The AZDPS uses secondary crashes in the agency's strategic plan. The Commander tracks the percent of secondary crashes over time; and if the numbers start increasing, it is the Commander's role to determine ways to reduce the numbers. In addition, AZDPS is using the data to better manage its resources on the roads. For example, they were able to reduce/eliminate recurring crashes in one location by strategically placing officers near the site. By knowing where and when incidents tend to occur (as well as the type of incidents), AZDPS staged its resources to reduce/eliminate response times (drive times, time to deploy tow trucks). AZDPS started this program to get the supervisors involved in using the data and understanding how they could influence how their officers are patrolling. The next step is to look at response and clearance times.

The WisDOT has set a departmental goal to reduce the length of time traffic flow is disrupted by EDIs on the interstates—those that close an interstate for more than two hours in one direction or for more than 30 minutes in both directions. The target for Extended Duration Incidents (EDI) clearance is four hours or less. WisDOT monitors and records all major incidents, and then conducts an After Action Review (AAR) to help identify strengths, weaknesses, opportunities, and threats associated with clearance activities. An EDI workgroup has been formed to analyze all facets of the process to identify areas for improvement.

You may need the Adobe® Reader® to view the PDFs on this page.

3Southeast Michigan Transportation Operations Center 2014 November Monthly Performance Measures, Michigan DOT, http://www.michigan.gov/documents/mdot/2014-09_Grand_Performance_Report_472027_7.pdf. [Return to Note 3]

4How WSDOT Incident Tracking System Leverages Archived Data, presented at the ITS America Annual Meeting, May 2, 2005. [Return to Note 4]

5The Gray Notebook, Washington State DOT, GNB Edition 56, December 31, 2014. [Return to Note 5]

6How WSDOT Incident Tracking System Leverages Archived Data, presented at the ITS America Annual Meeting, May 2, 2005. [Return to Note 6]

7West Michigan Transportation Operations Center 2014 September Monthly Performance Measures, Michigan DOT. [Return to Note 7]

8Wisconsin DOT, accessed April 2015. [Return to Note 8]