Emergency Transportation Operations

Section 3. Survey of Incident Responders

A survey instrument was developed to obtain information on how transportation, law enforcement, fire, and EMS/rescue agencies measure and report incident management performance measures in their jurisdiction. The survey instrument solicited information related to the following issues:

  • How incidents are defined by agencies in their jurisdiction;
  • How information about incidents is tracked and recorded;
  • What, if any, measures they are collecting, calculating, or recording regarding incidents;
  • What are the cost of collecting, processing, and reporting the measurement and source data;
  • If agencies are not using any measures, why not;
  • If they are planning to implement measures, why, when, and how;
  • How each measure is defined and calculated or measured;
  • How the measures were decided upon and by whom;
  • How long performance measure data have been collected and calculated;
  • To whom the measures are reported, and how often;
  • With whom the measures are shared;
  • What the recipients do with the measures;
  • What decisions are made based on or are influenced by the measures;
  • How the recipients feel about the measures (i.e. are they meaningful, are they timely, do they provide the information necessary for effective decision-making);
  • The types of data collected about incidents, and the sources of the data;
  • Whether similar data exists from other sources (especially other incident management partner agencies), whether the data from the different sources are compared to one another, and any findings from the comparison;
  • What issues exist regarding measuring incident management performance, and how they have been dealt with;
  • What are the best candidate measures, whether they are recording measures or not.

Methodology

TTI used a telephone-interview type of format to collect the information from the different transportation, law enforcement, fire, and EMS/rescue agencies. A series of questions were developed that represented the basic level of information to be obtained from each agency. A copy of the survey document is contained in Appendix D.

A pilot test of the survey instrument was performed prior to conducting the actual survey. The purpose of the pilot test was to verify that the wording of the questions were clear and concise, to fine-tune the data collection methodology, and to assess whether the questions provided meaningful response. Based on the results of the pilot test, the survey document was revised slightly to clarify some of the questions.

To conduct the survey, members of the research team initially contacted, via the telephone, each of the identified individuals to request their participation in the survey. During this initial contact, the researcher arranged a convenient day and time to conduct the survey or identify alternative contacts. The researcher also obtained either a mailing address or an e-mail address to which the survey questions could be sent. The researcher then forwarded the actual survey questions to the respondent prior to actually conducting the survey. This was done so that the survey respondent would have adequate time to prepare his or her responses to the questions.

At the scheduled day and time, the researcher contacted the survey respondent by telephone and administered the survey. The researcher documented the respondent's answers to each question. The researcher also asked probing questions to clarify the response to survey question. The responses were then coded into a spreadsheet to aid in analysis. This spreadsheet has been provided to FHWA under a separate deliverable.

Response Rate

A total of 54 individuals from 30 locations were identified as potential respondents to the survey. These individuals were identified from the following sources:

  • The IEEE Incident Management Working Group,
  • The ITE Traffic Incident Management Committee,
  • The TRB Freeway Operations Committee,
  • Personal contacts, and
  • Internet searches of functioning traffic management centers.

A total of 23 individuals from 19 locations actually participated in the survey. The remainder of the individuals originally identified either did not reply to initial inquiries about participating in the survey, elected not to participate in the survey, or indicated that they did not have an active incident management program in their area.

TTI planned to use representatives from the transportation agencies to identify appropriate individuals in the law enforcement and emergency service agencies to survey. One problem with this approach was that respondents were often unwilling to provide contact information of representatives from other agencies that were responsible for incident management. This was because either they did not know the correct person at the appropriate level or did not want to increase the workload of these individuals with trying to respond to the survey. Therefore, most of the insight into the emergency services perspective was obtained through the literature and a limited number of survey responses.

Findings

Definition of Incident

Most of the transportation agencies surveyed agree with the TMDD definition of an incident. Most agencies define an incident as any unexpected event that causes a temporary reduction in capacity. The term "temporary" is an important modifier because it implies that after the agency performs some type of initial operation or response (i.e., clearing wrecked vehicles from the travel lanes, removing a spilled load, etc.) the roadway can be reopened and normal capacity can be resumed. For the most part, transportation agencies do not view highway maintenance and reconstruction projects and non-emergency events themselves as incidents, generally, because they are events that have planned means of accommodating traffic flow.

Most transportation agencies do not consider the long-range effects of an incident as part of the initial incident. For example, most transportation agencies would not consider the repair of a collapsed bridge deck, or the removal of spilled cargo that has been pushed beyond the shoulder area as part of an incident, even though an event that they would describe as an incident was the primary cause of the loss of capacity. This is especially true when recovery efforts extend over multiple days. Most transportation agencies tend to classify incident events as being over once the initial response to the incident event has left the scene and when more traditional traffic control (i.e., work zone type traffic control) has been established at the scene.

Interestingly, many transportation agencies also classify unexpected weather events (particularly snow and ice) as an "incident," because they typically cause temporary reductions in capacity (i.e., once the snow event is over and the roadways are cleared, the "incident" is over), increase the potential for secondary events (such as crashes and stalled vehicles), and more importantly, require a "response" from the transportation agency (dispatching of snowplows and de-icing equipment, etc.).

Some agencies also classify events involving select sensitive users, such as school buses, railroad crossing, etc. as incidents, primarily because these events may require special attention for political or public welfare reasons.

Generally, events have to be on a roadway facility itself or in the right-of-way to be considered as an incident by transportation agencies. Events that occur off the right-of-way, such as a structure fire, are not routinely thought of as "incidents" by transportation agencies. Some agencies do log these events in their incident management software and may broadcast messages about these events through their motorist information systems.

Classification of Incidents

One goal of incident management is to ensure that the appropriate response personnel and equipment is provided at every incident. To aid in determining the appropriate level of response, many transportation and emergency service providers have developed systems of classifying incidents. Table 2 shows how the survey respondents replied to questions concerning methods and criteria for classifying incidents in their local area. The table also shows how the level of severity of the incident effects each agency's response decisions.

Table 1. Definition of Incident by Survey Respondents
Agency Collision Overturned Vehicle Stall in Lane Abandoned Vehicle In Lane Stall on Shoulder Vehicle Fire Hazmat Spill Abandoned Vehicle On Shoulder Public Emergency Debris Roadway Other
Kansas DOT—Kansas City Kansas City Department of Transportation reports incidents of collision. Kansas City Department of Transportation reports incidents of overturned vehicles. Kansas City Department of Transportation reports incidents of stalling in lane. Kansas City Department of Transportation reports incidents of abandoned vehicles in lane. Kansas City Department of Transportation reports incidents of stalling on shoulder. Kansas City Department of Transportation reports incidents of vehicle fires. Kansas City Department of Transportation reports incidents of hazardous material spills. empty cell Kansas City Department of Transportation reports incidents of public emergency. Kansas City Department of Transportation reports incidents of roadway debris. Only incidents requiring police accident reports are documented. Kansas DOT is currently in the process of building a TMC. They hope to have it operational by the end of this year to early next year. Currently, the state police and service patrol (operated by the police) are the only incident management elements in place. The police provide the DOT with copies of the accident reports for accidents on their facilities.
New Jersey DOT New Jersey Department of Transportation reports incidents of collision. New Jersey Department of Transportation reports incidents of overturned vehicles. New Jersey Department of Transportation reports incidents of stalling in lane. New Jersey Department of Transportation reports incidents of abandoned vehicles in lane. New Jersey Department of Transportation reports incidents of stalling on shoulder. New Jersey Department of Transportation reports incidents of vehicle fires. New Jersey Department of Transportation reports incidents of hazardous material spills. New Jersey Department of Transportation reports incidents of abandoned vehicles on shoulder. New Jersey Department of Transportation reports incidents of public emergency. New Jersey Department of Transportation reports incidents of roadway debris. Downed Utility Pole; downed signal pole; anything blocking a lane or shoulder
Arizona DOT Arizona Department of Transportation reports incidents of collision. Arizona Department of Transportation reports incidents of overturned vehicles. Arizona Department of Transportation reports incidents of stalling in lane. Arizona Department of Transportation reports incidents of abandoned vehicles in lane. Arizona Department of Transportation reports incidents of stalling on shoulder. Arizona Department of Transportation reports incidents of vehicle fires. Arizona Department of Transportation reports incidents of hazardous material spills. Arizona Department of Transportation reports incidents of abandoned vehicles on shoulder. Arizona Department of Transportation reports incidents of public emergency. Arizona Department of Transportation reports incidents of roadway debris. empty cell
Ohio DOT—Columbus Columbus, Ohio Department of Transportation reports incidents of collision. Columbus, Ohio Department of Transportation reports incidents of overturned vehicles. Columbus, Ohio Department of Transportation reports incidents of stalling in lane. Columbus, Ohio Department of Transportation reports incidents of abandoned vehicles in lane. Columbus, Ohio Department of Transportation reports incidents of stalling on shoulder. Columbus, Ohio Department of Transportation reports incidents of vehicle fires. Columbus, Ohio Department of Transportation reports incidents of hazardous material spills. Columbus, Ohio Department of Transportation reports incidents of abandoned vehicles on shoulder. Columbus, Ohio Department of Transportation reports incidents of public emergency. Columbus, Ohio Department of Transportation reports incidents of roadway debris. Unexpected weather change
Tennessee DOT Tennessee Department of Transportation reports incidents of collision. Tennessee Department of Transportation reports incidents of overturned vehicles. Tennessee Department of Transportation reports incidents of stalling in lane. Tennessee Department of Transportation reports incidents of abandoned vehicles in lane. Tennessee Department of Transportation reports incidents of stalling on shoulder. Tennessee Department of Transportation reports incidents of vehicle fires. Tennessee Department of Transportation reports incidents of hazardous material spills. Tennessee Department of Transportation reports incidents of abandoned vehicles on shoulder. Tennessee Department of Transportation reports incidents of public emergency. Tennessee Department of Transportation reports incidents of roadway debris. Anything effecting traffic flow
Phoenix, AZ Fire Dept. Phoenix, Arizona Fire Department reports incidents of collision. Phoenix, Arizona Fire Department reports incidents of overturned vehicles. Phoenix, Arizona Fire Department reports incidents of stalling in lane. Phoenix, Arizona Fire Department reports incidents of abandoned vehicles in lane. empty cell Phoenix, Arizona Fire Department reports incidents of vehicle fires. Phoenix, Arizona Fire Department reports incidents of hazardous material spills. empty cell Phoenix, Arizona Fire Department reports incidents of public emergency. empty cell empty cell
Maryland State Hwy AdminCHART Maryland State Highway Administration reports incidents of collision. Maryland State Highway Administration reports incidents of overturned vehicles. Maryland State Highway Administration reports incidents of stalling in lane. Maryland State Highway Administration reports incidents of abandoned vehicles in lane. empty cell Maryland State Highway Administration reports incidents of vehicle fires. Maryland State Highway Administration reports incidents of hazardous material spills. empty cell empty cell Maryland State Highway Administration reports incidents of roadway debris. Anything effecting traffic flow
Texas DOT—Austin Austin, Texas Department of Transportation reports incidents of collision. Austin, Texas Department of Transportation reports incidents of overturned vehicles. Austin, Texas Department of Transportation reports incidents of stalling in lane. Austin, Texas Department of Transportation reports incidents of abandoned vehicles in lane. Austin, Texas Department of Transportation reports incidents of stalling on shoulder. Austin, Texas Department of Transportation reports incidents of vehicle fires. Austin, Texas Department of Transportation reports incidents of hazardous material spills. Austin, Texas Department of Transportation reports incidents of abandoned vehicles on shoulder. Austin, Texas Department of Transportation reports incidents of public emergency. Austin, Texas Department of Transportation reports incidents of roadway debris. empty cell
Texas DOT—San Antonio San Antonio, Texas Department of Transportation reports incidents of collision. San Antonio, Texas Department of Transportation reports incidents of overturned vehicles. San Antonio, Texas Department of Transportation reports incidents of stalling in lane. San Antonio, Texas Department of Transportation reports incidents of abandoned vehicles in lane. San Antonio, Texas Department of Transportation reports incidents of stalling on shoulder. San Antonio, Texas Department of Transportation reports incidents of vehicle fires. San Antonio, Texas Department of Transportation reports incidents of hazardous material spills. San Antonio, Texas Department of Transportation reports incidents of abandoned vehicles on shoulder. San Antonio, Texas Department of Transportation reports incidents of public emergency. San Antonio, Texas Department of Transportation reports incidents of roadway debris. Weather; construction; maintenance
Minnesota DOT—Minneapolis Minneapolis, Minnesota Department of Transportation reports incidents of collision. Minneapolis, Minnesota Department of Transportation reports incidents of overturned vehicles. Minneapolis, Minnesota Department of Transportation reports incidents of stalling in lane. Minneapolis, Minnesota Department of Transportation reports incidents of abandoned vehicles in lane. Minneapolis, Minnesota Department of Transportation reports incidents of stalling on shoulder. Minneapolis, Minnesota Department of Transportation reports incidents of vehicle fires. Minneapolis, Minnesota Department of Transportation reports incidents of hazardous material spills. Minneapolis, Minnesota Department of Transportation reports incidents of abandoned vehicles on shoulder. Minneapolis, Minnesota Department of Transportation reports incidents of public emergency. Minneapolis, Minnesota Department of Transportation reports incidents of roadway debris. empty cell
Caltrans—San Diego Caltrans, San Diego reports incidents of collision. Caltrans, San Diego reports incidents of overturned vehicles. Caltrans, San Diego reports incidents of stalling in lane. Caltrans, San Diego reports incidents of abandoned vehicles in lane. Caltrans, San Diego reports incidents of stalling on shoulder. Caltrans, San Diego reports incidents of vehicle fires. Caltrans, San Diego reports incidents of hazardous material spills. empty cell Caltrans, San Diego reports incidents of public emergency. Caltrans, San Diego reports incidents of roadway debris. empty cell
Incident Management Services—Houston empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell
Southeast Michigan COG—Detroit Southeast Michigan Council of Governments, Detroit reports incidents of collision. Southeast Michigan Council of Governments, Detroit reports incidents of overturned vehicles. Southeast Michigan Council of Governments, Detroit reports incidents of stalling in lane. Southeast Michigan Council of Governments, Detroit reports incidents of abandoned vehicles in lane. Southeast Michigan Council of Governments, Detroit reports incidents of stalling on shoulder. Southeast Michigan Council of Governments, Detroit reports incidents of vehicle fires. Southeast Michigan Council of Governments, Detroit reports incidents of hazardous material spills. Southeast Michigan Council of Governments, Detroit reports incidents of abandoned vehicles on shoulder. Southeast Michigan Council of Governments, Detroit reports incidents of public emergency. Southeast Michigan Council of Governments, Detroit reports incidents of roadway debris. empty cell
City of Houston—Police Dept. City of Houston, Police Department reports incidents of collision. City of Houston, Police Department reports incidents of overturned vehicles. City of Houston, Police Department reports incidents of stalling in lane. City of Houston, Police Department reports incidents of abandoned vehicles in lane. City of Houston, Police Department reports incidents of stalling on shoulder. City of Houston, Police Department reports incidents of vehicle fires. City of Houston, Police Department reports incidents of hazardous material spills. City of Houston, Police Department reports incidents of abandoned vehicles on shoulder. City of Houston, Police Department reports incidents of public emergency. City of Houston, Police Department reports incidents of roadway debris. Assist TxDOT
New York DOT New York Department of Transportation reports incidents of collision. New York Department of Transportation reports incidents of overturned vehicles. New York Department of Transportation reports incidents of stalling in lane. New York Department of Transportation reports incidents of abandoned vehicles in lane. New York Department of Transportation reports incidents of stalling on shoulder. New York Department of Transportation reports incidents of vehicle fires. New York Department of Transportation reports incidents of hazardous material spills. New York Department of Transportation reports incidents of abandoned vehicles on shoulder. New York Department of Transportation reports incidents of public emergency. New York Department of Transportation reports incidents of roadway debris. Brush fire, pedestrian in restricted area, road work, traffic signal malfunction, non-recurring severe congestion
Colorado DOT Lakewood Lakewood, Colorado Department of Transportation reports incidents of collision. Lakewood, Colorado Department of Transportation reports incidents of overturned vehicles. Lakewood, Colorado Department of Transportation reports incidents of stalling in lane. Lakewood, Colorado Department of Transportation reports incidents of abandoned vehicles in lane. Lakewood, Colorado Department of Transportation reports incidents of stalling on shoulder. Lakewood, Colorado Department of Transportation reports incidents of vehicle fires. Lakewood, Colorado Department of Transportation reports incidents of hazardous material spills. Lakewood, Colorado Department of Transportation reports incidents of abandoned vehicles on shoulder. Lakewood, Colorado Department of Transportation reports incidents of public emergency. Lakewood, Colorado Department of Transportation reports incidents of roadway debris. empty cell
Texas DOT—Houston Houston, Texas Department of Transportation reports incidents of collision. Houston, Texas Department of Transportation reports incidents of overturned vehicles. Houston, Texas Department of Transportation reports incidents of stalling in lane. Houston, Texas Department of Transportation reports incidents of abandoned vehicles in lane. Houston, Texas Department of Transportation reports incidents of stalling on shoulder. Houston, Texas Department of Transportation reports incidents of vehicle fires. Houston, Texas Department of Transportation reports incidents of hazardous material spills. Houston, Texas Department of Transportation reports incidents of abandoned vehicles on shoulder. Houston, Texas Department of Transportation reports incidents of public emergency. Houston, Texas Department of Transportation reports incidents of roadway debris. empty cell
Illinois DOT—Chicago Chicago, Illinois Department of Transportation reports incidents of collision. Chicago, Illinois Department of Transportation reports incidents of overturned vehicles. Chicago, Illinois Department of Transportation reports incidents of stalling in lane. Chicago, Illinois Department of Transportation reports incidents of abandoned vehicles in lane. empty cell Chicago, Illinois Department of Transportation reports incidents of vehicle fires. Chicago, Illinois Department of Transportation reports incidents of hazardous material spills. empty cell Chicago, Illinois Department of Transportation reports incidents of public emergency. empty cell Ice on pavement, water main breaks, flooding, anything that blocks one or more lane for 30 minutes or more, school bus involvement, railroad crossing involvement, fatality.
North Carolina DOT North Carolina Department of Transportation reports incidents of collision. North Carolina Department of Transportation reports incidents of overturned vehicles. North Carolina Department of Transportation reports incidents of stalling in lane. North Carolina Department of Transportation reports incidents of abandoned vehicles in lane. North Carolina Department of Transportation reports incidents of stalling on shoulder. North Carolina Department of Transportation reports incidents of vehicle fires. North Carolina Department of Transportation reports incidents of hazardous material spills. North Carolina Department of Transportation reports incidents of abandoned vehicles on shoulder. North Carolina Department of Transportation reports incidents of public emergency. North Carolina Department of Transportation reports incidents of roadway debris. Anything effecting traffic flow
Connecticut DOT Connecticut Department of Transportation reports incidents of collision. Connecticut Department of Transportation reports incidents of overturned vehicles. Connecticut Department of Transportation reports incidents of stalling in lane. Connecticut Department of Transportation reports incidents of abandoned vehicles in lane. Connecticut Department of Transportation reports incidents of stalling on shoulder. Connecticut Department of Transportation reports incidents of vehicle fires. Connecticut Department of Transportation reports incidents of hazardous material spills. Connecticut Department of Transportation reports incidents of abandoned vehicles on shoulder. Connecticut Department of Transportation reports incidents of public emergency. Connecticut Department of Transportation reports incidents of roadway debris. empty cell

A common classification scheme that describes the severity of the incident and/or the urgency of the response does not exist. For the most part, transportation agencies tend to classify incidents into two to three categories based upon the degree to which traffic is likely to be impacted (severity) and/or the number of lanes blocked. Some of the criteria that transportation agencies use to classify incidents include the following:

  • Number of lanes blocked;
  • Estimated duration of blockage;
  • Severity and/or number of injuries involved;
  • Time-of-day;
  • Presence of hazardous materials;
  • Degree of damage to vehicles and/or infrastructure;
  • Type of vehicles involved (e.g., trucks, buses, etc.); and
  • Number of vehicles involved.

Emergency service providers, on the other hand, typically classify events based on the potential loss of life and/or the impact to public safety. Both of the emergency service providers use standards that have been defined by their industry as a means of classifying incidents. These standards take into account the presence of possible injuries or fatalities, and rely on dispatchers soliciting correct information from the individuals reporting the incidents.

Information Collected Per Incident

One attribute of a good performance measurement system is that data to generate performance measure be readily attainable in an economic manner. [1] This implies that in order for agencies to develop and use performance measures, the data must be readily available through their already existing systems. Responders are more likely to compute performance measures if they are already collecting the data to support them. Part of this survey effort was to look at what data is currently being collected by different agencies and how.

Table 3 shows what information many of the transportation and emergency service providers are collecting about each incident event. Based on the survey responses, at a minimum, the following information is recorded by most agencies:

  • The roadway name where the incident occurred;
  • The name of a nearby cross-street or location;
  • The location of the incident in the lanes (i.e., which lanes are blocked);
  • The type of incident;
  • The time at which the incident was detected or reported;
  • The time the first response vehicle arrived on the scene; and
  • The time the incident was cleared from the scene.
Table 2. Criteria Used to Categorize Incidents and How It Effects Incident Response
Agency Criteria Thresholds Response Variation
New Jersey DOT Major, Minor. Major incidents defined as those lasting more than one hour while minor incidents defined as those lasting less than 1 hour. Minor incidents—use ITS (DMS/HAR) if applicable. For major incidents, review to see if need to send IM response team. Team consists of state trooper and DOT traffic operations person, get to scene and try to speed clearance of incident.
Arizona DOT Level 1, 2, 3 Level 1—fatality; unplanned closure in one or both direction affecting any state route; any incident involving HAZMAT, homicide, trains, or school buses;
Level 2—traffic flow is restricted; requiring live AzDOT presence; fences cuts, livestock on roadway, or guard rail damage presenting hazard to motorist; red indication out / stop sign knockdown; large dead animal in lanes; roadway damage (large potholes, gravel on roadway); disabled vehicle blocking flow; structural damage that does not close hwy; threat of jumper that does not close hwy
Level 3—Yellow/green indication out; debris not blocking roadway; disabled vehicle not blocking roadway; Maintenance; anything that can be handled at supervisor discretion; anything not requiring immediate ADOT response
What changes is who gets notified and how much of a hurry we are to get responses from them.
Level 1—notify Admin Major (includes ADOT Director, and State engineer, and District Engineer).
Level 2—Notify Maintenance Supervisor by pager or phone.
Level 3—notify supervisors via email, phone, radio.
Ohio DOT—Columbus Severity, time-of-day, congestion level Lane blockages of more than one minute warrants activating DMS; DMS messages updated as lane blockage changes; Service patrol will work incidents expected to be under 15 minutes to clear, otherwise call for tow trucks Incident response plan (IRM) addresses how to handle major incidents, stalled vehicles, debris, roadwork, congestion, fire/HAZMAT, freeway diversion. For minor fender benders, execute only what is helpful to motorist that doesn't cause a lot of inconvenience. For major incidents (e.g., fatality) and EMS is on the scene, execute full plan immediately.
Tennessee DOT empty cell empty cell Long term—debriefings and updates
Phoenix, AZ Fire Dept. Use universal system U.S. Fire Adm. (thru FEMA website) empty cell Response bases on Inc. Management System (IMS)—developed in California published 1985. Dispatchers—rotate
Maryland State Hwy AdmCHART Property damage: person injured/fatality; Hazmat; emergency roadwork;—15 items out of FHWA Data Dictionary empty cell If longer than 2 hrs shutdown, preplanned detour routes. Dependent on magnitude of incident, different levels of notifications is given to agencies.
Texas DOT—Austin HCM Level of Service Criteria; Reported vs. verified Compare current volume/occupancy measures to HCM thresholds. No impact on operations—simply informational. Emergency services will look at speed. Haven't needed to classify incidents (respond to all incidents). Verified vs reported—if reported, will look to verify with CCTV and then clear.
Texas DOT—San Antonio Type of incident (i.e., debris, weather, accident). Severity of lanes closed; Severity of accident Severity of lanes closed—2 or 3 lanes closed, classified as major incident. With crash scenes, major incident is one that requires EMS (get information via police). Major incident—when demand expected to exceed capacity. TransGuide software system automatically prioritizes—major incidents over minor incidents, minor incident in open lane. System uses operator inputs (i.e., description of incidents) to driver scenario process.
Minnesota DOT—Minneapolis Major, Minor. Judgment call by operator. Used past experience, type of incident, Time-of-day, expected duration of incident (i.e., any road closure or any incident during peak period, hazmat or rollover) classified as major Major incidents—place motorist information system in overdrive. Broadcast radio messages every 10 minutes. With major incident, use DMSs to direct motorist to tune to station and continuously broadcast incident information. Will also call other media outlets. May pull in other operators if many going on at same time.
Caltrans—San Diego Use California Highway patrol's radio call system (10 codes, 11 codes) empty cell Highest level codes, Caltrans will dispatch response immediately. With other codes, will wait until officer on-site. Will change response or dispatch response based on officers needs.
Incident Management Services—Houston, TX Only respond to major incident involving 18-wheeler rollovers/lost loads. empty cell empty cell
Southeast Michigan COG—Detroit No defined criteria (i.e., delay threshold severity). Michigan State Police Criminal Justice Information Center has a system to capture this information called the Automated Incident Command System (AICS). There are no documented thresholds that I know of but there might be something defined by the State Police. They work by guidelines and training found in the Incident Command System (ICS). They also have a Computer Aided Dispatch (CAD) that dispatches the appropriate personnel for a particular event. The dispatcher determines the appropriate response after assessing the call or by the person responding to the call once at the scene of the incident. Appropriate responses scenarios might also be determined through the use of ICS and CAD systems. Assistance is provided by the Michigan Intelligent Transportation Systems (ITS) Center if it is a freeway incident through the use of the cameras.
City of Houston, TX Police Dept. Severity—Major/Minor; Location—Moving lane of traffic (right shoulder, left shoulder, lane(s) blocked—1 2 3 4 5 6 Major = major freeway blockage; Minor = minimal freeway blockage 90% of incidents detected by roving patrol; 6% dispatched from TranStar; clear minor incidents alone; assist with traffic control at major incidents;
New York DOT Combination of severity, anticipated duration, and time-of-day (e.g., peak or off-peak) Level 1—no lane blocked - on shoulder;
Level 2—1 lane blocked 0-15 min (peak) 0-30 min (off-peak);
Level 3—1 lane blocked 15-30 mins (peak) or 30-60 mins (off-peak);
Level 4—1 or more blocked 30-60min (peak) 60-120(off-peak);
Level 5—road closure, 1+blocked 60 min(peak) 60-120(off-peak)
The more severe the more they "throw" at it. They have communications with metro traffic and local media (if after metro traffic hours). Co-located in TMC with state police—get estimate from trooper for duration. Level 1–2: may or may not do anything. Higher levels—At first advise metro traffic/media of problem—if worse, recommend taking alternate route (but don't specify)—if really bad, recommend specific alternate route—more severe, use stronger DMS messages—use DMS to notify to tune to HAR—have 1 permanent HAR and 2 portable (1 portable being converted to permanent).
Colorado DOT—Lakewood Mile High Courtesy patrol handles minor incidents. The TMC only responds to major incidents—duration is the criteria used 3-tier system for major incidents—total freeway closure or most lanes blocked
Level 1—duration less than 30 minutes;
Level 2—duration 30 minutes to 2 hours;
Level 3—duration over 2 hours
Main response is public information. They have a broadcast fax system with 300 agencies/companies signed up including media, other public agencies, trucking firms, US military, US Postal Service, visitor centers, etc. Also post information on their website
Texas DOT—Houston Will follow that provided by law enforcement (Fatality/Injury = major, PDO = minor), as well as determining severity based upon lanes blocked and duration Major: One lane > 30 min (TOD dependent); Two or more lanes > 15 min (TOD dependent); truck accidents, HazMat spills, bus accident, multi-vehicle accidents
Minor: Other incidents
Different types of incidents require different level of response. For example, HFS is not contacted for a minor incident, however, HPD may be required and they are contacted the same as if it were a major incident. They are given all details known and it is left to them to determine their condition of response.
Illinois DOT—Chicago Severity—routine or incident; Lane blockage 1 or more lane closed for 30 minutes or more; total freeway closure for 15 minutes or more; Hazmat More documentation for incidents than "routines", more public awareness for more major incidents—media alerts, notify DOT personnel, DMS

Table 3. Information Collected About Each Incident Event
Agency Roadway Name Location/Cross-Street Name Block Number Detection Station # Lat/Long Location of Lanes Blocked Incident Type Incident Source Current status of Incident Time incident was detected (reported) Time incident was verified Source of incident verification Time response vehicle arrived on scene Type of response vehicles on scene Time response vehicles left scene Time incident was cleared from scene Time traffic returned to normal flow Roadway Surface Condition Roadway Condition Light Condition Weather condition Injuries present # of vehicles involved Type of Vehicles involved Incident Severity (qualitative) Other
Kansas DOT, Kansas City Kansas City, Kansas Department of Transportation collects information on roadway names. Kansas City, Kansas Department of Transportation collects information on cross-street names or locations. empty cell empty cell empty cell Kansas City, Kansas Department of Transportation collects information on locations of blocked lanes. Kansas City, Kansas Department of Transportation collects information on incident types. empty cell empty cell Kansas City, Kansas Department of Transportation collects information on time incident was detected. empty cell empty cell Kansas City, Kansas Department of Transportation collects information on time response vehicle arrived on scene.(1) empty cell empty cell empty cell empty cell empty cell Kansas City, Kansas Department of Transportation collects information on roadway conditions. Kansas City, Kansas Department of Transportation collects information on light conditions. Kansas City, Kansas Department of Transportation collects information on weather conditions. Kansas City, Kansas Department of Transportation collects information on injuries present. Kansas City, Kansas Department of Transportation collects information on the number of vehicles involved. Kansas City, Kansas Department of Transportation collects information on the types of vehicles involved. Kansas City, Kansas Department of Transportation collects information on incident severity. Property damage; diagram; names; vehicle makes; model, color, plate numbers
New Jersey DOT New Jersey Department of Transportation collects information on roadway names. New Jersey Department of Transportation collects information on cross-street names or locations. empty cell New Jersey Department of Transportation collects information on detection station numbers. empty cell New Jersey Department of Transportation collects information on locations of blocked lanes. New Jersey Department of Transportation collects information on incident types. New Jersey Department of Transportation collects information on incident sources. New Jersey Department of Transportation collects information on current status of incidents. New Jersey Department of Transportation collects information on time incident was detected. empty cell empty cell New Jersey Department of Transportation collects information on time response vehicle arrived on scene. New Jersey Department of Transportation collects information on types of response vehicles on scene. New Jersey Department of Transportation collects information on time response vehicles left scene. empty cell empty cell empty cell empty cell empty cell empty cell empty cell New Jersey Department of Transportation collects information on the number of vehicles involved. New Jersey Department of Transportation collects information on the types of vehicles involved. New Jersey Department of Transportation collects information on incident severity. empty cell
Arizona DOT Arizona Department of Transportation collects information on roadway names. Arizona Department of Transportation collects information on cross-street names or locations. empty cell empty cell Arizona Department of Transportation collects information on latitude and longitude. Arizona Department of Transportation collects information on locations of blocked lanes. Arizona Department of Transportation collects information on incident types. Arizona Department of Transportation collects information on incident sources. Arizona Department of Transportation collects information on current status of incidents. Arizona Department of Transportation collects information on time incident was detected. Arizona Department of Transportation collects information on time incident was verified. Arizona Department of Transportation collects information on sources of incident verification. Arizona Department of Transportation collects information on time response vehicle arrived on scene.(1) Arizona Department of Transportation collects information on types of response vehicles on scene. Arizona Department of Transportation collects information on time response vehicles left scene. Arizona Department of Transportation collects information on time incident was cleared from scene.(2) empty cell empty cell empty cell empty cell empty cell Arizona Department of Transportation collects information on injuries present. Arizona Department of Transportation collects information on the number of vehicles involved. Arizona Department of Transportation collects information on the types of vehicles involved. Arizona Department of Transportation collects information on incident severity. Route, direction, milepost, type of incident (accident with or without injuries/death); who was called out.
Ohio DOT—Columbus Columbus, Ohio Department of Transportation collects information on roadway names. Columbus, Ohio Department of Transportation collects information on cross-street names or locations. empty cell empty cell empty cell Columbus, Ohio Department of Transportation collects information on locations of blocked lanes. Columbus, Ohio Department of Transportation collects information on incident types. empty cell empty cell Columbus, Ohio Department of Transportation collects information on time incident was detected. empty cell empty cell empty cell empty cell empty cell Columbus, Ohio Department of Transportation collects information on time incident was cleared from scene. empty cell empty cell empty cell empty cell empty cell empty cell Columbus, Ohio Department of Transportation collects information on the number of vehicles involved. Columbus, Ohio Department of Transportation collects information on the types of vehicles involved. Columbus, Ohio Department of Transportation collects information on incident severity. Miler maker system location
Tennessee DOT Tennessee Department of Transportation collects information on roadway names. empty cell empty cell empty cell empty cell Tennessee Department of Transportation collects information on locations of blocked lanes. Tennessee Department of Transportation collects information on incident types. Tennessee Department of Transportation collects information on incident sources. empty cell empty cell empty cell empty cell Tennessee Department of Transportation collects information on time response vehicle arrived on scene. empty cell Tennessee Department of Transportation collects information on time response vehicles left scene. empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell Tennessee Department of Transportation collects information on types of vehicles involved. empty cell Type of service; vehicle tag #; direction
Phoenix, AZ Fire Dept Phoenix, Arizona Fire Department collects information on roadway names. Phoenix, Arizona Fire Department collects information on cross-street names or locations. Phoenix, Arizona Fire Department collects information on block numbers. Phoenix, Arizona Fire Department collects information on detection station numbers. Phoenix, Arizona Fire Department collects information on latitude and longitude. Phoenix, Arizona Fire Department collects information on locations of blocked lanes. Phoenix, Arizona Fire Department collects information on incident types. Phoenix, Arizona Fire Department collects information on incident sources. Phoenix, Arizona Fire Department collects information on current status of incidents. Phoenix, Arizona Fire Department collects information on time incident was detected. Phoenix, Arizona Fire Department collects information on time incident was verified. Phoenix, Arizona Fire Department collects information on sources of incident verification. Phoenix, Arizona Fire Department collects information on time response vehicle arrived on scene.(3) Phoenix, Arizona Fire Department collects information on types of response vehicles on scene. Phoenix, Arizona Fire Department collects information on time response vehicles left scene. Phoenix, Arizona Fire Department collects information on time incident was cleared from scene. empty cell Phoenix, Arizona Fire Department collects information on roadway surface conditions. Phoenix, Arizona Fire Department collects information on roadway conditions. Phoenix, Arizona Fire Department collects information on light conditions. Phoenix, Arizona Fire Department collects information on weather conditions. Phoenix, Arizona Fire Department collects information on injuries present. Phoenix, Arizona Fire Department collects information on the number of vehicles involved. Phoenix, Arizona Fire Department collects information on the types of vehicles involved. Phoenix, Arizona Fire Department collects information on incident severity. Detailed info on injuries, seatbelts, child restraints; Trucks have live terminals and digital cameras to collect info
Maryland State Hwy AdminCHART empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell
Texas DOT —Austin Austin, Texas Department of Transportation collects information on roadway names. Austin, Texas Department of Transportation collects information on cross-street names or locations. Austin, Texas Department of Transportation collects information on block numbers. Austin, Texas Department of Transportation collects information on detection station numbers. Austin, Texas Department of Transportation collects information on latitude and longitude. Austin, Texas Department of Transportation collects information on locations of blocked lanes. Austin, Texas Department of Transportation collects information on incident types. empty cell Austin, Texas Department of Transportation collects information on current status of incidents. Austin, Texas Department of Transportation collects information on time incident was detected. Austin, Texas Department of Transportation collects information on time incident was verified. empty cell empty cell empty cell empty cell Austin, Texas Department of Transportation collects information on time incident was cleared from scene. empty cell Austin, Texas Department of Transportation collects information on roadway surface conditions. Austin, Texas Department of Transportation collects information on roadway conditions. Austin, Texas Department of Transportation collects information on light conditions. Austin, Texas Department of Transportation collects information on weather conditions. empty cell empty cell Austin, Texas Department of Transportation collects information on the types of vehicles involved. empty cell System software records time that changes to any fields are made, including update to comments.
Texas DOT—San Antonio San Antonio, Texas Department of Transportation collects information on roadway names. San Antonio, Texas Department of Transportation collects information on cross-street names or locations. empty cell empty cell empty cell San Antonio, Texas Department of Transportation collects information on locations of blocked lanes. empty cell empty cell San Antonio, Texas Department of Transportation collects information on current status of incidents. San Antonio, Texas Department of Transportation collects information on time incident was detected. San Antonio, Texas Department of Transportation collects information on time incident was verified. empty cell empty cell empty cell empty cell San Antonio, Texas Department of Transportation collects information on time incident was cleared from scene.(4) empty cell empty cell San Antonio, Texas Department of Transportation collects information on roadway conditions.(5) empty cell empty cell empty cell empty cell empty cell empty cell System software records time reported, time entered in system, time system executed scenario, time scenario changed, time scenario over (when lane back open to traffic)
Minnesota DOT—Minneapolis Minneapolis, Minnesota Department of Transportation collects information on roadway names. Minneapolis, Minnesota Department of Transportation collects information on cross-street names or locations. empty cell empty cell empty cell Minneapolis, Minnesota Department of Transportation collects information on locations of blocked lanes. Minneapolis, Minnesota Department of Transportation collects information on incident types. Minneapolis, Minnesota Department of Transportation collects information on incident sources. Minneapolis, Minnesota Department of Transportation collects information on current status of incidents. Minneapolis, Minnesota Department of Transportation collects information on time incident was detected. empty cell Minneapolis, Minnesota Department of Transportation collects information on sources of incident verification. Minneapolis, Minnesota Department of Transportation collects information on time response vehicle arrived on scene. Minneapolis, Minnesota Department of Transportation collects information on types of response vehicles on scene. Minneapolis, Minnesota Department of Transportation collects information on time response vehicles left scene. Minneapolis, Minnesota Department of Transportation collects information on time incident was cleared from scene. empty cell Minneapolis, Minnesota Department of Transportation collects information on roadway surface conditions. Minneapolis, Minnesota Department of Transportation collects information on roadway conditions. empty cell Minneapolis, Minnesota Department of Transportation collects information on weather conditions.(6) Minneapolis, Minnesota Department of Transportation collects information on injuries present. Minneapolis, Minnesota Department of Transportation collects information on the number of vehicles involved. Minneapolis, Minnesota Department of Transportation collects information on the types of vehicles involved. Minneapolis, Minnesota Department of Transportation collects information on incident severity. empty cell
Caltrans San Diego Caltrans, San Diego collects information on roadway names. Caltrans, San Diego collects information on cross-street names or locations. empty cell empty cell empty cell Caltrans, San Diego collects information on locations of blocked lanes. Caltrans, San Diego collects information on incident types. Caltrans, San Diego collects information on incident sources. Caltrans, San Diego collects information on current status of incidents. Caltrans, San Diego collects information on time incident was detected. Caltrans, San Diego collects information on time incident was verified. Caltrans, San Diego collects information on sources of incident verification. Caltrans, San Diego collects information on time response vehicle arrived on scene. Caltrans, San Diego collects information on types of response vehicles on scene. Caltrans, San Diego collects information on time response vehicles left scene. Caltrans, San Diego collects information on time incident was cleared from scene. empty cell Caltrans, San Diego collects information on roadway surface conditions. Caltrans, San Diego collects information on roadway conditions. Caltrans, San Diego collects information on light conditions. Caltrans, San Diego collects information on weather conditions. Caltrans, San Diego collects information on injuries present. Caltrans, San Diego collects information on the number of vehicles involved. Caltrans, San Diego collects information on the types of vehicles involved. Caltrans, San Diego collects information on incident severity. # of lanes blocked
Southeast Michigan COG—Detroit Southeast Michigan Council of Governments, Detroit collects information on roadway names. Southeast Michigan Council of Governments, Detroit collects information on cross-street names or locations. empty cell empty cell Southeast Michigan Council of Governments, Detroit collects information on latitude and longitude. Southeast Michigan Council of Governments, Detroit collects information on locations of blocked lanes. Southeast Michigan Council of Governments, Detroit collects information on incident types. Southeast Michigan Council of Governments, Detroit collects information on incident sources. Southeast Michigan Council of Governments, Detroit collects information on current status of incidents. Southeast Michigan Council of Governments, Detroit collects information on time incident was detected. Southeast Michigan Council of Governments, Detroit collects information on time incident was verified. Southeast Michigan Council of Governments, Detroit collects information on sources of incident verification. Southeast Michigan Council of Governments, Detroit collects information on time response vehicle arrived on scene. Southeast Michigan Council of Governments, Detroit collects information on types of response vehicles on scene. Southeast Michigan Council of Governments, Detroit collects information on time response vehicles left scene. Southeast Michigan Council of Governments, Detroit collects information on time incident was cleared from scene. empty cell Southeast Michigan Council of Governments, Detroit collects information on roadway surface conditions. Southeast Michigan Council of Governments, Detroit collects information on roadway conditions. Southeast Michigan Council of Governments, Detroit collects information on light conditions. Southeast Michigan Council of Governments, Detroit collects information on weather conditions. Southeast Michigan Council of Governments, Detroit collects information on injuries present. Southeast Michigan Council of Governments, Detroit collects information on the number of vehicles involved. Southeast Michigan Council of Governments, Detroit collects information on the types of vehicles involved. Southeast Michigan Council of Governments, Detroit collects information on incident severity. See attachment
Houston, TX—Motorist Assistance Patrol Houston, Texas Motorist Assistance Patrol collects information on roadway names. Houston, Texas Motorist Assistance Patrol collects information on cross-street names or locations. empty cell empty cell empty cell Houston, Texas Motorist Assistance Patrol collects information on locations of blocked lanes. Houston, Texas Motorist Assistance Patrol collects information on incident types. Houston, Texas Motorist Assistance Patrol collects information on incident sources. Houston, Texas Motorist Assistance Patrol collects information on current status of incidents. Houston, Texas Motorist Assistance Patrol collects information on time incident was detected. Houston, Texas Motorist Assistance Patrol collects information on time incident was verified. Houston, Texas Motorist Assistance Patrol collects information on sources of incident verification. Houston, Texas Motorist Assistance Patrol collects information on time response vehicle arrived on scene. Houston, Texas Motorist Assistance Patrol collects information on types of response vehicles on scene. Houston, Texas Motorist Assistance Patrol collects information on time response vehicles left scene. empty cell empty cell empty cell empty cell empty cell empty cell Houston, Texas Motorist Assistance Patrol collects information on injuries present. Houston, Texas Motorist Assistance Patrol collects information on the number of vehicles involved. Houston, Texas Motorist Assistance Patrol collects information on the types of vehicles involved. Houston, Texas Motorist Assistance Patrol collects information on incident severity. Vehicle—make, model, color, year, license plate; Driver—male, female; number of occupants—driver only, 2, 3, 4+; motorist use of cell phone—# called, air time, motorist name & signature
New York DOT New York Department of Transportation collects information on roadway names. New York Department of Transportation collects information on cross-street names or locations. empty cell empty cell New York Department of Transportation collects information on latitude and longitude. New York Department of Transportation collects information on locations of blocked lanes. New York Department of Transportation collects information on incident types. New York Department of Transportation collects information on incident sources. New York Department of Transportation collects information on current status of incidents. New York Department of Transportation collects information on time incident was detected.(7) empty cell empty cell empty cell empty cell empty cell New York Department of Transportation collects information on time incident was cleared from scene. New York Department of Transportation collects information on time traffic returned to normal flow.(8) empty cell empty cell empty cell empty cell empty cell New York Department of Transportation collects information on the number of vehicles involved. New York Department of Transportation collects information on the types of vehicles involved. empty cell Other highways affected (if any); which ITS devices activated—DMS, HAR
Colorado DOT—Lakewood Lakewood, Colorado Department of Transportation collects information on roadway names. Lakewood, Colorado Department of Transportation collects information on cross-street names or locations. empty cell empty cell empty cell Lakewood, Colorado Department of Transportation collects information on locations of blocked lanes. Lakewood, Colorado Department of Transportation collects information on incident types. Lakewood, Colorado Department of Transportation collects information on incident sources. empty cell Lakewood, Colorado Department of Transportation collects information on time incident was detected. Lakewood, Colorado Department of Transportation collects information on time incident was verified. Lakewood, Colorado Department of Transportation collects information on sources of incident verification. Lakewood, Colorado Department of Transportation collects information on time response vehicle arrived on scene. empty cell Lakewood, Colorado Department of Transportation collects information on time response vehicles left scene. Lakewood, Colorado Department of Transportation collects information on time incident was cleared from scene. empty cell empty cell empty cell empty cell empty cell Lakewood, Colorado Department of Transportation collects information on injuries present. Lakewood, Colorado Department of Transportation collects information on the number of vehicles involved. empty cell empty cell Information collected for service patrol response to minor incidents only. There is currently no logging of major incident data (level 1, 2, 3 incidents) that the TMC responds to.
Texas DOT—Houston Texas Department of Transportation collects information on roadway names. Texas Department of Transportation collects information on cross-street names or locations. empty cell empty cell empty cell Texas Department of Transportation collects information on locations of blocked lanes. Texas Department of Transportation collects information on incident types. Texas Department of Transportation collects information on incident sources. Texas Department of Transportation collects information on current status of incidents. Texas Department of Transportation collects information on time incident was detected. Texas Department of Transportation collects information on time incident was verified. Texas Department of Transportation collects information on sources of incident verification. Texas Department of Transportation collects information on time response vehicle arrived on scene. Texas Department of Transportation collects information on types of response vehicles on scene. empty cell Texas Department of Transportation collects information on time incident was cleared from scene.(2) empty cell Texas Department of Transportation collects information on roadway surface conditions. Texas Department of Transportation collects information on roadway conditions. Texas Department of Transportation collects information on light conditions. Texas Department of Transportation collects information on weather conditions. Texas Department of Transportation collects information on injuries present. Texas Department of Transportation collects information on the number of vehicles involved. Texas Department of Transportation collects information on the types of vehicles involved. empty cell Incident date; direction of travel; Before/After cross street
Illinois DOT—Chicago Chicago, Illinois Department of Transportation collects information on roadway names. Chicago, Illinois Department of Transportation collects information on cross-street names or locations. empty cell empty cell empty cell Chicago, Illinois Department of Transportation collects information on locations of blocked lanes. Chicago, Illinois Department of Transportation collects information on incident types. Chicago, Illinois Department of Transportation collects information on incident sources. Chicago, Illinois Department of Transportation collects information on current status of incidents. Chicago, Illinois Department of Transportation collects information on time incident was detected. Chicago, Illinois Department of Transportation collects information on time incident was verified. Chicago, Illinois Department of Transportation collects information on sources of incident verification. Chicago, Illinois Department of Transportation collects information on time response vehicle arrived on scene. Chicago, Illinois Department of Transportation collects information on types of response vehicles on scene. Chicago, Illinois Department of Transportation collects information on time response vehicles left scene. Chicago, Illinois Department of Transportation collects information on time incident was cleared from scene. empty cell Chicago, Illinois Department of Transportation collects information on roadway surface conditions. Chicago, Illinois Department of Transportation collects information on roadway conditions. empty cell empty cell Chicago, Illinois Department of Transportation collects information on injuries present. Chicago, Illinois Department of Transportation collects information on number of vehicles involved. Chicago, Illinois Department of Transportation collects information on types of vehicles involved. Chicago, Illinois Department of Transportation collects information on incident severity. empty cell
City of Houston, TX Police Dept City of Houston, Texas Police Department collects information on roadway names. City of Houston, Texas Police Department collects information on cross-street names or locations. City of Houston, Texas Police Department collects information on block numbers. empty cell empty cell empty cell City of Houston, Texas Police Department collects information on incident types. empty cell empty cell City of Houston, Texas Police Department collects information on time incident was detected. empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell HPD staffs a single console at TranStar. While more specific information is collected by the officer in the field, HPD at TranStar only logs some general information—only for incidents that occur on the freeway system
North Carolina DOT North Carolina Department of Transportation collects information on roadway names. North Carolina Department of Transportation collects information on cross-street names or locations. empty cell empty cell empty cell North Carolina Department of Transportation collects information on locations of blocked lanes. North Carolina Department of Transportation collects information on incident types. empty cell empty cell North Carolina Department of Transportation collects information on time incident was detected. empty cell empty cell empty cell empty cell North Carolina Department of Transportation collects information on time response vehicles left scene. North Carolina Department of Transportation collects information on time incident was cleared from scene. empty cell empty cell North Carolina Department of Transportation collects information on roadway conditions. empty cell empty cell empty cell North Carolina Department of Transportation collects information on number of vehicles involved. North Carolina Department of Transportation collects information on types of vehicles involved. North Carolina Department of Transportation collects information on incident severity. Information only for motorist assistance patrols
Connecticut DOT Connecticut Department of Transportation collects information on roadway names. Connecticut Department of Transportation collects information on cross-street names or locations. empty cell empty cell empty cell Connecticut Department of Transportation collects information on locations of blocked lanes. Connecticut Department of Transportation collects information on incident types. Connecticut Department of Transportation collects information on incident sources. Connecticut Department of Transportation collects information on current status of incidents. Connecticut Department of Transportation collects information on time incident was detected. Connecticut Department of Transportation collects information on time incident was verified. Connecticut Department of Transportation collects information on sources of incident verification. Connecticut Department of Transportation collects information on time response vehicle arrived on scene. empty cell Connecticut Department of Transportation collects information on time response vehicles left scene. Connecticut Department of Transportation collects information on time incident was cleared from scene. Connecticut Department of Transportation collects information on time traffic returned to normal flow. empty cell empty cell empty cell empty cell empty cell Connecticut Department of Transportation collects information on number of vehicles involved. Connecticut Department of Transportation collects information on types of vehicles involved. empty cell empty cell
1. First on scene
2. removed from roadway altogether
3. Individual dispatched, on scene, and benchmark points
4. opening of lanes
5. also record under maintenance/construction
6. record weather at start of each shift as operator logs in
7. time stamp when entered into MIST
8. Not fields in software for this but try to indicate these in open comment field

Interestingly, only eleven agencies reported that they record the time that an incident was verified. However, in further discussion with the respondents, it was revealed that, in many cases, time the incident was detected (or reported) and the time the incident was verified are frequently the same time.

Thirteen agencies reported that they record the time the first incident responders arrived on the scene. Similarly, slightly more than half of the respondents indicated that they routinely record the time the incident response vehicles leave the scene and/or the time the incident was cleared from the roadway. For the most part, agencies are primarily concerned with keeping track of the time that they implement or execute their response and are not overly concerned with recording the time that other responders perform certain functions.

Only one agency reported that they record the time that the freeway returned to normal flow. A few common reasons cited for not recording this measure include the following:

  • It is too hard to determine when "normal" flow occurs;
  • The congestion resulting from an incident last so long that operators tend to forget to go back and log when normal traffic flow occurs; and
  • This time is not important to determining the effectiveness of the response.

Some respondents indicated that their software system automatically records the time (i.e., time stamps) every time the operator makes a change to the traffic control. For example, when the operator first initiates a message on a DMS, the time is logged by the system. If the operator changes the message, the time the new message is implemented by the system is logged. The advantage of this approach is that it takes the burden off the operator to log when certain changes are made.

Collection and Retention of Incident Data

Table 4 summarizes how the respondents replied to questions concerning the collection and storage of incident data. An approximately equal number of agencies use manual (seven of the respondents) and automatic (eight of the respondents) means of collecting incident data. Four agencies reported that they use a combination of manual forms and automated systems for collecting information about incidents. In a few cases where agencies used manual data collection means, the forms were later transferred into automated systems for further processing and storage.

Most agencies reported that their incident information either initially or eventually ended up in a database that could be queried. The survey also showed that information about specific incidents was generally kept for a long-time, with most agencies retaining their incident logs for three or more years.

Agencies were also asked if they integrated their incident reports with any of the other incident responders. The general response was "no"; however, some agencies did state they have plans to begin integrating their freeway management center systems with a 911 dispatching center so that data from other agencies could be merged with incident records. This is expected to increase both the quality and quantity of data about incidents at these locations.

Table 4. Collection and Storage Methods, Retention, and Integration Policies of Incident Information
Agency How is this information collected? What format is used to store information? How long is information retained? Is data integrated with other information?
Kansas DOT—Kansas City Manual (1) Receive paper file from state police, enter into a queriable Oracle database. No CCTV yet, highway patrol video for fatality. 5 years to Forever Highway patrol input accident data into accident report database. DOT automatically receives copy of any incident on DOT facility
New Jersey DOT Automatic Queriable database 8 years No
Arizona DOT Automatic Queriable database 3 years When the police work an incident, we are supposed to get their log number. These are not always made available to us. We usually enter these into the Road Condition report and enter the HCRS# into the documentation.
Ohio DOT—Columbus Manual (2) / Automatic Service patrol fills out paper form, later entered into queriable database—Paradox. DMS message logged manually to compare accuracy of DMS electronic file log (new) Not sure on the electronic files, permanent for database No
Tennessee DOT Manual Paper, entered into database Since start in database (June '99). Paper not kept long term after entered into database Some—major incidents w/ multiple agencies—debrief w/ police, fire, timeframe
Phoenix, AZ Fire Dept Both: All vehicles have geo id. Monitored by clock this tracks time of arrivals, reposition, leave.
Manual—Pictures;
EMS
data—handheld computer, download later
Paper, electronic Paper—3 yrs Yes—police dispatch, census
Maryland State Hwy AdminCHART Automatic Oracle database Started Feb 2000 keeping everything; before—5yrs on-site then paper to warehouse In future plans: 911 centers: ability for other agencies (police, county) to access software & edit incident reports eventually
Texas DOT—Austin Automatic Sybase No deletion policy has yet to be developed. Quarterly off-load and access through Excel No yet—only one incident done so far but not very detailed. Done to answer questions about response. Ad hoc requests—maintenance information about equipment failures
Texas DOT—San Antonio Automatic Electronic files Minimum of two years System tied directly to 911 map—don't use one system to verify the other
Minnesota DOT—Minneapolis Automatic queriable database—Access (since 2001); prior to '01—paper logs Early '90 Recent had FHWA intern perform big analysis were compared police logs to system logs. Do not routinely perform comparison. Done on as needed basis and when staff available. Do produce annual volume/crash frequency report
Caltrans—San Diego Manual Paper files and electronic files Less than 14 mo When needed.
Southeast Michigan COG—Detroit Manual & Automatic Data stored in both paper and electronic formats. SEMCOG requests copies of the database and we query it using MS Access SEMCOG has only just started to gather this data (over the past 5 years). Have kept all of it so far Try to cross reference the MSP 911 data with the Freeway Courtesy Patrol data (checking to see how long abandon vehicle have been out on the roadway after they have been identified). Also integrate the MSP crash data (UD10 forms/database) with the incident database. Also integrate the incident information with road attribute file with includes fields like: lane, 85%ile speed; posted speed; land use, vehicle classification counts, traffic volume counts, etc.
Houston, TX Motorist Assistance Patrol Manual & Automatic Paper file, electronic files, queriable database—Access Data generated by MAP is compiled by TTI and returned to TxDOT for storing. Don't know how long they keep it Yes. TTI compiles information and breaks numbers down to percentages.
New York DOT Typed into MIST Queriable database—Sybase Current six months active in system (last week of 6 months falls off each week); burn 6 mo. Data every week to CD for backup Service patrol logs to different system, but if working an incident DOT is entering into MIST, then cross-reference to service patrol record entered.
Colorado DOT—Lakewood Automatic—Service patrol calls dispatch, dispatcher enters all info into database. Oracle queriable database Indefinitely No
TxDOT—Houston Automatic Flat files—queriable database Indefinitely Not electronically. MAP files collected in same manner but different database
Illinois DOT—Chicago Manual Paper file—shared with DOT traffic, maintenance, and claims department 7 years Cross reference state police records; ETP service patrol uses fill-in the dot data cards, will soon be upgrading; the data is not routinely compared but the capability is there
City of Houston, TX Police Dept Manual (3) Other (4) Paper. The Access database is used to enter incidents during each shift (two shifts per day). At the end of the shift, the daily activity log is printed. The database only retains the totals for the shift (data on individual incidents not saved in the database—only on the printouts). The database is then used to prepare the monthly reports Printouts of the daily activity logs are kept for 3 years. empty cell
North Carolina DOT Manual (5) Queriable database Indefinitely (have been collecting for ~6yrs) No
Connecticut DOT empty cell Paper and electronic Incident reports are retained for 5 years No
1. Accident Forms
2. Freeway service patrol incident log form
3. Accident reporting form filled out by officer in field, but does not go to TranStar
4. Incident data at TranStar is manually entered into an Access database
5. IMAP program—called to TMC entered into database on local PC, moving to webpage to consolidate information

Incident Management Performance Measures

Table 5 shows the general types of performance measures that are routinely computed by the agencies responding to the survey. Only half of the agencies responding indicated that they routinely compute incident-related performance measures. Not surprisingly, most of the agencies that are computing performance measures reported computing the following performance measures:

  • Incident frequency,
  • Detection time,
  • Response time, and
  • Clearance time.

Operational Definition of Incident Management Performance Measures

Table 6 shows the operational definitions that each agency is using to compute these performance measures. Interestingly enough, most agencies define "detection time" as the time that they were notified of the incident (i.e., the time that the incident was reported to them in their control center). Detection time is not defined as the time between when an incident actually occurred and when the agency was notified of the incident (either from emergency responders, operator observation, and direct report from citizen).

Nearly all of the respondents indicated that they define "Response Time" as the elapse time between when the agency was first notified about an incident and when the first responder appeared on the scene. The primary difference in the way that agencies define response time is that emergency responders typically define response time as the time from when an incident was reported to their dispatcher to the time when their response vehicles arrive on the scene. Transportation agencies generally measure response time from when the call comes into the TMC (or service patrol dispatcher) to when first response vehicle arrives on the scene, regardless to which agency the vehicle belonged (i.e., this could be a fire vehicle, police vehicle, or service patrol vehicle). The problem with defining response time this way is that often times, the transportation agency does not have any control over when the emergency service providers are dispatched or the priorities that are assigned to different types of incidents. In many cases, the response time that is reported by many transportation agencies is actually the time between two unrelated events (i.e., notification of the incident and the dispatching and arrival of the response vehicles). This is especially true when the traffic management center (TMC) is not the first agency notified of the incident (which is generally the case in most metropolitan areas). Without integrating or comparing records from the dispatching agency, the response time may not represent the true response time of the first responder to the incident, but merely the time between unrelated events.

Clearance time is another measure that varies dramatically between freeway management operators and emergency service providers. For the most part, transportation agencies define clearance time as between when the first responder arrives on the scene (regardless of which agency they work for) to when the incident is cleared from the roadway. Emergency service providers typically define clearance time as the time between when the first of their units arrive on the scene to when their unit leaves the scene and can be deployed elsewhere.

Table 5. Typical Performance Measure Routinely Computed by Agencies
Agency Do you calculate Performance Measures?
What measures do you routinely compute?
empty cell empty cell Incident Frequency Incident Rate Detection Time Response Time Clearance Time Number of Secondary Incidents Time to Normal Flow Incident Delay Others
Kansas DOT Yes Kansas Department of Transportation computes Incident Frequency(1) empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell
Kansas DOT—Kansas City No (2) empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell
New Jersey DOT Yes empty cell empty cell New Jersey Department of Transportation computes Detection Time New Jersey Department of Transportation computes Response Time New Jersey Department of Transportation computes Clearance Time empty cell empty cell empty cell empty cell
Arizona DOT Yes (3) empty cell empty cell Arizona Department of Transportation computes Detection Time Arizona Department of Transportation computes Response Time Arizona Department of Transportation computes Clearance Time (4) empty cell empty cell empty cell
Ohio DOT—Columbus No (5) empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell
Tennessee DOT No (6) empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell
City of Phoenix Fire Dept Yes City of Phoenix Fire Department computes Incident Frequency City of Phoenix Fire Department computes Incident Rate City of Phoenix Fire Department computes Detection Time City of Phoenix Fire Department computes Response Time City of Phoenix Fire Department computes Clearance Time City of Phoenix Fire Department computes Number of Secondary Incidents City of Phoenix Fire Department computes Time to Normal Flow (7) Severity; Nature of Damage; Injuries
Maryland State Hwy AdminCHART Yes (8) Maryland State Highway Administration computes Incident Frequency Maryland State Highway Administration computes Incident Rate empty cell Maryland State Highway Administration computes Response Time Maryland State Highway Administration computes Clearance Time Maryland State Highway Administration computes Number of Secondary Incidents Maryland State Highway Administration computes Time to Normal Flow Maryland State Highway Administration computes Incident Delay Delay hours; environmental impacts; frequency by location; # of disabled vehicles assisted
Texas DOT—Austin No (9) empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell Error logs—preventative maintenance
Texas DOT—San Antonio No (10) empty cell empty cell San Antonio, Texas Department of Transportation computes Detection Time San Antonio, Texas Department of Transportation computes Response Time San Antonio, Texas Department of Transportation computes Clearance Time empty cell empty cell empty cell empty cell
Minnesota DOT—Minneapolis Yes Minneapolis, Minnesota Department of Transportation computes Incident Frequency empty cell empty cell Minneapolis, Minnesota Department of Transportation computes Response Time(11) empty cell empty cell empty cell empty cell empty cell
Caltrans—San Diego No (12) empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell
Southeast Michigan COG—Detroit Yes Southeast Michigan Council of Governments, Detroit computes Incident Frequency Southeast Michigan Council of Governments, Detroit computes Incident Rate Southeast Michigan Council of Governments, Detroit computes Detection Time Southeast Michigan Council of Governments, Detroit computes Response Time Southeast Michigan Council of Governments, Detroit computes Clearance Time Southeast Michigan Council of Governments, Detroit computes Number of Secondary Incidents empty cell Southeast Michigan Council of Governments, Detroit computes Incident Delay Air quality—pollutants (e.g., amounts of VOC, NOx, and CO)
Houston, TX—Motorist Assistance Patrol Yes (13) Houston, Texas Motorist Assistance Patrol computes Incident Frequency Houston, Texas Motorist Assistance Patrol computes Incident Rate Houston, Texas Motorist Assistance Patrol computes Detection Time (14) Houston, Texas Motorist Assistance Patrol computes Clearance Time Houston, Texas Motorist Assistance Patrol computes Number of Secondary Incidents empty cell Houston, Texas Motorist Assistance Patrol computes Incident Delay Types of assists provided (used to stock supplies); location of incidents (by corridor, by segment)
New York DOT No empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell
Colorado DOT—Lakewood No empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell
Texas DOT—Houston No (15) empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell
Illinois DOT—Chicago Yes Chicago, Illinois Department of Transportation computes Incident Frequency Chicago, Illinois Department of Transportation computes Incident Rate empty cell empty cell empty cell empty cell empty cell empty cell Other performance measures such as response time, clearance times, and detection time have been calculated before but not routinely done. Only done periodically for program justification.
City of Houston, TX Police Dept No (16) empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell
North Carolina DOT No (17) empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell empty cell
Connecticut DOT Yes Connecticut Department of Transportation computes Incident Frequency   Connecticut Department of Transportation computes Detection Time Connecticut Department of Transportation computes Response Time Connecticut Department of Transportation computes Clearance Time empty cell Connecticut Department of Transportation computes Time to Normal Flow Connecticut Department of Transportation computes Incident Delay empty cell
1. Use incident frequency to identify high accident locations for improvements
2. Hope more will be done once TMC is operational
3. These can be gotten by database query. We do not use this data, but the districts use them to rate district-wide response times
4. Believe this is important, but they do not track it as a general rule
5. Do not have the funding for personnel to design, implement, and update performance measures
6. Under evaluation; Early stages through contract with University (Vanderbilt)
7. Police do and offer to Fire, don't use
8. University of Maryland prepares yearly report (1997 on web)
9. Too time consuming
10. City-wide incident management project—visually seen 40% reduction in clearance times
11. By type of responder
12. Not an issue before now—can recreate times based on logs
13. Most incidents also depend on arrival of other agencies (i.e., ambulances, other police agencies, and other emergency equipment needed)
14. Data collected but not currently used
15. This is an operations staff not a research staff. There is not the time or personnel available for this function. High accident locations are identified from the information and consideration given to these areas on a routine basis. TTI puts together an Annual Report for TranStar
16. That information has not been required
17. Problem is what performance measures to look at. In process of identifying for future

Table 6. Operational Definition of Performance Measures Used to Evaluate Response Systems
Performance Measure Agency Operational Definition
Incident Frequency City of Phoenix Fire Dept. Time based, incident/shift, also calculate week, month, year and compare to last year
empty cell Maryland State Hwy AdminCHART How often occurs at a given location (mile post)
empty cell Connecticut DOT Any time there is a blockage of highway, an incident is established
Incident Rate City of Phoenix Fire Dept. # of incidents per month or year; look at each different category and calculate; use to shift response
empty cell Maryland State Hwy AdminCHART ADT x # of incidents
Detection Time New Jersey DOT When DOT finds out about the incident
empty cell Arizona DOT Delay from the time that an incident occurs until it is reported
empty cell City of Phoenix Fire Dept. 1st report to dispatch; if official (Police, city); ask them when they detected. Keep track of who reported incident (official or civilian)
empty cell Maryland State Hwy AdminCHART 1st person sees to calling it in
empty cell Texas DOT—San Antonio System parameter (2 minutes)—use 20 sec interval data with rolling average (6 cycles). System usually 1 or so minutes after call
empty cell Caltran—San Diego "Reported Time"—time when report comes into center
empty cell Houston Motorist Assistance Patrol Time of notification, also driver estimate of time of occurrence
empty cell Connecticut DOT The time the incident is reported to the TOC via surveillance equipment or verified phone calls
Response Time New Jersey DOT Time for DOT to get there
empty cell Arizona DOT Starts with live voice reports receiving page and then they are responding. Ends when unit reports they are on-scene.
empty cell City of Phoenix Fire Dept. Time elapse between 1st dispatch contact to 1st vehicle on-scene
empty cell Maryland State Hwy AdminCHART Time call received until arrive on scene
empty cell Texas DOT—San Antonio System logs time every time a change or update is made to response scenario
empty cell Minnesota DOT —Minneapolis Time detected to time responders arrived on scene; camera-based; not perfect—only when operator observes when respond on scene
empty cell Caltran—San Diego Time when 1st responder arrive on-scene
empty cell Houston Motorist Assistance Patrol Dispatch time and time of arrival
empty cell Connecticut DOT The time responders arrive on scene. Arrival time and response time are calculated for state police only out of the Bridgeport operations center coverage area. ConnDOT only contacts ITS internal responders such as bridge safety, construction, maintenance, and electrical and service patrol when required. The contact time and arrival time is then kept. Arrival time only for emergency responders such as EMS, wrecker, fire, and environmental protections is also noted. DOT does not normally contact these responders initially
Clearance Time New Jersey DOT Time between detection and incident cleared from scene
empty cell Arizona DOT When unit reports they are clear or when operator sees all units clear. This is for when the ADOT vehicle leaves the scene.
empty cell City of Phoenix Fire Dept. Time fire department declares incident over, usually as driving away from scene
empty cell Maryland State Hwy AdminCHART How long from notification to clear, or until delays clear / all lanes open is what they use
empty cell Texas DOT—San Antonio Time 1st vehicle arrives on scene until lanes open
empty cell Caltran—San Diego Time when roadway opened
empty cell Houston Motorist Assistance Patrol Time incident ends and clearing of incident from roadway
empty cell Connecticut DOT The time the accident or debris is removed from the travel way
Number of Secondary Incidents Arizona DOT Accidents that occur back in queue
empty cell City of Phoenix Fire Dept. Count of accidents, injury, fire, hazmat each count as one not a different incident #; 1 incident with multiple parts
empty cell Maryland State Hwy AdminCHART Pinpoint incident is created by delay from previous incident, call by operator
empty cell Caltran—San Diego Don't know how to compute
empty cell Houston Motorist Assistance Patrol Time of notification
Time to Normal Flow City of Phoenix Fire Dept. Set by incident commander. Wait at scene until flow returns to normal for time. Subjective.
empty cell Maryland State Hwy AdminCHART Back to operating capacity for time-of-day
empty cell Houston Motorist Assistance Patrol When incident clears and blockage has been removed from freeway
Incident Delay Maryland State Hwy AdminCHART Length of distance (5 mile delay) Max delay (example: 10 mile backup)
empty cell Houston Motorist Assistance Patrol Time of duration

Most agencies agree that the number of secondary accidents resulting from an incident was a difficult measure to compute. In most cases, this was considered to be a subjective measure of the operator. One agency, however, defined a secondary accident to be any accident that occurred within a defined radius and time frame of the first incident. Both the distance and time parameters changed by time-of-day to reflect the different levels of congestion that forms around incidents.

Maryland defines incident delays in term of queue distance. They generally use measures such as the length of congestion (e.g., a five-mile delay or a 10-mile backup) to help define incident delays. Queue distance is a parameter that can be observed almost instantaneously via the surveillance cameras while delay requires that the time it takes drivers to pass through the congestion be measured.

Origins of Performance Measures

In Table 7, respondents were asked about the origin of the operational definitions being used to generate the performance measures (i.e., the driving force behind the generation of the performance measures they are currently using). Several of the respondents indicated that the performance measure that they are currently generating were developed by FHWA and are being used by FHWA and their local administration to monitor their performance over time.

Several other of the respondents indicated that the measures they are currently using have evolved over time. As objectives of the control center changed or as new tasks and capabilities were added, new performance measures were added or old ones have been modified to reflect the new objectives of their system.

Interestingly, both of the emergency service providers that replied to the survey indicated that they have been collecting performance measures that are standard for their industry. It appears that these performance measures are used as a resource management tool for evaluating staffing and asset allocations.

In an attempt to gain insight into other potential performance measures, each respondent was asked if there were other performance measures that were not currently being generated by their system, but would be desirable or helpful to analyzing the effectiveness of the incident response in their area. Table 8 summarizes the responses obtained to these questions. For the most part, agencies' response fit into two categories. One group of agencies wants to generate more of the traditional performance measure (such as incident frequencies, incident rates, detection time, response time, etc.) while the other group wants to collect performance measures that relate to administrative and institutional issues (such as operator workload, camera utilization by other entities, web page hits, etc.). Most agencies, however, basically agree that better quality of data needs to be entered into their systems to make the performance measures more meaningful.

Table 7. Origin of Operational Definition for Performance Measures Being Used
Agency How were these operational definitions derived? By whom? What was the process for deriving them? Were other agencies involved? If so who were they and how?
New Jersey DOT Derived over time, FHWA and management of traffic operations at DOT have asked for it
Arizona DOT The software developers were in-house. They actually asked the operators what they wanted. We found out what management wanted, and told the developers how we wanted to amass the data. We kept the screens simple and eliminated the garbage as we found we didn't use or management didn't need what the screen or a button was offering. We also deleted things that would not work (Emergency notification systems). Driven by available funds.
City of Phoenix Fire Dept. Labor management committee that deals with performance measures (3 union officers; 3 fire dept. managers; shift commanders, exec. office). 1960's. Devised definitions for measures and guides, reviewed annually
Maryland State Hwy AdminCHART Work w/ FHWA over years, standard definitions
Texas DOT—Austin Developed by Traffic Operation Divisions at Headquarters
Minnesota DOT—Minneapolis Look at data recorded to see what information can be tracked over time. Looking for trends that can be addressed (e.g. Highway Helpers)
Southeast Michigan COG—Detroit By SEMCOG and the Metro Detroit Incident Coordinating Committee
Houston—Motorist Assistance Patrols We are a police agency. We follow normal police data gathering according to our Department SOP
Connecticut DOT General knowledge from other agencies thru 1-95 Corridor Coalition

Table 8. Other Performance Measures Not Currently Being Collected, but Desirable
Agency Are there other performance measures that you are not collecting but think would be beneficial?
New Jersey DOT Incident frequency, rate, secondary accidents, and incident delay
Tennessee DOT Interfacing w/ police records ==> high incident rates, commuter times/speeds
Maryland State Hwy
Admin.
CHART
Balance of operator workload; tow response to scene
Texas DOT—Austin Institutional issues ==> camera control (other agencies causing problems); web page hits (how many people looking at cameras)
Texas DOT—San Antonio Travel times; partial restoring of capacity (i.e., when lanes where opened)
Minnesota DOT—Minneapolis Better quality of information
Southeast Michigan COG—Detroit Haven't really given it much thought only because we are focused on making the data better (more accurate). For example, a call may be taken and dispatched but the officer can't locate any incident so instead of clearing the call the record is left with no clear time or any explanation as to why the data is missing.
Houston—Motorist Assistance Patrols No
New York DOT Would like to collect response time, clearance time, resumption of normal flow, and times individual lanes were open/closed. Got an estimate of $100K to upgrade MIST for these add-ons—not being pursued right now.
Illinois DOT—Chicago Detection time—improving *999 and CCTV; Response time—collecting data to calculate response time but not aware of it being used.
City of Houston, TX Police Department Clearance time

Costs of Generating Performance Measures

One objective of this task order was to capture information about the costs associated with collecting, processing, and reporting performance measures for incident management systems around the United States. Almost all of the responding agencies indicated that it was impossible to separate the costs of producing performance measure reports from their typical operating costs. For the most part, agencies consider the cost of collecting data for producing performance measures and performance measure reports as part of their normal operations, and the costs associated with producing special performance reports (such as those requested on demand) are included as part of their normal operating budgets. Table 9 summarizes a few of the responses received from individuals when questioned about the issue of costs.

Table 9. Estimated Cost for Collecting, Processing, and Reporting Performance Measures
Agency What would your estimate of cost to be for collecting, processing, and reporting you performance measures?
Arizona DOT The cost to set up the decision, notification, data collection system that is used for this was part of the AzTech funding.
Maryland State Hwy. AdminCHART Contract with University for performance measures
Caltrans—San Diego Not a way to separate costs for this specific function

Incident Management Performance Reports

The respondents were also surveyed as to the type, frequency, and use of reports they produced that documented the performance of their incident management systems. These responses can be found in Table 10 through 13.

Only eight of the responding agencies indicated that they routinely produce reports so they could monitor the performance of their incident management systems over time. Most of these agencies are reporting their performance measures on a system-wide basis. Five of the agencies also indicated that they routinely produce performance reports by roadway segment, and by facility as well. Many of the agencies reported that their software/data management systems are flexible enough to generate performance measure reports at any level.

Table 11 shows the frequency at which the responding agencies produce performance reports while Table 12 summarizes the uses of the performance reports. The frequency at which agencies produce performance reports varies greatly and seems to be a function of their use. Almost all of the transportation agencies that responded indicated that they produce performance reports on a monthly or quarterly basis. Monthly reports are generally used by the operations staff to track use of resources and include such information as the number and type of incidents, the type of responses (or assistance), the devices and/or resources used to manage the incident, the schedules of staff, and the high incident locations. Mid-level administrative staff generally use quarterly reports to assist in the coordination of incident responses across institutional and/or jurisdictional boundaries.

Both of the fire and police agencies that responded to the survey indicated that they generally produce daily reports of the "incidents" (not just those related to traffic operations) that they work. Watch commanders generally use these reports to assess the workload and readiness of the various units to respond to other types of incidents.

Table 10. Aggregation Level of Performance Reports
Agency By Facility By Segment System Wide Other
Kansas DOT—Kansas City empy cell empy cell empy cell Accident frequency can be on any of these levels
New Jersey DOT empy cell empy cell New Jersey Department of Transportation produces system-wide performance reports empty cell
Arizona DOT empy cell empy cell Arizona Department of Transportation produces system-wide performance reports(1) empty cell
Ohio DOT—Columbus empy cell empy cell empy cell empty cell
Tennessee DOT empy cell empy cell empy cell empty cell
City of Phoenix, AZ Fire Department City of Phoenix, AZ Fire Department produces performance reports by facility City of Phoenix, AZ Fire Department produces performance reports by segment City of Phoenix, AZ Fire Department produces system-wide performance reports empty cell
Maryland State Hwy. AdminCHART Maryland State Highway Administration produces performance reports by facility Maryland State Highway Administration produces performance reports by segment Maryland State Highway Administration produces system-wide performance reports Upon request
Texas DOT—Austin Texas Department of Transportation (Austin) produces performance reports by facility Texas Department of Transportation (Austin) produces performance reports by segment Texas Department of Transportation (Austin) produces system-wide performance reports Monthly reports on LCU failures; communications errors
Texas DOT—San Antonio empy cell empy cell empy cell Everytime something is changed, system documents time; therefore, have complete "history" of response
Minnesota DOT—Minneapolis empy cell empy cell empy cell By responder on monthly basis; also produce annual crash/volume report, by location
Caltrans—San Diego empy cell empy cell empy cell By incident
Southeast Michigan COG—Detroit Southeast Michigan COG (Detroit) produces performance reports by facility Southeast Michigan COG (Detroit) produces performance reports by segment Southeast Michigan COG (Detroit) produces system-wide performance reports empty cell
Houston, Tx—Motorist Assistance Patrols Houston Motorist Assistance Patrols produce performance reports by facility Houston Motorist Assistance Patrols produce performance reports by segment Houston Motorist Assistance Patrols produce system-wide performance reports empty cell
New York DOT empy cell empy cell empy cell empty cell
Colorado DOT—Lakewood empy cell empy cell empy cell empty cell
Texas DOT—Houston empy cell empy cell empy cell empty cell
Illinois DOT—Chicago empy cell empy cell Illinois Department of Transportation (Chicago) produces system-wide performance reports empty cell
City of Houston, TX Police Department empy cell empy cell empy cell empty cell
North Carolina DOT empy cell empy cell empy cell empty cell
Connecticut DOT empy cell empy cell empy cell empty cell
1. Think they are generated systemempy cellwide, but know they are grouped by Districts and ORGS (small operating units). Districts then examine the reports specific for their area.

Table 11. Frequency at Which Performance Measures Reported
Agency How often are they produced?
New Jersey DOT Monthly
Arizona DOT Quarterly
City of Phoenix, AZ—Fire Dept. Daily (Captain gets his last shift & last shift before he arrived)
Maryland State Hwy. AdminCHART Monthly—number of incidents by reg; assists; use of devices (monthly meetings); Annually—big picture by University, legislature, other agencies
Texas DOT—Austin Quarterly
Texas DOT—San Antonio As Needed basis—have done 2 system wide evaluations; also use on-line survey on homepage to gauge motorist responses (subjective)
Minnesota DOT—Minneapolis Monthly and yearly—incidents by type and response; special days (e.g., snow days)
Caltrans—San Diego As needed basis—some annual (accidents); monthly—for meeting purposes
Southeast Michigan COG—Detroit Monthly (for operators); quarterly (coordinating committee); and annually (program evaluation)
Houston, TX—Motorist Assistance Patrol Quarterly
Colorado DOT—Lakewood Monthly
Illinois DOT—Chicago Annually
City of Houston, TX Police Dept Daily; monthly
Connecticut DOT As needed basis; monthly

All of the agencies indicated that they also produce annual reports for their systems. These annual reports generally provide an overall summary of the performance of the system and give a "big picture" view of the effectiveness of the system. High-level administrators typically use these annual reports to provide justification for continued operation or expansion of their incident management programs. These reports are also used to identify high incident or "hot spot" locations.

Several agencies indicated that they would occasionally produce performance measure reports on individual or specific incidents. These reports are generally produced on an "as needed" basis and are used to critique the performance of the response agencies and to address problems with the responses to specific incidents. Generally, transportation agencies use these reports as a mechanism for improving coordination between response agencies.

Table 12. Uses for Performance Measure Reports
Agency How are these measures generally used in your system?
New Jersey DOT Feds look at it, not really used by DOT though
City of Phoenix, AZ—Fire Dept. 1) Response planning; 2) Budget planning; 3) Quality Assurance (10% detailed check); 4) Internal Assessment—by command officers, mostly fire side
Maryland State Hwy AdminCHART To get funding (big picture report); identify "hot spots"
Texas DOT—Austin Access queries through Sybase
Minnesota DOT—Minneapolis Generally tracking trends; in past month or two started generating reports to track operators; use w/ media for political support
Caltrans—San Diego Automatically by the system software
Southeast Michigan COG—Detroit They are provided to the Incident Management Coordinating Committee, MDOT, and the FCP operators. They are also provided to the MSP, as requested, for selective enforcement. MDOT uses the information for determining the benefit of the FCP program and to obtain additional funding for expansion.
Colorado DOT—Lakewood Statistics, program justification
Illinois DOT—Chicago Incident frequency/rate used in justification of service patrol, used to determine locations for safety improvements
City of Houston, TX Police Dept. Not sure how they are used
Connecticut DOT Can be used to evaluate staffing schedules, determine high accident locations, and evaluate effective response time and performance.
Arizona DOT We use them to prove we are achieving our goals
Texas DOT—San Antonio Justify giving less money to ITS
Houston, TX—Motorist Assistance Patrol To determine success of program and deputy performance ratings.

Respondents were also asked to indicate whether they thought these performance reports were timely, useful, and accurate. Table 13 summarizes these responses. While most of the respondents generally felt the reports were timely and provided decision-makers with the appropriate level of information they need, a few questioned the usefulness (particularly from the viewpoint of the operators) and the accuracy of the information. Several respondents indicated that they did not exactly know how the higher-level administrators in their agencies actually used the information.

Table 13. Timeliness, Usefulness, and Accuracy of Incident Management Performance Measures
Agency
In general, do you think the information in these reports or the performance measures themselves to be …
Provide the information necessary for effective decision-making?
empty cell Timely? Useful? Accurate? empty cell
New Jersey DOT Yes Yes (1) Yes No (2)
Arizona DOT No (3) Yes No (4) Yes
City of Phoenix, AZ—Fire Dept. Yes Yes (5) Yes Yes
Maryland State Hwy AdminCHART Yes Yes Yes Yes
Texas DOT—Austin No Yes Yes Yes
Minnesota DOT—Minneapolis Yes Yes (6) No (7) Yes
Caltrans—San Diego Yes empy cell Yes Yes
Southeast Michigan COG—Detroit Yes Yes Yes Yes
Houston, TX. Motorist Assistance Patrol Yes Yes Yes Yes
Colorado DOT—Lakewood Yes Yes Yes Yes
Illinois DOT—Chicago Yes Yes Yes Yes
City of Houston, TX Police Dept. Yes Not Sure Yes Not Sure
Connecticut DOT Yes Yes Yes Yes
1. Somewhat—not enough “meat” to be really useful, just break down number of incidents over and under one hour, by type, monthly average incident duration, etc.
2. Don't know enough to capture enough
3. Quarterly reports are up to 3 months behind today
4. It depends on where you get the data—somehow different people can find different numbers
5. For targeted audience
6. Over time
7. Based on operators view—not as good as could be

Integration of Incident Records and Information

Agencies were also asked about the kinds of incident information other agencies kept and their efforts to use this other information to supplement data used to develop incident management performance measures. Their responses are summarized in Tables 14 and 15.

Although many agencies are aware of other sources of incident records (such as 911 dispatching logs), relatively few agencies indicated that they routinely integrate response information about incidents with other agencies (such as fire and police). Several agencies mentioned, however, that efforts were underway in their areas to integrate police and fire computer-aided dispatching (CAD) systems with their freeway management systems. These agencies anticipated that integrating 911 CAD dispatching with their systems should greatly enhance response and record-keeping capabilities.

Several agencies indicated that they do combine information (or harmonize information) with police and/or emergency response agencies on an "as needed" basis. Generally, this involves taking information for the transportation agency's logs and matching them with information on the police or fire incident report forms. In those few cases when this is done, it is generally done as part of a debriefing effort between agencies after a major incident or as part of the preparation for litigation. Generally, when this is done, agencies find the exercise to be fruitful in helping to establish a timeline of response events to a specific incident, which, in turn allows them to more readily identify problems or bottlenecks in the response process.

Issues Involved in Establishing an Incident Management System

Table 15 shows how various agencies responded to questions concerning the issues faced when establishing an incident management system. Common issues cited include the following:

  • Bringing agencies together to work in a coordinated and integrated fashion;
  • Expanding the system to meet new objectives or added functionality with limited resources;
  • Being the "new guy on the block" and having to establish a good working relationship with other response agencies;
  • Providing consistent training for all agencies responsible for responding to incidents;
  • Working with emergency services to strike a balance between providing a safe work environment for responders and maintaining traffic flow past the incident;
  • Maintaining security of the system and confidentiality of data without effecting performance or response;
  • Getting accurate information entered into databases without overburdening operators with too many data entry screens;
  • Asking operations centers to do too much with too little resources; and
  • Involving private towing industry in development of system.
Table 14. Other Sources of Incident Information in Jurisdiction
Agency Do other agencies (such as fire, police, DOT, etc.) keep similar information about incidents in your jurisdiction?
Kansas DOT—Kansas City State Police, Service Patrol
New Jersey DOT Police and fire keep information like number of incidents, but only part of the same information that the DOT collects
Arizona DOT No. They cover different aspects of the incident
Ohio DOT—Columbus Yes—police, service patrol
Tennessee DOT 911 center log—no interaction
City of Phoenix, AZ—Fire Dept. Yes—other fire departments in valley (outside jurisdiction)
Maryland State Hwy
Admin—
CHART
Police and fire keep accident reports. All police reports go to DOT to look at for traditional statistics of accidents.
Texas DOT—Austin Have project to integrate ATMS with CAD system—automatically generate reports—operator will verify incident
Texas DOT—San Antonio Police—incident report on call, keep when they arrive on scene and when cleared; Fire—own method of notification, on file at district
Minnesota DOT—Minneapolis No. Now have CAD linked to State Patrol
Caltrans—San Diego No. Other do, but haven't tried to integrate
Southeast Michigan COG—Detroit Yes, I assume so but probably not to the degree SEMCOG does (with all the integrated data).
Houston, TX.—Motorist Assistance Patrols Yes, TxDOT
New York DOT State police use incident cards. Fire, EMS keeps records of dispatch, arrival, departure times but no traffic incident information.
Colorado DOT No
Texas DOT—Houston Please contact those agencies. Three law enforcement agencies, City and County Traffic and METRO the local transit authority are also housed at TranStar. They have access to the incident database as well as access to input data. To the best of our knowledge they do not do so.
Illinois DOT—Chicago State police, service patrol
City of Houston, TX Police Dept. Yes—TxDOT, MAP
North Carolina DOT Police reports
Connecticut DOT Yes

Table 15. Integration of Incident Information with Other Agencies
Agency Do you integrate or compare information with other agencies? If so, When? If so, How Often? If so, How ? What are generally your findings when this occurs?
Kansas DOT—Kansas City No empty cell empty cell empty cell empty cell
New Jersey DOT Share information with Delaware regional planning organization, DOT planning unit for congestion management program        
Arizona DOT No. They cover different aspects of the incident Partnering sessions between DPS and state Quarterly Given as a presentation with report as supporting documentation Does not change the state of how things are handled.
Ohio DOT—Columbus empty cell Haven't compared yet—requested that information six months ago and just now receiving data from City of Columbus public safety and police department to compare with service patrol, hope to show reduction in accident rates due to service patrol and TMC      
City of Phoenix, AZ—Fire Dept. Yes January Annual formally; informally more often (phone) Across all 26 cities in agreement, written copies to chiefs  
Maryland State Hwy AdminCHART Starting to look at this w/ police and 911 centers empty cell empty cell empty cell empty cell
Texas DOT—Austin Yes As needed As needed Hardcopy—TMT response to specific incidents Information similar—similar time stamps, when responders showed up on scene. Records state change in TCD response
Texas DOT—San Antonio Hope to integrate with Police CAD system empty cell empty cell empty cell empty cell
Minnesota DOT—Minneapolis No. Now have CAD link to State patrol Accident reports w/ highway patrol MinnDOT compare to State—on as needed basis empty cell empty cell Generally good. Lot of incident not accidents. See crashes that don’t have accident reports. Stalls are big incident source.
Caltrans—San Diego Yes For specific reason—may debrief after major incident; serve in court case Infrequently, rare empty cell empty cell
Southeast Michigan COG—Detroit Yes Whenever we can empty cell Using GIS Still being determined.
New York DOT Yes Can find out from state police (co located). Time incident came in—can use to enter more accurate detection time than time stamp from MIST when entered (for major incidents) empty cell May get CAD system in future, be able to query other agency activities. empty cell
Texas DOT—Houston empty cell empty cell empty cell empty cell Law enforcement does not share information readily with the DOT
City of Houston, TX Police Dept. No empty cell empty cell empty cell empty cell
North Carolina DOT Yes Varies—regular meeting in areas to critique incident management Monthly Meeting of interagency Committee Depends on area. Don't want to point fingers in area. Good information for improving response.

Table 16. Issues Faced in Setting Up Incident Management System
Agency What kinds of issues were faced when setting up the system and how were they resolved?
Kansas DOT—Kansas City Current system is incident management manual. Manual is posted on website (www.kdot1.kfdot.org/public/kdot/kcmetro/kcindex). Website also includes press release, lane closures, etc.; before, had problems with police/fire unnecessarily blocking lanes (e.g., fire block 2 lanes to extinguish brush fire, police not clearing lanes fast enough; before, multiple agencies may respond to major incidents. No way to notify media, because each agency might want to use different diversion route. Now 30 cities, 12 counties, 2 states cooperate, use incident manual Juanita developed. She talked to each agency before developing manual to get input, then again after created to explain need for prompt response and clearance. Manual has planned diversions for specific locations, list of contacts, and also describes what agencies cover what, and when to notify other agencies including other states and federal agencies. Manual is updated 2 times/year. All agencies receive e-mail to notify of manual updates.
New Jersey DOT Have problems trying to expand. Feds are behind expansion 100 percent as is the MPO, but design wants to spend money for paving, etc.
Arizona DOT We went from a Phoenix-only based operation to a statewide center. Created institutional barriers within the state DOT as local employees started to handle statewide system issues. Financial barriers were encountered in the form of communications needs. Operations were found to be non-uniform across the state. Training for the handling of incidents was found to be inconsistent. Creation of standards for training.
Ohio DOT— Columbus It is going to take some time to develop a real collaborative effort with all of us to understand that we work for the same employer—the taxpayer. City police work real well on freeway, understand the importance of quick removal of lane blocking incidents. Have problems with the fire department blocking too many lanes (e.g., blocking three lanes for a one lane blocking incident). Had a recent event where multiple units on the side of the freeway with the incident blocked extra lanes. An additional fire unit arrived on the other side of the freeway and blocked the inside lane, they were not needed but remained on scene in the vehicle. Police did not make them clear the area. Have heard fire agencies in other areas act similarly, may need Washington to act to change. Need better communication system between agencies, currently using cell phones.
Tennessee DOT They are the "new guy". Initially, had warm welcome at scene. Has greatly improved over years. Quick clearance issues w/ fire dept. Trying to add this to fire training; Memorandum of understanding with TennDOT and local
City of Phoenix, AZ—Fire Dept. System very old, built like snowball (began in 1945 with chiefs meeting and sharing; 1960 expanded kept information; 1971 began paramedics; 1977 HAZMAT); At each expansion, obstacles were City Manager asking why greater funds; labor sees this as extra added to their job—collecting was a pain—automation has minimized this.
Maryland State Hwy AdminCHART Hard to get code that is user (operator) friendly from contractor (off-the-shelf)—want to create custom software
Texas DOT—Austin How do we use the system—when/how do we pull information from the system
Texas DOT—San Antonio Security (keeping the system safe so someone can't corrupt the system) and confidentiality (displaying accidents without notify family, police need more detailed personal information than traffic)
Caltrans—San Diego Too much to do; too little resources
Houston, TX—Motorist Assistance Patrols Funding—type of vehicles to use, type of services to offer; Funding—created a public/private partnership; Vehicles—Carrying capacity and safety of vehicle; Services—determined type of incidents that might occur while driving.
Colorado DOT—Lakewood Getting accurate information to database, increased training; Response/clearance times reduced now through cooperation with police. DOT has provided police units with courtesy patrol radios, so courtesy patrol can contact police directly from the scene if police involvement needed.
Texas DOT—Houston When the integrated incident management database was developed, input was requested of all TranStar partner agencies. This included Law Enforcement and Transit. There were features requested by Law Enforcement that have never been used because they choose not to get involved in inputting data. However incorporating these features expanded the database GUI beyond what was needed by TxDOT causing operators to have to sift through more functions than were required. However, it was deemed that too much was better than too little.
IlDOT Private towing industry complaints when starting up service patrol, those issues were ironed out over time. Some opposition to using tax dollars for service patrol, but have showed that the peak periods are shorter with the patrol than without. Been in the incident management business for 40 years, none of those guys left to talk to.
North Carolina DOT Turf battles between agencies—face-to-face talks

Most Important Things To Be Measured in Incident Management Program

As a final question in the survey, respondents were asked what were the most important things to be measured in an incident management program, whether or not they were currently collecting the particular performance measures. Their responses are contained in Table 17.

Almost all of the agencies agreed that monitoring time-related performance measures was important for gauging the success of an incident management program. Important time-related performance measures to the monitored include the following:

  • Response time,
  • Duration on scene,
  • Clearance times, and
  • Detection times.

Many also cited the need to have performance measures that relate to the quality of the service being provided, or to quantify the ability of the system to monitor and effect a change in the traffic control. Several performance measures that agencies mentioned along these lines include the following:

  • The amount of delay caused by incidents in the system;
  • The road user costs associated with congestion caused by incidents;
  • The reduction in the overall delay caused by incidents;
  • The reduction in the total duration of the incident (how long lanes were blocked); and
  • The reduction in driving time of the public through incident scenes.
Table 17. Most Important Thing to Measure in Incident Management Program
Agency In your opinion, what are the most important things to be measured, whether or not you are currently collecting?
New Jersey DOT Delay caused by incidents; road user costs, B/C—how incident duration is reduced by ITS
Arizona DOT Notification, detection, response time, on-scene time, clear time, and closing of incident
Ohio DOT—Columbus It differs from urban area to urban area. The incident managers need to define their worst enemy, e.g., Hazmat, roadway geometries, weather, etc. and collect data before and after program implemented to show reduction in performance measures for program justification.
Tennessee DOT Time of clearance—moved to shoulder or exit; # of response units—make sure isn't people there that don't need to be
City of Phoenix, AZ—Fire Dept. Time related measures; quality (of performance) related measures; info to tie performance to specific budget expenditures
Maryland State Hwy Admin.CHART More data you have, better off you are
Texas DOT—Austin Response time; traffic control device changes; when response is provided, who/how many need—right now, we are more interested in did we do something, and not necessarily when we did something; finding information and making sure public has access to it.
Texas DOT—San Antonio Incident detection time; power of system that allows you to make changes in system; ability of system to monitor system and recommend changes; quality of information (data)—direct impact on response; good PR program
Minnesota DOT—Minneapolis Response time; clearance time—when they arrive, when they are out of lanes, and when total clear; on-site measures to ensure scene safety
Caltrans—San Diego What decision-makers are doing; when is significant to people and decision-makers
Southeast Michigan COG—Detroit Clear times, time it takes to return to free flow conditions, time and locations of occurrences, location of abandoned vehicles
Houston, TX. Motorist Assistance Patrols Services offered, reduction in delays in driving time for the public due to traffic incidents
New York DOT Response time; clearance time; resumption to normal flow; times individual lanes opened/closed; secondary accidents—can reduce if get the work out quickly of existing incidents
Texas DOT—Houston Accident: location, frequency, time of day, surface conditions; Detection: time, method frequency; Response time; Clearance time; time required to dissipate the queue. Quantitative differences in these areas by type of incident
Illinois DOT—Chicago Cause and effect of incident; Incident type vs. congestion factor; Will be upgrading computers and software—new database should improve information data collection and reporting.
City of Houston, TX Police Dept. Time incident occurred; location—street and intersection; response time; clearance time; lane closure information
North Carolina DOT Incident duration; response by agencies; effectiveness of response

Previous | Next