Office of Operations
21st Century Operations Using 21st Century Technologies

5. Future Research

Traffic Signal timing is not a trivial task. Even when the process is applied to a single, isolated controller, the path to optimum signal timing is usually paved with problems. The process of signal timing optimization, to streamlining signal timing, has been the focus of much academic research over the years. As a result, the practicing Traffic Engineer has many optimization models to choose from when retiming traffic signal. Transyt-7F, Passer and Synchro are examples of these models.

The Task-A Report considered the entire signal timing process. It defined specific areas where progress has been made, and identified the interfaces between these areas. This background provided the basis for the identification of specific areas for improvement. Notably, the Task-A Report identified five distinct procedures (Optimization, Deployment, Evaluation, Data Management, and Documentation) associated with the signal timing process. Each of these procedures was examined and evaluated. One of these procedures, Optimization, is considered well developed with several excellent tools including: PASSER, Transsyt-7F, and Synchro, available to the Traffic Engineer. Because optimization models are readily available, this Task B effort concentrates on the other areas to identify procedures where integration and/or automation would be beneficial to the signal timing process.

The initial work effort surfaced several opportunities where targeted improvements in specific areas would likely lead to significant improvements in the effectiveness and/or cost of the overall signal timing process. This Task B report identifies the elements in the signal timing process where improvements to existing procedures or new procedures can enhance and strengthen the signal timing process.

The Task-A effort concluded that the "Signal Timing Optimization" element was the area that had received the most research success and was the area least likely to benefit from additional research. The emphasis for future research, therefore, should be placed on Data Management (including Data Collection and Data Structure), Field Deployment, and Performance Evaluation.

As this effort continued, the areas where signal improvements were needed were further refined to be: Data Collection, Data Management, Data Structure, and Intersection Performance Evaluation.

Following this Introduction section, this report provides a description of the four areas of future research and development where improvements are needed to enhance the signal timing process. Each area is discussed and specific recommendations are made for potential projects that can further refine the signal timing effort.

5.1 Data Collection

To time traffic signals, the data collection need is frequently reduced to acquiring turning movement counts. Many jurisdictions have informal and sometimes formal requirements for 12-hour, turning movement counts as a necessary prelude to any signal retiming effort. While no one could argue that 12-hour counts are not a desirable resource, it is possible to generate good signal timing plans with less than this ideal input. This section presents several different ways that turning movement information can be generated for signal timing purposes.

5.1.1 Project 1 – Short Count Procedures

The objective of this project is to develop and prove the optimum technique to use to obtain estimates of peak period traffic flows using short-term observations. The emphasis in this project is placed on turning movement counts. The specific techniques will be on procedures that can be followed by a single person to obtain accurate estimates of all intersection movements. A critical issue is to determine how many approaches a single person can observe simultaneously. Obviously, at low volume intersections, a single observer can count all traffic movements. At high volume intersections, this is not possible. The developed procedure, therefore, must allow for a single observer to count one or more traffic movements in sequence.

Many Traffic Engineers have procedures that they follow to collect "short counts". Some count for a fixed period, like five or ten minutes. Some count for a fixed number of cycle lengths. There is no definitive methodology that describes an optimum technique to obtain estimates of peak period flows given short time observations.

Collecting turning movement counts is simple enough, it is just not inexpensive. Turning movement counts typically costs in the range of $500 to $1,000 per intersection or more. Converting the raw count data into a format that is useful for analysis also can add a substantial cost.

This is an area where significant progress has been made. For example one vendor, Jamar Technologies Inc., makes an electronic data collection board that is easy to use, accurate, and reliable. Although an observer is still required to record the movements, once the observations are completed, the data are easily uploaded to a computer for further processing. In this case data entry and manipulation of the data is minimal.

One way to reduce the expense for data collection is to reduce the time required to conduct the counts. Many traffic engineers use "short counts" to develop signal timing plans. Short Counts are normal turning movement counts that are conducted over periods of less than normal. Different agencies follow different procedures in conducting these short counts. There is a need for a defined process that is supported by research to guide the practitioners in conducting short counts; the process should be subjected to a sensitivity analysis.

5.1.2 Project 2 – Adapt NCHRP 255 Procedures to Signal Timing

The National Cooperative Highway Research Program developed techniques for estimating traffic demand. These techniques are described in NCHRP 255, "Highway Traffic Data for Urbanized Area Project Planning and Design", Chapter 8. This program derives forecast turning movements using an iterative approach, which alternately balances the inflows and outflows until the results converge (up to a user-specified maximum number of row and column iterations).

Dowling Associates, Inc., a traffic engineering and transportation planning consulting firm based in Oakland, California developed a program, TurnsW that forecasts turning volumes from existing turning movement volumes and forecast future approach and departure volumes. If observed turning volumes are not available, then the estimated turning percentages of the future year assigned inflows can be used. The user may 'Lock-In' pre-determined volumes for one or more of the forecast turning movements. The program will then compute the remaining turning volumes based upon these restrictions.

While neither this program, nor the procedure in described NCHRP 255, was developed with signal timing in mind, the process of estimating turning movement flows given estimates of intersection input and output flows is very useful for signal timing exercises. For near real-time traffic flow estimates, the inflows and outflows can be provided by system detectors. For off-line optimization, traffic flow demand in networks can be developed from link directional counts. It is our opinion this is an area where significant progress can be made in the overall signal timing process.

This project would generate a program like TurnsW that could expand counts from one intersection to a network and use the iterative process defined in NCHRP 255 to estimate traffic flows for a linear network of intersections.

5.1.3 Project 3 – Estimate Turning Movements from Detectors

A research project[17] conducted by Martin developed and evaluated a model, Turning Movement Estimation in Real Time (TMERT), that infers unknown traffic flows (intersection turning movements) from measured volumes in sparsely detectorized networks.

The model has shown its ability to apply the algorithm to minimize a weighted objective function to balance nodal continuity throughout a network and accurately estimate turning movements. TMERT has also shown its repeatability on a second network producing correlation coefficients of determination (r2) of above 90%.

This project would expand on the work conducted by Martin et. al. and determine if the process can be simplified from a complex Linear Programming research model, to a practical application that can be interfaced to systems typically deployed in the United States.

5.1.4 Project 4 – Timing Plan Need Determination

Most Traffic Engineers consider four plans to be the minimum required for proper signal operation: AM Peak Plan, Off-Peak Mid-Day Plan, PM Peak Plan, and a Night Plan. The minimum need, therefore, is to have a turning movement count for each of these four periods; but what about weekends, special events, and emergency evacuation? Does the system have a need for a distinct timing plan to service Saturday shopping traffic; if so, what hours should this plan be used?

The purpose of this proposed project would be to develop a methodology that the Signal Timing Engineer could follow to address these issues. It is likely that a new timing plan would be needed whenever the system experiences a "significant change in demand", similar to the "Traffic Responsive Mode" in a closed loop signal system. The project would address efficient ways to measures and estimate demand and to generate a standard means to identify significant changes. Notice that the word "significant" in this case is not used in the statistical sense. For low to moderate demand conditions, it is anticipated that there could be large changes in demand; but if this variable demand can all be accommodated by one signal plan, then there would be no need to develop a new timing plan. The inverse is also true, once demand is close to capacity, relatively small changes in demand could require a new signal timing plan. This project would investigate these issues.

5.1.5 Project 5 – Traffic Demand Network Procedure

Turning movement counts are collected at specific intersections. Before using this information, many traffic engineers plot the turning movements on a map of the network. This network map is very useful in identifying errors in data collection which otherwise would be difficult if not impossible to identify. Obviously plotting these data is a very time-consuming and error-prone activity.

The purposes of this Project are threefold: to prepare a computer program (could be an Excel spreadsheet) to allow the user to efficiently define the network; to ease entering the turning movement data, and to display the results graphically.

This process would employ logic to identify network data problems – such as the input at one location seems to be lower or higher than the other locations, and to suggest a remedy that would "balance the network." This step would allow the user to override particular movements and have the system adjust the remaining movements. This step would also allow the user to easily do the following:

  • Change the demand in a particular direction (i.e. southbound) by a constant or a percentage.
  • Change the demand in the entire network by a constant or a percentage.
  • Freeze the demand in a particular direction (i.e. southbound) while changing the other flows by a constant or a percentage.
  • When the user is satisfied with the network demand flows, generate an output that can be readily used by PASSER, Transyt-7F, and Synchro.

5.2 Data Management

Traffic signal timing parameters do not exist in a vacuum; the parameters must be installed in hardware on the street. The Signal Timing Process must produce results that can be installed in this hardware. This implies that data management issues must be evaluated within this context of the traffic control systems. There are two developments that will drive data management in the future: the Advanced Transportation Controller and NTCIP. While these two issues will define Data Management in the future, much of the existing traffic hardware in use today will still be in use ten years from now. It is likely that 25% to 50% of the hardware deployed today will still be in service 20 years from now. Our concern, therefore, is to address Data Management not only from the perspective of the future, but also that we address legacy hardware which will likely be in use for many more years.

This section begins with a view to the future with a discussion of the "Advanced Transportation Controller" and NTCIP, and concludes with a discussion of how Data Management can be improved with existing systems.

The Advanced Transportation Controller (ATC) is being developed to provide an open architecture hardware and software platform for a wide variety of ITS applications. In this context the words "open architecture" mean that the system will include both public and private sector developers and have modular software cooperatively running on standardized and shared modular hardware platforms. This will provide cost-effective ITS functionality for a wide variety of applications. To accomplish this goal the system must provide the maximum flexibility for many different system configurations and installations. The general concept and model for the ATC is the PC Computer. However, the ATC will be a field-hardened, general-purpose computer for embedded applications, which with the appropriate software and hardware modules could be asked to perform many different duties.

One of the largest component costs of today's systems is the development, testing, deployment and maintenance of applications software. As the current trend continues towards distributing more of the intelligence of ITS out closer to the field, there is an increasing demand for more capable field deployable devices. This hardware must run more sophisticated applications software and operate in modern networking environments. The ATC is intended to address these needs.

The ATC is intended as a next generation, "Open Systems" controller in which hardware interfaces are generically defined, standardized, and adopted by multiple manufacturers which follows the "Open Systems" lineage of the ATC 2070 and California Model 170 and New York Model 179 controllers. "Open Systems" in this context refer to the concept of separation of hardware from software by standardizing the interface between the two. This allows software to be developed independent of the hardware. "Open Systems" help protect an agency's investment by guarding against premature obsolescence due to a manufacturer's discontinuance of a particular controller.

The key to the ATC software is the use of API's. API's, Application Programming Interface, are the means by which an application program accesses operating system and other services. An API is defined at source code level and provides a level of abstraction between the application and the kernel (or other privileged utilities) to ensure the portability of the code.

As the ATC develops, the software specifications will evolve in parallel. The software is planned to have a two-layer Application Program Interface (API). The Layer 1 API's provide the linkage to the hardware. These API's are the interface between the hardware and the Level 2 API's. The Level 2 API's provide the linkage between Level 1 and the Application software. As applications software is developed for the ATC, the Data Management needs will become known.

The other significant development that impacts controller Data Management is the NTCIP. The National Transportation Communications for ITS Protocol (NTCIP) is intended to provide a commonality among systems that will allow devices from one vendor readily interface with devices provided by another vendor. Recent developments with NTCIP have set the stage for the next step in the evolution of intersection traffic control. These developments will have a significant impact on the signal timing process.

NTCIP is being developed as a family of protocol components that will establish interface standards between traffic management systems and their associated field devices. Traffic signal systems were the initial inspiration for NTCIP, and also the most difficult to fully implement.

As with all standards, NTCIP seeks to define common interfaces to achieve interoperability with other kinds of devices and interchangeability with other brands of signal controllers. Interchangeability requires that the semantics of signal controller settings be fixed, so that they mean the same things across the industry. Of course, fixing those settings also fixes how they work, and on the face of it this leaves little room for new algorithms. For example, NTCIP data objects have been defined to communicate all the conventional gap-acceptance parameters, including extension times, volume-density settings, minimum and maximum green times, and so on. No objects exist, for example, to define queue length or delay, even within the NTCIP objects, though these parameters may prove central to new algorithms based on new detection capabilities.

The structure of NTCIP provides the ability of software vendors to use data objects of their own definition to provide special features not available across the industry. The goal of NTCIP is to define interface standards, not operational standards, and therefore its scope is limited to currently and widely available functionality. While NTCIP holds great promise for the future, it is important to recognize that for most users, the signal timing process must be operable with legacy equipment – the hardware that is currently deployed and is likely to remain in service for many years to come.

Many of the NTCIP standards use a Management Information Base (MIB). Many of the NTCIP standard documents contain sections of text that look like a computer program. In fact, for the standards with "Object Definitions …" in their title, the largest part of the standard is the computer text. This "computer text" is called a Management Information Base, or MIB. The MIB describes the organization of a database that will be created in the memory area of the computers where it's installed. The MIB databases will be used to store information, which in turn will be used to control the traffic signals and other devices in a transportation management system. The MIB is a text document that can be read by a human and "compiled" by a computer. "Compiled" means converted from readable form into the special instruction language used by a computer.

The future of traffic controllers in the United States, and their Data Management Needs, will be determined by the evolution of the ATC and the NTCIP, especially the development of the MIB's.

While the future of Data Management is focused on a single path, the existing legacy is anything but. Each system in current use today requires different inputs and each input is handled in a different manner. The challenge is to identify the common factors and to support the common data needs. This Data Management structure must provide the Traffic Engineer with an efficient mechanism to input data as well as provide an efficient interface to the output the data.

Traffic data management has both a spatial and a temporal component. The spatial component determines where the data can be used. For example, data collected between two intersections can be useful in estimating turning movement data at the two intersections. In this example, the spatial aspect impacts three different locations: the initial location and the location of the two intersections.

The temporal dimension is important from two aspects: quantity and descriptive characteristic. The quantity is simply a byproduct from the fact that traffic demand changes significantly over the course of a day. The traffic signal timing process, whether manual or automated, requires demand estimates that are representative of periods within the day, the AM Peak Hour for example. Because these periods of relatively constant demand are different at different locations, it is necessary to collect data over significant periods of the day. In addition, to be useful, the data must be aggregated in short periods, such as 15-minute periods. The spatial and temporal requirements combined imply that the number of data elements necessary to support the signal timing process amounts to a very large database.

Traffic data exists in many forms, from turning movement counts, to road-tube counts, to system-generated detector counts, and etc. The challenge is to integrate these existing and on-going incoming resources into a database that can be used to time traffic signals. To this end, we have identified the four projects described below.

5.2.1 Project 6 – Data Scrubbing

Every Traffic Engineer wants timely, detailed, and accurate traffic flow data to use as a base for signal system operation. What every Traffic Engineer actually has is something less than this ideal. Turning movements are typically available but not for every intersection, and they frequently have been collected over a period of months if not years, and they may be reported in different time increments, some 15-minutes, some one hour.

The challenge is to develop a procedure to update and aggregate this data into a single representation of a traffic demand condition for the network – a PM Peak Period, for example, showing all major traffic flows.

This procedure will balance the outflows from one intersection with the inflows of the adjacent intersections to assure that the modeled traffic demands are representative of the real traffic flow data. The primary inputs to the procedure are existing Turning Movement data and directional link flow data (road-tube and detector counts). The procedure would make use of the results of Project 2, the simulated turning movement program similar to the TurnsW program developed by Dowling Associates, Inc. to fill in missing intersection data. In effect, this project extends the results of Project 2 to the entire traffic signal system network.

5.2.2 Project 7 – Data Aging and Resolution

As we noted above, every Traffic Engineer wants timely, detailed, and accurate traffic flow data. The available data, however, is typically anything but timely. At issue is how old is too old? In some situations, where the land is developed and employment is steady, there is little change in traffic demand from year to year; in other areas traffic demand can change dramatically in a single day when a new super store opens, or when a major employer shuts down for example. One of the objectives of this project is to investigate the factors that impact traffic flows and to develop guidelines for the practicing Traffic Engineer to identify situations when existing data is adequate, the existing data can be updated, and when the existing data is hopelessly obsolete and must be regenerated. In situations where the existing data can be updated, this Project will develop procedures that can be used to update the data.

A closely related topic, Data Resolution, is the second part of this Project topic. In many jurisdictions, there is an abundance of traffic data ranging from 15-minute counts to Average Annual Daily Traffic (ADT) counts. Much of these data are collected by agencies other than Traffic Engineering for various purposes usually related to City Planning and Commercial Development. Because these data may be available and useful to the Traffic Engineer, this element of the Project will investigate various sources of traffic data that is typically available in an urban jurisdiction and investigate how these data can be used for signal timing. In many case, this issue resolves into a problem of data resolution. That is, the data reported by other agencies may be "per day", but the data may have been generated "by hour" or by "15-minutes", this finer resolution would be far more useful to the Traffic Engineer. If the data were only available on an aggregated basis, however, it may be possible to develop a procedure to estimate flows on a more disaggregate basis. These are the issues that would be investigated in this Project.

5.2.3 Project 8 – Extended Signal Timing Manual

Most of the information available in this general area is provided by vendor user manuals. These manuals describe how the parameter functions. They do not tell the user how to use the parameter. For example, the Extension Time, Time-To-Reduce, and Minimum Gap are the three parameters that support the gap reduction feature. While all manuals tell the user how to input the three parameters and what parameter range is supported by the system, no vendor manual tells the user when to use gap reduction feature, nor do the manuals guide the user to the optimum values for these parameters.

While all controller suppliers provide gap reduction features, some provide these features with different parameters than others. This is illustrated below in Figure 2.

Figure 7. Gap Reduction Parameters.

With some controllers, the "Time to Reduce" is a direct input, with others; this parameter is implied by setting a "Maximum Gap" parameter. This project will examine all parameters used by actuated controllers deployed in the United States and provide a lucid description of how each parameter can be used and describe the expected impact of this parameter on traffic flows.

5.2.4 Project 9 – Signal Timing Field Adjustment Techniques

Once the hardware is determined to be operating correctly, the next task is to determine if controller timing parameters have to be adjusted to respond to changes in traffic demand. Many times, a simple adjustment of one parameter may be all that is necessary. It may be possible to accommodate longer queues on the main street, for example, by simply advancing the Offset by several seconds. Other timing problems can be resolved by simple adjustments to the Minimum Green or Vehicle Extension parameters.

Typically, problems with signal settings are only visible to the trained and experienced traffic signal engineer. To most, the problem is usually attributed to too much traffic. The objective of this project is to identify and define the characteristics of traffic responses to traffic signal operation that indicate a problem that can be ameliorated by changing signal settings.

In effect, this project will provide the knowledge base that is commonly used in the development of an Expert System. The primary goal of an expert system is to make expertise available to engineers and technicians who need answers quickly. There is never enough expertise to go around – certainly it is not always available at the right place and the right time. Portable computers loaded with an in-depth knowledge of a specific topic can bring decades of knowledge to the problem.

The primary goal of this project is to develop a dataset representing an expert's responses to traffic signal timing problems. Later, this problem domain can be used with a knowledge acquisition tool and converted automatically to a knowledge base that can perform as an expert system solving the problem in an expert system shell.

There is no easy path to expert system development. Understanding and communicating expertise are not easy tasks. Knowledge acquisition tools are designed to support and guide the expert. Any real system development involves exploration, false starts, inadequate datasets, and so on. The expert has to learn the skills of expertise transfer through trial and error. This effort, however, could have significant benefits that could result in much better signal timing at the average intersection.

One way to begin building this knowledge base would be to gather known traffic signal timing experts at a Work Shop. The workshop participants would be invited to share their experiences, problems, and solutions. The results of the Work Shop could be published in a "Best Practices" compendium. If the approach appears sound, then additional steps would be taken to begin building a functioning Expert System for Signal Timing.

5.3 Data Structure

While the development of NTCIP in large part has been a task spearheaded by the public sector, there have been other developments in the private section that provide a common denominator among the various simulation and optimization programs. One of the most important of these is the Universal Traffic Data Format currently used by the Trafficware Corporation. This significant recent development that not only has expedited data input to the models; but also has facilitated transferring the optimized results to the traffic control systems.

The Universal Traffic Data Format (UTDF) is an open standard data format specification for traffic signal and traffic related data for intersections that has been developed and promoted by Trafficware, the developers of Synchro and SimTraffic. UTDF can be used to efficiently transfer data between traffic software packages. UTDF can also be used to share data between software and traffic signal controller hardware. UTDF contains the ability to store multiple volume counts and timing plans for multiple intersections. This allows for a structured method of storing large amounts of traffic data, and a significant reduction in data entry of signal timing parameters.

UTDF allows data to be shared between otherwise incompatible software packages. It is anticipated that many software developers will support UTDF. In this scenario data is entered once and then used by all the software together. It is possible for planning departments to store traffic counts for various scenarios and use them for capacity analysis as well as other purposes. With UTDF-compatible software it could be possible for planners to completely automate traffic impact studies for future development and roadway improvements.

Text files are easy for end users to edit with any text editor such as Windows Notepad™. The column-aligned format is provided for compatibility with Turning Movement Count (TMC) files and for easy editing with text editors. The comma-delimited text files (CSV) can also easily be viewed and edited by spreadsheets such or Microsoft Excel. The user or software developer is free to choose the most convenient format.

All of these systems support multiple traffic signal plans that can be called by time of day and by traffic flow measures. All of these systems support the capability of measuring the traffic flow rates from sensors installed in or over the roadway. By combining these two features with the interface to Synchro, one can claim a true "closed loop" system. It works like this. Data are collected for a particular period by the system. These data are then electronically transferred to Synchro using the UTDF format. Synchro is executed and optimum timing parameters are generated. These parameters are converted to system input parameters and are electronically transferred from Synchro back to the traffic control system. This flow of information from the street, to the optimization model, back to the system is called 1 ½ Generation Traffic Control. This capability is available with most system currently deployed.

Although this capability exists, it is not often used. One reason is that few systems have enough instrumentation to actually derive new timing plan data. Another reason is that although the capability is inherent in the system design, few vendors are promoting this capability.

While there is considerable promise to improve the signal timing process in this general area of parameter conversion, the most significant advances have been made by the private sector responding to competitive pressures. This area is very difficult to address because it is basically a linkage between two packages that are in the private sector, Synchro and QuicNet/4, for example. There are other examples that we could cite that are comprised of a linkage between a public sector program (Transyt-7F or Passer II) and a private sector system, ACTRA for example. Perhaps the best contribution to be made in this area is to support training programs that encourage better use of the capabilities of systems.

5.3.1 Project 10 – UTDF

The objective of this proposed project are two-fold: to investigate the use of UTDF to support developing controllers based on using NTCIP and the results of the ATC program, and to investigate the use of UTDF to support existing controllers, at least those currently deployed in significant numbers throughout the United States.

5.4 Intersection Performance Evaluation

There are no existing manual or automatic tools available for use by the Traffic Engineer to evaluate the performance of a signalized intersection in real-time. The Engineer can stand on the corner and observe, or the Engineer can estimate the performance using one of the simulation tools available. "Controller in the Loop" simulation is one approach that has emerged in recent years that helps to bridge the gap between the real-time world and the simulation world.

With this approach the software simulation model generates vehicles, which activate simulated detector calls that are sent to the controller. The controller then uses this information to decide the signal phase of the intersection and sends this information back to the software model. The software model displays the current signals on the screen along with the vehicles in the network, which stop and go according to the signal phase. Meanwhile the software calculates MOE's such as vehicle delay time, queue measurements, speed, and volume. Once the real-time simulation is completed, MOE data compiled by the simulation software can be analyzed.

In recent years, microscopic traffic simulation has become an integral part of transportation and traffic planning, evaluation, and research. This technology has advanced greatly over the past decade but there remains a gap between traffic simulation and real traffic operation. Software-generated traffic simulations can never replicate real traffic conditions exactly. A clear reason for this inaccuracy is that the emulated traffic signal control logic in the simulation model in many cases is unable to replicate real traffic signal control exactly.

The concern, therefore, is to be able to refine existing methods, or develop new methods to evaluate intersection performance in real-time. Two projects address this issue. One considers the problem from the perspective of evaluating the intersection performance from an external perspective. This is, an observer (or machine) would measure performance independently of the intersection controller. The second considers the problem from an internal perspective. The intersection performance would be evaluated using data that is (or could be) available to the controller.

5.4.1 Project 11 – External Intersection Performance Evaluation

The criterion used initially to diagnose the problem is arbitrary and relies on the experience of the Signal Timing Engineer to make the correct decision to rectify the problem. There is a need to better define the diagnostic process to enable a more consistent performance in determining the extent of the problem. This need extends not only to the initial identification of the problem, but also to the evaluation of the adjustments made to solve the problem.

Once the adjustments are completed, the existing process still relies on the experience of the Signal Timing Engineer to judge that the adjustments are an improvement ("Looks OK"). The need is to formalize this evaluation to enable a more consistent performance by non-expert personnel. One approach would be to extend the Expert System approach defined in Project 9 to include the evaluation phase.

Another approach would be to identify specific points in the signal timing process where objective criterion can be employed to reduce the subjectivity to a minimum. This improvement requires clearly defined steps that are performed manually (adjust and observe), so that new practitioners have a set of guidelines to follow. This improvement would focus on the documentation (recording timing plan changes) and determine ways to improve this activity.

5.4.2 Project 12 – Internal Intersection Performance Evaluation

As noted above, evaluating intersection performance is more often than not very arbitrary. What looks OK to one engineer may very well not look OK to another. One feasible alternative way to evaluate signal timing performance is simulation.

While most simulation models provide the same measures of effectiveness, their values and interpretation frequently differ from model to model given identical inputs. This is not an unexpected result since the models use different assumptions and different algorithms to derive the estimates. During the last few years, researchers have compared the models to each other and to ground truth to try to determine which provides the most accurate estimates.

Mystkowski and Khan[18] compared the queue length estimates based on several models and field results. The models considered were CORSIM, version 4.01; Passer II-90, version 2.0; Synchro, version 3.0; SIGNAL94, version 1.22; Transyt-7F. This paper documented the methods used to estimate queue lengths and provides clarification on the definitions used for the different models.

Seeking new measures of effectiveness to be able to accurately evaluate intersection performance is another goal of many researchers. Husch's Intersection Capacity Utilization[19] is one such measure. The Intersection Capacity Utilization provides a straightforward method to calculate an intersection's level of service. The method simply compares a sum of the critical movement's volume to saturation flow rates, based on minimum green time required for each movement.

In general, the trend in recent years is to use simulation to evaluate intersection performance. For example, Transyt-7F can be used to generate optimum signal settings. Transyt-7F can also be used to evaluate existing signal settings. The model can be executed with the signal settings frozen and it will produce measures of effectiveness based on the existing settings. The model can be executed again and allowed to seek an optimum. The measures of effectiveness from the optimized settings can be compared to the measures of effectiveness from the original settings to get a quantified estimate of the probable improvement. This, however, requires a lot of work, generally more than the typical engineer is willing to do to retime a traffic signal.

While simulation offers some hope, even with the controller in the loop, it still leaves a lot to be desired. This project offers a slightly different approach. The focus in this proposed project is to carefully examine the data that is available at the controller to determine if a method can be developed that could automatically and continuously evaluate the performance of an intersection using the information available at the local controller. This information includes: the duration of the signal phase (traffic movement); the demand as measured by the detector(s) for that phase; the cycle length; demand on competing phases; the time of day and day of week; and additional detector measures (occupancy, speed).

This effort extends the scope of this study into the real-time control arena; if successful, the analysis would likely be carried to fruition by a different agency. However, this project could provide the initial analysis and examination to form the foundation for the efforts that would follow. The product of this project is an analysis of how intersection performance can be objectively analyzed using data that are available to the local controller. Implicit is the need for the analysis (algorithm) to be one that could be implemented in an intersection controller. Ideally, it would be simple enough that it could be implemented in legacy controllers.

  1. Martin, P., "Turning Movement Estimates", ITS-IDEA Project 53; Final Report, June 2000.
  2. Mystkowski, C. Khan, S., "Estimating Queue Lengths Using SIGNAL94, SYNCHRO3, TRANSYT-7F, PASSER II-90, and CORSIM". November 1998.
  3. Husch, D. "Intersection Capacity Utilization 2000: A Procedure for Evaluating Signilized Intersections", Trafficware Corporation, 2000.
Office of Operations