Manual on performance of traffic signal systems : assessment of operations and maintenance. Report prepared for the Florida Department of Transportation.

Author(s)
Stevanovic, A. & Radivojevic, D.
Year
Abstract

Investments in traffic signal equipment and staff can bring significant benefits to signal-operating agencies. However, such investments cannot easily be justified without a clear process of recording and documenting benefits of such investments. Such a process requires an evaluation methodology which should be based on quantifiable metrics that can reflect the true effects of the executed investments. Even in the cases where the annual expenditures, service areas (e.g., number of signals), and available staff are similar, operational and maintenance outcomes can vary considerably between different agencies. Therefore, if upper management of a city, county, or any other signal-operating agency wants to evaluate performance of its signal operations and the quality of service provided to citizens, it would need to have a clear procedure to evaluate strengths, weaknesses, and efficiency and reliability levels of its signal system. This need is further amplified if responsibility for operating and maintaining the traffic signals is awarded to private consultants. In such a case, a clear grading system becomes a mandatory component of any evaluation and compensation process. The major requirements for successful evaluation of a traffic signal system’s performance are available data and clearly defined performance measures. In a perfect situation, all of the data are fully available, and the performance measures can be clearly defined and based on quantitative inputs. However, in the real world, this is usually not the case. Lack of the available data is a major limiting factor, and it consequentially affects generation of the performance measures, which have to be based on the available data, some of which may be qualitative. The objective of this research was to develop an evaluation methodology which would help agencies in Florida and across the country to consistently and comprehensively evaluate performance and reliability of the traffic signal systems under their jurisdictions. Evaluation of the traffic signal system’s performance and reliability is a process that can be done in various (more or less frequent) time intervals. Depending on the frequency of the evaluation, the methodology for executing such an evaluation will vary. In other words, metrics which are appropriate for weekly monitoring of traffic signals may lose their significances if aggregated over the entire year and vice versa. This problem was recognized in the first half of this research project, and the scope of the work was modified to tackle both long-term (annual) and short-term (weekly/monthly) evaluations of traffic signal performance. Thus, the study consists of two major and quite distinctive components: annual evaluation of traffic signal assets (intended mostly for upper management of an agency as well as for external stakeholders (e.g., DOT)) and weekly/monthly evaluations of signals’ performance and reliability, which are mostly duties of operators in Traffic Management Centers (TMCs). Similarly, a spatial aggregation of the evaluation is dependent on the temporal aggregation of the system’s outcomes. In other words, while it make sense to report annual evaluation for the performance of the entire signaling agency, the weekly/monthly evaluations have to be constrained to specific subnetworks or corridors. Thus, the latter options are best executed through performance (and reliability) dashboards, where TMC operators can observe historical data and derived performance measures and decide what actions (if any) to take to improve operations and maintenance of the system. Therefore, the final outcome of this research has two components: methodologies for both annual and weekly/monthly evaluations of traffic signals. Both methodologies are practically executed through MS Excel tools/spreadsheets and accompanied with manuals which explain their use and logical flow of information. For the annual evaluations, MS Excel spreadsheets prompt users to answer a set of predefined questions. Such inputs prohibit users from entering ambiguous answers, and enable head-to-head comparisons with the other users. A grading system for the annual evaluation is divided into five distinct categories (Management, Traffic signal operations, Signal timing practices, Traffic monitoring and data collection, and Maintenance), thus mimicking the grading system of the 2012 National Traffic Signal Report Card 2012. The annual evaluation methodology was tested on two pilot agencies which volunteered to provide data and answers in the relevant spreadsheets. The findings from these experiments show that it is possible to achieve unbiased grading by using more quantitative (versus qualitative) grades in the process. However, some limitations were observed, e.g., many entries cannot easily be quantified because they require large amounts of various data, which may not be easy to acquire. To overcome a problem with missing data, the FAU researchers introduced the concept of evaluation confidence, which assigns a level of confidence to the evaluation outcome based on how many entries were based on quantifiable data. The proposed annual evaluation methodology can have a significant impact on the way traffic agencies evaluate their signal systems. However, it is necessary to: (1) standardize the types of the data that are collected and (2) calibrate the grading scale (e.g., by applying this methodology to a larger number of participating agencies). In the pilot studies, the participants were graded by comparing their entries to the virtual examples of the best and worst agencies. Inclusion of a higher number of agencies will make comparisons more realistic, which will lead to the development of a more accurate and meaningful grading scale. Another direction for improvement is development of a framework to connect weekly/monthly evaluations with annual evaluations, where data collected in shorter intervals would be aggregated and summarized by the end of a year. Weekly/monthly evaluations do not rely on the tedious process of collecting vast amounts of data from agency staff, but they use (when available) data that might be already collected and stored by signal system central software. In such cases, a TMC (Traffic Management Center) operator can use custom-built macros (provided as deliverables of this project) to transfer the data from signal system central software (in this case, ATMS.now) into spreadsheets used to create Traffic Signal System Performance Dashboard and Traffic Signal System Reliability Dashboards (also provided as deliverables of this project). The abovementioned macros enable users to seamlessly, in few steps, prepare new databases for the dashboards and visualize some of the key performance measures based on the data from ATMS.now’s reports. The dashboards add a significant value to the entire project because they utilize, in an innovative way, the data that are already available. The Performance dashboard focuses on operational characteristics of traffic signals. Cycle lengths, numbers and percentages of phase activations, minimal, average, and maximal values of phases, green time distribution, and phase terminations are all displayed in one place, which allows the user to easily observe most of the important facts about signal operations. On the other hand, the Reliability dashboard shows the number and the percentage of alarm activations, total number of alarms, etc. Both dashboards introduce some new ways to observe signal operations. For example, the Reliability dashboard shows the top five intersections with the highest numbers of alarm activations and the top five alarm types for the selected system. Both dashboards allow user to filter out, spatially and temporally, intersections and periods that are important. Both dashboards are fully ready for field implementation and testing in the real TMC environments at the agencies that utilize the ATMS.now signal system central software. (Author/publisher)

Publication

Library number
20170319 ST [electronic version only]
Source

Boca Raton, FL, Florida Atlantic University FAU, Civil, Environmental and Geomatics Engineering, 2017, XVII + 243 p., 46 ref.; FDOT BDV27-977-05

Our collection

This publication is one of our other publications, and part of our extensive collection of road safety literature, that also includes the SWOV publications.