An automated co-driver for advanced driver assistance systems : the next step in road safety. A thesis submitted for the degree of Doctor of Philosophy at the Australian National University.

Auteur(s)
Fletcher, L.S.
Jaar
Samenvatting

Road vehicles offer unique challenges in human-machine interaction. Road vehicles are becoming, in effect, robotic systems that collaborate with the driver. As the automated systems become more capable, how best to manage the onboard human resources becomes an intriguing question. Combining the strengths of machines and humans while mitigating their shortcomings is the goal of this intelligent-vehicle research. Almost every driver has avoided an accident thanks to a warning from a vigilant passenger. In this work we develop the computerized equivalents of the core competencies of a vigilant passenger. The developed systems are then integrated to create a new kind of Advanced Driver Assistance System (ADAS) an Automated Co-driver. We show that the Automated Co-driver is a powerful concept, that could be the next significant step in road safety. Our work has concentrated on road scene computer vision and the scope for improvement on two fronts. First, looking outside the vehicle, we investigated and developed road scene monitoring systems. The systems track the lane, obstacles, road signs and the "visual monotony" of the scene ahead of the vehicle. A visual-ambiguity tolerant framework was developed to extract information about the road scene from noisy sensor data. The algorithm was used for robust lane tracking and obstacle detection. A fast and effective symbolic sign reading system was also developed, as was a road scene visual monotony and clutter estimator. Visual monotony, a likely key contributor to fatigue, was estimated by measuring the variability in the road scene over time. Secondly, these developed components were then combined with the vehicle state, and existing pedestrian detection and a driver eye-gaze monitoring system, to form a comprehensive Advanced Driver Assistance System. In the integrated system the measured driver eye-gaze was correlated with detected road scene features to create a new class of Advanced Driver Assistance Systems. Systems with the potential to detect driver inattention by monitoring the driver's observations, not just the driver's actions. The essential combination of driver, vehicle and road scene monitoring enables us to obtain the missing driver-state information required to contextualise driver behaviour. Finally, we conducted a series of trials on the developed Automated Co-driver ADAS. Through our analysis and these trials we show that it is feasible to detect live in-vehicle correspondences between driver eye-gaze and road scene features to estimate the driver's observations and potentially detect driver inattention. The correlation between eye-gaze and road scene features is shown to be particularly useful in the detection of unobserved road events. (Author/publisher)

Publicatie aanvragen

1 + 11 =
Los deze eenvoudige rekenoefening op en voer het resultaat in. Bijvoorbeeld: voor 1+3, voer 4 in.

Publicatie

Bibliotheeknummer
20110309 ST [electronic version only]
Uitgave

Canberra, Australian National University, Department of Information Engineering, Research School of Information Sciences and Engineering, 2008, XXX + 276 p., ref.

Onze collectie

Deze publicatie behoort tot de overige publicaties die we naast de SWOV-publicaties in onze collectie hebben.