A Multisensory Driven Adaptation to Improve Driver's Comfort
Presenter
February 25, 2019
Abstract
Ruzena Bajcsy - University of California, Berkeley (UC Berkeley), CITRIS
Our basic hypothesis is that the driver in the car is exposed to different environmental stimuli coming both from the road as well as inside of the car. These stimuli are multimodal: Visual, acoustic and the motion of the drive modulated by the interaction of the car with the road surface.
There are several questions that we need to resolve as well as to investigate the interplay between visual, acoustic, and motion data and the driver’s attention in order to utilize these multimodal data properly.
The visual data from the outside of the car provides the spatiotemporal assessment of this environment. On the other hand, the visual data inside of the car provides information, on the other occupants beside the driver, their behavior that may affect the driver’s state. Similarly, the acoustic information both from the outside and inside environment. The question for us is these two different modalities enhancing or diminishing the state of the driver (being either positive or negative). Similarly, it is the observed motion of the driver (and its seat). These two motions detected by a pressure sensor on the seat need to be decoupled and discriminate between the motion coming from the restlessness of the driver and/or coming from the roughness of the road.
In collaboration with Erickson Rangel do Nascimento, Michal Gregor, Isabella Huang
EECS department, UC Berkeley.