We investigated the effect of multi-modal odor cues on human smell identification performance to inform the development of an adaptive interface for a mobile application. This involved a data elicitation study (N=429) to collect people’s olfactory associations when exposed to nine sample odors. Based on these associations, we then developed a multimodal interface that offered textual, image or combined cues to augment subjects’ odor perception, and 190 new subjects used the interface to identify odors. We found that participants’ smell identification performance increased when the interface offered visual (image and/or text) cues for odor identification. Furthermore, participants experienced the combination of visual and textual cues as most useful and enjoyable. The results of this experiment show that human smell perception can be successfully enhanced with the help of an adaptive odor cue interface.
We have used the results of this study to develop a first prototype of an intelligent interface that automatically generates cues to assist human smell identification. This prototype is based on causal models (Bayesian Networks). We extracted these observation models for a few relevant chemicals. Due to the lack of data, all types of chemicals could not be covered. Nevertheless, we have shown that construction of models supporting detection and localization using human reports is possible.
If you believe that digital publication of certain material infringes any of your rights or (privacy) interests, please let the Library know, stating your reasons. In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website. Please Ask the Library, or send a letter to: Library of the University of Amsterdam, Secretariat, Singel 425, 1012 WP Amsterdam, The Netherlands. You will be contacted as soon as possible.