• HOME
  • ABOUT
  • RESEARCH
  • MEDIA ART
  • PHOTO BOOK
  • CONTACT

PUBLISHED RESEARCH WORK



The effect of distance on audiovisual temporal integration in an outdoor virtual environment



Victoria Fucci, Pelle Krol, Plaisier Myrthe & Raymond H. Cuijpers





In this study, we explore the influence of stimulus distance on human tolerance for (physical) asynchronies in virtual reality (VR). A repeated audiovisual (AV) stimulus with sound and light bursts was presented to the participants in an outdoor virtual environment (VE) using a head-mounted display (HMD). The investigation focused on quantifying the point of subjective simultaneity (PSS) for both visual and auditory stimuli. A synchrony judgment method (SJ-3) was used for 11 stimulus onset asynchronies (SOA) and five egocentric distances from 10 m up to 50 m with 10 m increments. The data analysis showed negative PSS values that decreased with distance, resulting in a negative slope (-3 ms/m) of the regression line between PSS values and simulated distances...


read more

The effect of distance on audiovisual temporal integration in an indoor virtual environment



Victoria Fucci & Raymond H. Cuijpers





For several decades, it has been debated whether a distance compensation mechanism exists during audiovisual (AV) synchrony judgements, regardless of the vast difference between the speed of sound and light. Here we aimed to investigate the effect of stimulus distance on the human tolerance for (physical) asynchronies and broaden earlier findings with a state-of-the-art head-mounted display (HMD). In this study, we measured the point of subjective simultaneity (PSS) of visual and auditory stimuli in an indoor virtual environment (VE). The synchrony judgement method was used for 11 stimulus onset asynchronies (SOA) and six egocentric distances up to 30 m. In addition, to obtain higher validity of the dataset, we implemented in our data analysis the results from the previous studies of the egocentric distance perception and the AV hardware latency delay...


read more

Measuring audio-visual latencies in virtual reality systems



Victoria Fucci, Jinqi Liu, Yunjia You & Raymond H. Cuijpers





In virtual reality (VR) systems, delays may occur while the signal passes through hardware and software components, thus causing asynchrony or even cybersickness, as a result. To better understand and control the role of delays in VR experiments, we tested an accessible method for measuring audio and visual end-to-end latency between game engines (Unity and Unreal), head-mounted displays (HMD) (Oculus Rift and Oculus Quest 2) and vertical synchronisation (V-sync) setting (on/off). The measuring setup consisted of the microcontroller, a dedicated serial port, a microphone, a light sensor and an oscilloscope. The measurements showed that Unreal Engine with Oculus Rift had ≈ 16 ms less visual delay and ≈ 33 ms less audio delay than Oculus Quest 2. The Unity Engine with Oculus Rift had ≈ 22 ms less visual delay and ≈ 39 ms less audio delay than Oculus Quest 2...


read more

Quantifying egocentric distance perception in virtual reality environment



Victoria Korshunova-Fucci, Floris F. van Himbergen, Hsiao Ming Fan, Armin Kohlrausch & Raymond H. Cuijpers





In virtual reality (VR) studies, where object distance plays a role of an independent variable, unknown egocentric distance perception values can affect the interpretation of the collected data. It is known that the perceived egocentric distance in VR is often underestimated, which may affect other judgments that implicitly depend on it. In order to prepare later experiments on the effect of distance on audiovisual (a)synchrony perception, this study quantifies the egocentric distance perception in a virtual indoor environment using two methods: verbal judgment (VJ) and position adjustment (PA). For the VJ method, participants verbally estimated the distance between their own position and a cardboard box position at a distance between a nominal 5 m and 30 m, with increments of 5 m. For the PA method, participants were asked to position a cardboard box to an instructed distance of a nominal 5 m to 13 m, with increments of 1 m...


read more

Surround sound spreads visual attention and increases cognitive effort in immersive media reproductions



Catarina Mendonça & Victoria Korshunova





The goal of this study was to explore the effects of different spatial sound configurations on visual attention and cognitive effort in an immersive environment. For that purpose, different groups of people were exposed to the same immersive video, but with different soundtrack conditions: mono, stereo, 5.1 and 7.4.1. The different sound conditions consisted of different artistic adaptations of the same soundtrack. During the visualization of the video, participants wore an eye-tracking device and were asked to perform a counting task. Gaze direction and pupil dilation metrics were obtained, as measures of attention and cognitive effort. Results demonstrate that the conditions 5.1 and 7.4.1 were associated with larger distributions of the visual attention, with subjects spending more time gazing at task-irrelevant areas on the screen...


read more

The impact of sound systems on the perception of cinematic content in immersive audiovisual productions



Victoria Korshunova, Gerard B. Remijn, Synes Elischka & Catarina Mendonça





With fast technological developments, traditional perceptual environments disappear and new ones emerge. These changes make the human senses adapt to new ways of perceptual understanding, for example, regarding the perceptual integration of sound and vision. Proceeding from the fact that hearing cooperates with visual attention processes, the aim of this study is to investigate the effect of different sound design conditions on the perception of cinematic content in immersive audiovisual reproductions. Here we introduce the results of a visual selective attention task (counting objects) performed by participants watching a 270-degree immersive audiovisual display, on which a movie ("Ego Cure") was shown. Four sound conditions were used, which employed an increasing number of loudspeakers, i.e., mono, stereo, 5.1 and 7.1.4. Eye tracking was used to record the participant's eye gaze during the task...


read more