Dataset for "Eye movement-related eardrum oscillations do not require current visual input"
- Link:
- Autor/in:
- Verlag/Körperschaft:
- Universität Hamburg
- Erscheinungsjahr:
- 2025
- Medientyp:
- Datensatz
- Schlagworte:
-
- EMREOs
- Eye movements
- Eardrum oscillations
- Sensory input
- Oculomotor system
- Reference frame transformation
- Beschreibung:
-
Open data corresponds to the paper entitled Eye movement-related eardrum oscillations do not require current visual input by Hossein Abbasi, Stephanie Lovich, Brigitte Röder, Jennifer M. Groh, & Patrick Bruns, with the DOI https://doi.org/10.1016/j.heares.2025.109346, published in Hearing Research in 2025.
Abstract: Oculomotor signals influence the neural processing of auditory input. Recent studies have shown that this connection extends to the auditory periphery: The phase and amplitude of eardrum oscillations was systematically influenced by eye movement direction and magnitude, a phenomenon called eye movement-related eardrum oscillations (EMREOs). Previous findings have suggested that EMREOs occur independently from auditory stimulation, but it is unknown whether they depend on the presence of visual sensory input or solely reflect efference copies of the oculomotor system. To distinguish between these two alternatives, we measured eye movements and eardrum oscillations in sighted human participants who performed free saccadic eye movements in darkness. Despite the lack of any sensory stimulation during eye movements, significant EMREOs occurred in all participants. EMREO characteristics were comparable to a separate control experiment in which participants performed guided saccades to visual targets and were robust to different types of eye tracker calibration methods. Thus, our results suggest that EMREOs are not driven by bottom-up sensory signals but rather reflect a pure influence of oculomotor signals on peripheral auditory processing. This indicates that EMREOs might play a crucial role in reference frame transformations which are needed for audio-visual spatial integration.
The dataset Abbasi et al. 2025 - Experiment1.zip consists of data from eight participants included in Experiment 1.
The dataset Abbasi et al. 2025 - Experiment2.zip consists of data from 21 participants included in Experiment 2. This dataset has two sub-folders: one for the condition when the eye tracker was calibrated using the standard method and with the participant's left eye (Standard calibration), and one for the condition when the eye tracker was calibrated using the stand-in method and with the experimenter's left eye (Stand-in calibration).
Every *.mat file contains two structures, one related to the eye position in degrees (HEPos: horizontal position; VEPos: vertical position) and one related to the microphone data for each ear. The data are epoched from -200 ms to 250 ms with respect to saccade onset, with the sampling rate of 1000 Hz for the eye tracking data and 2000 Hz for the microphone data. In every structure, each row corresponds to one saccade.
- This work was supported by the Volkswagen Foundation (Grant 97624 to P.B.) and the German Research Foundation (DFG Ro 2625/10-1 to B.R.).
- Lizenzen:
-
- https://creativecommons.org/licenses/by/4.0/legalcode
- info:eu-repo/semantics/openAccess
- Quellsystem:
- Forschungsdatenrepositorium der UHH
Interne Metadaten
- Quelldatensatz
- oai:fdr.uni-hamburg.de:17662