The Echobot—An automated system for stimulus presentation in studies of human echolocation.
2019-09-24T10:54:43Z (GMT) by
This repository contains data and scripts related to the article “The Echobot—An automated system for stimulus presentation in studies of human echolocation” by Tirado, Lundén, & Nilsson (2019), accepted for publication in PlosONE. Please see the file List_of_files.txt for a short description of the content of each uploaded file.
(a) Rawdata are stored in *.txt files. Descriptions of variables (code-book) are included at the top of these files (lines starting with #).
(b) The R-code for analyzing the perceptual data in Figs. 5 and 6 of the article are in Echobot_analysis_script.R.
(c) R-code for the acoustic analysis are in the files a110_sound_analysis_fig3.R and a110_sound_analysis_fig4.R. Note that the script for Fig. 3 need information in file_fixed.txt and in six *.wav files. The *.wav-files are from measurement recordings (to save space, only those files relevant to Fig. 3 of the paper are included in this repository; see the file file_fixed.txt for a list of all recorded sounds, please contact the authors if you want access to these files).
(d) Files for running the Echobot and collecting responses are *.py (Python code) and *.wav (sound files played by Python for presenting the click signal, playing the masking sound and providing auditory feedback).