Cargando…
Acoustical cues for perception of emotional vocalizations in rats
The ultrasonic vocalizations of rats can transmit affective states to listeners. For example, rats typically produce shorter calls in a higher frequency range in social situations (pleasant call: PC), whereas they emit longer calls with lower frequency in distress situations (distress call: DC). Kno...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6646302/ https://www.ncbi.nlm.nih.gov/pubmed/31332218 http://dx.doi.org/10.1038/s41598-019-46907-0 |
_version_ | 1783437530976223232 |
---|---|
author | Saito, Yumi Tachibana, Ryosuke O. Okanoya, Kazuo |
author_facet | Saito, Yumi Tachibana, Ryosuke O. Okanoya, Kazuo |
author_sort | Saito, Yumi |
collection | PubMed |
description | The ultrasonic vocalizations of rats can transmit affective states to listeners. For example, rats typically produce shorter calls in a higher frequency range in social situations (pleasant call: PC), whereas they emit longer calls with lower frequency in distress situations (distress call: DC). Knowing what acoustical features contribute to auditory discrimination between these two calls will help to better characterize auditory perception of vocalized sounds in rats. In turn, this could lead to better estimation of models for processing vocalizations in sensory systems in general. Here, using an operant discrimination procedure, we examined the impact of various acoustical features on discriminating emotional ultrasonic vocalizations. We did this by systematically swapping three features (frequency range, time duration, and residual frequency-modulation pattern) between two emotional calls. After rats were trained to discriminate between PC and DC, we presented probe stimuli that were synthesized calls with one or two acoustical features swapped, and examined if the rats judged these calls as either PC or DC. The results revealed that all features were important for discrimination between the two call types, but frequency range provided the most information for discrimination. This supports the hypothesis that while rats utilize all acoustical features to perceive emotional vocalizations, they considerably rely on frequency cues. |
format | Online Article Text |
id | pubmed-6646302 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-66463022019-07-29 Acoustical cues for perception of emotional vocalizations in rats Saito, Yumi Tachibana, Ryosuke O. Okanoya, Kazuo Sci Rep Article The ultrasonic vocalizations of rats can transmit affective states to listeners. For example, rats typically produce shorter calls in a higher frequency range in social situations (pleasant call: PC), whereas they emit longer calls with lower frequency in distress situations (distress call: DC). Knowing what acoustical features contribute to auditory discrimination between these two calls will help to better characterize auditory perception of vocalized sounds in rats. In turn, this could lead to better estimation of models for processing vocalizations in sensory systems in general. Here, using an operant discrimination procedure, we examined the impact of various acoustical features on discriminating emotional ultrasonic vocalizations. We did this by systematically swapping three features (frequency range, time duration, and residual frequency-modulation pattern) between two emotional calls. After rats were trained to discriminate between PC and DC, we presented probe stimuli that were synthesized calls with one or two acoustical features swapped, and examined if the rats judged these calls as either PC or DC. The results revealed that all features were important for discrimination between the two call types, but frequency range provided the most information for discrimination. This supports the hypothesis that while rats utilize all acoustical features to perceive emotional vocalizations, they considerably rely on frequency cues. Nature Publishing Group UK 2019-07-22 /pmc/articles/PMC6646302/ /pubmed/31332218 http://dx.doi.org/10.1038/s41598-019-46907-0 Text en © The Author(s) 2019 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. |
spellingShingle | Article Saito, Yumi Tachibana, Ryosuke O. Okanoya, Kazuo Acoustical cues for perception of emotional vocalizations in rats |
title | Acoustical cues for perception of emotional vocalizations in rats |
title_full | Acoustical cues for perception of emotional vocalizations in rats |
title_fullStr | Acoustical cues for perception of emotional vocalizations in rats |
title_full_unstemmed | Acoustical cues for perception of emotional vocalizations in rats |
title_short | Acoustical cues for perception of emotional vocalizations in rats |
title_sort | acoustical cues for perception of emotional vocalizations in rats |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6646302/ https://www.ncbi.nlm.nih.gov/pubmed/31332218 http://dx.doi.org/10.1038/s41598-019-46907-0 |
work_keys_str_mv | AT saitoyumi acousticalcuesforperceptionofemotionalvocalizationsinrats AT tachibanaryosukeo acousticalcuesforperceptionofemotionalvocalizationsinrats AT okanoyakazuo acousticalcuesforperceptionofemotionalvocalizationsinrats |