Cargando…

Visiomode: An open-source platform for building rodent touchscreen-based behavioral assays

BACKGROUND: Touchscreen-based behavioral assays provide a robust method for assessing cognitive behavior in rodents, offering great flexibility and translational potential. The development of touchscreen assays presents a significant programming and mechanical engineering challenge, where commercial...

Descripción completa

Detalles Bibliográficos
Autores principales: Eleftheriou, Constantinos, Clarke, Thomas, Poon, V., Zechner, Marie, Duguid, Ian
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Elsevier/North-Holland Biomedical Press 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10375831/
https://www.ncbi.nlm.nih.gov/pubmed/36621552
http://dx.doi.org/10.1016/j.jneumeth.2022.109779
Descripción
Sumario:BACKGROUND: Touchscreen-based behavioral assays provide a robust method for assessing cognitive behavior in rodents, offering great flexibility and translational potential. The development of touchscreen assays presents a significant programming and mechanical engineering challenge, where commercial solutions can be prohibitively expensive and open-source solutions are underdeveloped, with limited adaptability. NEW METHOD: Here, we present Visiomode (www.visiomode.org), an open-source platform for building rodent touchscreen-based behavioral tasks. Visiomode leverages the inherent flexibility of touchscreens to offer a simple yet adaptable software and hardware platform. The platform is built on the Raspberry Pi computer combining a web-based interface and powerful plug-in system with an operant chamber that can be adapted to generate a wide range of behavioral tasks. RESULTS: As a proof of concept, we use Visiomode to build both simple stimulus-response and more complex visual discrimination tasks, showing that mice display rapid sensorimotor learning including switching between different motor responses (i.e., nose poke versus reaching). COMPARISON WITH EXISTING METHODS: Commercial solutions are the ‘go to’ for rodent touchscreen behaviors, but the associated costs can be prohibitive, limiting their uptake by the wider neuroscience community. While several open-source solutions have been developed, efforts so far have focused on reducing the cost, rather than promoting ease of use and adaptability. Visiomode addresses these unmet needs providing a low-cost, extensible platform for creating touchscreen tasks. CONCLUSIONS: Developing an open-source, rapidly scalable and low-cost platform for building touchscreen-based behavioral assays should increase uptake across the science community and accelerate the investigation of cognition, decision-making and sensorimotor behaviors both in health and disease.