SilicoLabs builds tools that allow researchers to capture and decode behaviour to reveal the foundations of learning, decision-making, and actions.
Relying on codeless, easy-to-use, and versatile graphical user interfaces, the SilicoLabs software tools are hardware agnostic and can be run on all platforms, from desktop computers, to mobile devices and VR/AR/XR headsets.
Their flagship software, LABO, allows anyone to quickly and easily create interactive experiences that simulate the real world. Labo captures high-fidelity behavioural data, like hand, face, and eye-tracking when using XR devices, as well as data from biosensors like EEG. This data unlocks new insights into how people interact with each other and their world. Labo is fully compatible with the LabStreaming Layer (LSL) and can easily collect and synchronize data streams from a wide variety of biosensor devices, including EEG, fNIRS, EKG, EMG, eye tracking systems, kinematics, respiration, and more.
Created by two graduate students from the University of Toronto and now located in Montreal, SilicoLabs is partially supported by the MILA – QuebecAI Institute and its software tools are used by neuroscience and AI laboratories in Canada, USA, and Europe.