FicTrac: A visual method for tracking spherical motion and generating fictive animal paths

Research output: Contribution to journalArticle

Abstract

Studying how animals interface with a virtual reality can further our understanding of how attention, learning and memory, sensory processing, and navigation are handled by the brain, at both the neurophysiological and behavioural levels. To this end, we have developed a novel vision-based tracking system, FicTrac (Fictive path Tracking software), for estimating the path an animal makes whilst rotating an air-supported sphere using only input from a standard camera and computer vision techniques. We have found that the accuracy and robustness of FicTrac outperforms a low-cost implementation of a standard optical mouse-based approach for generating fictive paths. FicTrac is simple to implement for a wide variety of experimental configurations and, importantly, is fast to execute, enabling real-time sensory feedback for behaving animals. We have used FicTrac to record the behaviour of tethered honeybees, Apis mellifera, whilst presenting visual stimuli in both open-loop and closed-loop experimental paradigms. We found that FicTrac could accurately register the fictive paths of bees as they walked towards bright green vertical bars presented on an LED arena. Using FicTrac, we have demonstrated closed-loop visual fixation in both the honeybee and the fruit fly, Drosophila melanogaster, establishing the flexibility of this system. FicTrac provides the experimenter with a simple yet adaptable system that can be combined with electrophysiological recording techniques to study the neural mechanisms of behaviour in a variety of organisms, including walking vertebrates

Details

Authors
  • Richard JD Moore
  • Gavin Taylor
  • Angelique C Paulk
  • Thomas Pearson
  • Bruno van Swinderen
  • Mandyam V Srinivasan
Organisations
External organisations
  • University of Queensland
Research areas and keywords

Subject classification (UKÄ) – MANDATORY

  • Biological Sciences
Original languageEnglish
Pages (from-to)106-119
JournalJournal of Neuroscience Methods
Volume225
Publication statusPublished - 2014
Publication categoryResearch
Peer-reviewedYes