In each moment in time, an animal that wants to move through its environment has to decide whether to turn right, left or whether it should continue to go straight. It has to compare its current heading with its desired heading and compensate any misalignment of these two angles with steering movements. We have developed a computational model of a brain region called the central complex that carries out this computation in the context of path integration.
Based on this computational model of path integration in the bee central complex we have developed hypotheses of how these elementary navigation decisions could be implemented on the level of concrete neural circuits for many distinct navigation strategies (Honkanen et al., 2019). To test the predictions generated by this idea, we are comparing the neural circuitry of the central complex across many species of insects with different navigational strategies.
We currently focus on hymenopteran central-place foragers, i.e. bees and ants that use different forms of navigation to return to their nest. In particular we use the honeybee, bumblebee, the nocturnal sweat bee Megalopta genalis, the desert ant Cataglyphis fortis, the army ant Eciton hamatum, and the Australian bull ant.
The brains of these species are subjected to block-face electron microscopy, and 3D datasets at nanometer resolution are generated that can be used to delineate the complete set of neurons of the central complex and their projections (projectome) across all these species. Individual modules of the central complex are additionally imaged at synaptic resolution, allowing us to derive local connectomics data.
Together, these data will allow us to extract the core conserved circuitry for navigation present in all these insects, but importantly, also enable us to reveal the adaptations that allow a species to carry out distinct navigation strategies, all on the level of defined neural circuits.
Long-term we will emulate the found differences in computational models of the central complex and directly test functional predictions with electrophysiology. Additionally, long distance migratory insects, such as the Bogong moth and the Monarch butterfly, will also be investigated in the near future in collaborative projects.
Methods: Block-face electron microscopy, 3D neuron tracing, large-scale image processing, CATMAID based collaborative neuron annotation
Collaborators: Ajay Narendra (MacQuarie University, Sydney, Australia), CMM (Center for Microscopy and Microanalysis), Queensland Brain Institute, Brisbane, Australia; Harald Wolf (University of Ulm, Germany)
Funding: VR, ERC
Masters and Bachelor projects are always available in this project.