Juan Manuel Galeazzi González
BA(Hons), MSc(Res), DPhil
Postdoctoral Research Fellow
- Computational Neuroscience Tutor and Lab Demonstrator
My research focuses on understanding how sensory signals are integrated and transformed into motor actions. In particular, we are interested on how visual information is used and combined with other signals to guide motor commands such as reaching to a visual target.
To investigate this, I am currently using artificial neural networks to simulate some of the neuronal properties observed in different parts of the primate posterior parietal cortex and premotor areas.
We have recently modeled the visually guided development of neurons that encode the position of targets in the frame of reference of the hand. This result advances our understanding of how these neurons self-organize through visually guided learning.
We are now continuing to develop artificial neural network models that explore how different positional signals can be used to implement reference frame transformations.
Hebbian learning of hand-centred representations in a hierarchical neural network model of the primate visual system.
Born J. et al, (2017), PLoS One, 12
The visual development of hand-centered receptive fields in a neural network model of the primate visual system trained with experimentally recorded human gaze changes.
Galeazzi JM. et al, (2016), Network, 27, 29 - 51
The Development of Hand-Centered Visual Representations in the Primate Brain: A Computer Modeling Study Using Natural Visual Scenes.
Galeazzi JM. et al, (2015), Front Comput Neurosci, 9
Neural Network Modelling of Hierarchical Motor Function in the Brain
Galeazzi JM. and Stringer SM., (2013), Computational and Robotic Models of the Hierarchical Organization of Behavior, 293 - -317
A self-organizing model of the visual development of hand-centred representations.
Galeazzi JM. et al, (2013), PLoS One, 8