Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

This project - funded by the Engineering and Physical Sciences Research Council (EPSRC) - addresses the challenge of ever-present involuntary movements of our eyes by combining ideas, methods and people from theoretical physics of random motion, and from the life sciences of visual neuroscience and psychology.

 Project Summary

How should a sensory system, such as vision or hearing, optimally sample the world? Too much detail takes too long, and too much resource to acquire and process; too little risks failing to capture the vital information on what is going on around us. Human vision is a fascinating example - for even when the gaze is 'fixed' onto a target object or performing a specific task such as deciding which of two objects is higher, the eyes are in constant, apparently-random, motion that we do not understand. One might assume that such involuntary movements of the eye could only 'blur' vision, but there are reasons to believe that they might actually enhance it. One reason for suspecting this is that we know that many aspects of vision have evolved towards the best performance possible.

This project - the Physics of Fixational Eye Movements (PhysFEM) - addresses the challenge of these ever-present involuntary movements of our eyes by combining ideas, methods and people from theoretical physics of random motion, and from the life sciences of visual neuroscience and psychology. The combination is new and potentially powerful: there is almost no realm of physics that does not think about finding paths of trajectories that optimise something. Examples arise in complex classical mechanics, quantum mechanics, and the thermal physics of random processes. These ideas from physics provide a natural but fresh way of thinking about the possibility that fixational eye movements optimise some aspects of vision. More than that, they bring new calculation methods to find such paths, leading to empirically testable predictions about how the eyes might move to maximise the information available given particular task-demands. The objective measurement of human visual performance under controlled conditions - the domain of 'psychophysics' - completes the iterative cycles of model predictions and testing.

Physics enters the experimental mode of this project also in terms of the equipment used in in the Oxford Perception Lab to measure eye movements. The team use 'adaptive optics' - first developed for large astronomical telescopes to correct the optical distortions of the atmosphere - to image the retina at the back of the eye whilst correcting for distortions in the eye's fluid-filled optical components. Such correction permits high-resolution imaging of individual cells in the human retina without any invasive procedure. The Adaptive Optics Scanning Laser Ophthalmoscope (AO-SLO) will perform three simultaneous tasks in PhysFEM: (i) projecting controlled images onto the retina; (ii) imaging the retina at the resolution of individual light-detecting ('photoreceptor') cells; (iii) measuring eye movements with unprecedented accuracy, under a series of specific visual tasks.

The findings of the project will be built into a growing, open-access computational tool for vision science and technology, ISETBio, through a collaboration with its originator and director at the University of Pennsylvania, USA. ISETBio is an open-source set of software tools that characterise the sensory processes of early vision, and provides a platform for realistic computational implementation and evaluation of models for how neural processing incorporates FEMs, and transferring this to the international community of researchers in biological- and computer-vision and medicine.

  

Conference Presentations

Cui J., Villamil M., Ponting S., et al., Oral Presentation: Evaluating the Effect of Pupil Diameter Change on AOSLO Image Quality without Pupil Dilation, Optica Biophotonics Congress: Biomedical Optics, paper TTu3B.7, Fort Lauderdale US, 7th-10th April, 2024.

Cui J., Villamil M., Hexley A.C., et al., Poster: Imaging Reproducibility Using an Adaptive Optics Scanning Laser Ophthalmoscope without Pupil Dilation, ARVO Annual Meeting, poster 5931, Seattle US, 5th-9th May, 2024.

Wang, M. Poster: Vernier thresholds of a Poisson-noise-limited computational observer with and without fixational eye movements, Fall Vision Meeting, Seattle, 5th-8th October, 2023. 

Wang, M. Talk: Computational simulations of Vernier thresholds: role of fixational eye movements in visual sampling, Applied Vision Association Christmas Meeting, Royal Holloway, University of London, 18th December, 2023. 

Wang, M. Talk: Simulations of Vernier thresholds unravel the role of fixational eye movements in visual sampling, Experimental Psychology Society London Meeting, University College London, 3rd-5th January, 2024.

 

News

    

Jiahe ARVO

May 2024: Congratulations to Dr Jiahe Cui who presented at the annual ARVO meeting in Seattle. In the photos above Jiahe is pictured (A) with her poster at the start of the poster session; (B) presenting to fellow conference attendees; (C) sharing her poster with Professor Austin Roorda (Advisory Board Member: University of California, Berkeley); (D) sharing her poster with Dr Niamh Winn (University of Pennsylvania).

 

 

PhysFEM meeting 

April 2024: PhysFEM project team members came together for an in-person Project Team Meeting. The team met in the Department of Experimental Psychology and Pembroke College, University of Oxford. During the meeting, team members presented up-to-date results and short experiments were carried out on the Oxford AOSLO to demonstrate current imaging capabilities. It was a wonderful opportunity to welcome our newer PDRAs (Fabian and Zahra).

 

 

  

Mengxin award

December 2023: Congratulations to Dr Mengxin Wang (PDRA) who was awarded the Honourable Talk Prize for her presentation: Computational simulations of Vernier thresholds: role of fixational eye movements in visual sampling, Applied Vision Association Christmas Meeting, Royal Holloway, University of London, 18th Dec, 2023.

Our Team

Professor Hannah Smithson (Co-PI) - University of Oxford

Professor Daniel Read (Co-PI) - University of Leeds

Professor Martin Booth (Co-I) - University of Oxford

Professor David Brainard (Collaborator) - University of Pennsylvania

Dr Allie Schneider (Research Co-I) - University of Oxford

Dr Mengxin Wang (PDRA) - University of Oxford

Dr Zahra Bagheri (PDRA) - University of Oxford

Dr Fabian Coupette (PDRA) - University of Leeds

Dr Jiahe Cui (PDRA) - University of Oxford

Dr Rebekah White (Administrative PDRA) - University of Oxford

Former Team Members

We are indebted to our friend and colleague, Professor Tom McLeish. Tom was one of the original co-PI's on this project and the insights, questions, ideas, warmth and passion that he brought to the project are a source of inspiration to the current team. 

We are also very grateful to Dr Alex Houston who was a PDRA on this project and maintains collaborative links with the current team.

Advisory Board

Professor William Bialek - Princeton University

Dr Jenny Bosten - University of Sussex

Professor Philip Nelson - University of Pennsylvania

Professor Austin Roorda - University of California, Berkeley

Forthcoming Events

The next Project Team Meeting - bringing together our researchers from Pennsylvania, Leeds and Oxford - will be held in the Department of Experimental Psychology and Pembroke College, University of Oxford: 2-3 July, 2024.

 

The next Advisory Board Meeting will be held in Pembroke College, University of Oxford: 2-3 October, 2024.

 

Pembroke

Gallery

Jiahe Cui visiting Dr. Ramkumar Sabesan's lab at the University of Washington with fellow ARVO conference attendees

JiaheLabVisit.jpeg