Tavis Shore


Postgraduate Research Student
B.Eng. (Hons), M.Sc.

About

My qualifications

M.Sc. Data Science
University of Surrey
B.Eng. (Hons) Electronic Engineering
University of York

Affiliations and memberships

MIET
Member of the Institute of Engineering and Technology
MIEEE
Member of the Institute of Electrical and Electronics Engineers

Research

Research projects

Publications

Tavis George Shore, Simon J Hadfield, Oscar Mendez (2024)BEV-CV: Birds-Eye-View Transform for Cross-View Geo-Localisation, In: Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 24) Institute of Electrical and Electronics Engineers (IEEE)

Cross-view image matching for geo-localisation is a challenging problem due to the significant visual difference between aerial and ground-level viewpoints. The method provides localisation capabilities from geo-referenced images, eliminating the need for external devices or costly equipment. This enhances the capacity of agents to autonomously determine their position, navigate, and operate effectively in GNSS-denied environments. Current research employs a variety of techniques to reduce the domain gap such as applying polar transforms to aerial images or synthesising between perspectives. However, these approaches generally rely on having a 360 degree field of view, limiting real-world feasibility. We propose BEV-CV, an approach introducing two key novelties with a focus on improving the real-world viability of cross-view geo-localisation. Firstly bringing ground-level images into a semantic Birds-Eye-View before matching embeddings, allowing for direct comparison with aerial image representations. Secondly, we adapt datasets into application realistic format-limited-FOV images aligned to vehicle direction. BEV-CV achieves state-of-the-art recall accuracies, improving Top-1 rates of 70 degree crops of CVUSA and CVACT by 23% and 24% respectively. Also decreasing computational requirements by reducing floating point operations to below previous works, and decreasing embedding dimensionality by 33%-together allowing for faster localisation capabilities.