EPSRC Centre for Doctoral Training in Agri-Food Robotics: AgriFoRwArdS - Andrew Perrett

Andrew Perrett

  • University of Lincoln in collaboration with Norfolk County Council

Research Interests

Sensing, Robotic/Computer Vision 



  • Addressing geographical domain shift for quantification of UK road verge biodiversity @ The Towards Autonomous Robots and Systems (TAROS) Conference 2023 / CDT Annual Conference / Joint Robotics CDT Conference (September 2023)

About me

I have a mixed background in electronic engineering and digital electronic fault finding within phototypesetters and industrial control gear. Programming of embedded devices and writing component level (sensor) device drivers has linked my interests within digital electronics and software throughout my life. I have also owned and ran a small Internet Service Provider which linked interests in digital communication and business. Ten years prior to arriving at the University of Lincoln to study a BSc in Computer Science, and then a MSc by Research, I had a “back to basics” semi self-sufficient lifestyle. This saw me becoming a very small-scale market gardener and selling home grown fruit and veg produce on a market stall (small scale farming).  The AgriFoRwArdS CDT was a logical step to not only bring many of my interests together but take them to a much higher level. I have continued with the University of Lincoln purely because both the staff and Agri-Robotic opportunities are fantastic.

MSc Project

Addressing geographical domain shift for quantification of UK road verge biodiversity

Previous work on DeepVerge demonstrated a method to survey biodiversity levels of roadside verges within one geographic locale using a convolutional neural network and a fully connected neural network (FCNN) classifier . Known limitations were left as future work, which now require addressing. Ordinal regression may exploit naturally occurring ranking information contained within the survey ground truth data unavailable to DeepVerge’s nominal classification method, allowing the problem of domain shift to then be addressed effectively. Addressing domain shift will extend this work providing DeepVerge with a methodology to expand beyond one locale and survey biodiversity at a national level.

PhD Project

Multi-modal fusion for remote biodiversity surveys of linear green infrastructure

The agricultural sector has a huge role to play in conserving biodiversity and mitigating its impact on the climate crisis. Biodiversity is an important indicator of the overall health of a habitat, and the environment in general. However, agricultural activities, typically focused on growing a single crop type over a large surface area and reliant on a number of substances (ranging from fertilisers to pesticides and herbicides) that are known to cause biodiversity losses, not just in arable fields, but also in the areas surrounding them. This project proposes to develop the means of automatically monitoring the biodiversity of habitats surrounding cultivated land, such as hedgerows, using computer vision techniques that incorporate information from several input sources including satellite, UAVs, and mobile robots. 

The project will initially focus on the continuation of previous work applied to road verges, where a computer vision approach (based on deep convolutional neural networks) will be extended as follows: 

(1) re-training of the model using data supplied by industry collaborator Gaist, 

(2) transfer of knowledge learned on road verges to other domains, such as hedgerows and additional geographic locations, 

(3) incorporation of hierarchical and ordinal relationships when classifying images. 

The second phase of the project will focus on incorporating additional imaging modalities, such as satellite imagery, UAV photographs, and images collected by mobile robots. In the third and final phase, a hierarchical classification system that effectively utilises information from each of the different modalities. 

The PhD candidate will have the opportunity to study and develop computer vision techniques and contribute to the state-of-the-art in deep learning-based methods (i.e., convolutional neural networks, vision transformers). As such they will develop skills in image processing, machine learning, and data visualisation. The candidate will also develop skills in working with several hardware platforms, and operation of a UAV. They will develop their oral and written communications skills through production of journal manuscripts, conference and seminar talks, poster presentations, and engagement with collaborators from other disciplines.

Andy’s PhD project is being carried out in collaboration with Norfolk County Council, under the primary supervision of Dr Petra Bosilj.