Omar Ali
Research Interests
Fleet Mobile Robotics, Autonomous Systems, Soft Robotics/Manipulation
Publications
Zhou, L., Ali, O., Arnaud Soumo, E., Attenborough, E., Swindell, J., Davies, J., & Fox, C. (2025). 3D Printer Based Open Source Calibration Platform for Whisker Sensors. In: Huda, M.N., Wang, M., Kalganova, T. (eds) Towards Autonomous Robotic Systems. TAROS 2024. Lecture Notes in Computer Science, vol 15051. Springer, Cham. [December 2024].
Ali, O., Baxter, P., & Harman, H. (2025) Mixed Reality Visualisations for Interpretable Transparent Robot Behaviour. In: Cavalcanti, A., Foster, S., Richardson, R. (eds) Towards Autonomous Robotic Systems. TAROS 2025. Lecture Notes in Computer Science, vol 16045. Springer, Cham. [August 2025].
Presentations
“Mixed Reality for Enhanced Human-Robot Interaction” (poster) – AgriFoRwArdS CDT Annual Conference 2024: Robots in Action [July 2024] – Norwich, UK.
“SLAM (Simultaneous localisation and mapping)” (oral) – AgriFoRwArdS CDT Summer School: Robotic Phenotyping [July 2024] – Wageningen, The Netherlands.
“Enhancing Human-Robot Interaction with Mixed Reality” (poster) – AgriFoRwArdS CDT Summer School: Robotic Phenotyping [July 2024] – Wageningen, The Netherlands.
“3D printer based open-source calibration platform for whisker sensors” (oral) – Towards Autonomous Robotic Systems (TAROS) 2024 [August 2024] – London, UK
“The AgriFoRwArdS CDT Summer School – Responsible adoption (Theme 3) & Phenotyping Perception beyond 2D Colour (Theme 6)” (oral) – Towards Autonomous Robotic Systems (TAROS) 2024 [August 2024] – London, UK.
“SHARP: multi-Sensor Human Activity Recognition and intention Prediction” (oral) – AgriFoRwArdS Seminar Day and Quarterly Research Progress Meeting [December 2024] – Lincoln, UK.
“Exploring activity goal and plan recognition and integrating with context and social-awareness to create natural, supportive and proactive Human-Robot Interaction” (oral) – Lincoln Centre for Autonomous Systems Human Detection and Understanding Knowledge Exchange [January 2025] – Lincoln, UK.
“SHARP: multi-Sensor Human Activity Recognition and intention Prediction” (oral) – AgriFoRwArdS CDT Annual Conference 2025 [May 2025] – Online.
“The LCASTOR Robotics Competition Team” (oral) – Lincoln Centre for Autonomous Systems Robotics and AI Showcase Event (RAISE) [June 2025] – Lincoln, UK.
“ShepSimulation” (oral) – AgriFoRwArdS CDT Summer School 2025: Going to the Dogs! [June 2025] – Lincoln, UK.
“Mixed Reality Visualisations for Interpretable Transparent Robot Behaviour” (oral) – Towards Autonomous Robotic Systems (TAROS) 2025 [August 2025] – York, UK.
Other activities
- Represented the AgriFoRwArdS CDT at the Robocup 2024 Competition as a member of the University of Lincoln team [July 2024].
- Lead for the LCASTOR outreach team [October 2024 to September 2025].
- Demonstrated the Tiago Robot at robotics bake sale at the University of Lincoln [January 2025].
- Hosted and competed in the East Midlands Home Robotics Competition 2025 [June 2025].
- Provided planning and technical support for the LCASTOR Team entry at the International Field Robotics Event (FRE) [June 2025].
About me
I joined the AgriFoRwArdS CDT because I am deeply passionate about researching and developing robotic technologies, particularly those that enhance human-robot collaboration. I believe robotics can significantly alleviate challenges in labour-intensive industries by improving efficiency, safety, and adaptability. Agricultural robotics is especially compelling due to the complexity of tasks influenced by diverse landscapes, crop types, and weather conditions. This sector is vital to global sustenance, and technological advancements can enhance productivity and consistency while maintaining, or even improving, quality and affordability.
My research focuses on how robots can better understand and predict human actions and intentions, ultimately improving workplace conditions. This forms the foundation of my PhD, where I am exploring cognitive architectures and aspects of social robotics. I am excited about the potential of integrating intention prediction, activity recognition, and social robotics to develop more adaptive and intuitive robotic systems.
MSc Project
Mixed Reality for Enhanced Human-Robot Interaction
My MSc project explored the potential of augmented and mixed reality (XR) to enhance human-robot interaction by providing users with additional insights into their robotic counterparts. Through XR-enabled perception, I was able to offer a “sixth sense” that improved users’ intuition and interaction with robots.
One of the key objectives of the project was to integrate XR-based localisation capabilities within robotic systems, enabling seamless awareness between humans and robots, even when physically separated. This allowed users to gain a more immersive and intuitive understanding of robotic movements and decisions, ultimately leading to more effective collaboration.
The research demonstrated how mixed reality could bridge the cognitive gap between humans and robots, enhancing cooperation in various applications. The findings have the potential to contribute to the broader field of human-robot interaction by enabling more natural, transparent, and responsive robotic behaviour.
PhD Project
Multi-Sensor Human Activity Recognition and Intention Prediction (SHARP)
When humans and robots share an environment, such as a food production facility or warehouses, both entities must adapt to each other’s presence. While humans naturally adjust their behaviour over time, robots often lack the flexibility to do so. My PhD research focuses on advancing human-robot collaborations by recognising human activities and predicting their intentions, thus enabling these more adaptive and natural responses from the robot’s side.
This research aims to develop a human-centred framework for Activity, Goal, and Plan Recognition (AGPR), allowing robotic systems to understand, anticipate, and respond to human actions in structured environments such as warehouses and polytunnels. By integrating multi-sensor inputs, including Vicon, RGB, RGB-D, and Lidar, the project seeks to create a more dynamic and adaptable approach to human-robot interaction.
The key objectives of this research are:
- Curating a Diverse Dataset – Capturing human activities, plans, and goals using multi-sensor modalities in realistic conditions.
- Developing AGPR Models – Creating integrated models that connect observed actions to inferred goals and plans.
- Implementing Adaptive Robot Behaviours – Designing proactive and reactive response mechanisms that enable robots to adjust their actions based on real-time human activity recognition.
- Establishing Bi-Directional Feedback – Ensuring AGPR systems not only predict intentions, but use this to inform what a human’s upcoming actions are, improving overall adaptability and collaboration.
- Validating in Real-World Settings – Testing and refining methodologies in simulated and real-world environments.
By developing socially aware and contextually adaptive robotic systems, this research contributes to the next generation of human-robot collaboration, bridging the gap between human intent and robotic action for safer and more efficient workplaces.
Omar’s PhD project is being carried out in collaboration with Ocado Innovation and under the supervision of Dr Helen Harman, Dr Paul Baxter, and Prof Hatice Gunes.