Omar Ali
Research Interests
Fleet Mobile Robotics, Autonomous Systems, Soft Robotics/Manipulation
Presentations
- “Mixed Reality for Enhanced Human-Robot Interaction” (poster) – AgriFoRwArdS CDT Annual Conference 2024: Robots in Action [July 2024] – Norwich, UK.
- “SLAM (Simultaneous localisation and mapping)” (oral) – AgriFoRwArdS CDT Summer School: Robotic Phenotyping [July 2024] – Wageningen, The Netherlands.
- “Enhancing Human-Robot Interaction with Mixed Reality” (poster) – AgriFoRwArdS CDT Summer School: Robotic Phenotyping [July 2024] – Wageningen, The Netherlands.
- “3D printer based open-source calibration platform for whisker sensors” (oral) – Towards Autonomous Robotic Systems (TAROS) 2024 [August 2024] – London, UK
- “The AgriFoRwArdS CDT Summer School – Responsible adoption (Theme 3) & Phenotyping Perception beyond 2D Colour (Theme 6)” (oral) – Towards Autonomous Robotic Systems (TAROS) 2024 [August 2024] – London, UK.
About me
I joined the AgriFoRwArdS CDT because I am deeply passionate about researching and developing robotic technologies, particularly those that enhance human-robot collaboration. I believe robotics can significantly alleviate challenges in labour-intensive industries by improving efficiency, safety, and adaptability. Agricultural robotics is especially compelling due to the complexity of tasks influenced by diverse landscapes, crop types, and weather conditions. This sector is vital to global sustenance, and technological advancements can enhance productivity and consistency while maintaining, or even improving, quality and affordability.
My research focuses on how robots can better understand and predict human actions and intentions, ultimately improving workplace conditions. This forms the foundation of my PhD, where I am exploring cognitive architectures and aspects of social robotics. I am excited about the potential of integrating intention prediction, activity recognition, and social robotics to develop more adaptive and intuitive robotic systems.
MSc Project
Mixed Reality for Enhanced Human-Robot Interaction
My MSc project explored the potential of augmented and mixed reality (XR) to enhance human-robot interaction by providing users with additional insights into their robotic counterparts. Through XR-enabled perception, I was able to offer a “sixth sense” that improved users’ intuition and interaction with robots.
One of the key objectives of the project was to integrate XR-based localisation capabilities within robotic systems, enabling seamless awareness between humans and robots, even when physically separated. This allowed users to gain a more immersive and intuitive understanding of robotic movements and decisions, ultimately leading to more effective collaboration.
The research demonstrated how mixed reality could bridge the cognitive gap between humans and robots, enhancing cooperation in various applications. The findings have the potential to contribute to the broader field of human-robot interaction by enabling more natural, transparent, and responsive robotic behaviour.
PhD Project
Multi-Sensor Human Activity Recognition and Intention Prediction (SHARP)
When humans and robots share an environment, such as a food production facility or warehouses, both entities must adapt to each other’s presence. While humans naturally adjust their behaviour over time, robots often lack the flexibility to do so. My PhD research focuses on advancing human-robot collaborations by recognising human activities and predicting their intentions, thus enabling these more adaptive and natural responses from the robot’s side.
This research aims to develop a human-centred framework for Activity, Goal, and Plan Recognition (AGPR), allowing robotic systems to understand, anticipate, and respond to human actions in structured environments such as warehouses and polytunnels. By integrating multi-sensor inputs, including Vicon, RGB, RGB-D, and Lidar, the project seeks to create a more dynamic and adaptable approach to human-robot interaction.
The key objectives of this research are:
- Curating a Diverse Dataset – Capturing human activities, plans, and goals using multi-sensor modalities in realistic conditions.
- Developing AGPR Models – Creating integrated models that connect observed actions to inferred goals and plans.
- Implementing Adaptive Robot Behaviours – Designing proactive and reactive response mechanisms that enable robots to adjust their actions based on real-time human activity recognition.
- Establishing Bi-Directional Feedback – Ensuring AGPR systems not only predict intentions, but use this to inform what a human’s upcoming actions are, improving overall adaptability and collaboration.
- Validating in Real-World Settings – Testing and refining methodologies in simulated and real-world environments.
By developing socially aware and contextually adaptive robotic systems, this research contributes to the next generation of human-robot collaboration, bridging the gap between human intent and robotic action for safer and more efficient workplaces.
Omar’s PhD project is being carried out in collaboration with Ocado Innovation and under the supervision of Dr Helen Harman, Dr Paul Baxter, and Prof Hatice Gunes.