Omar Ali
Research Interests
Fleet Mobile Robotics, Autonomous Systems, Soft Robotics/Manipulation
Presentations
- “Mixed Reality for Enhanced Human-Robot Interaction” (poster) – AgriFoRwArdS CDT Annual Conference 2024: Robots in Action [July 2024] – Norwich, UK.
- “SLAM (Simultaneous localisation and mapping)” (oral) – AgriFoRwArdS CDT Summer School: Robotic Phenotyping [July 2024] – Wageningen, The Netherlands.
- “Enhancing Human-Robot Interaction with Mixed Reality” (poster) – AgriFoRwArdS CDT Summer School: Robotic Phenotyping [July 2024] – Wageningen, The Netherlands.
- “3D printer based open-source calibration platform for whisker sensors” (oral) – Towards Autonomous Robotic Systems (TAROS) 2024 [August 2024] – London, UK
- “The AgriFoRwArdS CDT Summer School – Responsible adoption (Theme 3) & Phenotyping Perception beyond 2D Colour (Theme 6)” (oral) – Towards Autonomous Robotic Systems (TAROS) 2024 [August 2024] – London, UK.
About me
I joined the AgriFoRwArdS CDT because I am a very keen to research and develop Robotic technologies as I believe these can really help reduce the problems that arise in spaces in labour intensive roles. Agricultural technologies are particularly interesting due to the complexity of the tasks that arise due to the vast landscape, crops, and weather conditions. It is such a vital field as communities depend on regular farming for their sustenance, so technologies that allow this to be more efficient and consistent can also mean that this type of sustenance has the potential to become more affordable for everyone without compromising, maybe even improving, the quality.
I’m eager to apply my knowledge at Lincoln and engage with cutting-edge technologies. I intend for my PhD research to delve into fleet mobile robotics, aiming to enhance farm health and efficiencies. In my free time, I’m an avid web tech enthusiast, an archery enthusiast, a gamer, and a music lover. Plus, I’m a third-culture kid—guess where I’m from!
MSc Project
Mixed Reality for Enhanced Human-Robot Interaction
My MSc project will explore the potential of augmented and mixed reality to enhance human-robot interactions, with the ultimate aim of developing a game of tag between humans and robots. The goal is to provide both human and robot participants with a “sixth sense” through the use of Mixed Reality (XR) capable headsets.
The long-term objective is to equip a master planner with tools that to create rules and obscure certain information to create a strategic game environment. These tools will help the robot gather additional information about the human participant via an XR headset, and similarly, allow the human to gain insights and exert control over the robot. The immediate focus is on integrating localisation capabilities using XR headsets into ROS stacks, enabling the robot and the user to be aware of each other’s location without being in the same space.
This research could advance the field of human-robot interaction by demonstrating practical applications of mixed reality in robotics. It may pave the way for more intuitive and immersive ways for humans to interact with robots, enhancing collaborative tasks in various fields.
PhD Project
multi-Sensor Human Activity Recognition and intention Prediction (SHARP)
When humans and robots are present within an environment (e.g., a food factory), both types of agents must adapt their behaviour to the presence of the other. Humans seem to naturally adapt over-time; whereas robots’ behaviour currently tends to be more ridged. Our aim is to evaluate how humans are affected by the actions performed by a robot, thus leading to humans and robots more effectively co-existing and cooperating. This project focuses on the first step towards this vision, that is, on recognising the humans’ activities and predicting their intentions. For example, the system could recognise a human picked-up a mop and filled a bucket with water, and thus predict their intention is to clean the floor.
The project will involve four core technology-based stages: (1) setting up an emulated agri-food environment with a set of sensors (e.g., the Vicon system); (2) designing and performing experiments to evaluate the performance of activity recognition methods within that environment; (3) combining the best performing activity recognition approach with an intention prediction method and evaluating the performance within the test environment; and (4) investigating how the approach could be scaled and deployed within a more realistic setting (e.g., the industry partner’s site and/or the environments available at UoL’s Riseholme campus).
Omar’s PhD project is being carried out in collaboration with Ocado Innovation and under the primary supervision of Dr Helen Harman.