Dr Amir Ghalamzan
Academic and research departments
Computer Science Research Centre, School of Computer Science and Electronic Engineering.About
Biography
Amir holds the position of Associate Professor at the University of Surrey, Computer Science, leading the research in Intelligent Manipulation. His primary areas of focus encompass robot learning, robotic grasping and manipulation, teleoperation, Agri-food robotics, haptics, and tactile sensors.
Prior to his role at the University of Surrey, Amir served as an Associate Professor at the Lincoln Institute for Agri-Food Technology from 2021 to 2023, and as a Senior Lecturer at the School of Computer Science from 2018 to 2021, both at the University of Lincoln. Preceding this, he held the position of Senior Research Fellow at the University of Birmingham from 2015 to 2018.
Amir earned his Ph.D. in Robot Learning from Demonstration at Politecnico di Milano in 2015. He also holds a Second Level Specialization in Automatic Control Engineering from Politecnico di Torino (2011) and an M.Sc. in Mechanical Engineering-Systems, Vibration, and Control from Iran University of Science and Technology (2009).
His research interests lie in fundamental aspects of robot learning, and data-driven control and planning, with a strong commitment to addressing real-world challenges. Amir has particularly focused on mitigating labour shortages in the agri-food sector through the development of robots capable of functioning under extreme conditions, such as strawberry picking in high temperatures and humidity.
News
In the media
ResearchResearch projects
Past Research Projects
2020 - 2021 Cambridge Enterprise:
CERES Agtech funded Robofruit; £310K (PI) As a sole investigator, I spearheaded the development of a ground-breaking, fully autonomous strawberry-picking robot (SPR) that meets all food safety and standards requirements. The SPR incorporates state-of-the-art perception, AI, motion planning, and control components that were specifically designed and integrated to achieve optimal field performance. Our tireless efforts have resulted in a patent-protected invention that has the potential to revolutionise the agricultural industry. To validate the SPR’s capabilities, we conducted extensive testing in UoL’s strawberry polytunnel and Dyson Glasshouses, ensuring its readiness for real-world applications.
2021 - 2022 funded by IUK Fastpick; £150K (PI) As head of the UoL team, I played a vital role in developing a revolutionary perception and computing system for strawberry picking. The system was the first of its kind, utilising a private 5G network and edge server for computationally demanding processes. Collaborating with Saga Robotics and Robotfruit, we conducted real-time perception testing to validate its capabilities.
2021 - 2022 (PI) funded by IUK Bi-SENSS: £300K (PI) As the lead at UoL, I spearheaded the development of a dynamic and kinematic robot model, a grasping method using an RGB-D sensor, and motion planning and control for transforming DEXTER from a teleoperated system to an autonomous system. Our team worked closely with Veolia, the project coordinator, as well as Faculty.ai and other partners to ensure seamless integration of the high-level decision-making AI into the autonomous robotic system.
2021 - 2020 (PI) funded by Cancer Research UK ARTEMIS: £100K (PI) Under my leadership at UoL, we developed a pioneering learning-from-demonstration technique that allows a robot to learn the complex task of palpating a breast phantom directly from a human demonstration, without relying on any hand-designed features or computer vision techniques. Our approach directly maps visual information to robot action plans, resulting in highly efficient and accurate performance. Collaborating with colleagues at Imperial College London and the University of Bristol, we demonstrated the potential for using robotic systems for early breast cancer detection and screening.
2021 - 2021 (PI) funded by EPSRC via NCNR CELLO: £100K (PI) Haptic-guided mobile manipulation (CELLO): my team developed a novel haptic-guided method for mobile manipulation. This allows the teleoperation of autonomous cars as well as mobile manipulators much easier as the control of the system is shared between AI and human operators.
2023 - 2027 (Co-I) funded by IUK Agri Open-Core: £1,500K In this project, we aim to develop open-source software for robotic selective harvesting. I will contribute to this project by integrating motion planning into a few harvesting robots.
2017 - 2022 (Co-I) funded by EPSRC NCNR: £2,000K National Centre for Nuclear Robotics (NCNR): As one of the project supervisors, I oversaw a team of Ph.D. and postdoctoral researchers and developed novel haptic-guided teleoperation systems and data-driven control methods for physical robot interaction. Our goal was to enable robust grasping and manipulation of objects in high-consequence environments.
2019 - 2022 (Co-I) funded by IUK Grasp-berry: £300K As the supervisor, I led the team in the development of an innovative interactive motion planning method designed for handling complex tasks, including strawberry cluster manipulation during picking. We verified the efficacy of our approach using simulation techniques.
2020 - 2020 (Co-I) funded by IUK SBRI Competition Sort and Seg: £30K As the supervisor, I led the team in the development of an innovative interactive motion planning method designed for handling complex tasks, including strawberry cluster manipulation during picking. We verified the efficacy of our approach using simulation techniques.
2022 - 2022 (Co-I) funded by UKSA Space Debris removal: £30K
Research projects
2020 - 2021 Cambridge Enterprise:
CERES Agtech funded Robofruit; £310K (PI) As a sole investigator, I spearheaded the development of a ground-breaking, fully autonomous strawberry-picking robot (SPR) that meets all food safety and standards requirements. The SPR incorporates state-of-the-art perception, AI, motion planning, and control components that were specifically designed and integrated to achieve optimal field performance. Our tireless efforts have resulted in a patent-protected invention that has the potential to revolutionise the agricultural industry. To validate the SPR’s capabilities, we conducted extensive testing in UoL’s strawberry polytunnel and Dyson Glasshouses, ensuring its readiness for real-world applications.
2021 - 2022 funded by IUK Fastpick; £150K (PI) As head of the UoL team, I played a vital role in developing a revolutionary perception and computing system for strawberry picking. The system was the first of its kind, utilising a private 5G network and edge server for computationally demanding processes. Collaborating with Saga Robotics and Robotfruit, we conducted real-time perception testing to validate its capabilities.
2021 - 2022 (PI) funded by IUK Bi-SENSS: £300K (PI) As the lead at UoL, I spearheaded the development of a dynamic and kinematic robot model, a grasping method using an RGB-D sensor, and motion planning and control for transforming DEXTER from a teleoperated system to an autonomous system. Our team worked closely with Veolia, the project coordinator, as well as Faculty.ai and other partners to ensure seamless integration of the high-level decision-making AI into the autonomous robotic system.
2021 - 2020 (PI) funded by Cancer Research UK ARTEMIS: £100K (PI) Under my leadership at UoL, we developed a pioneering learning-from-demonstration technique that allows a robot to learn the complex task of palpating a breast phantom directly from a human demonstration, without relying on any hand-designed features or computer vision techniques. Our approach directly maps visual information to robot action plans, resulting in highly efficient and accurate performance. Collaborating with colleagues at Imperial College London and the University of Bristol, we demonstrated the potential for using robotic systems for early breast cancer detection and screening.
2021 - 2021 (PI) funded by EPSRC via NCNR CELLO: £100K (PI) Haptic-guided mobile manipulation (CELLO): my team developed a novel haptic-guided method for mobile manipulation. This allows the teleoperation of autonomous cars as well as mobile manipulators much easier as the control of the system is shared between AI and human operators.
2023 - 2027 (Co-I) funded by IUK Agri Open-Core: £1,500K In this project, we aim to develop open-source software for robotic selective harvesting. I will contribute to this project by integrating motion planning into a few harvesting robots.
2017 - 2022 (Co-I) funded by EPSRC NCNR: £2,000K National Centre for Nuclear Robotics (NCNR): As one of the project supervisors, I oversaw a team of Ph.D. and postdoctoral researchers and developed novel haptic-guided teleoperation systems and data-driven control methods for physical robot interaction. Our goal was to enable robust grasping and manipulation of objects in high-consequence environments.
2019 - 2022 (Co-I) funded by IUK Grasp-berry: £300K As the supervisor, I led the team in the development of an innovative interactive motion planning method designed for handling complex tasks, including strawberry cluster manipulation during picking. We verified the efficacy of our approach using simulation techniques.
2020 - 2020 (Co-I) funded by IUK SBRI Competition Sort and Seg: £30K As the supervisor, I led the team in the development of an innovative interactive motion planning method designed for handling complex tasks, including strawberry cluster manipulation during picking. We verified the efficacy of our approach using simulation techniques.
2022 - 2022 (Co-I) funded by UKSA Space Debris removal: £30K
Publications
This paper presents a novel acoustic soft tactile (AST) skin technology operating with sound waves. In this innovative approach, the sound waves generated by a speaker travel in channels embedded in a soft membrane and get modulated due to a deformation of the channel when pressed by an external force and received by a microphone at the end of the channel. The sensor leverages regression and classification methods for estimating the normal force and its contact location. Our sensor can be affixed to any robot part, e.g., end effectors or arm. We tested several regression and classifier methods to learn the relation between sound wave modulation, the applied force, and its location, respectively and picked the best-performing models for force and location predictions. The best skin configurations yield more than 93% of the force estimation within ±1.5 N tolerances for a range of 0-30 +1 N and contact locations with over 96% accuracy. We also demonstrated the performance of AST Skin technology for a real-time gripping force control application.
This paper introduces a novel Soft Acoustic Curvature (SAC) sensor. SAC incorporates integrated audio components and features an acoustic channel within a flexible structure. A reference acoustic wave, generated by a speaker at one end of the channel, propagates and is received by a microphone at the other channel's end. Our previous study revealed that acoustic wave energy dissipation varies with acoustic channel deformation, leading us to design a novel channel capable of large deformation due to bending. We then use Machine Learning (ML) models to establish a complex mapping between channel deformations and sound modulation. Various sound frequencies and ML models were evaluated to enhance curvature detection accuracy. The sensor, constructed using soft material and 3D printing, was validated experimentally, with curvature measurement errors remaining within 3.5 m-1 for a range of 0 to 60 m-1 curvatures. These results demonstrate the effectiveness of the proposed method for estimating curvatures. With its flexible structure, the SAC sensor holds potential for applications in soft robotics, including shape measurement for continuum manipulators, soft grippers, and wearable devices.
Selective harvesting by autonomous robots will be a critical enabling technology for future farming. Increases in inflation and shortages of skilled labor are driving factors that can help encourage user acceptability of robotic harvesting. For example, robotic strawberry harvesting requires real-time high-precision fruit localization, three-dimensional (3D) mapping, and path planning for 3D cluster manipulation. Whilst industry and academia have developed multiple strawberry harvesting robots, none have yet achieved human-cost parity. Achieving this goal requires increased picking speed (perception, control, and movement), accuracy, and the development of low-cost robotic system designs. We propose the edge-server over 5G for Selective Harvesting (E5SH) system, which is an integration of high bandwidth and low latency Fifth-Generation (5G) mobile network into a crop harvesting robotic platform, which we view as an enabler for future robotic harvesting systems. We also consider processing scale and speed in conjunction with system environmental and energy costs. A system architecture is presented and evaluated with support from quantitative results from a series of experiments that compare the performance of the system in response to different architecture choices, including image segmentation models, network infrastructure (5G vs. Wireless Fidelity), and messaging protocols, such as Message Queuing Telemetry Transport and Transport Control Protocol Robot Operating System. Our results demonstrate that the E5SH system delivers step-change peak processing performance speedup of above 18-fold than a standalone embedded computing Nvidia Jetson Xavier NX system.
This paper aims to present an innovative and cost-effective design for Acoustic Soft Tactile (AST) Skin, with the primary goal of significantly enhancing the accuracy of 2-D tactile feature estimation. The existing challenge lies in achieving precise tactile feature estimation, especially concerning contact geometry characteristics, using cost-effective solutions. We hypothesise that by harnessing acoustic energy through dedicated acoustic channels in 2 layers beneath the sensing surface and analysing amplitude modulation, we can effectively decode interactions on the sensory surface, thereby improving tactile feature estimation. Our approach involves the distinct separation of hardware components responsible for emitting and receiving acoustic signals, resulting in a modular and highly customisable skin design. Practical tests demonstrate the effectiveness of this novel design, achieving remarkable precision in estimating contact normal forces (MAE < 0.8 N), 2D contact localisation (MAE < 0.7 mm), and contact surface diameter (MAE < 0.3 mm). In conclusion, the AST skin, with its innovative design and modular architecture, successfully addresses the challenge of tactile feature estimation. The presented results showcase its ability to precisely estimate various tactile features, making it a practical and cost-effective solution for robotic applications.
Additional publications
- Modular autonomous strawberry picking robotic system, S Parsa, B Debnath, MA Khan, A Ghalamzan, Journal of Field Robotics, 2023
- Towards autonomous selective harvesting: A review of robot perception, robot design, motion planning and control, V Rajendran, B Debnath, S Mghames, W Mandil, S Parsa, S Parsons, A. Ghalamzan, Journal of Field Robotics, 2023
- Deep Functional Predictive Control for Strawberry Cluster Manipulation using Tactile Prediction, K Nazari, G Gandolfi, Z Talebpour, V Rajendran, P Rocco, A Ghalamzan, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2023.
- Proactive slip control by learned slip model and trajectory adaptation, K Nazari, W Mandil, A Ghalamzan, Conference on Robot Learning (CoRL) 2022.
- Deep Movement Primitives: toward Breast Cancer Examination Robot, O Sanni, G Bonvicini, MA Khan, PC López-Custodio, K Nazari, A. Ghalamzan, AAAI Conference on Artificial Intelligence 36 (11), 12126-12134, 2022.
- Action Conditioned Tactile Prediction: a case study on slip prediction, W Mandil, K Nazari, A Ghalamzan, The Robotics: Science and Systems (RSS) 2022
- Planning maximum-manipulability cutting paths, T Pardi, V Ortenzi, C Fairbairn, T Pipe, A Ghalamzan, R Stolkin, IEEE Robotics and Automation Letters (RA-L) 5 (2), 1999-2006, 2020.
- Haptic-guided shared control for needle grasping optimization in minimally invasive robotic surgery, M Selvaggio, A Ghalamzan, R Moccia, F Ficuciello, B Siciliano, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).
- Robot learning from demonstrations: Emulation learning in environments with moving obstacles, A Ghalamzan, M Ragaglia, Robotics and Autonomous Systems (RAS), 2018.
- Guiding Trajectory Optimization by Demonstrated Distributions, T Osa, A Ghalamzan, R Stolkin, R Lioutikov, J Peters, G Neumann, Robotics and Automation Letters (RA-L) 2017, 1-1
- An incremental approach to learning generalizable robot tasks from human demonstration, A Ghalamzan, C Paxton, GD Hager, L Bascetta, IEEE international conference on robotics and automation (ICRA), 5616-5621, 2015.