Profile photo

Hi, I'm Mark

Researcher and electronic, robotic, and software engineer

Feel free to get in touch: markhedleyjones@gmail.com

Current Work

Robotics Software Engineer

At SEQSENSE, I develop software for autonomous robots including SQ2, a security robot that performs scheduled patrols in office buildings, and Forro, a delivery robot developed with Kawasaki Heavy Industries.

SQ2 at Narita Airport
SQ2 performing a patrol at Narita Airport (Tokyo)

Both robots use a rotating turntable with three 2D LiDARs for navigation and wide-angle cameras for 360-degree vision. They transmit real-time data to a web platform for monitoring and can autonomously dock for charging and operate elevators for multi-floor operations.

My work centers on 3D object detection and tracking systems, SLAM-based mapping solutions, and training new detection models for people and other robots using PyTorch and OpenVINO. The video shows a prototype interface I developed for real-time SLAM mapping using ROS and C++. Users could view maps as they were built, with scan quality indicated in green. The interface was designed for tablet/smartphone display mounted to a PlayStation controller.

Web-based interface for SLAM mapping

Agricultural Robotics Research

Kiwifruit Automation Project

As a post-doctoral researcher with the University of Waikato and University of Auckland, I worked on robotic systems for kiwifruit operations. The project involved collaboration between universities, Plant and Food Research, and Robotics Plus Ltd, funded by the Ministry of Business, Innovation and Employment. The 12-person team developed autonomous systems for pollination, harvesting, and platform mobility, though the technology ultimately proved economically challenging for commercial deployment.

Multi-purpose platform in kiwifruit orchard
The multi-purpose platform in a kiwifruit orchard

Autonomous Platform Development

The multi-purpose platform served as a mobile base for harvesting and pollinating systems. This hybrid petrol-electric vehicle achieved autonomous row-following using LiDAR-based orchard structure sensing. Six electric motor/gearbox units provided full electric drive capability while supplying AC and DC power to mounted systems. The platform supported 1 tonne payload capacity with additional fruit storage between rear wheels.

Platform autonomous navigation demonstration

Harvesting System

The harvesting system identified kiwifruit in 3D space using stereo cameras and neural networks. It determined optimal picking sequences to minimize damage to neighboring fruit before directing robotic arms to collect fruit. The integrated approach combined computer vision, machine learning, and precision robotics for orchard operations.

Kiwifruit harvesting system in operation

Pollination System

The pollination system used the same camera and neural network infrastructure to identify flowers in 3D space. It calculated pollen solution trajectory timing to account for vehicle motion, ensuring accurate flower targeting while driving. The system enabled autonomous pollination operations across orchard rows.

Pollination
Targeted pollination of kiwifruit flower clusters

Publications

Curriculum Vitae

CV