Jack of all trades, master of many
Associate Professor Stephen Redmond is taking the biomedical engineering world by storm
Having arrived in Australia in 2008 as a backpacker, with a PhD in biomedical signal processing in his knapsack and a penchant for solving puzzles, Dr Stephen Redmond has managed to carve out an impressive career at UNSW. Approaching the University on the off-chance of some work, he struck gold when Scientia Professor Nigel Lovell (of bionic eye fame) offered him a one-year research position. Fast forward eight years and he is now an Australian Research Council (ARC) Future Fellow and Associate Professor at the Graduate School of Biomedical Engineering.
A self-described ‘jack of all trades’, Dr Redmond has easily proven himself the master of many (although he would insist that his PhD students actually do most of the work). We caught up with him to find out about the research he is currently involved in and where it is headed in the future.
Advances in telehealth promise to change the concept, quality and cost of healthcare forever. What is telehealth and what are you working on in this field?
Telehealth involves the monitoring and management of health and wellbeing remotely. This means you can undertake things like biomonitoring in the home to assist people suffering from chronic disease. We are interested in the development of telehealth technologies and associated software algorithms to interpret the acquired telehealth physiological data.
We have a lot going on in this area, including a project with Sotera Wireless, a US startup company based in San Diego which has designed a really cool wearable sensor that straps onto your arm, with additional sensors all over your body. It aims to supplement nursing observations in a hospital ward (measuring blood pressure, heart rate, etc.), but every time the wearer moves there is ‘noise’ in the signal. We’re helping them better understand how body movement affects the accuracy of the measurements and to develop new algorithms for this challenging environment.
We’re also working on a “tap and go” smartphone health data collection project to assist with cardiac rehabilitation in collaboration with the Austrian Institute of Technology in Graz, and the Prince of Wales Hospital in Sydney. We use the smartphone’s movement sensors to measure exercise (walking), and collect self-measured blood pressure and weight by tapping the phone against the monitors (this uses the same technology as the Visa PayWave at the supermarket checkout).
Can you explain some of your research on fall detection and prediction?
At the moment we are working to develop a very low-power fall detection device which has a barometer to measure air pressure (which is higher on the floor than it is at the waist), which can last for several years on a single small battery.
For our fall prediction research, in order to better assess walking and balance in the community, we are developing a system that uses only three sensors to fully characterise the movement of the entire lower body. There might be applications for this to see early warning signs of deterioration for older people in how they move which might trigger a preventative rehabilitation program. This could even have applications in injury prevention for elite athletes.
Your ARC Future Fellowship is looking to develop a friction sensor for a robotic finger. Why is this research significant?
This research followed an ARC Discovery project I worked on with Dr Ingvars Birznieks, neuroscientist, to try to understand how humans sense friction - or how slippery something is. I was inspired to think about how we could design tactile sensors for prosthetic hands, or autonomous robotic manipulators, which could mimic this aspect of our sense of touch.
When I started to look at the literature, I found very little had been done. There’s lots of information about robotic hands and prosthetic hands, and putting sensors into the fingers to know when they’ve made contact, but there’s no way to sense if the object is slippery, or if we’re about to lose our grasp.
How far have you got with the project?
We’ve built a number of prototypes, including an early-stage prototype of a friction sensor for a gripper. But there are lots of problems still to solve, such as what to do with the feedback information from the sensor. One avenue would be to deliver it to an amputee using simple vibrating motors to warn them if they are coming close to losing their grasp on the object, so they can either put the object down, stop moving it so vigorously, or increase the grip force. Another avenue would be to deliver it to a robotic arm and let the robot decide how to best use that information, although trying to tell a robot what to do with it requires some smarts. We don’t really know how to do that just yet!
In terms of robots, what do you think would be some real life applications for this research?
I like the ‘greater good’ aspect of solving health-related problems in biomedical engineering
A/Prof Stephen Redmond
Perhaps this is a bit far-fetched but as my imagination sees it, improving the dexterity and tactile sensation of robots means they could potentially help with things like surgery or chores around the home. Another application might be using robots in what we call unstructured environments, such as search and rescue, where being able to touch, feel and manipulate in a dexterous way is very important.
What is it about your research that really drives you?
I like to solve interesting problems. My background is quite general – electrical engineering and understanding systems and signals – so I’m able to apply that to many different fields. I like the ‘greater good’ aspect of solving health-related problems in biomedical engineering, but for me the biggest driver is puzzles.