AI learns how humans walk, breakthrough paves way for better support for patients
text_fieldsGeorgia Tech researchers have created an AI-based method that can pre-train exoskeleton controllers using existing human movement data.
The team says this removes the need for long laboratory training sessions and could make mobility-assist devices faster and easier to develop.
Earlier, each exoskeleton design required extensive lab work. Engineers had to collect motion data from users to train the robotic suit’s controller.
The new technique changes that process.
According to the study, the researchers redesigned a deep learning model – one previously used for image transformation – to convert normal walking patterns into predicted movements as if the person were wearing an exoskeleton.
The AI system forecasts hip and knee sensor readings. It also learns how much robotic assistance is needed. This creates a pre-trained controller that performs similarly to controllers tuned through traditional lab training. In tests done on a leg exoskeleton, the AI model followed the user’s joint motions and increased their effort by up to about 20%.
The researchers say this approach can help designers create and test prototype devices much faster. A startup could build several versions of a product without gathering new user data, Prof. Young explained.
The same type of controller could support prosthetic limbs or upper-body exoskeletons. These devices might assist amputees or people recovering from a stroke. They may also help factory workers lift heavy objects and reduce fatigue.
Overall, the study suggests that AI-supported training could bring wearable robotic systems – once seen as futuristic – closer to everyday use.


















