In the world of rehab, assistive machines can perform a key role once rehab has plateaued, or at the stage when patients hit a limit in regaining abilities. Some examples include robotic arms and exoskeletons, but the most common is the power wheelchair. In each instance, the goal is to enhance mobility or manipulation abilities. Unfortunately, many of these machines can be difficult to control, especially for patients who have severe restrictions.
This is where a mix of robotics, autonomy and artificial intelligence could play a meaningful role. “If we introduce autonomy so that the machine can partly control itself, and offload some of the burden, these machines can more accessible to more people,” says Brenna Argall, Associate Professor at Northwestern University and Director of the Shirley Ryan AbilityLab's Assistive & Rehabilitative Robotics Lab. “Even enabling someone to feed themselves lunch can be a big reduction in caregiver burden if someone doesn't have to come home from work in order to feed their loved one lunch. The introduction of autonomy and shared control could be the solution.”
Argall is working to create partially autonomous wheelchairs that can help their users avoid obstacles, plan and navigate routes, and maneuver in tricky spaces. “We want to build a system that can pay attention and help with certain tasks,” Argall says.
She and her lab are currently working to characterize just how people drive their powered wheelchairs. They built a course in her lab at the Shirley Ryan AbilityLab—complete with doors, ramps, and sidewalks—and are developing datasets of how patients with spinal cord injuries, cerebral palsy, and ALS use different wheelchair interfaces.
“If we know how people operate these machines, we can design autonomy that knows how that operation happens and expects certain signals to create a more fluid symbiosis between human and machine,” Argall says.
Argall’s lab is focused on figuring out the right way to share control between the human and the autonomy and artificial intelligence. This includes looking at:
- What is the right level of assistance or types of assistance for a given person?
- To what extent is autonomy personalized?
- Is there a customization that should be happening according to the interface, the type of impairment or actual mobility constraints?
“When addressing these fairly open-ended questions, we are looking at exploratory ways to try to figure out the right to provide assistance and to what extent the assistance should occur automatically,” says Argall.
Current Findings
Argall tells IndustryWeek one of the lab’s most interesting findings has been the huge variability in patient preferences. “Interestingly, results are telling us that the right thing to do is to provide options, which ultimately leads to the question of how to provide attractive options in a scalable way,” she says.
Through pilot tests, the lab has also found that people have some sort of intuition about how they are interacting with machines. “If we can somehow get a little bit of input from people, and pair the input with machine learning paradigms that are able to adapt things online, the opportunity exists to exceed any sort of performance that we would have been able to design on our own.”
Looking forward, the goal is to be able to adjust to people as they change over time whether the change is due to degeneration due to a disease, or because of regaining ability due to rehabilitation. “We want to be responsive in a way that machines can essentially learn from the signals that they're getting from the human,” she says.
Road ahead
According to Argall, the logical choice for introducing wheelchair automation into the mainstream market is to partner with manufacturers of assistive machines and serve as an additional offering. A boost in the autonomous driving industry could potentially serve as a watershed moment for this sort of advanced technology. People's comfort level and familiarity with autonomy is already increasing with current vehicles offering lane assist and emergency braking.