Enabling the great beyond.

Leia Stirling wants to make wearables invaluable to clinicians.

“A patient performs a set of activities at home, comes back and improves, doesn’t improve, or gets worse. The clinician has to figure out, ‘Okay, why?'”

Through collaboration with occupational therapists from Mass General Hospital, Hebrew SeniorLife, and Spaulding Rehabilitation Hospital, Stirling works to augment the clinical decision-making process.

“We want to synergize with clinicians. To give them the same measures indirectly that they can get directly — visually or tactilely,” says Stirling, Associate Faculty of the Institute for Medical Engineering & Science (IMES), Charles Stark Draper Professor of Aeronautics and Astronautics (AeroAstro), and Co-Director of the Man Vehicle Lab in AeroAstro. “So how do you create metrics for motion fluidity? Or compensatory mechanisms after stroke? We need more robust algorithms.”

Existing wearables depend on inertial measurement units (IMUs) — the embedded trackers in phones and FitBits, which capture plenty of data. It’s just hard to interpret.

“Step-count is something you can get more easily, because every time you take a step, you get an acceleration spike,” she says. “Using a variety of wearable sensors, we want to record lots and lots of data in a patient’s home environment: accelerations, strains, pressures, et cetera. Then reduce that data to a measure interpretable by the clinician.”

Stirling’s team is cracking the future of wearables — from the technology itself to user-confidence — to support the next generation of remote care: telerehabilitation.

“I don’t view wearables as replacing a clinician. Wearables are not going to make decisions for the clinician, but can provide indirect data to help make a more informed decision,” Stirling says. “For instance, if you live in a rural community, how often can you travel to the clinic?”

And therein lies the connection to AeroAstro. “Space is telemedicine to the extreme,” Stirling says.

One of her group’s graduate students, Richard Fineman, a PhD candidate in Harvard-MIT Health Sciences Technology (HST), is working on a space-based application. “Because what happens if an astronaut has a stroke on a mission to Mars? What happens if they have a fracture? How are they going to rehab and recover? A lot of these things from the perspective of telerehab are going to be important. Because you can’t just say, ‘Lets turn around.'”

Concurrently, Stirling is working on monitoring and understanding muscle motion and activation with colleague Julie Shah, Associate Professor in AeroAstro, who leads the Interactive Robotics Group of the Computer Science and Artificial Intelligence Laboratory (CSAIL).

“We did a study looking at grasp and release of a pen and cup. What we found is there are anticipatory mechanisms in the muscle activity signal prior to grasping the object, but also prior to releasing the object. This is great, because you actually cannot use your kinematics — the arm motions — to detect a release. Being able to get the intent from the muscles is really important for creating an integrated exoskeleton system.”

The small circles are part of the motion capture system that measure the motions. These marker based systems will be replaced with inertial measurement systems in the wearable usage. The band on the forearm are the surface electromyography sensors. Photo courtesy of Leia Stirling.

The small circles are part of the motion capture system that measure the motions. These marker based systems will be replaced with inertial measurement systems in the wearable usage. The band on the forearm are the surface electromyography sensors. Photo courtesy of Leia Stirling.

Combining Stirling’s biomedical insights with Shah’s artificial intelligence work, another PhD candidate, Hosea Siu, uses machine learning techniques to map and predict motion — an important step in advancing human-robot interaction.

“I started my lab here to look at how we can use wearable technology to augment decision making. So whether the decision maker is a human or a robot, my focus is: how we can use these technologies most effectively?”

As a master’s student in aerodynamics at the University of Illinois, Stirling first saw a presentation where aerospace methods were used to solve a biomedical problem. That’s when it all clicked. After completing her PhD from MIT in AeroAstro, she pursued a post-doc in human motor control, serving as a fellow at Boston Children’s Hospital and Harvard Medical School. Next, she joined the Wyss Institute for Biologically Inspired Engineering at Harvard University as a member of the Advanced Technology Team and Director of the Motion Capture Lab.

After almost three years heading up Stirling Lab at MIT, Stirling feels she’s only just begun.

“We have a lot of really fun threads starting. We’re beginning to publish. We’ve got awards to pursue this work from the NSF, NASA, and the US Army. And now we have work to do to make these systems usable outside the lab by non-experts in sensors.”

And, to do that work, Stirling feels she’s in the right place.

“The communities of IMES and AeroAstro have been really supportive of my vision for the research,” says Stirling. “And the students all bring phenomenal ideas.

“We strive to help clinicians do their job better. To help patients have less pain and enjoy life more. But also to explore and go further. So humans can go further into space, not only robots. And that exploration is really exciting.”