Robots walking in the shoes of humans

Engineers at the Amber lab at Caltech in California are using data to help robots walk more like humans. This research is led by Professor Aaron Ames in the Department of Mechanical and Civil Engineering. He joins Hari Sreenivasan via Google Hangout.

TRANSCRIPT

Engineers at the AMBER Lab at Caltech in California are using data to help robots walk more like humans.

This research is led by Professor Aaron Ames in the Department of Mechanical and Civil Engineering, who joins us now via Google Hangout.

So, why is it important for a robot to walk like a human?

Well, if we can understand how the dynamics of walking can be translated to robots, we can understand how to take those same basic concepts and put it on helping people.

So, we can translate robotic technologies to things like robotic-assisted devices -- prosthetics, exoskeletons.

So we can use that knowledge to make people walk better.

You had a heel-to-toe breakthrough.

Explain that.

I mean, we're told that we should walk in heel to toe, but it's actually a lot harder to program a robot to do exactly that?

That's right. Yeah.

I mean, it's a deceptively simple thing what we do when we walk.

You know, we come in, and we land with our heel, and then we roll on our foot, and we push off on our toe, and for the longest time, most robots walked with these very flat-footed gaits.

So that's basically a limitation of the current way that people think about walking in the robotics community -- not everyone, but a large collection of people, and what we were able to do was take a different approach that allowed us to really exploit the dynamics of the robot, and, as a result, get these very dynamic behaviors with heel-toe.

So what that meant was much more natural walking, much more efficient walking, much more humanlike walking.

But does this work give you an appreciation for how efficient over evolutionary scale time we have become as humans in this one act, which is to walk forward?

Absolutely.

I mean, walking is this, again, deceptively simple thing.

I always say the solution to the problem that I study mocks me every day.

It walks right outside of my window.

You know, I mean, humans, the way we're able to walk in such a natural and elegant way that we take for granted, we don't even have to think about.

We can sit there and text on our phone while we're walking around, and yet it's so hard to capture that kind of simplicity and elegance in motion on robotic systems.

To do that requires us to use all of this dynamics, mathematics, algorithms, mechanical design, and have it all come together in concert for the simple moment of seeing this elegant behavior on the back side.

So it's a wonderfully complex, but rich phenomenon that has this beautifully elegant solution that we're always trying to determine and find.

And a toddler uses all that to figure the same thing out in a matter of months, right?

Yeah, exactly.

But a toddler is also modeling the humans around it.

Is there any kind of learning that you can put into a robot to say, 'Behave more like this'?

And the toddler, when you watch it walk, when you watch a child learn how to walk, you sometimes see the phases that we go through with our robots, right?

It starts out as this very clumsy thing, and you have to kind of hold the robot and guide it and sort of teach it how to go, but what you're doing is you're not really teaching it, right?

It's not actually learning from your guidance, but you're learning what the robot does.

You're starting to understand its behavior and how to change its programming.

Can a robot now reach a perfect stride, what is sort of mathematically defined as a perfect walk?

Not yet.

We're getting closer, though.

I mean, there's lots of things I would argue go into that perfect stride 'cause there is a notion of a perfect stride, and, again, it's what people do every day without thinking about it.

But to get that perfect stride, we have to understand what that means from a mathematical and quantifiable point of view, and so one metric would be efficiency.

Can we make a robot walk so it looks humanlike and also is as efficient as a human when it walks?

And we have metrics of that -- how efficient humans are when they walk, and we're not there yet on humanoid robots.

We're getting closer, though.

I mean, we've come a long way.

Right now our walking that we had on our robot, that multi-contact walking, was about five times less efficient than a human.

Let's say you reach this perfect stride, or even in between, how does the research that you're doing help you design better prosthetics for people who need limbs?

Well, I mean, we can get that natural and efficient walking on the prosthetic device, the way that we can naturally translate this understanding, because it's a mathematical understanding.

We don't really care what the platform is in some sense, right?

We can understand its dynamics and take those ones and zeros of the algorithms and put it on a prosthetic.

So the more we understand locomotion, the more we understand how to control that prosthetic.

So we build prosthetics from scratch in my lab, as well, with the same technologies that we use to build bipedal robots and humanoids, and by understanding how to make them walk better, we continually make our prosthetic function better, and what does that mean in terms of function better?

Again, in the end, it means the person wearing it feels like a more natural movement when they walk.

You know, let's say I needed a new left leg, but my right leg had a tendency to pronate or supinate, right?

Here robot leg coming in is in this sort of almost perfect way, my body would have a tough time putting the two together, saying this still doesn't feel like me.

You raise two important points.

Can we just look at the good behavior of a leg and just map it over?

And what else is involved in doing that?

And, again, it's a deceptively complex thing to do that, right?

And so to really map it, you can't just look at those basic movements, but you'll have to look at the mathematics and that interaction.

So you have to understand both how to make the prosthetic work with its dynamics, which are different from a human, right?

It doesn't have tendons and muscles like the human, and it has to synergize with the human to create that natural gait on the back end.

So what are the use cases that are driving this kind of research forward?

I mean, I've seen some places where there are these sort of kind of robot dogs that could be carrying a lot of gear for the military, and I've seen, certainly, the prosthetic use case, but what do you see this technology enabling people to do 10 or 20 years from now?

Yeah, this is often the question I get -- What will this be fundamentally good for?

And you mentioned these sort of pack animals for the military.

I mean, to me this is less appealing from what I do.

What I'm really interested in is using robots in places where it's dangerous or hard to get humans, and the specific applications that I really find exciting are things like space exploration or disaster response.

So you can imagine Mars.

We're looking for life on Mars.

Right now we have wonderful rovers being built at JPL five miles away from where I'm sitting, but how do we get where the water is?

Well, the water tends to be at places that are hardest to get to, right?

So imagine now legged robots on Mars able to walk around like we do, all right?

We can explore new science.

So, to me, that's one application, and then on the other end of the spectrum is what we were talking about with helping humans, and I think that's an incredibly exciting thing.

I mean, prosthetics is a great application use case, but you can imagine extending much further to helping everybody in their daily life.

So imagine now you put on a pair of pants in the morning, and it's a smart pair of pants, and it helps you walk a little better or helps correct asymmetries in your gait or gives you a workout while you're walking to work.

So you can start imagining robotic technologies spreading out through the population much like smart phones do right now, right?

Starting to become this ubiquitous thing that everybody uses and doesn't even think about they're using, but it makes our lives better.

All right.

Aaron Ames from Caltech.

Thanks so much.

Pleasure being here.

Thanks much.