podcast

Podcast: Talk to The Hand: Meet The Robots With a Human Touch

In this episode, we discuss how MIT researchers are making great strides in developing better robotic hands by focusing on an often overlooked component: the palm.

author avatar

12 Jun, 2024. 14 min read

In this episode, we discuss how MIT researchers are making great strides in developing better robotic hands by focusing on an often overlooked component: the palm.


EPISODE NOTES

(0:50) - The Robot With a Human Touch

Become a founding reader of our newsletter: http://read.thenextbyte.com/


Transcript

Hey friends, today I want you to talk to the hand, not my hand, a robotic hand, because we're talking about a team from MIT that's reinventing robotic hands. I think it's time to hand it over, so can you lend me a hand? I gotta hand it to you. This is hands down one of the best topics we've talked about, so let's go.

I'm Daniel, and I'm Farbod. And this is the NextByte Podcast. Every week, we explore interesting and impactful tech and engineering content from Wevolver.com and deliver it to you in bite sized episodes that are easy to understand, regardless of your background. 

Daniel: What's up friends today, we're talking about a robot with human touch and essentially what we're looking at here is, the future we want is robots that are able to handle things as well as human hands are, and maybe even better. But the problem is most robotic hands lack palms, which limits their ability to grip things and handle objects the same way that human hands do, even though we're trying to use human hands as a method of training robots, how to grip. We don't give them a palm to be able to grip. And if you've ever like cut your palm or burned your palm, you understand how much your palm is actually involved in gripping things by feeling that painful sensation when you're like, oh shoot, I cut my palm or I have blisters on my hand and I'm gonna grab something and you feel that shooting pain that's because your palm's involved in gripping that object. Current robotic designs focus on finger dexterity. They're talking all about the grippers, the fingers, et cetera but they overlook the importance of the palm and doing stable and delicate object manipulation.

Farbod: And that is a claim that I think was a little surprising to me. But then I went back. Figure 1 is the really current popular robot that I think most are talking about. And I was like, wow, they're right. A lot of the demonstration's focus is on the fact that the robot's fingers can manipulate a lot of stuff. But if you look at its palm, it's just a completely static, flat piece of metal. Same applies for the Atlas robot on the Boston Dynamics side. There's nothing functional about the palm. Every piece of engineering effort seems to have been put in the fingers doing something. And just like you mentioned, I think I also just overlooked the importance of a palm until it was explicitly pointed out. And I'm like feeling around my hand, I'm like, oh wow, yeah, my palm does kind of deform in the right way to grip something tighter like a ball or something else. So yeah, definitely overlooked, super excited to see what these folks are doing. And yeah, I mean, do we wanna go into the problem or do we wanna start talking about what they're doing?

Daniel: Well, I mean, I think we've laid out the problem pretty well, which is a lot of robotic effort has focused on making soft robotics, generally on the fingers, right, as the gripping, as the main gripping feature of a robotic hand. We've gotten a lot of research related on making flexible and I think the word that they use is like, compliance, right? A material that complies and kind of flexes and bends when it touches something. We've got a lot of research doing that with fingers. We've got a lot of research doing that with even sensors in the fingers to be able to feel when it's touching something. But this team from MIT, I think they've got this really astute observation here that like the same level of focus should be applied to the palm because in the human anatomy, the palm is just as important in being able to grab things. So, I think that's a perfect segue to talk about what exactly they worked on, what they focused on, but that's the long and the short of it here is, they're like, if we use the palms on our human hands, it's such an important part of being able to grip things, feel that we're gripping things, and then interact with the world around us, we should also take that same level of focus when we're making robotic hands.

Farbod: Yeah, and I like their philosophy when it comes to this solution that they were exploring, which was, you need both structural and material compliance. So, on the structure side, think about the flesh in the structure of your hands. You have your bones. They're able to move back and forth to be more accommodating to whatever you want to grab, which is great. Same thing goes for the small bones in your palm. But on top of that, you have flesh and muscle that you can think of as the material that is to be compliant. It's not super rigid. It can deform to make room and better grip onto the object that you want to grab to. Most robotics are very good at the structural compliance portion of it, where they can articulate in a way to grab something. But there usually isn't a lot of material compliance there.

Daniel: Well, and I liked what they did here is where, and I think it kind of, at least to my understanding, has filled a gap that we've seen in the robotics realm so far, is they have this strong skeleton structure in there that is kind of flexible. They call it like a semi-flexible skeleton. And then on top of that, they do have this like really squishy soft gel. So then between the two, they're able to create this structure that's strong enough to be able to create force to grab items, but also soft enough to be able to absorb the different changes in geometry and the way that that might interact with the hand and with the palm in a way that not only allows it to pick up different shaped items without dropping them and without damaging them. Cause that's already, I view that as like task zero. That's like the minimum that you have to be able to achieve to call this like a viable robotic gripper. On top of that, they're able to use the compliance as a method of creating feedback to help the robot understand what it is that it's gripping, what the geometry of the thing that it's gripping, what it feels like, what it's gripping. And so, compliance was not only used as a way of mitigating damage to the item that it's picking up or mitigating issues with picking up items that are different shapes. It also is used as a feedback mechanism to help the robot understand what it's grabbing. Which is basically a proxy for feeling like when we pick up things, we can feel the items that we're picking up basically using this compliance as a feedback mechanism as a proxy for the robot actually feeling the things that it's picking up and interacting with the world around.

Farbod: Yeah, and just before we go further, I think it's worth noting this objective that they have and this philosophy about what a robotic hand should be manifested in a platform that has, I would say, two core components. You have this gel palm, as the name implies, it's a gel-based palm with a bunch of ROMEO fingers on it. ROMEO is actually an acronym. Let me see if I wrote it down. I feel like I did. Rigid modular something.

Daniel: RObotic Modular Endoskeleton Optical.

Farbod: I got one word right. Yes. And those are essentially the fingers. So, you have this gel palm. It's a component that has in their iteration, two cantilevered beams that allow the structure to kind of like deform, collapse, comply structurally with a layer of gel on top of it so that the material can also comply as it's trying to grab something. And then the nice thing about this platform is that you can then choose the type and the number of fingers that you want attached to it. These ROMEO fingers kind of follow the same methodology. They also have a rigid structure with a silicone material that's casted onto them. And both the gel pro, I mean the gel palm and these fingers have LED strips in them to aid with that sensing technology that you were talking about.

Daniel: Well, and one of the things that I want to mention is they have an awesome simulation that you actually clipped and you put it in our notes, but it's also linked in the show notes is showing the difference between if you have like a compliant structure in the palm so that flexible skeleton versus if you have a compliant gel, which is like the soft skin on top of it. It shows the difference in the amount of contact that the robot's hand is able to have with the object that it's picking up. And if you have just a compliant structure, it's really, really good at conforming to the macro geometry. Let's say the overall shape of the item, like basically whether it's like a square or if it's a circle. But it's not really good at conforming to the micro geometry of the item and allowing you to create a lot of surface area, a lot of contact with that object. Same thing with the gel, it's really, really good at just the micro geometry. So, it's good at flexing and changing the shapes and in the local area of the object but it doesn't create enough compliance to be able to you know change the structure of the hand massively as a result of the overall shape of the object. But when they have both when they have both a compliant gel and a compliant structure, they show the amount of difference of surface area that this robotic palm is able to get on the items that it picks up. And whether it's a square with sharp corners or if it's a circle that's really round and soft, it's able to create a maximum amount of surface area to aid in gripping, which is great to, again, to make sure that it's able to grab things effectively, but also this increased surface area allows it to get an awesome feedback method using the LEDs and the cameras that are built into the hand as kind of a method of reflecting light off of the object that it's touching through the surface area that it's in contact with, and then using cameras inside the hand to collect that reflected light and kind of try and get a picture of what it is that it's picking up.

Farbod: Yeah. I think that that's a great point to now expand on. We've been talking about sensing. You mentioned cameras earlier. This platform, the GelPalm, actually has two cameras embedded in it. I think it's, you know, they have a two cantilever beam set up right now. I think it's one camera per cantilever, and it's looking outwards towards whatever's coming into the Palm through this transparent soft material. And the LEDs, this red, green, blue LED is shining on the object. And these cameras are picking up an image that when processed a little bit can give you a black and white image that ideally says the darker spots are pressing harder into the structure versus the lighter spots. And that will allow this system, this platform to interpret that data as almost like a pressure sensor data. Which, think of this technology being used for prosthetics. You can use visual data to get sensor data and you don't need to have tiny sensors embedded everywhere. You could just have one general visual platform that's picking all of this up and interpreting it to tell the user what the sensation is at a given area.

Daniel: Well, you kind of stole my punch line there by mentioning prosthetics, because we've only been talking about robots this whole time, but I think that's an awesome point, right? Is you're not only using human-inspired geometry to create a robotic hand that's more effective from an overall form factor perspective, looking at their hand, it looks more like a human hand than a lot of the robotic grippers. Like you were saying, Figure 1, an awesome robotic achievement, an awesome engineering achievement to be able to look at all the things that that robot can do. But if I were to think about putting that hand on the end of my arm, if I had had an amputation versus this robotic hand from this team from MIT, it does look and feel more human-like based off the pictures they include in the research and the video that's linked in the show notes.

Farbod: And it's functionality.

Daniel: You should go check it out. Yeah. Yeah. But yeah, I agree, man. So, prosthetics are an awesome potential impact to this. But I don't know, using LEDs as a method of getting some sort of sensory feedback for gripping was still a little bit lost on me. Kind of feels like it went over my head, but you should definitely check out the article that's linked in the show notes. They did an awesome job of showing just how much detail they're able to get using this reflected light and the cameras inside the hand. Some of the things that they did was pick up small M6 screws. They were able to get a clear projection of the threads as it pressed against this compliant gel skin. And they're also able to pick the logo off of the stud on a Lego piece. Again, just to show the level of detail that they have, they're able to use this illumination system, right? There's soft lights built into the hand that shine light on the object as it's gripping it. And then there's cameras inside that collect the reflected light off of the object that it's touching through this clear silicon gel. And it's able to reconstruct an image based off of that reflected light. And one of the things I like is one in a super scientific paper like this, I catch there being a not super scientific term. And I loved the caption to this image says “Illumination strategies result in decent 3D reconstruction of it is”, and they're right. It is decent It's not great acceptable. It's not awful, but it is super interesting to see that they're able to like Clearly distinguish the detail on an M6 screw the threads like that’s something really, really unique to be able to say like, hey, we're able to collect this much detail in the palm of a robot grabbing this item, similarly to pull the logo off of a Lego piece just shows that this is a step in the right direction. And I would agree those images aren't perfect, but they are decent. And they're able to basically reconstruct the 3D geometry of the item that it is that they're picking up based off of just reflected light and cameras inside the palm of the robotic hand.

Farbod: And to kind of give credit to the team, the visual platform being used here, they have two Raspberry Pis that are doing the image processing, and the cameras used are Raspberry Pi spy cameras. So, these are like fairly inexpensive components that you can get off the shelf. And they're demonstrating that you can do something like, pretty impressive, just with that.

Daniel: And on top of all this, all the compliance structure, it would sound super fancy, this semi-flexible endoskeleton, all 3D printed. The entire robotic hand and all the silicon gel around it casted themselves in probably 3D printed molds.

Farbod: The LED that they got that's flexible, off the shelf from, Adafruit, commercially available, fairly inexpensive.

Daniel: And I don't know. It's just super impressive to me to watch them achieve a world class university MIT, but achieve a world class university level of research result, while also, I think we appreciate it as both being garage hackers and mechanical engineers, this team's scrappiness to pull things together and manufacture something on their own.

Farbod: And you can tell like this is a group that had scale in mind. Like if we actually make this and it's good, then would it make financial sense to create this at scale? And you can definitely see bits and pieces of that in the design.

Daniel: Well, and I think you and I both related with this, but you were saying one of the lead authors, Sandra Liu, who's a PhD student in mechanical engineering, you're like, I could just tell based off the way that this was written, that she's a mechanical engineer, because she's talking about delamination of the materials. And like oftentimes-…

Farbod: It was speaking to me.

Daniel: We talk about the fabrication of the hardware to associate with some super interesting software algorithm. The hardware is almost always an afterthought.

Farbod: Yeah, not here.

Daniel: Thank you, Sandra, for putting the hardware first, because hardware is hard. It's not easy.

Farbod: And it warms our little mechanical hearts.

Daniel: Our little tiny small mechanical hearts were warmed a little bit. Our hearts grew three sizes today.

Farbod: Yeah. The laws of thermodynamics were in play, thanks to you.

Daniel: Well, so I will say, I think it's important to kind of talk about the so what here, right? The achievements, the significance of it. I think that one of the cool things they're able to show is improved dexterity and safer interactions with objects, as opposed to a robotic hand without the same sensing mechanism inside the palm. Obviously, you don't want to crush the items that you're picking up. You want to be great at picking up the items, but I also appreciated the icing on top of the cake here, if you will, which is getting the sensory feedback from the system to be able to understand what exactly the item is that you're picking up. I think about if I were to grab my phone in my hand, using just my palm, I'm doing it right now, using just my palm, I can feel where the buttons are. I can feel specific aspects about the item that I'm picking up using the palm, which is actually a lot of surface area compared to the fingers as a way of collecting extra data when robots are picking things up. And then I love the application here in the biomedical sense that we're mentioning, this is a potential application for prosthetics as well.

Farbod: I'm with you, man, I'm with you. One thing that I think is worth noting is that this team really wanted to keep their design as lightweight as possible. They didn't want it to be super bulky because if it was, then it would limit its use cases. And they kind of encountered this problem where you have the two cameras in the GelPalm, but you still have a fairly limited view of whatever you're grabbing in case you want to look at the top or the sides of whatever is going on there. And they couldn't add more cameras without adding more bulk. So, this is an element that they're actively working on and trying to iterate on to give you the maximum amount of input data without making this a very heavy, bulky structure.

Daniel: Yeah, that makes sense.

Farbod: I think it's worth noting it.

Daniel: And I think the configuration they ended up going with was the three fingers for most of the testing. It would be awesome if they're able to create less packaging space for all the sensors to be able to add more fingers, because I think we're, at least from a humanoid perspective, we're seeing a lot of these humanoid robots converge on having at least four fingers. I think it'd be awesome, again, purely from a packaging and aesthetics perspective as someone who are trying to empathize with someone who needs a prosthetic hand. I think it would be awesome if they're able to get the form factor to be even closer to a human hand as opposed to like a kind of weird three-fingered claw looking thing.

Farbod: I can tell that that's an important issue for you. So, if the team at MIT is listening, can we please get a four-fingered hand?

Daniel: Or can we please go to MIT and look at your palms and fingers?

Farbod: It's been a couple months now. I think we're due for a visit back.

Daniel: I think we are.

Farbod: Yeah. Hey, MIT CSAIL. Our friend, Rachel, if you're listening, we'd love to come back.

Daniel: We really would. Yeah. All right, before we wrap up the episode here, I want to do a quick summary of the topic today. Talk to the hand, my friend. Robots are about to get a huge upgrade. Researchers from MIT and around the world want to make robots handle things as well as human hands can. But most robots don't have this soft, flexible palm in the middle of those robotic hands, which makes it really hard for them to grip things easily and gently and to understand exactly what it is that they're grabbing. This team at MIT developed something called GelPalm, which you guessed it is a gel palm with special sensors and a soft design built in. It helps the robots touch and hold objects much more like humans do. This makes them better at working with people and can help them improve grabbing items, recognizing items, and one of the cool applications that I think is for this is improving prosthetic hands in the future as well.

Farbod: Awesome, right on the money. All right, folks, thank you so much for listening. As always, we'll catch you in the next one.

Daniel: Peace.


As always, you can find these and other interesting & impactful engineering articles on Wevolver.com.

To learn more about this show, please visit our shows page. By following the page, you will get automatic updates by email when a new show is published. Be sure to give us a follow and review on Apple podcasts, Spotify, and most of your favorite podcast platforms!

--

The Next Byte: We're two engineers on a mission to simplify complex science & technology, making it easy to understand. In each episode of our show, we dive into world-changing tech (such as AI, robotics, 3D printing, IoT, & much more), all while keeping it entertaining & engaging along the way.

article-newsletter-subscribe-image

The Next Byte Newsletter

Fuel your tech-savvy curiosity with “byte” sized digests of tech breakthroughs.