podcast

Podcast: BMW Makes Cars Faster With Augmented Reality

In this episode, we discuss how BMW is leveraging augmented reality to design, manufacture, and service cars faster using Microsoft’s HoloLens glasses and a cloud based computer design solution called Hololight.

author avatar

31 Jul, 2024. 17 min read

In this episode, we discuss how BMW is leveraging augmented reality to design, manufacture, and service cars faster using Microsoft’s HoloLens glasses and a cloud based computer design solution called Hololight.


This podcast is sponsored by Mouser Electronics


Episode Notes

(4:05) - BMW accelerates concept evaluations with industrial augmented reality and virtual reality

This episode was brought to you by Mouser, our favorite place to get electronics parts for any project, whether it be a hobby at home or a prototype for work. Click HERE to learn more about how human machine interfaces have evolved over the years and what the future holds!

Become a founding reader of our newsletter: http://read.thenextbyte.com/


Transcript

Welcome back to the podcast, folks. And let me ask you a quick question. Do you love German sports cars? Well, we do because we're doing two back-to-back episodes for it. So, we're talking about BMW. We're talking about how they're revolutionizing their manufacturing process using AR. And if that's got you excited, buckle up and let's get right into it.

I'm Daniel, and I'm Farbod. And this is the NextByte Podcast. Every week, we explore interesting and impactful tech and engineering content from Wevolver.com and deliver it to you in bite sized episodes that are easy to understand, regardless of your background. 

Farbod: All right, folks, as you heard, we're talking about another German manufacturer today of beautiful automobiles, and that's gonna be BMW. But before we get into today's episode, let's talk about today's sponsor, and that's Mouser Electronics. Now, here at this podcast, we love Mouser Electronics, and here's why. They have incredible connections with the industry, being one of the world's biggest electronics suppliers. So, they know a lot about what's coming down the pipeline, what's hot in the R&D space, but they're also very close with academia, which means that a lot of cutting-edge research that hasn't made its way into industry, they also know a lot about, and occasionally they write content about the things that they know. And the one that we're linking into today's episode in the show notes, if you're interested, you can check it out, is all about human machine interfaces. And you might be wondering, what does that exactly entail? Like what is a human machine interface? Have I ever interacted with one and you actually have.

Daniel: The answer is almost certainly yes, especially if you're listening to this episode.

Farbod: If you're listening to this episode, you're probably listening to it on a phone, on your car smart system. Computer. That is a perfect example of a human machine interface. And I guess that the way we're thinking about it in the future of what it could be, it becomes a little bit more sci-fi, right? And that's what this article goes into. It gives us a little bit about bit of a background about the origins of this, how we use it in our daily lives today, what the future can look like. It talks about brain interfaces. That's really hot right now with Neuralink and stuff. But it really tries to touch on augmented reality and virtual reality because I feel like that's the next step of what we can do with these interfaces, right? We've had the physical touch.

Daniel: Yeah, that's what I was gonna say. They walk through the history of how humans interact with computers and machines. And it used to start like, I remember learning about this in like,

Farbod: The punch card history class.

Daniel: Yeah, like punch cards. You used to have to punch cards to program and interact with the computer. And then we were able to use a keyboard and then along came the mouse. And now we've got touch screens and we've already got early concept development of like using your brain to interact with the computer. But one of the big things that they see a lot of people using as their way of interacting with the machine, the human machine interface will be augmented reality and virtual reality, things like the Apple Vision Pro that we have started to see kind of take a little bit of a foothold in our society. Mouser, again, having a great finger on the pulse with future technical trends can say, this is something that we think is coming, and this is where we think the future of human machine interface is gonna be, is in this augmented reality, virtual reality space where instead of sitting in front of a computer and interacting with it in front of a screen, that the reality that you're interacting with becomes a part of the real world as well.

Farbod: And I will say, we've now revisited some of their articles a few times for different topics, and they are consistently right on the money. I think it's probably, again, due to the fact that they have such close ties to industry members that they really know what's gonna materialize over the next couple years for every concentration of the technology world that we care about. But this is a hot one. Folks, if you're interested in AR, VR, or generally just human machine interfaces, definitely check it out. And with that said, let's segue into today's article. Again, we're talking about the BMW Group, and we're focusing on AR, as you can probably tell. This is an interesting one, because when I think about AR or VR, the first thing that comes to my mind is the Oculus from Facebook, right? And then more recently, there's the Apple Vision Pro, all of which are like really consumer focused. Have you ever heard of the Microsoft HoloLens?

Daniel: Only on this podcast, when we were talking about industry applications. So, and I like the angle that Microsoft has taken with this, which is maybe less so than it being a consumer toy. Let's find ways where it can be an industry tool. Absolutely. And I think that's where a lot of new technology first takes a foothold and gets developed and becomes an inseparable part of our daily lives as a part of industry, as a part of the tools that we use to produce things as a part of our economy, essentially. And that's one of the things that they've done here. So that the BMW group, not to kind of give the spoiler away, but they're using AR technology here to help address a lot of the key issues that they have with the traditional automotive development and prototyping stages. They're able to use the strengths of augmented reality in a way where it's not just a tic tac tool or it's not a cool perk or something that's fun to play around with. It's actually a key part of how they develop and solve technical issues a lot faster than using the traditional methods.

Farbod: Yeah, you hit the nail on the head and the only thing I wanted to add is, I feel like as consumers, I for one was kind of quick to brush off something like the Apple Vision Pro because I didn't think it made sense for the use cases of like what I do with my laptop or what I do with my phone. But the reality is that behind the scenes, there are a lot of applications for tools like this. And like you said, we're now seeing BMW kind of start adopting it. So, we got the why out of the way, why they're going after something like this, because theoretically it could improve their efficiency quite a lot. And now I want to talk a little bit about the how. How have they tried to approach this problem of, can we actually implement this system in our factories? And we gave it away with the first bit. The hardware they're using is the Microsoft HoloLens 2. And fun fact, you said it, we have talked about this product before on this podcast, and we actually talked about it during our sixth episode.

Daniel: That's crazy.

Farbod: I don't know if you remember, but this was with ETH Zurich chief surgeon who was doing spinal surgery, the first spinal surgery ever that was assisted with augmented reality.

Daniel: Crazy.

Farbod: And they leveraged this HoloLens to come up with the right piece of metal and then figure out where to screw into the spine. It was pretty incredible. And here we are, what, three years later talking about how that same piece of technology is being used in the manufacturing realm now.

Daniel: Yeah, and I will say I think the secret sauce here in addition to this hardware that they're using, this HoloLens augmented reality goggles, they're also using software called HoloLight Space. And HoloLight is this tool that runs on the Microsoft HoloLens and allows people to visually overlay real geometries, real two-scale geometries in 3D space from CAD. So, imagine you can take a 3D model of a part that you've designed for a car and we can sit there and you and I can put on our Microsoft HoloLenses and there's an awesome video actually attached to the article that displays this functionality. But you and I could sit in front of a model car and we could say, okay, let's bring up the design that represents the overall shape of our engine. Right. And you and I sitting here, both with our goggles on can virtually drop this engine into the car with an empty engine cavity in front of us. And visually see this you and I both actual to perfect scale, watch it interact and fall directly into place in the spot that it's supposed to be in this engine cavity. And use it as an opportunity to provide feedback on the design, provide feedback on the manufacturing. This is how they're using it, but I think the HoloLight technology is what enables them to do that using the HoloLens hardware.

Farbod: Yeah, totally agree with you. And one thing I wanna highlight here is, for folks out there that are familiar with CAD or computer aided drafting and design, the programs actually require a lot of computational resources to run, right? So, Dan and I, the computers that we had to use for CAD work have typically been quite heavy because of the hardware that they have on board. And some of these assemblies for cars have hundreds of pieces. So just rendering them even on a powerful machine takes a lot of time. So, the way that HoloLight tackles this problem is like Dan was saying. You get the rendering happening, you visually get to see it on the HoloLens but all the difficult computation is happening on their cloud, which is capable of doing all that work and not having to offload the computation onto a HoloLens which might not have the capacity to do it. So, it kind of gives you best of both worlds. You have a lightweight machine that you can put on your head and visualize everything and then have the computation happen elsewhere.

Daniel: Now that's super smart. And I just kind of want to talk about the use cases here and kind of the motivating factors. Just trying to understand in the traditional automotive prototyping methods, right? It's pretty time consuming. Think about any time you wanna validate something, you've either gotta use a virtual only model to try and validate a new design. So, you're trying to simulate the manufacturing process using other 3D models of other parts and seeing if the new part fits in there without any issues. Generally, that works on the situations pretty simple, but when you've got something as complex as trying to take into account, the ergonomics of the operator who might be assembling this part, any tools they might be using, the workspace of another operator near them. And honestly, just trying to see like, if it's feasible for someone to do a motion into a part, you know, to put this part into place, that becomes either super computationally intensive if you've got a really good model to do it, or it becomes super low fidelity because you don't have a great model to do it. And you end up making mistakes along the way. The other version of that is spending significant cost and timing to create real physical models, either 3D prototypes or rapid prototype assemblies to try and do physical validation. But this costs a lot and you need to create new models for every single iteration, every single version. You need everyone to physically be in the same space. You need to ship prototypes to and from for feedback. Like that has a lot of friction because it's in the physical realm and then the options that are in the virtual realm aren't high fidelity enough for you to get a great answer. And so, what this team from BMW is doing, accompanied by HoloLens and HoloLight, they're able to take the best strengths of the virtual realm, which is being able to make quick iterations, be able to make changes, be able to view them in a way that doesn't require lots of money and time to create a 3D model. But then they're also able to combine that visually with being in the real physical space. You and I both looking at a part to scale and the original size that it's supposed to be in its orientation with other physical parts that it's supposed to mate with and give direct immediate design feedback back to the designers who are creating that part without us ever having to create a 3D print. And you can just imagine the difference in fidelity you get if you're able to physically stand there and see, oh, would I be able to place this part inside this hole or is that hole too small? Versus trying to simulate that in a CAD program. I mean, I can't even imagine where to start on a CAD program. Being able to do that with augmented reality, for me, feels a lot more intuitive. Feels like you don't need a ton of training to do that. Feels like you can get good, actionable feedback quickly to designers to make a part that's truly manufacturable, as opposed to maybe designing parts that have flaws and then having to work those kinks out later.

Farbod: Dude, totally agree with you. And you and I are both F1 fans, right? So, what I was thinking about is, there's digital twins of these cars that the drivers get to test on a regular basis and give immediate feedback. I'm like, this change is good, this is back, I have oversteered or whatever. And now you have this AR technology that's giving designers that exact same digital twin that for example they could use for the fit and finish of the interior. They can visualize what it would be like to sit in there and say, okay, you know the hinge is this way right now, but what if I tweak it? And they can actually tweak it in real time this way, would this be good, would this be bad? It's no longer a process of just sitting in your desk, thinking about it like conceptually and kind of seeing it. Now you get to feel it, now you get to experience it. Not only that, because this HoloLight software allows you to share this with all the stakeholders, it's not just you as the designer that's saying, yeah, this is good, and then your buddy who's also a designer saying, this is good, you can quickly be like, hey, project manager put on the HoloLens real quick, I want to show you something. And then they get to give you instant feedback too, so that everybody's on the same page.

Daniel: Yeah, and I agree with you there, and I think the real strength comes with cross-functionally as well, right? Not just in the product development silo, let's say. Bridging that over to manufacturing, bridging that to quality, bridging that to the folks who are in service engineering that are supposed to make sure that the design that you create for a vehicle is still serviceable by technicians in the field. You're able to quickly loop in all these people it doesn't cost millions of dollars for you to make a prototype. It doesn't have to be a physical vehicle that everyone's sitting in the same space to look at. All it takes is everyone popping on a HoloLens and they're able to remotely even access this simulation and understand how does this part look? How does the part accessibility look? Do we need to make any design changes, et cetera? And because you're able to do this virtually and like you said, manipulate the parts in real time, you can create rapid iterations and even instant iterations if you're able to and willing to sit there and design. One of the things that I think is really cool, but could potentially be a problem, at least from my perspective, is the way that they interact with this 3D model is completely by hand. So, it's like all hand controls. And you can see in the video that we're gonna link in the show notes, the entire AR application is controlled by the user's hand. They make hand gestures, which allows them to directly manipulate all these virtual components. I wonder if there's an option to also have a keyboard or a mouse or something like that, because I feel like there's only so many things you can do with hand gestures before you're like, all right, I just want to use my computer and mouse. So, I wonder if there's an option to use that. I could see it becoming laborious trying to modify a 3D model using only hand gestures. But outside of that, I think it's super valuable for folks who are interacting with it, let's say in a read only mode to be able to use their hands because they don't have to use a special remote control. It reduces the cost, it reduces the complexity. They're able to just use hand gestures to manipulate the part, view it in different angles, and then again, like we said, provide real-time feedback on how the part looks and if any changes need to be made.

Farbod: Yeah, no, that's really good feedback. And to our listeners, we promise that we deliver balanced feedback on every piece of technology that we're talking about. We don't always want to be cheerleaders. We want to give you some pros and cons. So, I think that's a very fair con. And again, as someone who's used CAD quite a lot, I can't imagine myself committing to a gestures only control when I'm so fixated on my mouse and keyboard. But another component that I saw as a potential point of friction is the cloud-based solution for computing basically these models. In the video they shared, you can see that there's lag already. And typically, in demo videos, that's kind of like the best-case scenario. So, it makes me wonder what happens like if you don't have great connection and now, you're dependent on all of this happening in some database elsewhere. So, I think those are some pain points that can potentially be improved on in the future but obviously this technology has a lot of like immediate benefit that it's providing specifically to the BMW group because they went out of their way to say leveraging AR has decreased their concept validation time by up to a year. So that is like incredibly valuable for a company of the size of BMW that's operating on so many different fronts. Just look at their current vehicle lineup in the United States alone. Imagine if they could save a year from the five-year development cycle for like, I don't know, a three series or a five series. That's like millions of dollars saved.

Daniel: Well, and you hit the nail on the head, right? Just from my experience in the automotive realm, you've got a lot of startups, a lot of newcomers who are trying to push the pace in terms of development cycles, reducing it from four to five years down to like two years to get a vehicle program out from the ground up. And for a legacy automaker like BMW to stay competitive, not only from a technology perspective, if you think about it, everything that BMW is releasing to you today started development five years ago. That may be why things like the infotainment screen feel slow and laggy in comparison to things like your phone because iPhone's developing new phones every single year, almost completely. I think it's like an 18 month development cycle for the iPhone, maybe 12 to 18 months, but cars are being developed on a five year cycle. So that's why some of the technology kind of feels obsolete by the time you touch it. And now they've got competitors trying to do things in too. Obviously, they wanna stay at the forefront, they wanna feel competitive, but they know to some extent they've got people who are loyal to them from a brand perspective, we're gonna buy BMW no matter what. But they would like to, and like you're saying, just save millions and millions of dollars. Imagine being able to produce products without as much time investment, without as much cost investment. That's a big deal as well.

Farbod: Totally agree with you, that's a great point. And one other thing I wanna note is I feel like, you can chime in on this because again, you're the automotive world expert in this podcast, but we have these legacy automakers that have their brand name, which are super respected by others. Think Toyota, like everyone just connects them with reliability and quality. Same thing is expected from these German manufacturers, like they have the best fit and finish that you could think of. So, quality and manufacturing it is a big plus for them in comparison to some of the startups that might have not figured that out completely yet. So, another way that they're leveraging AR here, which I thought was super impressive, was for training of the technicians that are actually working on the assembly line, right? Someone, like maybe on day one, you don't want them to be working on a customer facing part because they just don't have the experience yet. Well, you can replicate that entire process pretty much to a T using this AR headset. And then if a person's struggling with a specific component of the assembly line, you can then start catering specific trainings for just that person. I think that's a huge win, especially, I know you've talked to me a lot about how, if an engineer changes a part during a production run, now the technicians have to come up to speed on how to mitigate that and how to make it part of their flow. Imagine having a ramp that allows you to come up to speed on that much more smoothly than just learning on the job. So, I thought that was a huge win. They didn't emphasize it as much as I thought it should be emphasized in the article, so I just wanted to call it.

Daniel: I'm with you and like similarly from assembly, I'm thinking about as a service technician, right? You've got a new developed product. It's a really challenging learning curve to try and train people from a service perspective to be able to pretty much service every single aspect of a car. In assembly, you've got people who specialize like they sit in one station or they bounce between a couple different stations and like using AR is really valuable to train them to do, if they're doing plastics assembly or they're getting really good at putting the seats into the cars, but then most of their time is spent putting seats into cars. You've got a service technician who's pretty much expected in the field to be able to take apart an entire car and put it back together by themselves. That's an extreme amount of knowledge. You either end up paying a lot, paying a premium to hire these people, or you end up investing in people a lot to train them to have this level of expertise. I think having AR to equip more people to feel like experts the way that I feel like when I'm working on something with YouTube as like my co-pilot to help me. Imagine like if you were doing your pantry remodeling this weekend but you had visual guides helping you…

Farbod: For where the studs are.

Daniel: Where the studs are where you place the stuff, I think that's part of the selling point of HoloLens And also HoloLight space, which is this 3d CAD software I think there's a lot more applications similar to this in an industry perspective just like service just like construction. I mean that the possibilities aren't completely endless, but there's a lot more than we're giving AR credit for today.

Farbod: Yeah, yeah, I feel that. You think it's a good point to start wrapping it up?

Daniel: Yeah, let's do it.

Farbod: All right, folks, hear me out. I know you might be looking at the Apple Vision Pro and thinking, well, this is absolutely useless. And honestly, you might have a point. But BMW's proving us wrong on the manufacturing because they're taking a Microsoft HoloLens 2 and reducing their concept car validation process for up to a year by using this technology in collaboration with a software called HoloLight. See, HoloLight takes the HoloLens, which is augmented reality, and allows engineers and other stakeholders to put on this augmented version of the components of a car onto a real piece of geometry out in the real world for everybody to analyze it. Imagine you're designing a new car, you're sitting in the interior, and you're like, I wanna change this one piece. Is it a good fit? Is it not? Well, the engineer can agree on it, the project manager can agree on it, heck, the CEO can even agree on it. And that's why this process and this tool is going to revolutionize the automotive industry.

Daniel: I love it.

Farbod: I do what I can. All right, folks, thank you so much for listening. As always, we'll catch you the next one.

Daniel: Peace.


As always, you can find these and other interesting & impactful engineering articles on Wevolver.com.

To learn more about this show, please visit our shows page. By following the page, you will get automatic updates by email when a new show is published. Be sure to give us a follow and review on Apple podcasts, Spotify, and most of your favorite podcast platforms!

--

The Next Byte: We're two engineers on a mission to simplify complex science & technology, making it easy to understand. In each episode of our show, we dive into world-changing tech (such as AI, robotics, 3D printing, IoT, & much more), all while keeping it entertaining & engaging along the way.

article-newsletter-subscribe-image

The Next Byte Newsletter

Fuel your tech-savvy curiosity with “byte” sized digests of tech breakthroughs.