XR Brings Greater Clarity to Automotive Design
While the technology is still evolving, it also offers a lot of possibilities to automakers.
Virtual Reality (VR) and Augmented Reality (AR) News
Virtual Reality (VR) and Augmented Reality (AR) Resources
Theorem Solutions
January 19, 2023
Extended reality (XR) is literally shifting the way automakers see design data. These technologies enable automotive engineers to glean greater insights into the issues that determine a design’s success. XR visualizes design data in a way that helps development teams to achieve a level of understanding that 2D drawings or 3D digital data viewed on a 2D screen simply cannot match (Fig. 1).
Numerous benefits to using XR in automotive design exist, such as quick iterations and ray-traced photorealism packaged with today’s desktop-based digital design packages, according to David Weinstein, director of virtual reality and augmented reality at NVIDIA.
“XR goes one step further and extends these advantages, making everything life-sized, immersing the designer within the model,” Weinstein says. “With photorealistic rendering and 1:1 scale, designers can intuitively assess ergonomics and aesthetics. XR also allows engineers to develop much more natural interfaces because they can interact directly with their virtual model—digitally creating models in thin air, directing virtual assistants with speech, and evaluating reflections and glare through the virtual windshield.”
Understanding the XR Toolbox
The term XR encompasses virtual, augmented and mixed reality. Understanding how the three technologies are defined, as well as the factors differentiating them from each other, will help a company choose the correct device for its business use cases. It will also determine to an extent how the design data should be prepared and optimized.
The main feature of virtual reality (VR) is that it is fully immersive. Once the user puts on the headset, there is no access to the real world. All activity occurs in the digital environment.
On the other hand, augmented reality (AR) is experienced with handheld devices, such as tablets and mobile phones (Fig. 2). This technology overlays 3D data on the real-world environment to provide a hybrid communications medium.
Mixed reality (MR) is viewed via headsets, and it combines an immersive environment with digital data. For example, Microsoft HoloLens projects a holographic representation of the design data into the real-world environment, at scale and in context (Fig. 3).
“Essentially, each technology enables the ability to visualize design data, removing the disconnect between the screen and the design, giving context and scale,” says Kevin Levy, director of marketing at Theorem Solutions. “Using XR helps to extend the use of existing design data for digital twins, layout, training, inspection, validation, maintenance, design review and visualization, all of which can be of benefit to the automotive industry—depending upon the requirement and use case.”
A Clearer View of Interior Design
As these visualization technologies mature, the dynamics change between the physical and digital worlds. Until recently, automakers had no other choice but to spend significant time and money developing physical prototypes to test and evaluate interior designs and ergonomics. At best, this methodology has proven to be problematic.
“With conventional interior design, a team of prototype fabricators would take weeks to months to shape interior components to reflect design concepts and drawings,” says Weinstein. “Before fabrication could even be completed, the design typically would already be out of date.”
Recently, however, XR has increasingly become a viable alternative to physical prototypes, streamlining automotive design in the process. These technologies allow automotive engineers to visualize design data at full scale, which can help them to better understand how the design will fit in the real world and how people will interact with the finished product.
“Ergonomics can be tested virtually at scale with AR or VR, before having a physical mockup or prototype,” says Eric Abbruzzese, augmented and virtual reality research director at ABI Research. “The interior becomes an object in a virtual world, and content developers can program different interaction elements as needed, such as common touchpoints.
“With accurate 3D spatial tracking and visualization, things like distance relationships and sight lines can become apparent that would be more difficult to visualize with traditional digital design,” he says. “And of course, there is the visual element of design, so being able to see a potential design immersively in 3D, with high-quality representations of color and material, can provide data for a more qualitative analysis.”
For example, with VR, refined models can be experienced in real time from anywhere. Interior volume studies can be experienced as quickly as the CAD surfaces are modeled. Furthermore, the level of evaluation and quality no longer depends on traditional fabrication techniques, and they are limited only by the CAD model’s level of detail.
The same applies for ergonomics and human-machine interface design. With VR, space, reach, ingress and egress can be validated faster using an adjustable seating buck to represent a range of vehicle classes, and VR high-fidelity graphics can be used to facilitate simulation of materials, finishes, day-to-night lighting conditions, reflection, glare and line-of-sight conditions.
“Seeing something as if it were in front of you is very different from seeing it on a screen,” says Levy. “It allows users to better understand the design data that they’re working with.
“Furthermore, the ability to overlay digital data on a physical object as a means of validation and inspection is a very powerful application, with many different use cases,” Levy adds. “These can range from digitally checking that a manufactured component fits or that weld points are in the correct place to whether color and finish changes work. All these assets exist digitally prior to manufacture. Using them for styling, inspection, clash detection, human factors and ergonomics within XR to enhance existing processes makes sense.”
Better Than Being There
XR visualization also enhances the collaborative design review process, a long-time logistical challenge for the automotive sector. Teams traditionally gather in specific rooms, at specific times, to review progress and make decisions. If a key stakeholder isn’t able to attend the meeting, the review will often be rescheduled. The time and expense involved in bringing together all of the stakeholders result in artificial deadlines and fewer candidate models.
But it doesn’t have to be that way. XR systems now provide the means to facilitate collaboration among remote or distributed teams, as well as enhance in-person sessions.
Using XR technologies, review participants can see, hear and interact with each other, just as they would in a physical design room (Fig. 4). Furthermore, with the introduction of speech AI, collaborative VR sessions will soon support real-time translation, enabling participants’ comments to be translated into the native languages of the other stakeholders if necessary.
“In a virtual environment, individuals can provide asynchronous feedback through virtual sticky notes or video memos, capturing spontaneous inspiration or supporting participation across global time zones,” says Weinstein. “Furthermore, entire design review sessions can be captured and replayed at a later time, allowing stakeholders who joined a project late or who were double-booked during a session to better understand the history and context of previous design decisions.”
These visualization formats provide enhanced communication and streamline the exchange of ideas and promote greater interaction and flexibility.
“Everyone involved can view the same data at the same time from their own perspective,” says Levy. “Remote, collaborative design reviews can be as instantaneous as starting a Teams or Webex session. The difference with doing this in XR is that participants of the review will join using an XR device. These will predominantly be mixed or virtual headsets. If a user doesn’t have a headset, they could still join using a desktop machine.”
Removing or reducing the need to travel, eliminating the need to send entire teams to a single location to review a prototype—even in the early stages—promises to reduce project timescales and save a lot of time and associated costs.
Merging XR and Digital Twin
Though XR alone offers a number of valuable benefits, eventually users ask: How will XR fit into the larger digital ecosystem? Will it complement technologies like digital twin? If so, how?
Most platforms today are capable of supporting XR, but that hasn’t always been true. For years, the XR space faced significant integration and implementation barriers. Even as recently as a couple years ago, much of the XR ecosystem was partially or fully siloed from other infrastructure.
This is now changing, as companies work aggressively to eliminate those barriers through partnerships, standards and plug-ins. As XR technologies mature, many see a valuable opportunity presenting itself with the convergence of XR and digital twins.
“Digital twins hold the key for uniting modeling, simulation and visualization,” says Weinstein. “Increasingly, we use digital twins to model manufacturability and serviceability, cabin lighting and airflow, and even crash test safety. The future of digital twins is to unify these tools through a shared data platform, so model changes can immediately be propagated downstream for analysis. One way of accomplishing this is with a development platform like NVIDIA Omniverse, which is built on Universal Scene Description (USD), a powerful, lossless 3D interchange framework that can be read and written to by a wealth of design, simulation and analysis tools.”
According to Weinstein, because USD is compatible with numerous visualization tools (e.g., ParaView) and rendering engines (e.g., Unreal Engine), full-fidelity digital twin models can be immersively explored.
Analysts can sit in the virtual driver’s seat to experience the results of a lighting and glare study. Service technicians can confirm the ergonomic feasibility of replacing an engine component. Safety inspectors can walk around a crash test simulation at full scale to understand how specific parts responded to impact. As a result, all XR benefits—life-scale, natural interactions and virtual collaboration—can be leveraged to better, more efficiently explore and understand digital twin simulations.
Obstacles and Opportunities
As the market tries to determine XR’s place in the emerging digital infrastructure, developers of the technologies work to overcome obstacles that have plagued XR from its inception.
“There is a barrier to entry in the form of buy-in to the technology from key stakeholders, which can be associated with costs,” says Levy. “VR and MR headsets can be expensive, and some companies may not be willing to purchase them without knowing the ROI.
“So, it becomes a bit of a vicious circle,” Levy continues. “You can’t know ROI until you have a device, but won’t buy a device until you know ROI. We have seen this go full circle, and augmented reality devices—phones and tablets—can alleviate some of this issue and make extended reality easier to adopt because users already have the required equipment and are familiar with how they work.”
Advances in XR technologies and the cultivation of applications are also mitigating cost issues. The price points for virtual-reality and MR headsets are changing, and the use of XR to enhance collaboration is also bringing down other automotive development expenses.
“VR and AR can reduce the number of prototypes that must be shipped or moved to different locations,” says Timoni West, vice president of emerging tools at Unity. “Anything related to design, materials or the final finishing stages of a car can be easy to review and improve, at an acceptable fidelity for major stakeholders, in virtual reality. So, it doesn’t remove the prototyping process, but it does reduce cost and speeds up the review process.”
The greatest obstacle to XR adoption by design engineers, however, is also the greatest opportunity.
“XR is a new technology, and it’s constantly evolving,” says Weinstein. “On the one hand, this can be frustrating because new headsets and handheld XR devices are always right around the corner. On the other, it’s wonderfully encouraging because the degree of innovation is leading to ever-better solutions.”
Weinstein observes that what was impossible a short time ago is now common. A plethora of solutions are available for immersive automotive styling. For example, ESI IC.IDO for ergonomic analysis and virtual simulation and Gravity Sketch (gravitysketch.com) for immersively creating design concepts. Furthermore, a few years ago, users had to be physically tethered to their workstation to experience VR. Today, they can stream VR from the cloud to an untethered headset nearly anywhere.
Over the next few years, the market will likely see vast improvements in XR technology.
“Look for deeper support for autonomous agents to assist users, deeper integration of context understanding for AR and improvements in all aspects of XR devices,” says NVIDIA’s Weinstein. “With all of these improvements, automotive design engineers will see tremendous benefits to their craft. Design tools will become more intuitive, the immersive rendering truer to life and the convenience of virtual collaboration undeniable. And as that happens, XR will become an essential tool, just like a cellphone or a laptop.”
Tom Kevan is a freelance writer/editor specializing in engineering and communications technology. Contact him via de-editors@digitaleng.news.