August 5, 2021
With vaccinations picking up momentum in the United States, schools and universities in many states are once again reopening. But Bharanidharan Rajakumar, CEO of TRANSFR, believes many of his clients who have discovered the convenience of virtual reality (VR) may choose to continue their education in pixels and bytes.
Rajakumar founded the startup TRANSFR in 2017 with the idea to deliver training in VR. At the time, VR and its sibling augmented reality (AR) had found audiences in the game and entertainment market, but both struggled to find adoption in the professional sector.
He never anticipated that, in a few years, a pandemic would shut down schools and business and social distancing would become mandatory. During the COVID-19 pandemic, remote teaching and learning was the safest possible option, creating new markets for TRANSFR products and services.
“COVID-19 may have been the instigator for VR-based training, but for a lot of people, there’s no going back. VR’s value over PowerPoint-based training and classroom lectures has been established,” he says.
Cognitive Training
At the Mazda Toyota Manufacturing USA (MTMUSA) factory in Huntsville, AL, when newly hired employees in the paint department come to get trained on operating the paint robot, they will likely put on a VR headset before they go into the auto painting facility.
The VR-based training modules are the outcome of a private-public partnership involving MTMUSA, Alabama Industrial Development Training (AIDT; part of the Alabama Department of Commerce) and TRANSFR.
The program was piloted with two groups of 31 participants, made up of trainers, paint booth users and trainees. Booting up the VR program, the participants first saw a digital replica of the factory site that housed the paint robot. In this virtual environment, they learned to perform safety checks, and handle, measure and calibrate the tools.
“There was a significant increase in performance for both groups overall (from 53% to 82%) … this was an indication that the trainees gained knowledge from the simulations,” TRANSFR notes in the published case study.
“The value in simulation-based training is, it helps the trainee acquire procedural knowledge. So when they get to the job site, they know where everything is, what every device does,” Rajakumar says.
The VR training boosts confidence in the new hires, but it also works to the training provider’s advantage. “It’s easier to standardize the training and scale up the capacity,” he says. “If you have good trainers in one region, but have a shortage in another, you can solve it with VR. Before, you might be training 20 students with four trainers. In VR, you could train 80 with the same trainers.”
Some VR-based training application developers like Sixense incorporate physical props, such as replica paint cans and weld guns into the setup, allowing the trainees to build muscle memory and get used to the shape and weight of the tools.
TRANSFR applications uses the standard game controllers, which do not match the shape of the real tools.
“For the instances we work in, the added value from the prop is not a must have,” Rajakumar says.
TRANSFR caters to not only the industrial sector but also to the education and government sectors. Of these, Rajakumar feels automotive is an area of growth.
“They are moving from traditional cars to electrical and autonomous vehicles, so there’s a lot of room for new training materials to be created,” he says.
Mixing Physical and Digital Assets
The best driving simulation is a mix of physical assets, which provide the basic frame of a car, with VR-based digital view, which shows what a yet-to-be-built model looks like from the inside out. One of the best examples of the physical-digital combo comes from Varjo, an AR/VR application developer that counts Volvo among its customers.
Volvo engineers have been test-driving a real Volvo XC90 while wearing Varjo’s VR headsets.
The purpose is “to perform [automotive] UX [user experience] studies by keeping as much as possible of reality—the real road, nature, road signs and more—and only exchange the things they want to evaluate, for example, a new display or interior, during the virtual and mixed reality test drive,” according to the case study by Varjo.
Varjo found that test drivers instinctively stepped on the brake when they saw a virtual moose crossing the road, and reacted to virtual cars overtaking the test car, according to the case study.
Because of the partnership between design software maker Autodesk and Varjo, Autodesk VRED and Alias software users can view their detailed vehicle models with Varjo hardware. Such applications push the automotive design discipline to rely less on traditional clay models and more on digital models.
“Bringing the clay model experience into the virtual space is the holy grail in automotive design,” says Thomas Heermann, vice president of automotive, concept design and XR at Autodesk.
One aspect of the clay model that cannot yet be simulated is the tactile sensation. You can run your hands down a clay model to examine the surfaces and curvatures of the vehicle, but tactile or haptic feedback in AR/VR is still a long way from such interactions.
“It will take a few more steps in innovation to get there. Getting rid of the controllers is the first step,” Heermann says. “Also, finger recognition needs to get granular enough to recognize the joints.”
The latest devices like Microsoft HoloLens 2 include finger recognition, allowing a user to use their fingers to push a virtual button or play a virtual keyboard, for example. However, such features are not yet the norm in AR/VR hardware.
Create, Review, Experience
During the shutdown, the use of AR/VR for collaborative design review became the norm, Heermann notes.
“In rough concept models, the shape and form are more important, but in the later design phases, you want to see shadows and reflections, so this requires high-resolution graphics,” he says. “Imagine seeing objects behind you in your [virtual car’s] side and rearview mirrors.”
It suggests that the use of graphics processing units (GPUs) with real-time interactive ray tracing may become part of AR/VR hardware architecture.
Automakers have also been using mixed reality or extended reality (XR) to boost sales, Heermann says. “So you can use your iPhone’s XR function to put your new car in your driveway, for example,” he says. While design creation in AR/VR is not yet the norm, Heermann predicts, “design review and creation will converge in VR.”
This June, Autodesk added the USDZ file export feature in the Autodesk Fusion 360 software, giving users an easy way to bring 3D CAD models into AR/VR environments for viewing and interaction.
Subscribe to our FREE magazine,
FREE email newsletters or both!About the Author
Kenneth WongKenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at kennethwong@digitaleng.news or share your thoughts on this article at digitaleng.news/facebook.
Follow DE