IDSA INNOVATION: Integrating Virtual Reality into the Design Process
This article was originally published in IDSA's INNOVATION Magazine spring 2022 industrial design: Yesterday, today, tomorrow. Click here to access.
At Formation Design Group, an Atlanta-based design and innovation firm, we’ve been implementing virtual reality (VR) as a tool in the design and development of various projects since 2017. That was a turning point in the VR industry when the modality became accessible to mass markets following the release of the original HTC Vive headset. Since then, the VR industry has been developing at breakneck speed. Headsets evolved to feature higher-quality displays, improved ergonomics, and new tools such as built-in hand tracking. VR is a powerful rapid prototyping tool that allows designers and engineers to evaluate design iterations in 3D space without the time, labor, and expense of building complete prototypes in the shop.
Our Process
Most VR projects start from the same place: parametric computer-aided design (CAD) files. These are the familiar CAD file formats used in software such as Solidworks, Fusion360, and Siemens NX. However, these files aren’t ready to simply be dropped into a virtual environment; first, they need to be made real-time ready.
Formation uses the 3D game engine Unity to develop VR experiences. Game engines rely on real-time rendering, meaning that CAD geometry needs to be imported as a performance-friendly polygonal model. We use 3Ds Max to convert the parametric geometry into those polygonal models, though there is other 3D modeling software out there ready for the job.
The process is never perfect and requires some manual tweaking. That’s especially true when it comes to textures, which require UV unwrapping to achieve the correct direction and placement of textures on the model. UV unwrapping is like peeling an orange and laying the peel flat. The flat image of the CAD geometry allows us to have precise control over the placement and resolution of materials on polygonal geometry.
After the files are imported into Unity, some projects may be nearing the finish line. They may only require static geometry and an immersive environment to effectively evaluate a design solution. However, for other projects, we can start taking advantage of the advancements made in VR in the last few years to push them to the next level.
A More In-Depth Approach
During the planning stage of a VR project, it’s important to consider how the technology can be used to achieve specific end goals. There is a lot of potential functionality on the software side through Unity, and hardware is also a consideration, depending on the needs of the project.
For example, we’re working to leverage the hand tracking built into Valve’s Index head-mounted display (HMD) to add basic mechanical functionality to models. The Index’s intuitive hand tracking controllers allow users to reach out and manipulate non-static parts of the model with their hands, just as they would a real functioning prototype. Some examples of this could include opening and closing hinged elements, interacting with a touchscreen, or pushing a cart across the room.
For even more interactivity and function, VR scenes can include user interface (UI) elements that allow for swapping materials, environments, and models without having to load new scenes. We develop these elements in Unity with C Sharp code to achieve almost any function.
One thing we’ve noticed working with UI elements in VR is that VR necessitates a unique approach compared to traditional flat UI. Elements can’t be positioned on the edges of the HMD viewport because they become difficult to focus on, but elements in the center of the viewport are highly obtrusive. At Formation, we’ve tried to stay ahead of the curve, innovating new solutions like UI elements being placed in 3D environments and other intuitive ways to display centered UI elements only when they are needed.
Mixed-reality experiences are another great way to increase immersion and provide additional value to a VR project. At Formation, we’ve created several mixed-reality experiences by developing custom controllers that can provide input to Unity through an Arduino or Xbox adaptive controller. Bringing physical mockups and controls into the VR environment enables users to better evaluate control layouts.
VR can also be a fun and engaging element of a trade show booth. Recent advancements in wireless HMDs simplify the process of setting up a VR experience in a limited space. The Vive Focus Pro 3 is a good example of an HMD tailored to the trade show setting. It works without needing to be tethered to a PC or laptop and includes a kiosk mode that limits functionality to just what you want to show as a demo.
In some cases, photorealism may be a more valuable goal to strive for than full 3D space interactivity. For projects like these, VR can be an excellent tool for viewing static stereoscopic renders. While this approach limits the VR experience to one viewpoint, it allows for much greater fidelity and detail as the models and textures don’t need to be as performance friendly.
Where We’re Headed
Recent advancements in VR have been exciting, and at Formation, we’re looking forward to what’s to come. We know how designers can use VR to accomplish various goals, and we’re now working to leverage VR as an internal tool in the design process.
Specifically, we’re interested in finding ways to seamlessly integrate VR into our design workflow—even for projects that aren’t going to have VR as a deliverable. In the near future, we expect to be able to generate rudimentary VR scenes more efficiently, without the need to have designers send CAD files to a VR specialist for conversion.
Ultimately, Formation has found that VR is an excellent tool to support design work, both internally and with our clients. We’re excited for all that we’ll be able to create as developments continue to skyrocket in this space, and we’re thrilled to be on the front lines of incorporating VR into the industrial design and user interface development processes.