Steven M. Caruso is an Experiential Creative Director and Creative Technologist with a background in Industrial Design, Exhibit Design, and 3D Graphics.
Working as an industrial designer and agency creative has given Steven an appreciation for the equal importance of both the artistic vision and technical execution of design projects, whether physical or digital.
A native of upstate New York and graduate from Rochester Institute of Technology, he’s been living and working in Brooklyn for over a decade. Outside the studio—or in it, at odd hours—he’s a persistent tinkerer and amateur engineer. He’s continually building, experimenting, coding, restoring, studying, and creating.
Let's simultaneously drink coffees and talk about creative technology and experiential design over the internet—or, if you're in or around New York City, maybe even in person!
Steven M. Caruso
44 Turner Place, Ste. 1
Brooklyn, NY 11218
Consider this a technical and practical demo of what’s possible with interactive 3D experiences.
I’ve been designing exhibits and experiences for over ten years, and my process is the same whether it’s a physical exhibit in a museum or a virtual one in a web browser.
To model the exhibit you see under this text window, I used Blender. I’ve been using 3ds Max for years, but have been transitioning to Blender because I prefer its status as free, open-source software.
The WebGL renderer that this interactive website uses does support decent lighting and physically-based materials. Instead, I opted to “bake” the rendering from Blender into texture maps that include all of the lighting detail. This helps make the site load quicker and work better on more computers.
Wide accessibility is one of my major goals for virtual experiences. I believe that anyone, anywhere, should be able to access great exhibits and experiences even if they can’t travel or otherwise visit a location in person.
I’d love to hear what you think of this site, how it works in your browser, and if you have any accessibility issues. Send me a message with your thoughts!
Sometimes, the confines of an exhibit or event are too narrow for your idea. Show visitors a bigger and more immersive experience in the limitless space of virtual reality!
Unlike mobile VR experiences, managed on-site virtual reality applications run on custom, high-performance workstation PC hardware for incredibly realistic graphics and complex interactions that make full use of motion controllers.
VR is especially useful for demonstrating immersive visualizations and simulations or providing views of things that would be too large (or small) to practically show in person.
Virtual reality applications are made using Unreal Engine, a high-performance realtime 3D rendering engine that can generate nearly photorealistic graphics.
This application runs on a custom PC, which a head-mounted display (HMD) is connected to with a cable.
While the possibilities are virtually endless, consider some of these intuitive VR interactions as a starting point for your custom virtual reality experience:
Interactive kiosks, tablets, and large-format installations are an immersive way to present information with nonlinear connections, gamify engagements with visitors, and involve multiple people in demos and other 3D experiences.
On-site interactive installations can include any kind of 2D media and user interfaces, including video and web content, and run high-performance realtime 3D graphics at their core. The applications are optimized and rigorously tested to reliably run on custom hardware.
Touchscreen displays or large-format digital displays paired with other input hardware—including hand-tracking, joysticks or custom control panels—are connected to a PC running either a self-serving WebGL application or an Unreal Engine-based 3D experience.
Tablets can also be preloaded with custom 3D applications, or connected to a common access point for a local server to deliver browser-based experiences.
Beyond the digital application, the physical unit can be designed and built to fit the exhibit or experience it resides in, fully self-contained and needing only an electrical connection.
Interactive 3D experiences can also be made available online for visitors to revisit, or for those unable to join in person to receive an equivalent remote experience!
Emerging technologies like hand-tracking, virtual reality, or integration with motorized physical components or external lights and sensors can bring an interactive exhibit element to another level and provide a truly memorable experience.
I've been in the business of experiential design for over a decade and I've worked on and led projects of every kind: museum exhibits, trade show booths, conference stages, pop-up retail stores, custom vehicles with electronic games-of-chance built into the side of them — you name it. I'm an industrial designer first, and when I want to make something, I make sure I know how it's going to get built.
I put my experience creating environments and displays for the physical world into my virtual experiences, too. The same focus on usability and quality of craftsmanship that I demand from the exhibit properties and structural designs I design apply to the virtual experiences I code.
Considering the entire design of an experience (including both the physical spaces that contain them and virtual environments that they extend into) creates a fully realized whole that reinforces each individual component.
We can design and build an exhibit, visualize it with a digital twin, add interactions to the digital version and make it a remote experience, reconfigure the content and embed it into an interactive installation, and give everyone that uses it a link to try it in their VR headset.
The techniques used to turn each of these into a real thing is different, but the design process is the same.
VR headsets (Head-Mounted Displays, or HMDs) are usually connected to a gaming or workstation PC. The headset is simply a display for graphics generated by those machines' powerful graphics processors.
Mobile VR differs from those HMDs as it’s contained to standalone headsets or mobile devices.
At the moment, that largely means using the Oculus Quest 2. In some cases, Google Daydream or Cardboard might work.
Though I personally have some reservations about Facebook as a company, the Quest 2 is currently the best and most accessible standalone HMD for most people. It is becoming extremely popular because of its low price, high quality, and ease of use. Future offerings from Apple and Microsoft may change this in a big way. But, for now, the Quest 2 is what we’ll be working with for most projects.
My preferred method for delivering great at-home VR experiences is through WebXR, an API for rendering simple virtual reality experiences in a standalone headset's integrated browser.
Oculus does have a secondary store and distribution network called the App Lab for the Quest 2. This is a good choice for more complex experiences requiring more realistic rendering, complex interactions and media. Distributing through the App Lab allows us to develop an application that runs directly on the headset's Android-based hardware, allowing for much more flexibility and performance.
For even more realistic and immersive experiences, PC-based VR is the way to go. These will be accessible to the most technically-minded users with high-end gaming machines and expensive hardware, though. For that reason I recommend them primarily for managed, on-site VR experiences. For the right audience or context, though, they're unbeatable.
As you've been experiencing for yourself, interactive 3D environments are a fun and immersive way to experience non-linear information, akin to a thoughtfully designed exhibit or showroom.
As an exhibit designer and experiential creative director, I've wanted to be able to provide widely accessible virtual environments as alternatives, enhancements or parallel experiences to live events for a long time. And now, it’s possible!
Interactive experiences can be as large as a museum or as simple as a single product presented in exquisite detail.
In my experience working with clients, the reasons they want an exhibit or brand experience is to organize information spatially. They are effective educational tools because the immersive environment creates an organizational structure for the information being shared.
Web-based experiences are great as a takeaway to revisit after events, or as a "digital twin" that allows us to build a virtual analogue for advanced design visualization and planning.
There are two great options for web-based interactive 3D experiences.
These apps are developed using Unreal Engine and render in real-time on powerful graphics servers which then send the output to clients' browsers as a video. Still completely interactive, these will greatly surpass the performance of most end users' computers while delivering nearly photorealistic graphics, along with deep interactions, animations, physics, and more.
For more technically-minded audiences, interactive 3D applications can also be distributed as direct downloads. They can even be developed as mobile apps for iPhone or Android devices, provided that the content and project timeline allows for those devices' App Store approval process.
Choosing the best method for delivering interactive 3D experiences depends largely on their purpose and what information and media they'll contain. As part of my design process, I’ll help you choose the right technology, platform, and application to get the job done right.
Augmented Reality, or Mixed Reality (AR/MR), is a class of technologies that combine 3D graphics with real-world camera imagery. At the moment, that consists mainly of objects seen through a mobile device, or a specialized headset like Microsoft's Hololens.
AR often takes the form of photo filters, like on Instagram or TikTok, but expect mixed-reality headsets and glasses to deliver more immersive experiences overlaid on the real world very soon.
AR, or an effect like it, can also be achieved through video compositing. In this case, the presenters are filmed on a green screen and edited onto the virtual set, which is rendered in real-time:
Augmented reality is best served by Apple's RealityKit suite of APIs which allow for easily-distributed interactive models and apps with AR features. AR can also be brought to the web through WebXR/WebAR, or packaged into app-specific photo filters.
More complex scenes, like virtual stage sets, are composited in production and distributed as finished videos.