VR Tutorial Demo Prototype

40 hours | VR (Oculus Go/Oculus Quest) | Unreal

Working in architectural visualization means many clients are inexperienced with a game-like platform. The challenge I set forth for myself was to create a tutorial that can take someone who has never had a headset on before and get them to move and look around like a natural and hopefully even gets them to interact using the controller as well. While the final tutorial ended up using Neoscape art I cannot show here, this prototype version served as the blueprint for the final version and was my first time developing for VR.

A bridge appears inviting forward movement

A bridge appears inviting forward movement


From the VERY Beginning

Before any actual blueprinting I need to consider 2 things:

  • What common features will be used in VR Architectural tours

  • how do you teach someone how to use these necessary features.

You can’t assume a client knows that they can look around while wearing one of these headsets, nonetheless how to manipulate a controller to interact with the world. You have to start from the very beginning when teaching. After some back-and-forth with the Interactive lead, we came up with a list of things that will be used in most VR products we could make.

  • Looking at ‘hot spots’ (things that change or activate after staring at them long enough)

  • Pointing and using the controller pad to move

  • Pointing and using the controller trigger to interact

  • 2 different kinds of interactions (material swapping and object location changing)


Nervous orbs explode when looked at too long

Nervous orbs explode when looked at too long

Move that neck

In my first prototype of this, I skipped the looking around portion of the tutorial and went straight to explaining the controller. Bad move. When faced with testing the most common feedback I got were people getting lost not even realizing that this virtual world was surrounding them in 360 degrees. Before anything, they have to learn to look. Luckily the first and simplest solution did well in testing.

Nervous Orbs

Before the ground appears, before you get access to your controller or anything else, there are 4 orbs that the player needs to look at for about a second or so each. These orbs explode in a satisfying manner to let the player know it was a good thing. While there was a lot of changes made to where these orbs were placed exactly so as not to strain the neck looking up or down, the system itself worked well

The system behind it is a dot product comparison of the forward trace of the camera compared to a trace from the player to the orb. These traces are vectors and dot product can give you a value between -1 and 1 for how similar they are. Just give a threshold and set off a timer while the player stays within that threshold. the ‘pop’ effect is my first particle system made in Niagara using some base cubes from unreal (well, second if you include the floating ambient pink particles).


Buttons That Do Stuff

The highlight and tag system on these buttons was inspired by the oculus quest tutorial game. They had a floating tag attached by a wire to the specified button on the controller and it was so clear and cool I just had to steal that method for my tutorial. This was my first run in with spline meshes, and we’ve been friends ever since.

This part of the tutorial is just for button familiarity. The Oculus Go only has 3 usable buttons (the fourth is home) so memorizing them is easier than modern controllers for each hand with various buttons. Directly after the player will be using the touchpad to start moving, then followed by the trigger to interact with objects on the other side


Moving Objects Interaction

This is one of two systems developed in VR for moving objects. The Oculus Go doesn’t have motion tracking and only keeps track of the player’s rotation, which means players can’t easily walk around with an object; they would have to hold down a button while pressing and releasing another one to move. That’s probably too complex for most clientele who likely have little to no gaming experience with a remote or controller of some kind. I opted to go for something similar to a 3D program translate widget. Because of the imprecise nature of using a controller to point at something at a distance, the axis needed to be enlarged and separated to make selection easier.

The tool would be used to make design alterations to a space and implements 5cm snapping and 5° rotation so the user can be precise even with shaky hands like mine


Material Swapping

The only thing used more often than moving stuff around is swapping materials. Like all the interactions built so far, to start the interaction you click on the object and up pops a menu. This menu can be used to swap materials on an object with multiple materials, which is very common in the way traditional 3D artists create models and materials for them.

In addition to designing and blueprinting all of the systems here, I also experimented with dynamic materials. This material swap shader was something I built from the ground up after figuring my way around Unreal’s material system. As there were no other unreal or game specialists on the team I have been pushed out of my comfort zone quite a few times. But final products really need the bit of polish that comes from SFX, particles, and material effects such as the material swapper to feel complete.