Subtitles section Play video
SHANEE NISHRY: Making virtual reality games
and applications with Unity is easy.
In this video I'll go over the concepts and implementation
of adding Cardboard support to a Unity project.
As you may know, Unity is a game engine
with an editor that allows you to easily import 3-D models
and arrange them in the scene.
You can also attach scripts to objects
to give them functionality.
Before we start, make sure to get the Cardboard Unity
plugin from this link.
Then, open your Unity project and import a package like this.
Now that they are all set, there are two things you need to do.
You need to create a stereoscopic camera
and make sure your user interface works well
for virtual reality.
Let's start with adding the camera and look into modifying
the UI at the end.
You can do so by using one of the available prefabbed objects
or by attaching a script to an existing camera.
The easiest way is to use the Cardboard main prefab.
This is best if you're starting a new project
or haven't made any changes to your existing camera.
Simply replace the existing camera with the Cardboard
main prefab, and you're done.
You can still add any custom scripts on top,
for controlling the camera, for example.
Press Play, and you'll have a functioning scopic view.
You can rotate the camera using the Alt key
while moving the mouse.
To simulate Cardboard Trego, press the left mouse button.
If you already have a camera, you
can use the Cardboard adapter prefab.
Add it as a child of your camera and select
Update Stereo Cameras from the Components menu.
Once again, you can press Play and have
a functioning stereoscopic view in the game window.
If you don't want to use a prefab,
then you can just use a script.
By adding the stereo controller script to your camera,
two stereoscopic cameras will be created dynamically
as you press play.
You may not want to use the stereo controller script
since it doesn't let you add any image processing on top
of the cameras because they are added dynamically.
If you want to create the cameras in the Editor,
then simply select Update Stereo Cameras from the menu
and they will be created for you.
Press Play, and you are done.
Last thing we have to do is get the user interface working
and add support for the Trego.
Start by adding the [INAUDIBLE] Input Module script
to the [INAUDIBLE] Event System Object.
This script emits [INAUDIBLE] for the event system
based on the user's gaze.
Next, in your UI element, set the Conference Render mode
to [INAUDIBLE] Space and set the event camera
to a camera controlled by a Cardboard head script,
either directly or as a parent.
At this point, the system is able to respond to the user's
gaze into triggers so that UI elements, such as buttons,
can be activated.
If you wish to interact with 3-D objects in the scene,
add a physics ray caster component to the event camera.
Designate an in-game object to be interactive
by adding a collider component to interact with the ray
caster.
And by adding a script to respond
to the generated events.
An event trigger is a good choice,
or you can implement some of the standard Unity event interfaces
on your own scripts.
If you wish to add a cursor to let a user see
the point of their gaze, set the Gaze Input models cursor
to the game object that will serve [INAUDIBLE].
This cursor will be moved to the exact point on whatever UI
object the user is gazing at.
If the event camera has a physics ray caster,
then this includes 3-D objects with the collider components.
If no object is hit by a ray cast, the cursor is hidden.
Now that you know how to make everything work,
it is important to keep in mind some best practices in order
to make a compelling virtual reality experience.
The three most important words to remember
are always keep tracking on, keep stable 60 frames
per second or higher, and avoid unexpected motion.
One of the things that makes virtual reality compelling
is the ability to look around.
In contrast, it would feel extremely unnatural
if the camera stopped responding to your head.
Therefore, you should always take into account
the user's orientation and never freeze the camera
or force the user to look somewhere specific.
If you want to grab the user's attention,
use cues such as light and sound to direct
them to look where you want.
You can also delay activating an event in your scene
until you know the user had turned
their head in that direction.
That way, they have time to take things in and enjoy the scene.
You must always keep to 60 FPS or higher.
Not only does it contribute to a good user experience,
but it is even more crucial in virtual reality.
Think about it this way, the screen
is the only thing the user can see.
Rendering at 60 FPS means the user sees the same flame
for 16.6 milliseconds.
If you miss 60 FPS, vsync drops you to 30 FPS and its frame
is shown for 33 milliseconds.
That means as the user moves their head,
they're getting an incorrect image for a very long time.
This is why it is very important for virtual reality
applications to be fast and responsive.
Movement can be tricky, because the user does not
feel like they are in motion.
If the world starts moving around,
it can contribute to an odd feeling
if there is discrepancy between one's actual lack of movement
and what the user is seeing.
There are ways to convey movement safely.
For example, by keeping motion constant
and avoiding acceleration, or by using another object and making
it move first or creating a path for the user to see.
This signals to the user they're about to be moved
and subconsciously propels them.
There are many more ways to ensure a good user experience.
I recommend you to check out the Cardboard Design Lab
Application to learn more about good and bad design patterns
so you can create the best user experience in your game
or application.
Good luck with making your own virtual reality experience,
and make sure to post about it in our Cardboard community.