Placeholder Image

Subtitles section Play video

  • Hi XR developers, today we're going to talk about a new tool from Meta which is called the Mixed Reality Utility Kit or MRUK for short.

  • Before, when we wanted to test features like the Scene API or Anchors, we always had to build an APK and test it on our device.

  • MRUK now allows us to directly test everything in the editor when we use the Quest link.

  • It even comes with room prefabs which allows us to test different rooms to better fit our application to different room sizes of our users.

  • Trust me, this is a really great tool to develop mixed reality experience and I hope you're as excited as I am.

  • If you like this type of videos and they are helpful to you, please take a second to like and subscribe to this channel.

  • It helps me a lot.

  • If you'd like to get the source code of each video, please consider subscribing to my Patreon where you can find all the source codes.

  • If you have any questions, please join our Discord community.

  • We are a community of over 200 XR developers at this point and we are happy to help you with any questions.

  • And now, let's get started with MRUK.

  • To begin, let's cover the requirements for using the MRUK.

  • To use MRUK with Unity and MetaQuest, ensure you have Unity 2021.3.30 or newer and a Quest 2, Pro or 3 with firmware version 60 or newer.

  • For PC, use Quest link but remember the passthrough image is headset only and set up your room scan before connecting.

  • On Mac, you will have to build and deploy an APK.

  • Keep in mind that you do not need an OVR Scene Manager in your scene, MRUK serves as a replacement for it.

  • Lastly, it is advised to familiarize yourself with Meta's Scene API first.

  • With that out of the way, let's set up a new Unity project.

  • We first want to install the meta-xr-sdk from the Package Manager.

  • You can simply install it by its name, which is com.meta.xr.sdk.all.

  • Next, we install the Mixed Reality Toolkit from our Package Manager by typing com.meta.xr.mrutilitykit.

  • After installing MRUK, also make sure to install the samples which we will look at in a second.

  • Let's make sure we create a new room scan inside our MetaQuest.

  • The more accurate the scan, the better will be our experience later.

  • Take your time to walk around your room and look at your surfaces from different angles.

  • Also, take enough time to add new anchors for your furniture and add the correct labels such as table, couch, or bed.

  • Now, in order to later test any of our scenes, let's set up our project with the Project Setup Tool from Meta.

  • Just apply all the suggested changes, which will set up everything for us, even our XR plugin for testing the scene directly on our device.

  • Do this for both Windows and Android, and lastly, switch the platform to Android in the Build Settings, in case you want to deploy the app to your headset later.

  • We are now finally ready to test out some of the samples and examine the components and functions that come with the Utility Kit.

  • Firstly, let's just look at the MRUK base scene.

  • We can see that we have a regular OVR camera rig with the Tracking Origin Mode set to Stage.

  • Under Quest Features, we can see that we don't need to support Spatial Anchors, Scene Understanding, and Passthrough like we normally have to if we don't use the Utility Kit.

  • However, we still require an OVR Passthrough layer to actually be able to start our experience in Passthrough mode.

  • Keep in mind, however, if you would like to build your app to your Quest device, you will still need to enable the Anchors and enable the Scene Support as well as the Passthrough mode.

  • Let's now look at our MRUK prefab that contains the main component MRUK, which is a singleton and therefore should only exist once within our scene.

  • Firstly, we have a Scene Loaded event, which lets us easily execute any public method from here once our scene has loaded.

  • Now, keep in mind, you don't have to use this component to reference all your methods.

  • The Utility Kit also comes with a MRUK Start component, which you can find on Effect Mesh, GameObject for example.

  • This component is not different to the MRUK, but simply exists for drag and drop ease of use, so you don't have to use the event on the MRUK component to reference all your methods that you want to execute when your scene has loaded.

  • Now, let's look at the main component in detail.

  • The first checkbox we enable is called World Lock.

  • World Locking is a new feature that makes it easier for developers to keep the virtual world in sync with the real world.

  • Previously, the recommended method to keep the real and virtual world in sync was to ensure that every piece of virtual content is attached to an anchor.

  • This meant that nothing could be considered static and would need to cope with being moved by small amounts every frame.

  • This can lead to a number of issues with networking, physics, rendering, etc.

  • So, we definitely want to keep this box checked.

  • Next, we look at the scene settings.

  • Our data source can either be a room prefab which is provided to us by the utility kit already, or it can be the scene model that we have created inside our headset already.

  • There is a third option called Device with Prefab Fallback, which means if we haven't set up a room scan inside our quest home, our application will make use of the room prefabs that the utility kit provided to us.

  • This is not only beneficial for when we cannot scan our own room, but also if we want to test our app in a variety of rooms that could be similar to our end-users' rooms.

  • Next, we have a room index and a list of room prefabs that can be loaded.

  • Setting the room index to minus one means that a random room prefab will be loaded.

  • Setting the index to the number zero means the first room prefab will be loaded every time and so on.

  • The list below comes already prefilled with some room prefabs.

  • If we play the scene two different times, you will be able to see that, in the editor Unity, we'll randomly load two different rooms.

  • Next, there is the Load Scene on Startup checkbox.

  • When enabled, the scene is automatically loaded if it exists, and the scene loaded event is fired with no other action.

  • When false, you can manually control scene initialization behavior.

  • Lastly, we can specify the width of a seating area within our scene model.

  • This means, if we have set up an anchor with the Couch label, and it has at least the width specified on our MRUK component, it can be queried from code, so for example, we could call tryGetClosestSeatPose, which returns the closest seat pose on any couch objects.

  • Or we could also call getSeatPoses, which simply returns all seat poses in the room.

  • It will return zero if there are no couch objects in the scene.

  • Let's now look at the Effect Mesh component.

  • This component allows us to easily render our scene models in a different material, which we can assign under Mesh Material.

  • It also allows us to enable colliders, which will let us interact with physics in our scene like bouncing objects off our models.

  • We can also allow the casting of shadows on our surfaces, which lets us see the shadow of other objects when they are moving within our scene.

  • Lastly, we can decide which labels to apply this scene effect to.

  • For example, just floors and walls, or all the objects in our scene.

  • I will leave a link in the description that explains the other properties on this component, which we won't look at in more detail in this video.

  • Lastly, let's look at the RoomGuardian GameObject.

  • It contains the Effect Mesh and MRUKStart component like on the Effect Mesh GameObject.

  • But this time, we also have a RoomGuardian component, where we can set a distance.

  • This is the distance from which the guardian should be activated, from when the player moves within that distance.

  • On the SceneLoaded event, we call the GetEffectMesh material from the RoomGuardian component.

  • This method will find the mesh from our Effect Mesh and fade the guardian material depending on our distance to the guardian.

  • We can see that right here in the code as well.

  • Let's take a look in our editor.

  • You can see the guardian being simulated very accurately, as if we are wearing our headset right now, and we would walk towards a boundary.

  • You can play around with the distance and figure out which distance is best for your game.

  • The last thing we want to look at in this basic scene is the SceneDebugger component.

  • This component offers us a menu with a variety of tools for getting anchor and surface information.

  • It also allows us to shoot a projectile into our space, if we enable the collision on our Effect Mesh.

  • Let's press play and see what kind of functions are open and available to us from the MRUK Singleton class that comes with the Mixed Reality Utility Kit.

  • We can get the KeyWall, which is the longest wall in the room that has no other room corners behind it.

  • Or we can request the largest available surface, or the closest surface position, which can be great for placing content in our own app.

  • We can also query the closest seat position, or visualize where our raycast is hitting the model, for example to get a better understanding of how users are interacting with our app.

  • Now let's look at some code, and for that we open the SceneDebugger script.

  • I want to show you how easy it is to query all this information yourself, and create your own unique gameplay with it.

  • We can use three main classes, MRUK, MRUKRoom, and MRUKAnchor.

  • They all come with a bunch of methods that provide us with a lot of information about our room.

  • I will leave a link in the description to all of these methods.

  • Now, let's just quickly look at how Meta has used some of those methods for the debugging functionality.

  • Let's check line 156, for example.

  • As you can see, the GetKeyWallDebugger method is supposed to retrieve the KeyWall, and apply the debug visuals to it.

  • To get the KeyWall from our room model, we can simply use the MRUK singleton, by calling MRUK.instance.

  • Then we need to get the current room, and get the KeyWall from it, by calling GetCurrentRoom.GetKeyWall.

  • Also, as you can see, our KeyWall has the type MRUKAnchor.

  • Let's also take a look at another method.

  • In line 197, we can see that we are doing the exact same thing.

  • We declare another local MRUKAnchor variable, and get the largest available surface by calling the MRUK.instance, and then getting the current room we are in.

  • We can then simply call the FindLargestSurface method, and provide it with the surface type parameter, so it knows which surface we are looking for.

  • Meta really made it super easy for us to query a bunch of scene data.

  • But let's look at a few more samples to cover all the most important features of this amazing utility kit.

  • We open the FindFloorZone scene and open the FindFloorZone module, because the next component we take a look at is the FindSpawnPositions.

  • This is an excellent tool for when we have our own prefabs, let's say a model of a small building for an architecture application, and we want to check where we are able to place it without overlapping with our furniture.

  • So, we would reference our prefab in the SpawnPrefab field, and depending on the size of it, it will then decide on where and how often it can be placed on our surface.

  • If we look at this FloorSpot prefab for example, we can see it is 2 units long and wide, so it will only fit in places of that size or bigger.

  • We can set the amount of prefabs we want to place, as well as the number of times to attempt spawning or moving an object before giving up.

  • We can also specify where on the surface we would like to spawn the objects, and for which kinds of labels, for example, just on the floor.

  • If CheckOverlap is enabled, then the SpawnPosition will check colliders to make sure there is no overlap.

  • We lastly call the StartSpawn method from the FindSpawnPositions on the SceneLoaded event.

  • Let's give this scene a try.

  • We can see now how the different prefabs of different sizes are being spawned onto our floor surfaces.

  • Great guys!

  • There's one very cool component left before we close off this video that I want to show you, and that's the AnchorPrefabSpawner.

  • Let's open the virtual home scene and look at the FurnitureSpawner.

  • The AnchorPrefabSpawner allows us to effectively replace existing anchors, such as beds and tables, with virtual objects, or in other words prefabs, that we prepared beforehand.

  • So, if we open up the prefabs to spawn and then open one of the elements, we can see that we first specify the label of the anchor that we want to replace, in this case, the walls.

  • And after that, we assign the prefab that our walls should be replaced with.

  • Let's give this last scene a try and see our room with those walls we added as prefabs.

  • We can see that our real room turned into a completely virtual room.

  • This allows us to modify our users' rooms in the style of our game, which opens up a huge variety of gameplay.

  • And I can't wait to see what all of you are building with the Mixed Reality Utility Kit.

  • Alright guys, and that's it for this video.

  • I hope you learned a lot about MRUK today, and if you're enjoying this content, please take a second to like and subscribe to this channel, consider subscribing to my Patreon if you want to get all the source codes of each tutorial, and we are happy to welcome you in our Discord community, and feel free to join if you have any questions.

  • Thank you so much for watching, and see you in the next one.

  • Microsoft Mechanics www.microsoft.com

Hi XR developers, today we're going to talk about a new tool from Meta which is called the Mixed Reality Utility Kit or MRUK for short.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it

B1 US

Mixed Reality Utility Kit: Build spatially-aware apps with Meta XR SDK

  • 0 0
    鄭博涵 posted on 2024/11/27
Video vocabulary