In this guide we will show you how to use some of the prefabs & samples that come with the Unity* Toolkit available in the Intel® RealSense™ SDK R2 (v4.0). This guide assumes you have imported the toolkit into your Unity environment and are familiar with applying a RealSense script to a Unity object. When using Unity 5, be sure to add the two 64 bit libs available at \RSSDK\bin\x64. You can learn more about these processes in the companion blogs to this one, “8 Steps to Add Intel RealSense Unity Toolkit to your Project” & "12 Steps to Apply Intel RealSense Technology to Your Unity Project”.
Note: Depending on your browser, screen shots may show larger when clicked on (Thumbnails in Chrome will expand when clicked).
1. Explore the Prefabs | |
Start a project with only a directional light source in the scene. | |
In the Project Tab, navigate to Assets – RSUnityToolkit – Prefabs. These are pre-assembled game objects with RealSense Actions applied | |
Drag the Face, the Left hand, and the Right hand prefabs into the scene. | |
Run game with these prefab game objects. Each prefab includes a pre-defined group of objects for each item you dragged into the scene. For example, the left hand includes 4 fingers, a thumb, a palm & a wrist. You can use these prefabs to jumpstart getting the actions into your game. Interact with the sample before moving on to the next example. | |
2. Test a Sample or Two | |
Navigate to the Samples folder under Assets\RSUnityToolKit in the lower project panel. These are full Unity projects with more RealSense functionality built in. | |
Double-click on “Sample3 – AR Mirror” in the folder. You may be asked to save your current work, do so if you desire. | |
Run the sample. The game panel will recognize your face and hands and superimpose color animation of the control points onto the color camera’s image of you. The objects will track the movements of your face and hands. The tracking is selectable in the inspection panel to either mirror or 'track exact' depending on the desires of the designer. | |
3. Play a Game Sample | |
Here’s where you can learn some of the nuances of the gestures available with Intel RealSense technology. Double-click on “Sample4 – Falling Balls.” This is a functional game sample where the player tries to guide falling balls into a box using hand gestures. In the scene panel you will see a perspective and a target. Press the run button above the scene panel to see the game rules. | |
Run and play the Falling Balls sample game. Just tell your boss you are working. I do hope you can beat Bryan’s score. |
We hope this series has helped you get started adding Intel RealSense technology to your Unity3D game projects.
Изображение значка:
