June 25, 2020

1163 words 6 mins read

microsoft/MixedRealityToolkit-Unity

microsoft/MixedRealityToolkit-Unity

Mixed Reality Toolkit (MRTK) provides a set of components and features to accelerate cross-platform MR app development in Unity.

repo name microsoft/MixedRealityToolkit-Unity
repo link https://github.com/microsoft/MixedRealityToolkit-Unity
homepage https://microsoft.github.io/MixedRealityToolkit-Unity
language C#
size (curr.) 1504508 kB
stars (curr.) 3625
created 2016-01-28
license MIT License

Mixed Reality Toolkit

What is the Mixed Reality Toolkit

MRTK-Unity is a Microsoft-driven project that provides a set of components and features, used to accelerate cross-platform MR app development in Unity. Here are some of its functions:

  • Provides the basic building blocks for Unity development on HoloLens, Windows Mixed Reality, and OpenVR.
  • Enables rapid prototyping via in-editor simulation that allows you to see changes immediately.
  • Operates as an extensible framework that provides developers the ability to swap out core components.
  • Supports a wide range of platforms, including
    • Microsoft HoloLens
    • Microsoft HoloLens 2
    • Windows Mixed Reality headsets
    • OpenVR headsets (HTC Vive / Oculus Rift)
    • Ultraleap Hand Tracking

Getting started with MRTK

Getting Started and DocumentationGetting Started Getting StartedMRTK Overview Feature GuidesFeature Guides API ReferenceAPI Reference

Build status

Branch CI Status Docs Status
mrtk_development CI Status Docs Status

Required software

Windows SDK 18362+ Windows SDK 18362+ Unity Unity 2018.4.x Visual Studio 2019 Visual Studio 2019 Emulators (optional) Emulators (optional)
To build apps with MRTK v2, you need the Windows 10 May 2019 Update SDK. To run apps for immersive headsets, you need the Windows 10 Fall Creators Update. The Unity 3D engine provides support for building mixed reality projects in Windows 10 Visual Studio is used for code editing, deploying and building UWP app packages The Emulators allow you to test your app without the device in a simulated environment

Feature areas

Input System Input System  Hand Tracking (HoloLens 2) Hand Tracking (HoloLens 2) Eye Tracking (HoloLens 2) Eye Tracking (HoloLens 2) Profiles Profiles  Hand Tracking (Ultraleap) Hand Tracking (Ultraleap)
UI Controls UI Controls  Solvers Solvers  Multi-Scene Manager Multi-Scene Manager Spatial Awareness Spatial Awareness Diagnostic Tool Diagnostic Tool
MRTK Standard Shader MRTK Standard Shader Speech & Dictation Speech & Dictation BoundarySystem BoundarySystem In-EditorSimulation In-EditorSimulation ExperimentalFeatures ExperimentalFeatures

UX building blocks

Button Button Bounding Box Bounding Box Object Manipulator Object Manipulator
A button control which supports various input methods, including HoloLens 2’s articulated hand Standard UI for manipulating objects in 3D space Script for manipulating objects with one or two hands
Slate Slate System Keyboard System Keyboard Interactable Interactable
2D style plane which supports scrolling with articulated hand input Example script of using the system keyboard in Unity A script for making objects interactable with visual states and theme support
Solver Solver Object Collection Object Collection Tooltip Tooltip
Various object positioning behaviors such as tag-along, body-lock, constant view size and surface magnetism Script for laying out an array of objects in a three-dimensional shape Annotation UI with a flexible anchor/pivot system, which can be used for labeling motion controllers and objects
Slider Slider MRTK Standard Shader MRTK Standard Shader Hand Menu Hand Menu
Slider UI for adjusting values supporting direct hand tracking interaction MRTK’s Standard shader supports various Fluent design elements with performance Hand-locked UI for quick access, using the Hand Constraint Solver
App Bar App Bar Pointers Pointers Fingertip Visualization Fingertip Visualization
UI for Bounding Box’s manual activation Learn about various types of pointers Visual affordance on the fingertip which improves the confidence for the direct interaction
Near Menu Near Menu Spatial Awareness Spatial Awareness Voice Command Voice Command / Dictation
Floating menu UI for the near interactions Make your holographic objects interact with the physical environments Scripts and examples for integrating speech input
Progress Indicator Progress Indicator Dialog Dialog [Experimental] Hand Coach Hand Coach [Experimental]
Visual indicator for communicating data process or operation UI for asking for user’s confirmation or acknowledgement Component that helps guide the user when the gesture has not been taught
Hand Physics Service Hand Physics Service [Experimental] Scrolling Collection Scrolling Collection [Experimental] Dock Dock [Experimental]
The hand physics service enables rigid body collision events and interactions with articulated hands An Object Collection that natively scrolls 3D objects The Dock allows objects to be moved in and out of predetermined positions
Eye Tracking: Target Selection Eye Tracking: Target Selection Eye Tracking: Navigation Eye Tracking: Navigation Eye Tracking: Heat Map Eye Tracking: Heat Map
Combine eyes, voice and hand input to quickly and effortlessly select holograms across your scene Learn how to auto-scroll text or fluently zoom into focused content based on what you are looking at Examples for logging, loading and visualizing what users have been looking at in your app

Tools

Optimize Window Optimize Window Dependency Window Dependency Window Build Window Build Window Input recording Input recording
Automate configuration of Mixed Reality projects for performance optimizations Analyze dependencies between assets and identify unused assets Configure and execute an end-to-end build process for Mixed Reality applications Record and playback head movement and hand tracking data in editor

Example scenes

Explore MRTK’s various types of interactions and UI controls in this example scene.

You can find other example scenes under Assets/MixedRealityToolkit.Examples/Demos folder.

Example Scene

MRTK examples hub

With the MRTK Examples Hub, you can try various example scenes in MRTK. You can find pre-built app packages for HoloLens(x86), HoloLens 2(ARM), and Windows Mixed Reality immersive headsets(x64) under Release Assets folder. Use the Windows Device Portal to install apps on HoloLens.

See Examples Hub README page to learn about the details on creating a multi-scene hub with MRTK’s scene system and scene transition service.

Example Scene

Sample apps made with MRTK

Periodic Table of the Elements Galaxy Explorer
Periodic Table of the Elements is an open-source sample app which demonstrates how to use MRTK’s input system and building blocks to create an app experience for HoloLens and Immersive headsets. Read the porting story: Bringing the Periodic Table of the Elements app to HoloLens 2 with MRTK v2 Galaxy Explorer is an open-source sample app that was originally developed in March 2016 as part of the HoloLens ‘Share Your Idea’ campaign. Galaxy Explorer has been updated with new features for HoloLens 2, using MRTK v2. Read the story: The Making of Galaxy Explorer for HoloLens 2

Engage with the community

This project has adopted the Microsoft Open Source Code of Conduct. For more information, see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Useful resources on the Mixed Reality Dev Center

Discover Discover Design Design Develop Develop Distribute) Distribute
Learn to build mixed reality experiences for HoloLens and immersive headsets (VR). Get design guides. Build user interface. Learn interactions and input. Get development guides. Learn the technology. Understand the science. Get your app ready for others and consider creating a 3D launcher.

Useful resources on Azure

Spatial Anchors Spatial Anchors Speech Services Speech Services Vision Services Vision Services
Spatial Anchors is a cross-platform service that allows you to create Mixed Reality experiences using objects that persist their location across devices over time. Discover and integrate Azure powered speech capabilities like speech to text, speaker recognition or speech translation into your application. Identify and analyze your image or video content using Vision Services like computer vision, face detection, emotion recognition or video indexer.

Learn more about the MRTK project

You can find our planning material on our wiki under the Project Management Section. You can always see the items the team is actively working on in the Iteration Plan issue.

How to contribute

Learn how you can contribute to MRTK at Contributing.

For details on the different branches used in the Mixed Reality Toolkit repositories, check this Branch Guide here.

comments powered by Disqus