Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

Introduction to HoloLens

Save for later
  • 10 min read
  • 11 Jul 2017

article-image

In this article, Abhijit Jana, Manish Sharma, and Mallikarjuna Rao, the authors of the book, HoloLens Blueprints, we will be covering the following points to introduce you to using HoloLens for exploratory data analysis.

  • Digital Reality - Under the Hood
  • Holograms in reality
  • Sketching the scenarios
  • 3D Modeling workflow
  • Adding Air Tap on speaker
  • Real-time visualization through HoloLens

(For more resources related to this topic, see here.)

Digital Reality - Under the Hood

Welcome to the world of Digital Reality. The purpose of Digital Reality is to bring immersive experiences, such as taking or transporting you to different world or places, make you interact within those immersive, mix digital experiences with reality, and ultimately open new horizons to make you more productive. Applications of Digital Reality are advancing day by day; some of them are in the field of gaming, education, defense, tourism, aerospace, corporate productivity, enterprise applications, and so on.

The spectrum and scenarios of Digital Reality are huge. In order to understand them better, they are broken down into three different categories: 

  • Virtual Reality (VR): It is where you are disconnected from the real world and experience the virtual world. Devices available on the market for VR are Oculus Rift, Google VR, and so on. VR is the common abbreviation of Virtual Reality.
  • Augmented Reality (AR): It is where digital data is overlaid over the real world. Pokémon GO, one of the very famous games, is an example of the this globally. A device available on the market, which falls under this category, is Google Glass. Augmented Reality is abbreviated to AR.
  • Mixed Reality (MR): It spreads across the boundary of the real environment and VR. Using MR, you can have a seamless and immersive integration of the virtual and the real world. Mixed Reality is abbreviated to MR.

This topic is mainly focused on developing MR applications using Microsoft HoloLens devices.

Although these technologies look similar in the way they are used, and sometimes the difference is confusing to understand, there is a very clear boundary that distinguishes these technologies from each other. As you can see in the following diagram, there is a very clear distinction between AR and VR. However, MR has a spectrum, that overlaps across all three boundaries of real world, AR, and MR.

introduction-hololens-img-0Digital Reality Spectrum

The following table describes the differences between the three:

Holograms in reality

Till now, we have mentioned Hologram several times. It is evident that these are crucial for HoloLens and Holographic apps, but what is a Hologram?

Virtual Reality

  • Complete Virtual World
  • User is completely isolated from the Real World
  • Device examples: Oculus Rift and Google VR

Augmented Reality

  • Overlays Data over the real world
  • Often used for mobile devices
  • Device example: Google Glass
  • Application example: Pokémon GO

Mixed Reality

  • Seamless integration of the real and virtual world
  • Virtual world interacts with Real world
  • Natural interactions
  • Device examples: HoloLens and Meta

Holograms are the virtual objects which will be made up with light and sound and blend with the real world to give us an immersive MR experience with both real and virtual worlds. In other words, a Hologram is an object like any other real-world object; the only difference is that it is made up of light rather than matter.

The technology behind making holograms is known as Holography.

The following figure represent two holographic objects placed on the top of a real-size table and gives the experience of placing a real object on a real surface:

introduction-hololens-img-1Holograms objects in real environment

Interacting with holograms

There are basically five ways that you can interact with holograms and HoloLens. Using your Gaze, Gesture, and Voice and with spatial audio and spatial mapping. Spatial mapping provides a detailed illustration of the real-world surfaces in the environment around HoloLens. This allows developers to understand the digitalized real environments and mix holograms into the world around you. Gaze is the most usual and common one, and we start the interaction with it. At any time, HoloLens would know what you are looking at using Gaze. Based on that, the device can take further decisions on the gesture and voice that should be targeted. Spatial audio is the sound coming out from HoloLens and we use spatial audio to inflate the MR experience beyond the visual.

introduction-hololens-img-2HoloLens Interaction Model

Sketching the scenarios

The next step after elaborating scenario details is to come up with sketches for this scenario. There is a twofold purpose for sketching; first, it will be input to the next phase of asset development for the 3D Artist, as well as helping to validate requirements from the customer, so there are no surprises at the time of delivery.

For sketching, either the designer can take it up on their own and build sketches, or they can take help from the 3D Artist. Let's start with the sketch for the primary view of the scenario, where the user is viewing the HoloLens's hologram:

  • Roam around the hologram to view it from different angles
  • Gaze at different interactive components

    introduction-hololens-img-3Sketch for user viewing hologram for the HoloLens

Sketching - interaction with speakers

While viewing the hologram, a user can gaze at different interactive components. One such component, identified earlier, is the speaker. At the time of gazing at the speaker, it should be highlighted and the user can then Air Tap at it. The Air Tap action should expand the speaker hologram and the user should be able to view the speaker component in detail.

introduction-hololens-img-4Sketch for expanded speakers

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at €18.99/month. Cancel anytime

After the speakers are expanded, the user should be able to visualize the speaker components in detail. Now, if the user Air Taps on the expanded speakers, the application should do the following:

  • Open the textual detail component about the speakers; the user can read the content and learn about the speakers in detail
  • Start voice narration, detailing speaker details
  • The user can also Air Tap on the expanded speaker component, and this action should close the expanded speaker

    introduction-hololens-img-5Textual and voice narration for speaker details

 As you did sketching for the speakers, apply a similar approach and do sketching for other components, such as lenses, buttons, and so on.

3D Modeling workflow

Before jumping to 3D Modeling, let's understand the 3D Modeling workflow across different tools that we are going to use during the course of this topic. The following diagram explains the flow of the 3D Modeling workflow:

introduction-hololens-img-6Flow of 3D Modeling workflow

Adding Air Tap on speaker

In this project, we will be using the left-side speaker for applying Air Tap on speaker. However, you can apply the same for the right-side speaker as well.

Similar to Lenses, we have two objects here which we need to identify from the object explorer.

  • Navigate to Left_speaker_geo and left_speaker_details_geo in Object Hierarchy window
  • Tag them as leftspeaker and speakerDetails respectively

By default, when you are just viewing the Holograms, we will be hiding the speaker details section. This section only becomes visible when we do the Air Tap, and goes back again when we Air Tap again:

introduction-hololens-img-7Speaker with Box Collider

  • Add a new script inside the Scripts folder, and name it ShowHideBehaviour. This script will handle the Show and Hide behaviour of the speakerDetails game object.

Use the following script inside the ShowHideBehaviour.cs file. This script we can use for any other object to show or hide.

public class ShowHideBehaviour : MonoBehaviour {
    public GameObject showHideObject;
    public bool showhide = false;
    private void Start()
    {
        try
        {
            MeshRenderer render = 
                showHideObject.GetComponent
                InChildren<MeshRenderer>();
            if (render != null)
            {
                render.enabled = showhide;
            }        
    }
        catch (System.Exception)
        {
        }
    }
}

The script finds the MeshRenderer component from the gameObject and enables or disables it based on the showhide property. In this script, the showhide is property exposed as public, so that you can provide the reference of the object from the Unity scene itself.

Attach ShowHideBehaviour.cs as components in speakerDetails tagged object. Then drag and drop the object in the showhide property section. This just takes the reference for the current speaker details objects and will hide the object in the first instance.

introduction-hololens-img-8Attach show-hide script to the object

By default, it is unchecked, showhide is set to false and it will be hidden from view. At this point in time, you must check the left_speaker_details_geo on, as we are now handling visibility using code.

Now, during the Air Tapped event handler, we can handle the render object to enable visibility.

  1. Add a new script by navigating from the context menu Create | C# Scripts, and name it SpeakerGestureHandler.
  2. Open the script file in Visual Studio.
  3. Similar to SpeakerGestureHandler, by default, the SpeakerGestureHandler class will be inherited from the MonoBehaviour. In the next step, implement the InputClickHandler interface in the SpeakerGestureHandler class. This interface implement the methods OnInputClicked() that invoke on click input. So, whenever you do an Air Tap gesture, this method is invoked.
    RaycastHit hit;
    bool isTapped = false; 
     public void OnInputClicked(InputEventData eventData)
        {
            hit = GazeManager.Instance.HitInfo;
            if (hit.transform.gameObject != null)
            {
                isTapped = !isTapped;
                var lftSpeaker = 
                    GameObject.FindWithTag("LeftSpeaker");
                var lftSpeakerDetails = 
                    GameObject.FindWithTag("speakerDetails");
                MeshRenderer render = 
                    lftSpeakerDetails.GetComponentInChildren
                        <MeshRenderer>();
    
                if (isTapped)
                {
                    lftSpeaker.transform.Translate(0.0f, -1.0f * 
                                Time.deltaTime, 0.0f);
                    render.enabled = true;
                }
                else
                {
                    lftSpeaker.transform.Translate(0.0f, 1.0f * 
                                Time.deltaTime, 0.0f);
                    render.enabled = false;
                }
            }
        }

When it is gazed, we find the game object for both LeftSpeaker and speakerDetails by the tag name. For the LeftSpeaker object, we are applying transformation based on tapped or not tapped, which worked like what we did for lenses. In the case of speaker details object, we have also taken the reference of MeshRenderer to make it's visibility true and false based on the Air Tap. Attach the SpeakerGestureHandler class with leftSpeaker Game Object.

Air Tap in speaker – see it in action

Air Tap action for speaker is also done. Save the scene, build and run the solution in emulator once again. When you can see the cursor on the speaker, perform Air Tap.

introduction-hololens-img-9Default View and Air Tapped View

Real-time visualization through HoloLens

We have learned about the data ingress flow, where devices connect with the IoT Hub, and stream analytics processes the stream of data and pushes it to storage. Now, in this section, let's discuss how this stored data will be consumed for data visualization within holographic application.

introduction-hololens-img-10Solution to consume data through services

Summary

In this article, we demonstrated using HoloLens,  for exploring Digital Reality - Under the Hood, Holograms in reality, Sketching the scenarios, 3D Modeling workflow, Adding Air Tap on speaker,  and  Real-time visualization through HoloLens.

Resources for Article:


Further resources on this subject: