HoloLens.jpg

Microsoft HoloLens

Microsoft HoloLens

HoloLens.jpg
 
 

BUILDING THE FUTURE OF COMPUTING

Working closely with director Nando Costa and a brilliant cross-discipline team, I helped define the look and feel of interacting with holograms in mixed reality for HoloLens v1.

As a motion designer, I created visual prototypes using After Effects, Cinema4D, and other tools to solve interaction problems and develop a cohesive visual language for Windows Holographic.

The sequences below were captured on device, showing the end result of a few of the countless interactions I prototyped through motion and visual explorations (they’re designed for an additive display and are best experienced from within a HoloLens).

Using the “bloom” gesture to summon your personal Start menu out of thin air

Using the “bloom” gesture to summon your personal Start menu out of thin air

Launching and then choosing where to place a volumetric app within your space

Launching and then choosing where to place a volumetric app within your space

Typing on a keyboard using only gaze and the air-tap gesture

Typing on a keyboard using only gaze and the air-tap gesture

Visualizing the surface reconstruction mesh so that you can see how HoloLens understands your environment

Visualizing the surface reconstruction mesh so that you can see how HoloLens understands your environment

UNPRECEDENTED CHALLENGES FOR AN UNPRECEDENTED PLATFORM

Enabling a user to interact with a computer using only their gaze, voice, and two gestures is a challenge. Even more so when they don’t have a monitor, and the interface is spread out around them in the physical world. How can such a revolutionary form of computing be made to feel understandable, navigable, and confidently operable?

Answering this question meant first answering many other questions, such as:

  • How will users know what they can interact with?

  • How will they know which button they’re currently targeting?

  • How will they know when their hands are in view of HoloLens so that it will see their gestures?

  • How will they know when a gesture was successfully recognized?

  • How will they find what they’re looking for, when it could be physically anywhere?

These are some of the questions I sought to explore using visual prototypes even before the hardware and platform were ready for any trials or iteration on-device.

YOU CAN TAKE IT WITH YOU

There’s something magical about a world-locked hologram, fixed in your world like a physical object that you can walk around and inspect as if it’s really there. But fixing a hologram to the real world can have drawbacks—it would be inconvenient if you could accidentally leave your Start Menu in the kitchen, requiring you to walk around looking for it. That’s why some holograms are designed to carefully “tag along” with the user. I didn’t invent the tag along behavior, but I did explore some of the questions it raised:

  • How tightly should tag along objects follow your head movement?

  • What happens when a tag along object collides with a physical barrier?

  • If barriers force an object to be closer to you than intended, how can it be kept comfortably interactable?

The motion study below, showing an overhead view, communicates how tag along holograms should behave as a user walks around their space.

THE LANGUAGE OF LIGHT

Embracing the additive nature of the HoloLens display, I used light to tell the story of the connection between the user and the holographic world. Light from your gaze cursor pools out and clings to the edges of an affordance, visually reinforcing its boundaries. Light charges up, ready to burst with power when HoloLens sees that you’re ready to make a gesture. Light ripples outward in a powerful release when an air-tap is registered.

Below are some of the motion studies I created using After Effects to explore this new visual language.

I was awarded a patent for my role in developing this light-based language of visual feedback, along with Director Nando Costa and Technical Artist Mathew Lamb. My motion studies were included in vision decks for what became the Fluent Design system, and elements of my designs for HoloLens can now be seen across Windows.