◂ PROJECTS

LF One


YEAR 2020

I have always been fascinated by the possibility of augmenting experiences by combining multiple activities together to create a new one. This interest manifested in various projects, from a prototype for Google Glass to a hologram display for a game. After working on countless headset-related projects, including one I initiated called "ReFrame," I wanted to tackle the augmented reality (AR) headset experience at its essence.

ROLE :

Experience Architect & HCI Researcher

TEAM :

Worked with two senior Principal Engineers

KEY TECHNOLOGIES

3D Rendering

Wave Guide Optics

Point Cloud Rendering

Spatial Tracking

TOOLS & SOFTWARES USED:

  • 3D Printing

  • Realsense Camera SDK

  • Unity

  • Capturing Reality

  • Cloud Compare

Demystifying augmented reality to build its core architecture

Methodology & Approach

An AR headset can be broken down into three fundamental layers: the Perceptual layer, the input it's ingesting, like cameras and microphones; the Projection layer, the information it's projecting to the user, like the display and sound; and the Compute layer, the bridge between the two, which handles performance, weight, and usage time. I identified and quantified these parameters through research and by studying user workflows, expectations, and human sensory understanding.

LAYER OF AR - An Overview of the System's Architecture 🠈

“Perception in, Projection out, and Compute is the bridge.”

— Layers of AR

Research & Prototype

I then built an end-to-end prototype with all the components from these three layers. I measured the delta between our prototype's performance and our goals, studying the trajectory of hardware components and new algorithms to see how the landscape was evolving. I also analyzed whether current microprocessors could deliver the required KPIs and KEIs. This process was an intersection of design, architecture, and finance, which helped keep the Build of Material (BOM) cost in check.

DEEP DIVE - A list of all the modules and the various types of sub-components in each layer 🠋

FEATURE SUITE - A family of interconnected features that work together 🠉

FIRST PROTO - Render of a Point Cloud and 3D Model 🠉

“A passthrough from a very early prototype using a testbed with waveguide optics and a rendering pipeline.”

— First Prototype

My initial prototyping process was crude, involving a lot of duct tape and 3D-printed modules. The idea was to build the entire end-to-end pipeline. The device needed to boot, run the rendering pipeline, have tracking cameras, pass-through waveguide optics, and the necessary compute to do so.

The primary objective was to quantify the consumption and delivery of specific metrics, such as weight, power input, rendering FPS, and tracking accuracy, for each module, system, or algorithm.

EINRIG - Testing the second-generation prototype 🠊

DESIGN - More detailed reference designs 🠋

MICROPROCESSOR - Dedicated Compute 🠊

I also considered a standalone microprocessor to run the workload while assessing the compute requirements. This approach was aimed at producing a leaner design and reducing the device's bounding box.

Final Outcome

By the end of the project, we had a clear understanding of this space, down to each pixel. Our prototypes evolved through several iterations, and we developed many functional scenarios. The experience demystified the alchemy behind augmented reality glasses for me. After thoroughly researching the trajectory of silicon manufacturing, display, and battery technology, I could clearly visualize when this next big leap in computing would become accessible for universal adoption, which I believe will come around 2026/27.

FINAL DESIGN - The final Leaner AR HMD Design 🠋

Summary

“This project helped me demystify the alchemy of augmented reality, establishing a framework to build, measure, and analyze its core components and its trajectory.”

ENIGMÁREA
RED
GATHER AI
LF ONE
REFRAME
AWESEVEN