Reframe
YEAR 2019
One project at Intel asked me to truly venture into the unknown. My team was called "Next Generation Pathfinding," and our charter was to map terra incognita. I had carefully and meticulously assembled a team of some of the smartest people I knew at Intel to work on a project that had been brewing in my mind: the Reality Extension Framework, or Reframe for short. It was a framework that encapsulates the baseline functionalities needed to support a plethora of augmented reality experiences.
ROLE :
Experience Architect & HCI Researcher
TEAM :
Led a team of Two AI Engineers, One Graphics Engineer, and Two Software Engineer
KEY TECHNOLOGIES
Augmented Reality
Spatial AI Analytics
3D Interface
5G
TOOLS & SOFTWARES USED:
Unity
Adobe Creative Suite
Capturing Reality
Cloud Compare
BLK360
“An AI spatial framework for augmented reality that extends reality itself”
Methodology & Approach
The way to pull it off was to bring together a system of multiple cutting-edge technologies that were only in labs at the time and make them work together like the finely crafted gears of a watch. The technologies at play included an AR headset, 5G, edge computing, AI, spatial 3D scene reconstruction, and software virtualization, to name a few. The idea was to have an AR headset work alongside an edge computing server over a 5G network to run a custom suite of spatial analytics features
PROTOTYPE - The First Prototype 🠉
“The first prototype tracked the user's movement within a sparse point cloud as they moved with a head-mounted display (HMD).”
— First Proto
CORE - Core Modules of the Project 🠉
Research & Prototype
The architecture had three primary features: VNS (Visual Navigation System), which gave millimeter-level 6DoF positioning in real-time without GPS; VAS (Visual Analytic System), which ran AI algorithms over what the user was seeing; and SLS (Shared Location System), which synchronized two or more instances of the same scenario. The entire software suite was encapsulated in a 3D application made with a game engine, so you could visualize every bit of it, while a companion app on the headset gave the local view. The last part was a display of pure capability—a testament to all the gears of the "clockwork architecture" working meticulously in sync.
3D MAPPING - Hyper-dense point cloud scans of the environment using LiDAR cameras 🠋
HMD UI - The HMD (Head-Mounted Display) user interface 🠊
ARCHITECTURE - Semi-detailed Architecture of the Modules and Their Functionalities 🠋
Final Outcome
This project was truly remarkable, as it not only catapulted the capabilities of what could be achieved technologically but also gave me a framework in my mind for how to architect all these systems together. It showed me how to model them from day one and gave me the ability to think architecturally about any subject—be it research, HCI, or AI. Reframe genuinely made me start thinking like an architect.
FINAL INTERFACE - The final user interface on the server displays a 3D scene, paired with active, tracked HMD’s 🠋
“The framework is adaptable to diverse environments and a variety of user scenarios.”
— Scenario #2
SCENE #2 - A Different Environment 🠊
Below is a Live Capture of a Demo presented in Santa Clara to a limited audience, including Intel's CEO and board members. The purpose was to showcase all the AI, analytics, and multi-user features of the framework, running on a pre-release 5G network.

