Emerging tech · Looking Glass · 2023–2024

Designing for a display
that shows depth
not just pixels.

The Looking Glass Go is a holographic display that sits on your desk. My job was to design the iOS companion app that bridges a phone camera to a 3D lenticular screen — a problem with no existing playbook.

0→1
Mobile app built from scratch
2D→3D
AI-powered depth conversion
AI
Depth map generation in cloud
v1
Shipped to hardware customers

Design constraints that
didn't exist in any playbook.

This wasn't a standard mobile app. Every design decision had to account for four constraints simultaneously — hardware, network, AI processing, and a completely unfamiliar output format.

📡

Wi-Fi only architecture

The display connects over local Wi-Fi — not Bluetooth, not USB. The app had to make an invisible connection feel reliable. Any latency in the setup flow would read as broken hardware.

🧠

AI processing latency

Converting a 2D photo into a 3D holographic depth map requires cloud inference. That takes 3–8 seconds. The design had to make that wait feel productive, not frustrating.

📺

Unfamiliar output

Users had never seen their photos as holograms before. The first "wow" moment was also the moment they needed to understand how to control the display — an onboarding challenge unlike any I'd faced.

Hardware state complexity

The device had multiple distinct states — searching, connecting, syncing, firmware updating, ready — each requiring explicit design treatment. One vague "loading" state would destroy trust.

The AI wait problem:
hide it or explain it?

The most consequential design decision was how to handle 3–8 seconds of cloud AI processing. Two fundamentally different philosophies — and only one that actually built user trust.

✕ Direction 1 — Rejected

Hidden processing

A spinner with "Processing your photo…" — no progress indication, no preview of what's being created.

  • Simple to build
  • Less UI surface area
  • Users assumed it crashed after ~3 seconds
  • No sense of what they'd receive
  • Support tickets spiked in testing
✓ Direction 2 — Shipped

Transparent progress + preview

A progress indicator with a real-time 2D depth preview — showing the AI's work before the hologram renders on the display.

  • Users understood what was happening
  • Preview set correct expectations for depth
  • Wait felt like value being created
  • More complex state management
  • Preview could mislead if depth was wrong
"The wait isn't dead time — it's the moment users form their mental model of how holographic conversion works. Design for the understanding, not just the outcome."

Every state named.
Every state designed.

Hardware products punish vague loading states. I defined four explicit device states, each with distinct visual treatment and clear user guidance.

🔍

Connecting

Animated Wi-Fi scan with device illustration. Step-by-step guidance visible — not hidden behind a loading indicator.

Setup flow

Syncing

Progress bar with file count. User knows exactly how long and why — content is being transferred to the display's local storage.

Active
🔄

Firmware update

Full-screen interrupt with critical messaging: don't unplug the display. High-stakes state treated with appropriate visual weight.

Critical

Ready

Clean confirmation with display preview. Transition into the main experience is intentional — celebrate the connection, then get out of the way.

Complete

The app,
in detail.

A look at the shipped iOS interface for the Looking Glass Go companion app.

Looking Glass screen 1
Looking Glass screen 2
Looking Glass screen 3

What shipped. What I
carried forward.

0→1
Mobile app designed and shipped with no prior design system or iOS pattern library to reference
v1
Shipped to hardware customers alongside Looking Glass Go launch
4
Explicit hardware states designed — no ambiguous loading screens
Setup completion rate improved significantly after shifting from hidden to transparent AI progress

What worked

Naming every hardware state explicitly and designing each one deliberately. Hardware users are already anxious — specificity calms them.

What I'd do differently

I'd involve hardware engineers earlier on the state machine definition. Some states I designed for were never reachable in practice; others I missed entirely showed up in QA.

Biggest surprise

The "wow" moment wasn't the hologram — it was the depth preview during AI processing. Users were more engaged watching their photo get parsed than watching it render on the display.

What I'm taking with me

Designing for physical hardware teaches you that the app is only half the product. The connection experience, the power cable, the physical placement — it all shapes the UX.