DAZAI CHEN
← Back to Works

Spatial Memory: NYU IDM Thesis

A VR spatial memory album: browse and step into 3DGS-captured personal spaces using hand-tracked microgestures on Meta Quest 3.

3D Gaussian Splatting Unreal Engine Meta Quest 3 PuerTS PostShot RealityScan ElevenLabs

Overview

My thesis explores a new format for preserving and revisiting spatial memory: a personal album of 3DGS-captured places, experienced in VR on Meta Quest 3. Each entry is a real space from my life, preserved as a navigable 3D environment. The viewer puts on a headset, learns the gesture controls through a guided onboarding, then browses an archive of photo cards organized by category. Select a card, and the flat preview dissolves into a full 3D space around you. That transition, from looking at a memory to standing inside it, is the core moment.


Why This Project

I’m a stage lighting designer. The way I perceive the world is through space: how light fills a room, how a place feels when you’re standing in it.

I’ve noticed this about myself: when I remember someone, I don’t always remember what we talked about, but I remember the scene, the space, the light, the feeling of being there. Photos and videos capture what a place looked like, but they’ve always felt like they’re missing something: the sense of actually being surrounded by a space.

3D Gaussian Splatting can preserve space itself. VR lets you step inside it. One preserves, the other lets you return. This project started from that gap, and from a personal need to hold onto the spatial quality of memory that flat media can’t carry.


Core Inquiry

One captured space is a tech demo. Multiple captured spaces, held together by a browsing and transition system, become a medium.

The question shifts from “can you revisit a memory?” to “can spatial memory be collected, archived, and revisited the way we flip through a photo album, except each page is a place you walk through?”


The Archive

The album currently holds 19 spaces across three categories:

Travel (15 spaces)

SpaceLocation
City SquareAmsterdam
BeachOkinawa
BeachSanta Monica
Liberty SquareTaipei
Marina Bay SandsSingapore
Mirror SphereAmsterdam
River GorgeNew Taipei
National Concert HallTaipei
NTU CampusTaipei
Sensoji TempleTokyo
Salt FieldsTainan
BoulevardNTU
CanalWuzhen
High LineNew York
Nankunshen TempleTainan

Personal (3 spaces)

SpaceLocation
ApartmentBrooklyn
RooftopBrooklyn
HomeTaipei

Motion (1 entry)

SpaceType
Camel4DGS sequence

The Experience

  1. Onboarding: a guided 4-step walkthrough with video tutorials and voiceover introduces the gesture controls (palm menu, swipe to browse, swipe to switch categories)
  2. Carousel: photo cards appear in a curved arc. Browse with thumb swipes left/right, switch categories with up/down. The 3DGS scene dims when the carousel is open so cards are easy to read.
  3. Transition: select a card. The carousel fades, the scene transitions, and the new 3DGS space loads around you.
  4. Exploration: look around, walk through the space. The spatial memory surrounds you.

Documentation


Technical Approach

Capture: iPhone / 360 camera → RealityScan (point cloud) → PostShot (3DGS training)

Runtime: Unreal Engine 5 + PuerTS (TypeScript) + Meta ISDK (hand tracking) → Meta Quest 3

Interaction: Hand-tracked microgestures via Meta Interaction SDK. Palm menu toggle, thumb swipe navigation, ray-based card selection. All interaction logic written in TypeScript via PuerTS; Blueprint handles gesture detection.


NYU IDM MS Thesis, Spring 2026

Recent Updates

View all updates →