DAZAI CHEN
← Back to Works

Spatial Memory: NYU IDM Thesis

Exploring human-memory interaction when memories shift from 2D images to 3D spaces.

3D Gaussian Splatting Unreal Engine VR PostShot RealityScan

Overview

My thesis explores the transformation of human-memory interaction when memories shift from 2D images to 3D spaces, by creating a VR experience that uses 3D Gaussian Splatting to allow users to return to, and potentially reshape, their spatial memories.


Core Inquiry

Photos only show what a place looked like — not what it felt like.

But what if we could step into a spatial memory, not just look at photos? And how would returning to this space change our relationship with memory?

3D Gaussian Splatting preserves space with high fidelity. VR lets us step inside and be enveloped. Together: we can actually return to spatial memories.


What is “Feeling” to Me?

I’m a stage lighting designer. For me, feeling is about space.

Space is how I perceive the world and communicate feelings. But how can we save this feeling? How can we preserve this spatial experience?


Documentation


Technical Approach

Pipeline: 360 camera → RealityScan (point cloud) → PostShot (3DGS training) → Unreal Engine → VR

Current exploration: Body interaction in VR — can hand tracking or body movement change the brightness/atmosphere of a spatial memory?


Already Started

I’ve captured Nankunshen Temple using 3DGS during winter break. The final project may use this space, or a different one.

View Nankunshen 3DGS Project


NYU IDM MS Thesis, Spring 2026

Recent Updates

View all updates →