DAZAI CHEN
← All Weeks

Week 5: From Space to Album

Reframing the project as a spatial memory archive, building the interaction model for browsing and entering captured spaces.

March 2, 2026

Overview

This week marked a fundamental shift in how I think about this project. It’s no longer about building a single VR experience inside one space. It’s about demonstrating what becomes possible when spatial memory can be captured, archived, and revisited, and building the interaction model that makes this feel natural.


Goals This Week

  • Reframe thesis direction: from single space to spatial memory archive
  • Design and build archive interface (carousel + palm menu)
  • Implement full interaction flow: summon, browse, select, enter
  • Level streaming with transition effect
  • Expand to 5+ captured spaces
  • Return-to-archive flow

The Shift: From Experience to Medium

Why This Changes Everything

Up until now, the project was framed as: “build a VR experience inside a 3DGS-captured Brooklyn apartment.” That’s a valid project, but it limits the thesis to one story in one space.

The reframe: this project is demonstrating a new format for preserving spatial memory. The Brooklyn apartment is one entry in what could be an infinite archive. The thesis argument shifts from “I made an experience” to “I’m proposing a medium and its interaction model.”

This matters because:

  • Showing multiple spaces (even briefly) makes the scalability immediately obvious
  • Viewers don’t need you to explain the concept; they see the archive and understand “any space could be here”
  • The thesis becomes about the medium’s possibility, not just one narrative

Spatial Memory Album

The viewer enters VR and encounters an archive of captured spaces. Each entry represents a different place, time, and type of memory. You browse, you choose, you step in. The moment a flat preview card expands into a full 3D space around you is the core experience.

The archive currently holds spaces from different parts of my life:

  • NYU Apartment (Day) — afternoon sunlight, the everyday rhythm of a desk and a window
  • NYU Apartment (Night) — the same space, different season, no sunlight. Same place, different memory.
  • Taipei Home — the apartment I grew up in, a different country, a different scale of familiarity
  • High Line Pigeon Sculpture — a public landmark in Manhattan, a moment captured while passing through
  • Nankunshen Temple (南鯤鯓廟) — a temple in southern Taiwan, cultural memory rooted in community
  • Previous Apartment Rooftop — an old living space, a vantage point that no longer belongs to me

The point isn’t that these spaces need to contrast in scale or significance. They’re simply the places I’ve been, the spaces that hold something for me. The archive is a personal memory collection. Any captured space can be an entry. That’s the whole idea.


Interaction Design: Making It Feel Natural

The Palm Menu

One of the core design questions: how do you summon a UI in VR without breaking immersion?

The answer I landed on: look at your left palm. When your palm faces your face, a small menu floats above it. Tap to open the archive. This is inspired by how Meta’s own system menu works, but custom-built to avoid conflicting with it.

Why this works:

  • It’s intentional — you have to deliberately open your hand and look at it. No accidental triggers.
  • It’s embodied — the menu lives on your body, not floating in space
  • It’s extensible — the palm menu can grow to include more options (settings, back button, etc.)

Microgesture Navigation

Browsing the archive uses microgestures: subtle finger movements that the Quest 3 can detect. Swipe left/right with your index finger and thumb to spin the carousel. Thumb tap to select.

This is significant because microgestures are a relatively new interaction paradigm. They’re almost invisible to an observer, they don’t require large arm movements, and they feel surprisingly intuitive once you try them. For an archive that you might browse often, this low-effort interaction is essential.

The Transition Moment

Selecting a space triggers: the screen fades, the 3DGS space loads, and then it materializes from particles as the fade clears. This “grow from nothing” moment reinforces the idea that these spaces are being reconstructed from captured data, not just teleported to.


Documentation

Niagara Particle System Test (2/25)

Early test of particle-to-mesh conversion behavior in Niagara.

Archive Interaction Demo (3/2)

Full interaction flow with 5 spaces: palm menu, carousel navigation, card selection, fade transition, and 3DGS grow effect.


Open Questions

  • How much interaction does each space need beyond navigation? Is pure spatial presence enough for some?
  • What does the “archive room” look like? Right now it’s a void. Should it have its own atmosphere?
  • For Demo Day: should I focus on polishing the flow, or on deepening the content of individual spaces?
  • What happens when a space is dynamic instead of static? 4DGS (dynamic Gaussian Splatting) could capture movement, people, weather. How does that change the experience of “revisiting”?

What’s Next

Near Term (Demo Day, March 11)

  • Return-to-archive flow: palm gesture in space > decay effect > back to carousel
  • Polish: card visuals, transition timing, audio
  • Begin thesis paper (8-12 pages due Week 8)

Future Direction: From Static to Dynamic

All current spaces are static 3DGS captures: frozen moments. The next step is exploring 4DGS (dynamic Gaussian Splatting), which captures movement and change over time. A space where people walk through, where weather shifts, where time passes. This would push the archive from a collection of frozen moments to a collection of living memories.


Week 5 | 2026-02-24 ~ 2026-03-02