DAZAI CHEN
← All Weeks

Week 1: Technical Experiments

Testing 3DGS integration with VR, relighting, and interaction systems

February 2, 2026

Overview

This week focused on technical experiments to verify the 3DGS → Unreal Engine → VR pipeline and explore potential interaction mechanisms. Using Nankunshen Temple as the test scene.


Progress

1. 3DGS Relighting Test (XVerse Plugin)

Tested dynamic relighting on 3DGS using Unreal Engine 5.5 + XVerse Plugin.

Key findings:

  • PostShot’s PLY output isn’t directly compatible with XVerse → built a converter to handle format conversion
  • Relighting enables mood/atmosphere changes without re-capturing
  • Fog and volumetric effects work well with 3DGS

Issues encountered:

  • XVerse plugin changes the format → loses the quality from PostShot training
  • Model breaks when enabling relight function
  • Performance is unstable
  • Trade-off: relight capability vs. visual quality
Original
Relight + Postprocess

Potential for thesis:

  • Could relighting represent different emotional states of memory?
  • Time of day changes (morning light vs. evening) to evoke different feelings?

2. VR Integration with Micro Gesture Interaction

Built a VR experience for Meta Quest 3 with hand tracking interaction.

Implementation:

  • Unreal Engine 5 + Meta Quest 3
  • Hand tracking enabled (no controllers)
  • Micro gesture: pinch left thumb to teleport to preset locations

Key findings:

  • Teleportation feels natural for exploring a captured space
  • Hand tracking adds presence without breaking immersion
  • Preset locations work better than free teleportation for guided experiences

Questions raised:

  • Should navigation be fully free or guided?
  • How does the interaction method affect the feeling of “returning” to a space?

Inspiration: This experiment directly inspired the “Memory Editor & Playback System” idea — if micro gestures can control navigation, could they also control time?


3. Third-Person Walkthrough

Tested third-person camera perspective for exploring the 3DGS scene.

Observations:

  • Third-person creates a sense of observing yourself in the space
  • Different emotional quality than first-person VR
  • Could be useful for documentation or sharing experiences

Next to explore:

  • Compare with first-person perspective in VR
  • Which perspective feels more like “returning to a memory”?

4. Particle Effects Test

Experimented with adding particle effects (dust, light rays) to enhance atmosphere.

Initial thoughts:

  • Particles can emphasize the “memory” quality — imperfect, dreamlike
  • Need to balance between enhancement and distraction

Technical Pipeline Verified

Photos/Video → PostShot (3DGS Training) → PLY Export

                                      Format Converter

                       Unreal Engine 5 ← XVerse Plugin (Relight)

                                      Meta Quest 3 (VR)

Key Insights

On Relighting

Relighting isn’t just a visual effect — it could be a way to represent how the same space feels different at different moments in our memory.

On Interaction

The simplest interaction (teleportation) might be the most appropriate. Too much interaction could distract from the experience of simply “being” in a space.

On Perspective

First-person VR = “I am here again” Third-person = “I am watching myself return” Both have value, but they evoke very different feelings.


Emerging Idea: Memory Editor & Playback System

Based on this week’s experiments, a new direction is taking shape:

Core Concept: What if we could interact with spatial memory like a video — play, pause, rewind, fast-forward?

Research Questions

1. Interaction within Memory Space

  • How do people naturally want to interact inside their own spatial memories?
  • Micro gesture for play/pause — what does it mean to “pause” a memory?
  • Does pausing create a moment of reflection? Or break the immersion?

2. 4D Gaussian Splatting (4DGS) Possibilities

  • 4DGS captures not just space, but space over time
  • Could micro gestures control the timeline? (slide finger to scrub through time)
  • Fast-forward and rewind memory — what does this feel like experientially?

Multi-Sensory Integration

Beyond visual, how can other senses enhance the experience of spatial memory?

Haptics (Heat)

  • Memories often have thermal qualities — warmth of sunlight, heat from incense
  • Could thermal feedback trigger deeper emotional recall?
  • Hardware: thermal haptic devices, heated hand controllers?

Sound

  • Ambient soundscape of a space is part of the memory
  • Spatial audio in VR — sounds positioned where they were in real life
  • Could sound be “scrubbed” along with 4DGS timeline?

Technical Research Needed

TopicQuestion
4DGS PipelineHow to capture and train 4DGS? What tools exist?
Timeline ControlCan 4DGS playback be controlled in real-time in UE5?
Gesture MappingWhich gestures feel intuitive for temporal navigation?
Emotional ImpactDoes controlling time change how we relate to memory?
Thermal HapticsWhat devices exist? How to sync with visual experience?
Spatial AudioHow to capture and playback 3D soundscapes?

Conceptual Framework

Traditional Memory: Fixed, passive, we receive it

Memory Editor: Active, controllable, we navigate it

Questions: Does control enhance connection or distance us?
           Is memory meant to be edited?

Next Steps

  • Research 4DGS tools and pipelines (Nerfies, D-NeRF, 4DGS papers)
  • Prototype timeline scrubbing gesture in VR
  • Test with a more personal space (not just a temple)
  • Experiment with audio/soundscape integration
  • Document the emotional difference between passive viewing and active control


Week 1 | 2026-01-27 ~ 2026-02-02