User Navigation Trace in 3DGS Scenes

A preview of the Playroom scene showcasing recorded user navigation path within the 3D environment.
Abstract
3D Gaussian Splatting (3DGS) is an emerging media representation that reconstructs real-world 3D scenes in high fidelity, enabling 6-degrees-of-freedom (6-DoF) navigation in virtual reality (VR). However, developing and evaluating 3DGS-enabled applications and optimizing their rendering performance, require realistic user navigation data. Such data is currently unavailable for photorealistic 3DGS reconstructions of real-world scenes. This paper introduces 👁️NavGS (EyeNavGS), the first publicly available 6-DoF navigation dataset featuring traces from 46 participants exploring twelve diverse, real-world 3DGS scenes. The dataset was collected at two sites, using the Meta Quest Pro headsets, recording the head pose and eye gaze data for each rendered frame during free world standing 6-DoF navigation. For each of the twelve scenes, we performed careful scene initialization to correct for scene tilt and scale, ensuring a perceptually-comfortable VR experience. We also release our open-source SIBR viewer software fork with record-and-replay functionalities and a suite of utility tools for data processing, conversion, and visualization. The 👁️NavGS dataset and its accompanying software tools provide valuable resources for advancing research in 6-DoF viewport prediction, adaptive streaming, 3D saliency, and foveated rendering for 3DGS scenes.
Download 👁️NavGS Dataset
The 👁️NavGS dataset includes two distinct collections, recorded independently at Rutgers University and National Tsing Hua University.
👁️NavGS Record-n-Replay Software
The 👁️NavGS Record-n-Replay Software is a fork of the SIBR viewer for 3DGS, enhanced with record-and-replay functionalities. It allows user trace recording and frame-by-frame replay of recorded navigation traces for visualization, video generation, and detailed offline analysis.
👁️NavGS Dataset Description
The 👁️NavGS (EyeNavGS) dataset provides 6-Degrees-of-Freedom (6-DoF) navigation traces from 46 participants exploring twelve diverse, real-world 3D Gaussian Splatting (3DGS) scenes. Data was collected using Meta Quest Pro headsets, recording per-frame head pose and eye gaze.
Dataset Overview:
- Participants: 46 individuals across two collection sites (Rutgers University and National Tsing Hua University).
- Scenes: Twelve diverse real-world 3DGS scenes (8 from original 3DGS paper, 4 newly trained from the ZipNeRF dataset). Includes both indoor and outdoor environments .
- Data Collected: Per-frame head position, head orientation (quaternion), eye gaze position, eye gaze orientation (quaternion), and field-of-view (FOV) for both left and right eyes. Timestamps are recorded in milliseconds.
- Hardware: Meta Quest Pro headsets with eye tracking enabled. Navigation within a 3m x 3m physical play area.
- Software: Custom fork of the SIBR viewer with Record-and-Replay functionalities. OpenXR was used for VR rendering.
- Scene Initialization: Careful per-scene initialization for tilt correction, metric scale establishment, and starting viewpoint selection to ensure perceptually comfortable VR experiences.
- Data Format: Traces are stored in structured CSV files. Utility tools are provided for conversion to other formats (e.g., JSON for NeRFstudio compatibility) and visualization.
Dataset Structure:
The dataset is organized hierarchically. Each top-level folder corresponds to a specific 3DGS scene. Within each scene folder, individual user traces are stored as CSV files:
dataset/
├── truck/
│ ├── user1_truck.csv
│ ├── user2_truck.csv
│ └── ...
├── alameda/
│ ├── user1_alameda.csv
│ └── ...
└── (other scenes)
Each CSV file contains columns as described in the paper (ViewIndex, FOV1-4, Position X/Y/Z, Quaternion X/Y/Z/W, GazePos X/Y/Z, GazeQ X/Y/Z/W, Timestamp).
Utility Tools:
- Conversion from virtual world coordinates to physical stage coordinates.
- Conversion tools (csv2json, json2csv) for compatibility with frameworks like NeRFstudio.
- Eye gaze visualization scripts (e.g., projecting gaze onto video frames).
Preview of the 12 3DGS-Reconstructed Scenes












Preview of User Movement Trajectories (3 Users, XZ-plane)












Eye Gaze Video
Example replay of a recorded user navigation trace exploring the Bicycle scene, with eye gaze visualization, using the 👁️NavGS software and utility.
👁️NavGS BibTex
If you use the 👁️NavGS dataset or software in your research, please cite:
@article{ding2025eyenavgs,
title={EyeNavGS: A 6-DoF Navigation Dataset and Record-n-Replay Software for Real-World 3DGS Scenes in VR},
author={Ding, Zihao and Lee, Cheng-Tse and Zhu, Mufeng and Guan, Tao and Sun, Yuan-Chun and Hsu, Cheng-Hsin and Liu, Yao},
journal={arXiv preprint arXiv:2506.02380},
year={2025}
}