Skip to content

A VR desktop app for Meta Quest devices designed to support AI-assisted moodboarding in immersive environments. It features multimodal interaction—gestures, voice, gaze, and AI—to support intuitive creative workflows and strong UX.

Notifications You must be signed in to change notification settings

PietroUras/VRMoodboarding

Repository files navigation

VRMoodboarding

This repository contains the codebase for my master's thesis:

Designing for Spatial Computing: Exploring Natural and Traditional Interfaces in a VR Moodboarding Tool

This thesis was submitted in partial fulfillment of the requirements for the Master’s Degree in Cinema and Media Engineering at the Politecnico di Torino.

The full document will be available at: https://webthesis.biblio.polito.it/


👀 Key Elements

In-Headset View External View
VR Moodboarding Interface User Wearing Headset Using the App
Image manipulation inside the virtual moodboard Gesture-based interaction with free hands

🧩 Project Overview

This project presents a VR desktop application developed for Meta Quest 2, with a strong focus on user experience (UX) and usability within immersive environments.
A custom prototyping system was designed and implemented using Figma, taking into account the field of view (FoV) and emphasizing a modular component structure. This approach aimed to simplify interaction without limiting creative possibilities, supporting both accessibility and expressiveness in the interface design.

This study investigates the effectiveness of multimodal interaction techniques within a spatial computing context, applied to a moodboarding system based on AI-generated images.

Three interaction modes are tested:

  • 🎛️ A traditional interface relying on virtual buttons
  • ✋ A gesture-driven configuration using hand tracking for key operations such as triggering image generation, initiating speech recognition, and navigating the UI
  • 🔁 A hybrid interface combining both modalities

In addition to evaluating task accuracy, speed, and user engagement, the study explores user behavior within the hybrid setup to identify which interaction method users prefer for each type of task. This provides insights into the most congenial input strategies in creative spatial workflows.


🛠️ Setup and Installation

For detailed instructions on how to install and run the project locally, please refer to the Installation Guide.


🧠 Project Details

For an in-depth explanation of the system architecture and software components, please refer to the Architecture and Interaction Design Overview.

To learn more about the user experience and design workflow, see the User Experience for Immersive Moodboarding document.


📊 Study Results

A user study involving 21 participants was conducted to evaluate which interaction mode was the most efficient and preferred. The study included a detailed analysis of the custom gesture set designed specifically for this application, as well as emerging patterns in combining multiple input types in hybrid interaction modes.

For more information about the experiment and its outcomes, see the Test Results.

🙏 Acknowledgements

I would like to express my gratitude to the Politecnico di Torino for supporting this work and providing access to the Azure Speech Recognition service, which played a fundamental role in the implementation of voice-based interaction.

Special thanks also go to the following contributors and open platforms:


📄 License

This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0).
This means you are free to:

  • 🔄 Share and adapt the material
  • 🧪 Use it for personal, research, or educational purposes

As long as you:

  • ✍️ Give appropriate credit
  • 🚫 Do not use it for commercial purposes
  • 📎 Distribute any derivative works under the same license

License: CC BY-NC-SA 4.0

About

A VR desktop app for Meta Quest devices designed to support AI-assisted moodboarding in immersive environments. It features multimodal interaction—gestures, voice, gaze, and AI—to support intuitive creative workflows and strong UX.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published