Project Title: Venus Flytrap – Mixed Reality Game Experience
Platform: Meta Quest 3
Software: Unity (with Meta XR SDK), Arduino (for physical computing integration)
ovERview
This project was developed in response to a design brief focused on creating an engaging experience through animation and storytelling, with a strong emphasis on user interaction and immersive feedback. The brief encouraged experimentation with multiverse concepts—blending physical and virtual worlds—to create a compelling and thought-provoking experience. I approached this challenge by designing a mixed reality (MR) game for the Meta Quest 3, developed in Unity and combined with physical computing using Arduino. The project centers on user agency, responsive visual feedback, and the seamless integration of real-world components that dynamically respond to actions within the virtual environment.
Concept & Gameplay Summary
The game is built around a Venus Flytrap theme and designed as a seated, single-player mixed reality experience. The player is surrounded by hostile “flytrap monsters” and must defeat them to rescue a trapped fly. This fly exists not only in the virtual world but is also physically represented through a mechanical flytrap installation in the real world.
At its core, the experience explores the idea that what happens in VR doesn’t necessarily stay in VR. By linking in-game success to physical outcomes, the project questions the boundaries between virtual action and real-world consequence.

THE DETAILS
Core Mechanics
The player shoots flytrap monsters, each requiring three hits to defeat
Defeating all enemies triggers a real-world reward: the physical flytrap opens, releasing the fly
Fail condition: if monsters reach the player, health depletes and the game ends

Animation & Visual Feedback
A key focus of the project was using animation as a primary feedback system to reinforce player agency and immersion.
State-based animations: Each monster responds visually to damage using Unity’s Animator Controller, transitioning through three distinct states that represent remaining health
Colour feedback: Monsters shift from vibrant green to desaturated red as they take damage
Stylised death effects: Upon defeat, stop-motion-inspired animated flies burst from the monster’s abdomen, acting as a visual reward
Audio cues: Sound design complements animation states, strengthening the connection between player action and feedback
Together, these elements transform simple interactions into sensory-rich, meaningful moments.
Technical Implementation
Unity + Meta XR SDK
The core game logic, animation systems, and mixed reality features were developed in Unity using the Meta XR SDK, enabling passthrough, spatial anchoring, and controller/hand input.
Arduino Integration
The physical flytrap installation communicates with Unity via serial communication. In-game events—such as defeating all monsters—trigger real-world servo movements.
Pathfinding with NavMesh Agents
Monsters navigate the environment using Unity’s NavMesh system, moving toward the player while avoiding obstacles and interacting with the player collider.
Projectile & Hit Detection System
Shooting mechanics were implemented in C#, instantiating projectile prefabs with physics-based force and collision detection. Damage is registered only when bullets hit the monster’s abdominal area.
Game State Management
Player health, enemy count, scoring, and progression are managed through a custom game state system, ensuring clear feedback and pacing.
Back to Top