UNREAL ENGINE + ARDUINO UNO
REAL TIME ANIMATION INSTALLATION
INTERACTION DESIGN, UX/UI FOR THE METAVERSE ERA
THE VIDEO OVERVIEW
THE MAKING OF
OVERVIEW

When Worlds Collide is an interactive installation that spans two interconnected realms:
the physical world, navigated through an Arduino UNO microcontroller mounted beneath a table and connected to a custom circuit, and the virtual world, built in Unreal Engine and running in real time. The transition between these worlds is enabled by sending data from the physical interface into the virtual environment via a serial communication plugin.
The piece offers a gamified experience in which the audience interacts with capacitive touch sensors, proximity sensors, a stretch sensor, and a grip sensor—bringing back the sense of physicality often missing in immersive media accessed through HMDs (Head-Mounted Displays). As Slater highlights in his research on Plausibility and Place Illusion, interaction in VR is often not considered “valid” because users touch virtual objects but feel no tactile feedback. This installation reintroduces physical engagement to address that gap.
In the virtual world, characters inhabit a theatre-like scene. They begin in an idle, subtle breathing state and remain so until user input is detected from the sensors on the table. Once activated, the characters move according to the game logic: Unreal Engine retrieves data from Arduino, checks predefined conditions, and calls the corresponding animation blueprints—specifically, the relevant states within each character’s animation state machine.
The final work is not a rendered video but a packaged Unreal Engine application for Windows. It requires a live Arduino connection using the exact COM port and baud rate specified in the Unreal plugin blueprint. These settings must match the Arduino code uploaded to the microcontroller and the wiring of the custom circuit, both of which are explained in the making-of video.
This is a data-driven, real-time application, where character movement responds directly to user interaction with physical sensors. While the animation states are predefined, Unreal’s blend spaces allow for real-time interpolation based on incoming data. One example—shown on the poster—is a toggle sensor that makes two central characters begin walking. When the toggle remains on, a neighbouring proximity sensor controls the interpolation between walking in place, walking forward, running, and finally speed-running. As the user’s hand gradually approaches the cushion, the character accelerates until full contact triggers a “Naruto-like” sprint.
To achieve this, each sensor had to be calibrated. I measured their output ranges, mapped those ranges in Arduino, cleaned the data, sent it to Unreal, stored it globally, and then retrieved it locally within the blueprints.
In this sense, the installation functions as data visualisation through digital body motion.
The project explores the role of animation in user experience, focusing on the visual feedback loop generated by user actions. Each action triggers a corresponding animated reaction, reinforcing the user’s sense of agency.
The interaction design is grounded in the motility of the user’s physical body, which is fundamental in VR and essential for enhancing avatar fidelity.
Ultimately, this work proposes a novel interaction model for the metaverse era—one that moves beyond keyboards and mice and instead relies on the user’s body, while still maintaining physical contact with an interface, either directly (touch) or indirectly (proximity).
Back to Top