Started on 07/22/2024
32 hours in total
An IoT input/output device equipped with RGB LEDs and various sensors allowing for advanced gesture recognition.
The Cube is a small but mighty customizable control and feedback device. Meaning it is able to notify a user by lighting up LEDs in any color, and can be interacted with to perform an action.
The Cube can be controlled via the web interface, allowing the user to change the color of the LEDs. It is able to detect the amount of light in the room. In the future I would like to use this to adjust the brightness of the LEDs based on the ambient brightness.
The Cube is able to recognize customizable gestures that can preform actions like sending a text or controlling a light. At the moment customization required new code to be uploaded to the cube. However, in the future I would like to add a simple dashboard that allows adding, editing, and removing gestures.
Physically the Cube is an old Minecraft Redstone Ore desk toy that I gutted and replaced the internals with a micro-controller and a myriad of sensors.
.env file. (This can be a phone hotspot)The Cube has a lot going on under the hood. When it's powered on it goes through a series of setup steps:
.env file containing settings, along with the certificates to securely connect to HiveMQ.Or from the Cube's point of view:
1Hello! :) 2The Cube: A project by Daniel Stoiber. 3https://danielstoiber.com/project/cube 4Starting setup... 5Connecting to [NETWORK NAME] 6.......... 7Connected to [NETWORK NAME] 8IP address: x.x.x.x 9Waiting for NTP time sync: .... 10PST Mon Jul 22 08:35:00 2024 11Number of CA certs read: 171 12Starting MPU6050 Calibration in 3 seconds... 13Calibrating MPU6050... 14Calibration offsets: aX: #.##, aY: #.##, aZ: #.##, gX: #.##, gY: #.##, gZ: #.## 15Calibration complete. 16Setup complete! 17Connected!
Once the setup is complete the Cube will start listening for gestures. I went through many iterations of the gesture recognition algorithm before I settled on the current one. I tried everything from simple if statements with some thresholds, to an attempt at TinyML. In the end I settled for something somewhere in the middle. In an attempt to demystify the gesture recognition system have sketched out how the program works in the diagram below. The current algorithm works as follows:
{aX, aY, aZ, gX, gY, gZ}
Figure 1: Cube gesture recognition program design diagram.
This project started off as a fun side project to see if I could modernize the original desk toy. Somewhere along the way it turned in to so much more. I identified a real problem, and noticed that this might be my solution.
Use case: You lift the cube and place it back down to enter brightness control mode, then moving the cube forward/right will increase the brightness of another light in the room. Moving the cube backward/left will decrease the brightness.
Swipe for more...
© 2026 Daniel Stoiber
Built with ❤️ and purpose.