Check out these Arduino-powered research projects from CHI 2024
Held in Hawaii this year, the Association of Computing Machinery (ACM) hosted its annual conference on Human Factors in Computing Systems (CHI) that focuses on the latest developments in human-computer interaction. Students from universities all across the world attended the event and showcased how their devices and control systems could revolutionize how we interact with technology in both the real-world and virtual environments. These 12 projects presented at CHI 2024 feature Arduino at their core and demonstrate how versatile the hardware can be.
MouseRing
First on the list is MouseRing from students at Tsinghua University in Beijing that aims to give users the ability to precisely control mouse cursors with only one or two inertial measurement units (IMUs). Worn as a ring on the index finger, data collected from the MouseRing via an Arduino UNO Rev3 was used to both train a classification neural network and model the finger’s kinematics for fine-grained mouse cursor manipulation.
MobileGravity
Because objects in virtual reality are only as heavy as the controller, simulating weight has always presented a challenge, which is why five students from the University of Regensburg in Germany devised their MobileGravity concept. With it, the user can place a tracked object onto a base station where an Arduino Micro then quickly pumps in/extracts water from the object to change its weight.
AirPush
Another virtual reality device, the AirPush, is a fingertip-worn haptic actuator which gives wearers force feedback in up to eight directions and at five different levels of intensity. Through its system of an Arduino UNO, air compressor, and dual DC motors, this apparatus from students at the Southern University of Science and Technology in Shenzhen can accurately apply pressure around the finger in specific areas for use in games or training.
Robotic Metamaterial
A Robotic Metamaterial, as described by students at Carnegie Mellon University, is a structure built from repeating cells that, on their own, cannot accomplish much, but when combined in specific configurations are able to carry out very complex tasks. Some of the Arduino Mega 2560-powered cells are able to actuate, sense angles, or enable capacitive touch interactions, thus letting a lattice of cells become a capable robot.
MagPixel
Instead of using pneumatics to bend materials, this team of students from Zhejiang and Tongji universities in China has designed a modular, flexible material using magnets which they call MagPixel. An Arduino UNO powers one such digital clock application leveraging MagPixel by energizing electromagnets within a ring to move the hour “hand” around the clock face.
ArmDeformation
Proprioception, or the ability to inherently sense where limbs are in 3D space, is vital to how we navigate the world, but VR spaces can limit this ability. The ArmDeformation project from a group of Southern University of Science and Technology students in Shenzhen rests on the wearer’s forearm and then moves the skin below to simulate an external force thanks to an Arduino Mega and several DC motors.
VRScroll
Grasping and moving objects is already quite the task in VR, but sketching a picture takes it to a whole other level of difficulty. Three students from the University of Virginia, therefore, have developed a shape-changing device that attempts to match the forms present in a 3D world for the purpose of sketching. After attaching a piece of paper to the surface, the VRScroll will bend into the correct shape using its two Arduino Uno WiFi Rev 2 boards and six motors.
Desktop Biofibers
As an alternative to plastic-based fibers for use in smart textile prototyping/production, four University of Colorado-Boulder students built an open-source machine that is capable of spinning gelatine-based fibers in a compact footprint. Leveraging an Arduino Mega, the machine can spin biofibers through its heated syringe with GCODE input, thus creating a strong thread which potentially integrates wearable sensors.
ExBreath
The art of communication relies on many forms of signals- not just speaking, and harnessing the user’s breathing pattern to better communicate is ExBreath from students at Tsinghua University in Beijing. An Arduino Nano continuously monitors the breathing patterns from a wearer via a bend sensor and translates them into signals for a micro air pump. In doing so, small, externally-worn air sacs are inflated to reflect the sensed breathing pattern.
ConeAct
This smart material, called ConeAct by its creators at Carnegie Mellon University, is a modular system consisting of small cones joined together with four shape memory actuators (SMA) that either flex or become rigid at certain temperatures. An Arduino Nano coordinates the actions of each cone, and when one needs to bend, the onboard ATtiny1616 will activate its MOSFETs to begin heating the corresponding SMA wires.
Tangible Stats
Targeted to those with blindness or low vision, the Tangible Stats project from a group of students at Stanford University allows them to more easily visualize statistical data by interacting with physical objects. The Arduino Mega-driven platform senses the number of stackable tokens placed into a column and provides quick feedback. Additionally, it can tilt the row of tokens to represent a sloping line.
ActuAir
Everyone needs access to fresh, clean air, but quickly seeing the indoor air quality of somewhere like an office meeting room/lobby is difficult. ActuAir, constructed by students at Newcastle University, is a wall-sized soft robotics display powered by a several Arduino UNO R4 WiFis that can each adjust the shape and color of a wall-mounted pouch to indicate the current CO2, temperature, or humidity levels — all of which is adjustable from an external web application.