Researchers Develop Wearable Sensor System that Traverses Your Body

Descriptive image for Researchers Develop Wearable Sensor System that Traverses Your Body

Wearables like fitness trackers and smartwatches are widely used for health monitoring and digital interactivity. Yet most of these devices are worn in a single spot, even though there are various body locations best suited for specific types of sensors. It’s optimal to monitor your breathing on the upper body, for example, while the wrist is ideal for tracking typing and writing.

To place wearable sensors where they can do their best work, researchers at the University of Maryland have developed a miniature robotics system designed to assist in a myriad of tasks by traversing to numerous locations on the human body.

Their device, called Calico, mimics a toy train by traveling on a cloth track worn around the user, operating independently of external guidance through the use of magnets, sensors, and connectors.

“Our device is a fast, reliable, and precise personal assistant that lays the groundwork for future systems,” says Anup Sathya, who led Calico’s development for his master’s thesis in human-computer interaction. Sathya graduated from UMD last year and is currently a first-year Ph.D. student in computer science at the University of Chicago.

Most wearable workout devices are limited to the type of exercises they can monitor, but Calico is innovative. For example, it can track running on a user's arm, move to the elbow to count push-ups, to the back for planks, and then to the knee to count squats.

And unlike other devices, Calico moves quickly and accurately without getting stuck on clothing or at awkward angles. “For the first time, a wearable can traverse the user’s clothing with no restrictions to their movement,” says Huaishu Peng, an assistant professor of computer science who was Sathya’s adviser at UMD.

Peng, who also has an appointment in the University of Maryland Institute for Advanced Computer Studies (UMIACS), sees a future in which mini wearable devices like Calico will seamlessly integrate with humans for interaction, actuation and sensing.

Despite its several potential directions, he recently took Calico the creative route in a new collaboration with Jonathan David Martin, a lecturer in Immersive Media Design; and Adriane Fang, an associate professor at the School of Theatre, Dance and Performance Studies (TDPS).

The interdisciplinary team is combining dance, music, immersive media, robotics, and wearable technology into a novel and compelling series of interactive dance performances that are choregraphed in real time through Calico.

First, Peng’s research group programmed Calico to instruct a dancer to execute specific movements using motion and light. Then, using their smartphones, the audience gets to collectively vote on how Calico should instruct the dancer, therefore engaging them as the key component of the performance.

The project is being funding with a $15K award from the Arts for All initiative, which leverages the combined power of the arts, technology and social justice to make the University of Maryland a leader in addressing grand challenges. It is one of three funded projects that will be demonstrated at the Arts Amplification Symposium on Friday, October 14. Following the performance, the audience will have the opportunity to try on Calico for themselves to explore its possibilities and potential.

“The idea is to explore the dynamics and connections between human+robot and performer+audience,” says Peng. “In this instance, Calico will and act as the 'mediator' to broadening art and tech participation and understanding.”

Calico’s original creators include Jiasheng Li, a second-year Ph.D. student in computer science; Ge Gao, an assistant professor in the College of Information Studies (iSchool) with an appointment in UMIACS; and Tauhidur Rahman, an assistant professor in data science at the University of California, San Diego.

Their paper, “Calico: Relocatable On-clothWearables with Fast, Reliable, and Precise Locomotion,” was recently published in the top-tier journal for human-computer interaction—the ACM Journal on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT)—and presented at its corresponding conference UBICOMP, the premier conference for ubiquitous computing.

—Story by Maria Herd, UMIACS communications group

The Department welcomes comments, suggestions and corrections.  Send email to editor [-at-] cs [dot] umd [dot] edu.