The objective of the workshop series on Earable Computing (EarComp) is to provide an academic forum and bring together researchers, practitioners, and design experts to discuss how sensory earable technologies complement human sensing research. It also aims to provide a launchpad for bold and visionary new ideas and serve as a catalyst for advancements in the emerging new earable computing space.
Earable computing devices can be an important platform for mobile health (mHealth) applications and digital phenotyping, since they allow for collection of detailed sensory data while also providing a platform for contextual delivery of interventions. In this paper we describe how the eSense earable computing platform has been integrated with a programming framework and runtime platform for the design of mHealth applications. The paper details how this programming framework can be used in the design of custom mHealth technologies. It also provide data and insight from an initial study in which this framework was used to collect real-life contextual data, including sensory data from the eSense device.
Head tracking is a fundamental component in visual attention detection, which, in turn, can improve the state of the art of hearing aid devices. A multitude of wearable devices for the ear (so called earables) exist. Current devices lack a magnetometer which, as we will show, represents a big challenge when one tries to use them for accurate head tracking.
In this work we evaluate the performance of eSense, a representative earable device, to track head rotations. By leveraging two different streams (one per earbud) of inertial data (from the accelerometer and the gyroscope), we achieve an accuracy up to a few degrees. We further investigate the interference generated by a magnetometer in an earable to understand the barriers to its use in these types of devices.
Wearable activity recognition research needs benchmark data, which rely heavily on synchronizing and annotating the inertial sensor data, in order to validate the activity classifiers. Such validation studies become challenging when recording outside the lab, over longer stretches of time. This paper presents a method that uses an inconspicuous, ear-worn device that allows the wearer to annotate his or her activities as the recording takes place. Since the ear-worn device has integrated inertial sensors, we use cross-correlation over all wearable inertial signals to propagate the annotations over all sensor streams. In a feasibility study with 7 participants performing 6 different physical activities, we show that our algorithm is able to synchronize signals between sensors worn on the body using cross-correlation, typically within a second. A comfort rating scale study has shown that attachment is critical. Button presses can thus define markers in synchronized activity data, resulting in a fast, comfortable, and reliable annotation method.
This paper seeks to explore the possibilities in earable computing with a case study of acoustical manipulation in a walking blindfold scenario. In human locomotion, veering often occurs while walking, especially within the absence of visual cues. We investigated the effect of acoustical manipulation with eSense on both Subtle and Overt conditions by conducting a series of experiments. The results showed that our acoustical manipulation reduced deviations in walking straight in the case of both Subtle and Overt conditions. We highlight one future direction for earable computing.
Head motion-based interfaces for controlling robot arms in real time have been presented in both medical-oriented research as well as human-robot interaction. We present an especially minimal and low-cost solution that uses the eSense  ear-worn prototype as a small head-worn controller, enabling direct control of an inexpensive robot arm in the environment. We report on the hardware and software setup, as well as the experiment design and early results.
Ear-worn wearable devices, or earables, are a rapidly emerging sensor platform, with unique opportunities to collect a wide variety of sensor data, and build systems with novel human-computer interaction components. At this point in the development of the field, with projects such as eSense putting hardware in researchers' hands but being limited in reach, the sharing of datasets collected by researchers with the wider community would bring a number of benefits. A central data sharing platform would enable wider participation in earables research and improve the quality of projects, as well as being a vehicle for better data quality and data protection practices. We discuss the considerations behind building such a platform, and propose an architecture that would achieve better privacy-utility trade-offs than many existing data sharing efforts.
This paper shows that inertial measurement units (IMUs) inside earphones offer a clear advantage in counting the number of steps a user has walked. While step-count has been extensively studied in the mobile computing community, there is wide consensus that false positives are common. The main reason for false positives is due to limb and device motions producing the same periodic bounce as the human walk. However, when IMUs are at the ear, we find that many of the lower-body motions are naturally "filtered out", i.e., these noisy motions do not propagate all the way up to the ear. Hence, the earphone IMU detects a bounce produced only from walking. While head movements can still pollute this bouncing signal, we develop methods to alleviate the problem. Results show 95% step-count accuracy even in the most difficult test case -- very slow walk -- where smartphone and fitbit-like systems falter. Importantly, our system STEAR is robust to changes in walking patterns and scales well across different users. Additionally, we demonstrate how STEAR also bring opportunities for effective jump analysis, often important for exercises and injury-related rehabilitation.
We explore the use of personal 'earable' devices (widely used by gym-goers) in providing personalized, quantified insights and feedback to users performing gym exercises. As in-ear sensing by itself is often too weak to pick up exercise-driven motion dynamics, we propose a novel, low-cost system that can monitor multiple concurrent users by fusing data from (a) wireless earphones, equipped with inertial and physiological sensors and (b) inertial sensors attached to exercise equipment. We share preliminary findings from a small-scale study to demonstrate the promise of this approach, as well as identify open challenges.
State-of-the-art respiration tracking devices require specialized equipment, making them impractical for every day at-home respiration sensing. In this paper, we present the first system for sensing respiratory rates using in-ear headphone inertial measurement units (IMU). The approach is based on technology already available in commodity devices: the eSense headphones. Our processing pipeline combines several existing approaches to clean noisy data and calculate respiratory rates on 20-second windows. In a study with twelve participants, we compare accelerometer and gyroscope based sensing and employ pressure-based measurement with nasal cannulas as ground truth. Our results indicate a mean absolute error of 2.62 CPM (acc) and 2.55 CPM (gyro). This overall accuracy is comparable to previous approaches using accelerometer-based sensing, but we observe a higher relative error for the gyroscope. In contrast to related work using other sensor positions, we can not report significant differences between the two modalities or the three postures standing, sitting, and lying on the back (supine). However, in general, performance varies drastically between participants.
Bruxism is a jaw-muscle condition characterized by repetitive clenching or grinding of teeth. Existing methods of detecting jaw clenching towards diagnosing bruxism are either invasive or not very reliable. As a first step towards building a reliable, non-invasive and light weight bruxism detector, we propose an eSense based in-ear inertial jaw clenching detection technique that detects peaks/dips in gyroscope vector magnitude. We also present results from preliminary experiments that show an equal error rate of 1% when the person is stationary and 4% when moving.