Alzheimer's disease and related dementias (ADRD) are debilitating neurodegenerative disorders affecting one in every ten older Americans. In the United States, roughly 35.6 million Americans have dementia and approximately 5.7 million have Alzheimer's. By 2050, the count is expected to increase to 13.2 million older Americans. Alzheimer's is the top sixth cause of death in the U.S. People with ADRD face substantial challenges adapting to their physical, psychological and social environments. Like cognitively-intact older adults, ADRD persons prefer to age in place in a familiar environment. However, aging in one's own habitat comes with the potential for loneliness, social isolation, and the consequences of impaired mobility that threaten the displacement of ADRD persons to non-preferred living arrangements. The decrease in multi-generational living and increase in nuclear family living calls for innovative solutions to promote safe mobility, independence, decision making and communication for persons with ADRD. Assistive technologies (ATs) if user friendly and properly engineered for persons with ADRD, hold the potential to enhance daily functioning and improve the quality of life for ADRD persons in living habitats of their choices. This review paper aims to discuss existing and emerging AT megatrends in the Internet of things (IoT) era. Five technological megatrends are examined: assistive robots (e.g. assistance in daily activities, social robots for communication, telepresence robots for social connectedness), biometric sensors and movement sensor technologies (IMURs) for gait and walkability (e.g. non-wearable sensors), multimodal interaction systems for early disease detection, augmented reality systems that
According to recent statistics, depression and suicide are on a rise in the United States and elsewhere. To resolve this is- sue, synonymous to various current approaches, we propose a multi-modal robot interaction framework, which will act as an extension to current Human Robot Interaction systems to further identify studied signs of depression from various data-acoustic features, like images, video, speech, text, and in general, multi-modal data. One of the recent technologies that we plan to introduce in our resolution, is the use of social-humanoid robots (Pepper by SoftBank) to detect early signs of depression via the power of Natural Language and Multi-Modal Interactions. Rather than solely relying on the interaction between professionals and patients/individuals for treatment, the current HRI framework, offers to lower the entry barrier for potential mental health diagnosis and providing medical treatments in convenience of ones reach. To assure the psychological safety of conversation there is also a "psychological safety module" to provide professional assistance/aid for episodic-cognitive behavioral therapy. Our Multi-modal Robot Interaction (MRI) architecture contains of five modules: Multi-modal Data, Social Robot & dialogue system, psycho-linguistic feature extraction, Machine Learning & NLP methods, and Psychological Safety Feedback/Suggestion to end user and experts.
The effect of certain emotional stimuli on physiological indicators is helpful in the treatment of, amongst others, strokes and post-traumatic stress disorder (PTSD). Such knowledge can help with understanding the efficacy of rehabilitation procedures after serious debilitating medical conditions. With recent advances in Internet of Medical Things (IoMT) technology, solutions can be built to correlate psychophysiological signals. Furthermore, data mining and artificial intelligence applied to these signals can help in early intervention in cases where direct correlation is inconclusive. When combined with carefully designed and customized feedback mechanisms, AR can be of high value to the rehabilitation process. In this paper we present a platform built using IoMT sensors and augmented reality (AR) technology. The system is capable of creating ambient environments in AR simulating quantifiable emotional stimuli while measuring physiological variability. The CRADLE (Correlational Research Application Development Linking Emotions) platform can capture subject personalities, procure psychophysiological data from large scale studies, and in the future, perform data mining tasks to make recommendations about conducive environments for psychological wellness. Using CRADLE, this paper shows that heart rate variability (HRV) is impacted in a limited study of five subjects using AR to simulate emotional states. Effectiveness of rehabilitation tasks can hence be actively measured and modified without explicit feedback from subjects using this system.
Immersive technologies offer the potential to drive engagement and create exciting experiences. A better understanding of the emotional state of the user within immersive experiences can assist in healthcare interventions and the evaluation of entertainment technologies. This work describes a feasibility study to explore the effect of affective video content on heart-rate recordings for Virtual Reality applications. A low-cost reflected-mode photoplethysmographic sensor and an electrocardiographic chest-belt sensor were attached on a novel non-invasive wearable interface specially designed for this study. 11 participants responses were analysed, and heart-rate metrics were used for arousal classification. The reported results demonstrate that the fusion of physiological signals yields to significant performance improvement; and hence the feasibility of our new approach.
Commuting by car can be stressful, especially unexpected traffic jams may result in feelings of loss of control and social disconnectedness. In this paper, we present Traeddy, a teddy bear augmented with embedded technology, which serves as a wellbeing companion for car commuters in case of traffic jams. Traeddy is capable to help, for example by notifying relevant contacts about traffic jams and potential delays. We describe in detail the design process, including 20 contextual inquiries and report the evaluation of Traeddy through an online survey with 102 participants and a field study evaluating Traeddy with three commuters and two contacts in the field. The results of the field study indicate that Traeddy has a positive impact on the relationship between the commuter and the notified contact. Furthermore, the majority of the online participants anticipated Traeddy to be useful and to support their wellbeing in traffic jams.
We report on the design process and evaluation of Pen-Pen, which is a design combining a neck-cushion, a mobile app, and a multi-modal feedback loop to help commuters relax and rest during commuting hours. The design process of Pen-Pen includes a series of inquiries, which identified "support for relaxation" and "location based arrival notification" as desires of commuters, and "mindfulness" and feelings of "autonomy" as relevant determinants of commuters' wellbeing. We evaluated Pen-Pen in the field with five commuters, and through an online survey with 68 participants. Our results indicate that using Pen-Pen has the potential to increase feelings of rest and autonomy, and to foster mindfulness through the feedback loop which feeds back spatial audio based on user location and finger touch. Especially commuters who reported to be less mindful and easily stressed anticipate Pen-Pen to be useful for them.
The prevalence of IoT enabled devices in the home has manifested in buildings as voice and gesture enabled devices that enable functions such as optimizing energy usage and controlling electrical fixtures. However, most of these are fundamentally dissociated from the spatial nature of buildings as they exist in isolation from the building system and other smart devices in the same space. This situation results in user interaction experiences that are convoluted and non-intuitive, negatively affecting their adoption. Meanwhile, Building Information Models, while used extensively in a building's design and construction phases, remains underutilized in its operation phase. This paper proposes BIM as an "operating system" for smart homes to serve as the foundational platform for smart device applications. Much like the analogous operating system for computers, BIM serves to provide IoT enabled devices access to information and control over various components of the built environment. It is expected that this smart home architecture would enable seamless integration of smart home functionality across devices and buildings to provide residents with contextualized information for optimized security, comfort, and efficiency. Methods for personalizing building performance to users can also be accomplished through the proposed methodology.
Timely crowd evacuation in life-threatening situations such as fire emergency or chemical attack is a significant concern for authorities and first responders. An individual's fate in this situation is dependent on several factors, namely (i) decision dynamics: how egress strategy is selected and executed, (ii) hazard dynamics: how hazards propagate and impair the surrounding environment with time, (iii) environment layout: how space layout effects habitats' lives in emergency evacuation. This paper presents the implementation of EVAQ, a simulation tool developed by the authors in Python for evaluating emergency mapping strategies. Two person-specific egress simulation in fire emergency (an airport terminal and a shopping mall) are presented and results are discussed. Findings confirm that EVAQ can successfully simulate large crowd evacuation by modeling evacuees' personal (i.e., age, gender, disability) and interpersonal (i.e., group interactions) attributes, and situational awareness in a deteriorating environment. Results also show the effectiveness of EVAQ in simulating the impact of the space design (e.g., shape and size of rooms and obstacles, number and width of exits) in crowd evacuation. As personalized sensing and information delivery platforms are becoming more ubiquitous, findings of this work are ultimately sought to assist in developing and executing more robust and adaptive emergency mapping and evacuation plans, ultimately promoting people's lives and wellbeing.
Big data captured through sensory-integrated devices present a unique opportunity for emerging applications in urban research towards understanding, quantifying, and promoting aspects of the urban environment based on the residents' needs. We examine the effect of various built environmental features of the urban environment on individuals emotional distress, as captured through self-assessment indices and physiological measures. Through statistical analysis, we study potential associations between signal-based electrodermal activity (EDA) measures and self-reports. We further propose a novel saliency detection mechanism for physiological signals, according to which prominence of an EDA segment is calculated based on its distance metrics from the remaining signal segments. Results indicate that the signal-based and self-reported indices of emotional distress, as depicted in relation to various built environmental features, are weakly but significantly correlated. Visual inspection of the saliency-based measures further suggests that the hypothesized "emotional distress hotspots" of the urban environment, such as discontinuous sidewalks, threatening noise, and housing with broken features, elicit distinct physiological responses for the majority of participants in the experiments. These suggest the feasibility of physiological measures for quantifying emotional distress caused by the built environment and provide a foundation for the development of signal-based indices of the human felt-sense in urban settings.