Select Page

Academia

Research Publications


2024

Traditional games like “Tag” rely on shared control via inter-body interactions (IBIs) – touching, pushing, and pulling – that foster emotional and social connection. Digital games largely limit IBIs, with players using their bodies as input to control virtual avatars instead. Our “Shared Bodily Fusion” approach addresses this by fusing players’ bodies through a mediating computer, creating a shared input and output system. We demonstrate this approach with “Hidden Touch”, a game where a novel social electrical muscle stimulation system transforms touch (input) into muscle actuations (output), facilitating IBIs. Through a study (n=27), we identified three player experience themes. Informed by these findings and our design process, we mapped their trajectories across our three experiential spaces – threshold, tolerance, and precision – which collectively form our design framework. This framework facilitates the creation of future digital games where IBIs are intrinsic, ultimately promoting the many benefits of social play.

DOI: https://dl.acm.org/doi/abs/10.1145/3643834.3660723 (Open Access)

Prior research around the design of interactive systems has highlighted the benefits of supporting embodiment in everyday life. This resulted in the creation of body-centric systems that leverage movement. However, these advances supporting movement in everyday life, aligning with the embodiment theory, so far focused on sensing movement as opposed to facilitating movement. We present PneuMa, a novel wearable system that can facilitate movement in everyday life through pneumatic-based bodily extensions. We showcase the system through three examples: “Pardon?”, moving the ear forward; “Greetings”, moving a hand towards the “Bye-bye” gesture; “Take a break”, moving the hands away from the keyboard, enabling the bodily extensions that support movement in everyday life. From the thematic analysis of a field study with 12 participants, we identified three themes: bodily awareness, Perception of the scenarios, and anticipating movement. We discuss our findings in relation to prior research around bodily extensions and embodied interaction to provide strategies to design bodily extensions that support movement in everyday life. Ultimately, we hope that our work helps more people profit from the benefits of everyday movement support.

DOI: https://doi.org/10.1145/3623509.3633349

Underlying humanity’s social abilities is the brain’s capacity to interpersonally synchronize. Experimental, lab-based neuropsychological studies have demonstrated that inter-brain synchrony can be technologically mediated. However, knowledge in deploying these technologies in-the-wild and studying their user experience, an area HCI excels in, is lacking. With advances in mobile brain sensing and stimulation, we identify an opportunity for HCI to investigate the in-the-wild augmentation of inter-brain synchrony. We designed “PsiNet,” the first wearable brain-to-brain system aimed at augmenting inter-brain synchrony in-the-wild. Participant interviews illustrated three themes that describe the user experience of modulated inter-brain synchrony: hyper-awareness; relational interaction; and the dissolution of self. We contribute these three themes to assist HCI theorists’ discussions of inter-brain synchrony experiences. We also present three practical design tactics for HCI practitioners designing inter-brain synchrony, and hope that our work guides a HCI future of brain-to-brain experiences which fosters human connection.

DOI: https://doi.org/10.1145/3613904.3641983

Gaze aversion is embedded in our behaviour: we look at a blank area to support remembering and creative thinking, and as a social cue that we are thinking. We hypothesise that a person’s gaze aversion experience can be mediated through technology, in turn supporting embodied cognition. In this design exploration we present six ideas for interactive technologies that mediate the gaze aversion experience. One of these ideas we developed into “GazeAway”: a prototype that swings a screen into the wearer’s field of vision when they perform gaze aversion. Six participants experienced the prototype and based on their interviews, we found that GazeAway changed their gaze aversion experience threefold: increased awareness of gaze aversion behaviour, novel cross-modal perception of gaze aversion behaviour, and changing gaze aversion behaviour to suit social interaction. We hope that ultimately, our design exploration offers a starting point for the design of gaze aversion experiences.

DOI: https://doi.org/10.1145/3613905.3650771

The field of Sports Human-Computer Interaction (SportsHCI) investigates interaction design to support a physically active human being. Despite growing interest and dissemination of SportsHCI literature over the past years, many publications still focus on solving specific problems in a given sport. We believe in the benefit of generating fundamental knowledge for SportsHCI more broadly to advance the field as a whole. To achieve this, we aim to identify the grand challenges in SportsHCI, which can help researchers and practitioners in developing a future research agenda. Hence, this paper presents a set of grand challenges identified in a five-day workshop with 22 experts who have previously researched, designed, and deployed SportsHCI systems. Addressing these challenges will drive transformative advancements in SportsHCI, fostering better athlete performance, athlete-coach relationships, spectator engagement, but also immersive experiences for recreational sports or exercise motivation, and ultimately, improve human well-being.

DOI: https://doi.org/10.1145/3613904.3642050

2023

Prior research has offered a plethora of wearables centred around sensing bodily actions ranging from more explicit data, such as movement and physiological response, to implicit information, such as ocular and brain activity. Bodily augmentations that physically extend the user’s body along with altering body schema and image have been proposed recently as well, owing to factors such as accessibility and improving communication. However, these attempts have usually consisted of uncomfortable interfaces that either restrict the user’s movement or are intrusive in nature. In this work, we present Pneunocchio, a playful nose augmentation based on the lore of Pinocchio. Pneunocchio consists of a pneumatic-based inflatable that a user wears on their nose to play a game of two truths and a lie. With our work, we aim to explore expressive bodily augmentations that respond to a player’s physiological state that can alter the perception of their body while serving as an expressive match for a current part of the body.

DOI: https://doi.org/10.1145/3586182.3616651

Most digital bodily games focus on the body as they use movement as input. However, they also draw the player’s focus away from the body as the output occurs on visual displays, creating a divide between the physical body and the virtual world. We propose a novel approach – the ”Body as a Play Material” – where a player uses their body as both input and output to unify the physical body and the virtual world. To showcase this approach, we designed three games where a player uses one of their hands (input) to play against the other hand (output) by loaning control over its movements to an Electrical Muscle Stimulation (EMS) system. We conducted a thematic analysis on the data obtained from a field study with 12 participants to articulate four player experience themes. We discuss our results about how participants appreciated the engagement with the variety of bodily movements for play and the ambiguity of using their body as a play material. Ultimately, our work aims to unify the physical body and the virtual world.

DOI: https://doi.org/10.1145/3611054

Spectating digital games can be exciting. However, due to its vicarious nature, spectators often wish to engage in the gameplay beyond just watching and cheering. To blur the boundaries between spectators and players, we propose a novel approach called “Fused Spectatorship”, where spectators watch their hands play games by loaning bodily control to a computational Electrical Muscle Stimulation (EMS) system. To showcase this concept, we designed three games where spectators loan control over both their hands to the EMS system and watch them play these competitive and collaborative games. A study with 12 participants suggested that participants could not distinguish if they were watching their hands play, or if they were playing the games themselves. We used our results to articulate four spectator experience themes and four fused spectator types, the behaviours they elicited and offer one design consideration to support each of these behaviours. We also discuss the ethical design considerations of our approach to help game designers create future fused spectatorship experiences.

DOI: https://doi.org/10.1145/3611049

Water’s pleasant nature and associated health benefits have captivated the interest of HCI researchers. Prior WaterHCI work mainly focused on advancing instrumental applications, such as improving swimming performance, and less on designing systems that support interacting with technology in water in more playful contexts. In this regard, we propose floatation tanks as research vehicles to investigate the design of playful interactive water experiences. Employing somaesthetic design, we developed a playful extended reality floatation tank experience: “Fluito”. We conducted a 13-participant study to understand how specific design features amplified participants’ water experiences. We used a postphenomenological lens to articulate eight strategies useful for designers aiming to develop digital playful experiences in water, such as designing to call attention to the water and designing to encourage breathing and body awareness in water experiences. Ultimately, we hope that our work supports people to be playful and benefit from the many advantages of being in water.

DOI: https://doi.org/10.1145/3611056

Our bodies play an important part in our remembering practices, for example when we can remember passwords by typing, even if we cannot verbalise them. An increasing number of technologies are being developed to support remembering. However, so far, they seem to have not taken the opportunity yet to support remembering through bodily movements. To better understand how to design for such embodied remembering, we conducted a diary study with 12 participants who recorded their embodied remembering experiences in everyday life over a three-week period. Our thematic analysis of the diaries and interviews led to the creation of a framework that helps understand embodied remembering experiences (ERXs) based on the level of skilled and conscious movements used. We describe how this ERX framework could help with the design of technologies to support embodied remembering.

DOI: https://dl.acm.org/doi/abs/10.1145/3563657.3595999

With technologies becoming increasingly intelligent, the interaction paradigm of Human-Computer Integration where computers and human form a partnership emerged. Most of these works considered computers as separate from users’ embodiment. However, in recent years, technologies are becoming increasingly closer and even interwoven with the human body. Our work asks whether computers can incorporate into one’s embodiment and form a partnership. We call this integrated embodiment. Such a paradigm might facilitate a more direct and intimate partnership between humans and computers. To exemplify the paradigm, we present AI-in-the-Shell, an exoskeleton-based system that enables users to experience having an AI residing in their body. The AI-powered system can make independent decisions and actuate the user’s body to better support their daily tasks and experiences, e.g., to enhance their game performance. We hope this work can extend the current understanding of Human-Computer Integration, and step towards a more complete understanding of integrated embodiment.

DOI: https://dl.acm.org/doi/abs/10.1145/3505270.3558367

Video: https://www.youtube.com/watch?v=JZj_COcg9Vw

2022

Bodily games often use players’ physiology as input to provide output via screen-based modalities. Game design researchers could extend the use of the body as input and output (I/O) by using body-actuating technologies such as Electrical Muscle Stimulation (EMS). EMS works by passing a small amount of electricity via electrodes attached to the player’s body, contracting their muscles to actuate involuntary body movements. Our work explores this bodily I/O by creating three “body-actuated play” systems ranging from single-player to social game experiences. Ultimately, by studying the associated user experiences of these systems, we will deduce a prescriptive design framework for designing bodily games in which humans can use their bodies as input and output.

DOI: https://dl.acm.org/doi/abs/10.1145/3505270.3558367

Human-Computer Interaction (HCI) researchers are increasingly captivated by water interactions and hence explored interactive devices to support aquatic activities in different settings (e.g., mixed realities in water parks). However, our understanding of the user experience in interacting with water and technology is still underdeveloped. To begin closing this gap, we explore flotation tanks as a water setting for playful interactive experiences. The goal of the associated somaesthetic design approach was to sensitize the body of the designer (the first author) by engaging with her experiences interacting with water and create meaningful interactions. This preliminary work presents four different user experiences that can facilitate play through a defamiliarization analysis of water interaction with the body mediated by technology. We offer HCI insights for design researchers interested in creating playful experiences in water settings, while also providing industry with initial strategies on how to enrich flotation tank sessions.

DOI: https://dl.acm.org/doi/abs/10.1145/3505270.3558324

Body-actuating technologies such as Electrical Muscle Stimulation (EMS) can actuate multiple players simultaneously via physical touch. To investigate this opportunity, we designed a game called “Touchmate”. Here, one guesser and two suspects sit across with their legs hidden under a table. The guesser attaches a ground electrode from one EMS channel, and each suspect attaches one active electrode from the same channel on their forearms. When a suspect touches the guesser’s leg, their bodies complete the electrical circuit, actuating both their hands involuntarily via the EMS. The guesser’s goal is to determine who touched their leg. In this paper, we present the results from our initial study and articulate three player experience themes. Ultimately, we hope our work inspires game designers to create physical touch games using body-actuating technologies.

DOI: https://dl.acm.org/doi/abs/10.1145/3505270.3558332

Our gastrointestinal health is influenced by complex interactions between our gut bacteria and multiple external factors. A wider understanding of these concepts is vital to help make gut-friendly decisions in everyday life; however, its complexity can challenge public understanding if not approached systematically. Research suggests that board games can help to playfully navigate complex subjects. We present Gooey Gut Trail (GGT), a board game to help players understand the multifactorial interactions that influence and sustain gut microbial diversity. Through the embodied enactment of in-game activities, players learn how their habits surrounding diet, physical activity, emotions, and lifestyle influence the gut microbial population. A qualitative field study with 15 participants revealed important facets of our game design that increased participants’ awareness, causing them to reflect upon their habits that influence gut health. Drawing upon the study insights, we present five design considerations to aid future playful explorations on nurturing human-microbial relationships.

DOI: https://dl.acm.org/doi/abs/10.1145/3549502

Applying the theory of Embodied Cognition through design allows us to create computational interactions that engage our bodies by modifying our body schema. However, in HCI, most of these interactive experiences have been stationed around creating sensing-based systems that leverage our body’s position and movement to offer an experience, such as games using Nintendo Wii and Xbox Kinect. In this work, we created two pneumatic inflatables-based prototypes that actuate our body to support embodied cognition in two scenarios by altering the user’s body schema. We call these ”SomaFlatables” and demonstrate the design and implementation of these inflatables based prototypes that can move and even extend our bodies, allowing for novel bodily experiences. Furthermore, we discuss the future work and limitations of the current implementation.

DOI: https://dl.acm.org/doi/abs/10.1145/3526114.3558705

Aquatic recreation encompasses a variety of water-based activities from which participants gain physical, mental, and social benefits. Although interactive technologies for supporting aquatic recreation activities have increased in recent years, the HCI community does not yet have a structured understanding of approaches to interaction design for aquatic recreation. To contribute towards such an understanding, we present the results of a systematic review of 48 papers on the design of interactive technology for aquatic recreation, drawn from the ACM, IEEE, and SPORTDiscus libraries. This review presents an aquatic recreation user experience framework that details problems and opportunities concerning water and HCI. Our framework brings us closer to understanding how technology can interact with users and the aquatic environment to enhance the existing recreational experiences that connect us to aquatic environments. We found that designers can elicit delight, enablement, challenge, and synergy in aquatic recreation experiences.

DOI: https://dl.acm.org/doi/abs/10.1145/3532106.3533543

2021

Motor movements are performed while playing hand-games such as Rock-paper-scissors or Thumb-war. These games are believed to benefit both physical and mental health and are considered cultural assets. Electrical Muscle Stimulation (EMS) is a technology that can actuate muscles, triggering motor movements and hence offers an opportunity for novel play experiences based on these traditional hand-games. However, there is only limited understanding of the design of EMS games. We present the design and evaluation of two games inspired by traditional hand-games, “Slap-me-if-you-can” and “3-4-5”, which incorporate EMS and can be played alone, unlike traditional games. A thematic analysis of the data collected revealed three themes: 1) Gameplay experiences and influence of EMS hardware, 2) Interaction with EMS and the calibration process and, 3) Shared control and its effect on playing EMS games. We hope that an enhanced understanding of the potential of EMS to support hand-games can aid the advancement of movement-based games as a whole.

DOI: https://dl.acm.org/doi/abs/10.1145/3450337.3483464

Myopia is an eye condition that makes it difficult to focus on objects in the distance. It has become one of the most serious eye conditions worldwide and negatively impacts the quality of life. Although myopia is prevalent, many non-myopic people have misconceptions about it and encounter challenges empathizing with myopia situations. In our game, we developed two virtual reality (VR) games, Myopic Bike and Say Hi, to provide a means for non-myopic population to experience the frustration and difficulties of myopic people, i.e., riding a bicycle and greeting someone on a street. Our games simulate two inconvenient daily life scenarios that myopic people encounter when not wearing glasses. We evaluated four participants’ game experiences through questionnaires and semi-structured interviews. We propose that our two VR games can create an engaging and non-judgemental experience for the non-myopic population to better understand and empathize with those who suffer from myopia.

DOI: https://dl.acm.org/doi/abs/10.1145/3450337.3483505

Human-Computer Integration (HInt) is a growing paradigm within HCI which seeks to understand how humans can, and already are, merging with computational machines. HInt’s recent inception and evolution has seen much discussion in a variety of symposiums, workshops, and publications for HCI. This has enabled a democratized and decentralised emergence of its core concepts. While this has allowed for rapid growth in our understanding of HInt, there is some discrepancy in how the proponents of this movement might describe its principles, motivations, definitions, and ultimate goals, with many offshoot concepts of HInt beginning to emerge. SIGHint aims to provide a platform to facilitate high level discussion and collation of information between researchers and designers seeking to learn from and contribute to the development of Human-Computer Integration. It is our intention that through this SIG we may better understand how new and emerging, diverging ideas, and perspectives within Human-Computer Integration relate to each other, ultimately facilitating a mapping of the paradigm and a synthesis of its concepts.

DOI: https://dl.acm.org/doi/10.1145/3411763.3450400

With the popularity of online access in virtual reality (VR) devices, it will become important to investigate exclusive and interactive CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart) designs for VR devices. In this paper, we first present four traditional two-dimensional (2D) CAPTCHAs (ie, text-based, image-rotated, image-puzzled, and image-selected CAPTCHAs) in VR. Then, based on the three-dimensional (3D) interaction characteristics of VR devices, we propose two vrCAPTCHA design prototypes (ie, task-driven and bodily motion-based CAPTCHAs). We conducted a user study with six participants for exploring the feasibility of our two vrCAPTCHAs and traditional CAPTCHAs in VR. We believe that our two vrCAPTCHAs can be an inspiration for the further design of CAPTCHAs in VR.

DOI: https://dl.acm.org/doi/abs/10.1145/3411763.3451985

Interest in combining interactive play and the human body, using “bodily play” systems, is increasing. While these systems primarily prioritize a player’s control over their bodily actions, we see intriguing possibilities in the pursuit of “limited control over the body” as an intriguing design resource for bodily play systems. In this paper, we use three of our bodily play systems to illustrate how designers can engage with limited control over the body by varying the player’s degree of indirect control (for instance, via other bodily activity and external triggers). We also propose four strategies for employing limited control over the body: Exploration, Reflection, Learning and Embracement. We hope our own work and the strategies developed from it will assist designers to employ limited control over the body, ultimately helping people benefit from engaging their bodies through play.

DOI: https://dl.acm.org/doi/10.1145/3411764.3445744

2020

Somaesthetics – motivated by improving life quality via appreciation for bodily and sensory experiences – is increasingly influencing HCI designs. Investigating the potential of drones as a material for somaesthetic HCI, we designed Drone Chi: a Tai Chi-inspired close-range human-drone interaction experience. The design process for Drone Chi has been informed by the soma design approach and the Somaesthetic Appreciation concept from HCI literature. The artifact expands somaesthetic HCI by exemplifying dynamic and intimate somaesthetic interactions with a robotic design material, and body movements in expansive 3D space. To characterize the Drone Chi experience, we conducted an empirical study with 32 participants. Analysis of participant accounts revealed 4 themes that articulate different aspects of the experience: Looping Mental States, Environment, Agency vs. Control, and Physical Narratives. From these accounts and our craft knowledge, we derive 5 design implications to guide the development of movement-based close-range drone interactions.

DOI: https://dl.acm.org/doi/abs/10.1145/3313831.3376786

Bodily play systems are becoming increasingly prevalent, with research aiming to understand the associated player experience. We argue that a more nuanced lexicon describing “bodily play experience” can be beneficial to drive the field forward. We provide game designers with two German words to communicate two different aspects of experience:“Erfahrung”, referring to experience where one is actively engaged in and gains knowledge from; and “Erlebnis”, referring to a tacit experience often translated as “lived experience”. We use these words to articulate a suite of design strategies for bodily play experiences by referring to past design work. We conclude by discussing these two aspects of experience in conjunction with two previously established perspectives on the human body. We believe this more nuanced lexicon can provide a clearer understanding for designers about bodily play allowing them to guide players in gaining the many benefits from such experiences.

DOI: https://dl.acm.org/doi/abs/10.1145/3374920.3374926

There is an increasing trend in utilising interactive technology for bodily integrations, such as additional limbs and ingestibles. Prior work on bodily integrated systems mostly examined them from a productivity perspective. In this article, we suggest examining this trend also from an experiential, playful perspective, as we believe that these systems offer novel opportunities to engage the human body through play. Hence, we propose that there is an opportunity to design “bodily integrated play”. By relating to our own and other’s work, we present an initial set of design strategies for bodily integrated play, aiming to inform designers on how they can engage with such systems to facilitate playful experiences, so that ultimately, people will profit from bodily play’s many physical and mental wellbeing benefits even in a future where machine and human converge.

DOI: https://dl.acm.org/doi/abs/10.1145/3374920.3374931

2018

Ingestible sensors, such as capsule endoscopy and medication monitoring pills, are becoming increasingly popular in the medical domain, yet few studies have considered what experiences may be designed around ingestible sensors. We believe such sensors may create novel bodily experiences for players when it comes to digital games. To explore the potential of ingestible sensors for game designers, we designed a two-player game – the “Guts Game” – where the players play against each other by completing a variety of tasks. Each task requires the players to change their own body temperature measured by an ingestible sensor. Through a study of the Guts Game (N=14) that interviewed players about their experience, we derived four design themes: 1) Bodily Awareness, 2) Human-Computer Integration, 3) Agency, and 4) Uncomfortableness. We used the four themes to articulate a set of design strategies that designers can consider when aiming to develop engaging ingestible games.

DOI: https://dl.acm.org/doi/abs/10.1145/3242671.3242681

There is an increasing trend in HCI on studying human-food interaction, however, we find that most work so far seems to focus on what happens to the food before and during eating, i.e. the preparation and consumption stage. In contrast, there is a limited understanding and exploration around using interactive technology to support the embodied plate-to-mouth movement of food during consumption, which we aim to explore through a playful design in a social eating context. We present Arm-A-Dine, an augmented social eating system that uses wearable robotic arms attached to diners’ bodies for eating and feeding food. Extending the work to a social setting, Arm-A-Dine is networked so that a person’s third arm is controlled by the affective responses of his/her dining partner. From the study of Arm-A-Dine with 12 players, we articulate three design themes: Reduce bodily control during eating; Encourage savouring by drawing attention to sensory aspects during eating; and Encourage crossmodal sharing during eating to assist game designers and food practitioners in creating playful social eating experiences. We hope that our work inspires further explorations around food and play that consider all eating stages, ultimately contributing to our understanding of playful human-food interaction.

DOI: https://dl.acm.org/doi/abs/10.1145/3242671.3242710

Games research in HCI is continually interested in the human body. However, recent work suggests that the field has only begun to understand how to design bodily games. We propose that the games research field is advancing from playing with digital content using a keyboard, to using bodies to play with digital content, towards a future where we experience our bodies as digital play. To guide designers interested in supporting players to experience their bodies as play, we present two phenomenological perspectives on the human body (Körper and Leib) and articulate a suite of design tactics using our own and other people’s work. We hope with this paper, we are able to help designers embrace the point that we both “have” a body and “are” a body, thereby aiding the facilitation of the many benefits of engaging the human body through games and play, and ultimately contributing to a more humanized technological future.

DOI: https://dl.acm.org/doi/abs/10.1145/3173574.3173784

2017

Regular breathing exercises can be a beneficial part of leading a healthy life. Digital games may have the potential to help people practice breathing exercises in an engaging way, however designing breathing exercise games is not well understood. To contribute to such an understanding, we created Life Tree as the culmination of three prototypal breathing games. Life Tree is a virtual reality (VR) game in which a player controls the growth of a tree by practising pursed-lip breathing. We selected VR head-mounted display technology because it allows players to focus and limit external distractions, which is beneficial for breathing exercises. 32 participants played Life Tree and analysis of the collected data identified four key themes: 1) Designing Breathing Feedback; 2) Increasing Self-Awareness of Breathing and Body; 3) Facilitating Focused Immersion; and, 4) Engagement with Breathing Hardware. We used these themes to articulate a set of breathing exercise game design strategies that future game designers may consider to develop engaging breathing exercise games.

DOI: https://dl.acm.org/doi/abs/10.1145/3116595.3116621

2016

In recent years, attention has increased to digital breathing games via new technology that allows interaction between breathing and video games. While some breathing games use breath as a fun form of interaction, other games use the breath to improve the mental health aspects of a player to reduce stress and anxiety. So far, little research has been devoted to understanding breathing game design. To develop an understanding towards the design of breathing games, we begin by proposing a taxonomy depending on the factors of game genre, game design analysis based on the human body senses involved, breathing technique used, aim of the breathing technique, technology used to experience the game world and technology used to measure breathing. To demonstrate the strength of our taxonomy, we analyze example games and discuss how the novel taxonomy could help game designers create breathing games.

DOI: https://exertiongameslab.org/wp-content/uploads/2016/05/breathsenses_multisensory_chi2016.pdf


Journals


2020

Theme parks visits can be very playful events for families, however, waiting in the ride’s queues can often be the cause of great frustration. We developed a novel augmented reality game to be played in the theme park’s queue, and an in-the-wild study with X participants using log data and interviews demonstrated that every minute playing was perceived to the same extent of about 5 minutes of not playing the game. We articulate a design space for researchers and strategies for game designers aiming to reduce perceived waiting time in queues. With our work, we hope to extend how we use games in everyday life to make our lives more playful.

DOI: https://dl.acm.org/doi/abs/10.1145/3361524


Workshops


2024

The human-computer interaction community has evolved from using body-sensing to body-actuating technologies, transforming the body’s role from a mere input to an input-output medium. With body-sensing, the separation between the human and the computer is clear, allowing for an easy understanding of who is in control. However, with body-actuating technologies, this separation diminishes. These technologies integrate more closely with our bodies, where both the user and the technology can share control over their bodily interactions. In this workshop, we will explore this notion of sharing control, specifically focusing on experiences where users interact with their own bodies (intra-corporeal experiences), and interact with others using technology (inter-corporeal experiences). Our discussions and group activities will focus on brainstorming and designing within human augmentation, examining how this shared control can lead to innovative applications.

DOI: https://doi.org/10.1145/3652920.3653037

2023

Assistive Augmentation, the intersection of human-computer interaction, assistive technologies and human augmentation, was broadly discussed at the CHI’14 workshop and subsequently published as an edited volume on Springer Cognitive Science and Technology series. In this workshop, the aim is to propose a more structured way to design Assistive Augmentations. In addition, we aim to discuss the challenges and opportunities for Assistive Augmentations in light of current trends in research and technology. Participants of the workshop need to submit a short position paper or interactive system demonstration, which will be peer-reviewed. The selected position papers and demos will kick off a face-to-face discussion at the workshop. Participants will also be invited to extend the workshop discussion into a journal submission to a venue such as the Foundations and Trends in Human-Computer Interaction.

DOI:https://dl.acm.org/doi/abs/10.1145/3582700.3582729

2021

People engage in sportive activities for reasons beyond improving their athletic performance. They also seek experiences like fun, adventure, a feeling of oneness, clear their heads, and flow. Since sport is a highly bodily experience, we argue that taking an embodied interaction perspective to inspire interaction design of sports systems is a promising direction in HCI research and practice. This workshop will address the challenges of designing interactive systems in the realm of sports from an embodied interaction perspective focusing on athletes’ experience rather than performance. We will explore how interactive systems enhance sports experience without distracting from the actual goal of the athlete, such as freeing the mind. We will focus on several topics of interest such as sensory augmentation, augmented experience, multi-modal interaction, and motor learning in sports.

DOI: https://dl.acm.org/doi/10.1145/3411763.3441329

Web: https://sports-hci.com/

While many systems have successfully demonstrated functional integration of humans and technology, little attention has been paid to how technologies might experientially integrate to feel as part of humans. Our aim is to shed light on the importance of experiential integration and provide researchers with a scientifically driven foundation for future designs and investigations. The workshop will consist of hands-on experiments with novel body-illusions, discussions on experiential integration, and instructor-guided sessions on psychological concepts related to the design and evaluation of experiential integration.

DOI: https://dl.acm.org/doi/10.1145/3411763.3441355

Web: https://cyborgdreams.media.mit.edu/

2020

Inbodied interaction is an emerging area in HCI that aligns how the body performs internally with our designs to support and optimise human performance. Inbodied Interaction, therefore, relies on knowledge of our physiology/neurology/kinesiology etc, to blend with HCI methodology. Recent, Inbodied Interaction workshops and summer schools, have been designed to share models of these processes to accelerate access to these areas of specialisation for HCI researchers. As such this one-day-hands-on-studio presents an extension of this work – an Inbodied interaction framework – to (1) make inbodied sciences accessible and (2) usable for HCI practitioners when it comes to crafting experiences, whether for health, performance or play. Our framework also offers a design alternative to cyborging futures that seek to augment human performance, Inbodied Interaction seeks to help discover and optimise human potential. As such, in this studio, we will explore where inbodied interaction fits in the narrative of our future bodies.

DOI: https://dl.acm.org/doi/abs/10.1145/3374920.3374969

There is mounting evidence acknowledging that embodiment is foundational to cognition. In HCI, this understanding has been incorporated in concepts like embodied interaction, bodily play, and natural user-interfaces. However, while embodied cognition suggests a strong connection between motor activity and memory, we find the design of technological systems that target this connection to be largely overlooked. Considering this, we are provided with an opportunity to extend human capabilities through augmenting motor memory. Augmentation of motor memory is now possible with the advent of new and emerging technologies including neuromodulation, electric stimulation, brain-computer interfaces, and adaptive intelligent systems. This workshop aims to explore the possibility of augmenting motor memory using these and other technologies. In doing so, we stand to benefit not only from new technologies and interactions but also a means to further study cognition.

DOI: https://dl.acm.org/doi/abs/10.1145/3334480.3375163

Web: https://motorhci.com/


Books


2022

Human-Computer Integration (HInt) is an emerging new paradigm in the human-computer interaction (HCI) field. Its goal is to integrate the human body and the computational machine. This monograph presents two key dimensions of Human-Computer Integration (bodily agency and bodily ownership) and proposes a set of challenges that we believe need to be resolved in order to bring the paradigm forward. Ultimately, our work aims to facilitate a more structured investigation into human body and computational machine integration.

DOI: https://www.nowpublishers.com/article/Details/HCI-086


Master Thesis


For leading a healthy life, regular breathing exercises can be beneficial. Digital games may have the potential to help people practice breathing exercises in an engaging way. However, not much understanding exists to design such breathing exercise games. To contribute to such an understanding, I created three Virtual Reality (VR) prototype games that used breathing as the primary control mechanism. I selected virtual reality headmounted display technology because it allows players to focus on the virtual environment and limit external distractions, which is beneficial for breathing exercises. I conducted a formal analysis of gameplay on my three prototypes to develop my final game Life Tree. Life Tree is a single player VR game in which a player controls the growth of a virtual tree by practicing pursed-lip breathing technique. Thirty-two participants were interviewed and filled out a questionnaire after playing Life Tree. Analysis of the data identified four key themes of affecting the experience of participants: 1) Designing Breathing Feedback; 2) Increasing Self-awareness of Breathing and Body; 3) Facilitating Focused Immersion and, 4) Engagement with Breathing Hardware. I used these themes to articulate a set of breathing exercise game design strategies. Game designers may consider these design strategies to develop engaging breathing exercise games.

Master’s Thesis [PDF, 3.55 MB]

WhatsApp chat