TechX accepted EMAC proceedings publications

Various papers presented by the TechX lab team at the EMAC 2020 conference have been accepted.

Good Buzz, Bad Buzz: Mobile Vibrations as Rewards and Modifiers of Consumer Choice

Author: William Hampton

Many people spend a large portion of their day interacting with vibrating mobile devices, yet how we perceive the vibrotactile sensations emitted by these devices, and their effect on consumer choice is largely unknown. Building on haptic sensory processing research, we examine the functional relationship between vibration duration and perception, and vibrational stimuli as rewards and modifiers of choice. We find that mobile vibrations of an intermediate duration are consistently perceived as rewarding and can boost purchasing in an ecological online shopping environment, whereas short or long durations are perceived as neutral or punishing, respectively. We further show that these effects are amplified for impulsive consumers and relate to a range of demographic and psychological trait variables. Our findings have important implications for the effective design of haptic human-machine interfaces in marketing and the role of vibrotactile stimuli as a novel form of reward.

Morphing Vulnerable Machines: Paralinguistic Cues in Digital VoiceAssistants Shape Perceptions of Physicality, Vulnerability,And Trust

Author: Fotios Efthymiou

Paralinguistic features in the human voice are reliable cues to detect discrete momentary emotions and personality inferences in other humans. By listening to different paralinguistic features in the human voice people infer differences in personality (Mohammadi, Origlia, Filippone, & Vinciarelli, 2012). The current work provides evidence from a series of tightly controlled experiments that altering digital voice assistants along the vibrato dimension (i.e., systematic changes in the pitch of a synthesized voice) causes systematic changes in personality perception and trust, while holding critical dimensions of message content, syntax, and other paralinguistic cues constant. Our results demonstrate that humans attribute greater submissiveness, lower dominance, and reduced perceptions of power to a digital voice assistant with increasing vibrato. We further show that these perceptions are explained by altering the perception of physicality such that greater vibrato results in perceiving the voice assistant as older and smaller. Moreover, we show that these changes in perception cause subsequent attributions of trust and are robust across a broad range of consumer demographics and psychological trait measures.

Machine Talk: How Conversational Interfaces Promote Brand Intimacy and Influence Consumer Choice

Author: Anouk Bergner

This work examines the effects of conversational interfaces on consumers’ brand perceptions and purchase decisions. We introduce a conceptual model of technology-mediated communication that builds on insights from prior work on human-to-human conversations and mind perception, and that incorporates both structural aspects of conversations (turn-taking and system autonomy) and design features of the interface (the extent of linguistic formality and anthropomorphic appearance), to advance our understanding of how conversational interfaces transform consumer-firm interactions. Based on a series of five studies conducted in the field and the lab, we show that consumers perceive conversational interfaces as substantially more human-like than comparable non-conversational interfaces, and that this greater perception of human-like characteristics results in more intimate consumer-firm relationships, leads to higher prices paid for target products, and renders consumers more likely to accept recommended options.

Black-Box Emotion Detection: On the Variability and Predictive Accuracy of Automated Emotion Detection Algorithms

Author: Francesc Busquet

The ubiquitous availability of image data, advances in cloud-computing, and recent developments in classification algorithms gave rise to a new class of automated emotion detection systems which claim to perform accurate emotion detection from faces at scale. In this research, we provide a tightly controlled validation study using pretrained emotion detection algorithms of the Google ML, Microsoft Cognitive Service, GfK EmoScan, and other platforms to test the robustness and consistency across and within current emotion detection systems. Our results demonstrate considerable variability in predictive validity across emotion detection systems, high variability across different types of discrete emotions with strong positive emotions (such as an open mouth smile) being easier to classify compared to negative emotions such as anger or fear, and we detect sizable positive correlations of theoretically opposite emotions (such as surprise and fear). We provide two modeling strategies to improve prediction accuracy by either combining feature sets or by averaging across emotion detection systems using ensemble methods.

No Comments
Leave a Reply