catlab

PUBLISHED: Factors that affect younger and older adults’ causal attributions of robot behaviour

Pak, R., Crumley-Branyon, J. J., de Visser, E. J., & Rovira, E. (2020). Factors that affect younger and older adults’ causal attributions of robot behavior. Ergonomics, (just-accepted), 1-49.

Abstract

Stereotypes are cognitive shortcuts that facilitate efficient social judgments about others. Just as causal attributions affect perceptions of people, they may similarly affect perceptions of technology, particularly anthropomorphic technology such as robots. In a scenario-based study, younger and older adults judged the performance and capability of an anthropomorphised robot that appeared young or old. In some cases, the robot successfully performed a task while at other times it failed. Results showed that older adult participants were more susceptible to aging stereotypes as indicated by trust. In addition, both younger and older adult participants succumbed to aging stereotypes when measuring perceived capability of the robots. Finally, a summary of causal reasoning results showed that our participants may have applied aging stereotypes to older-appearing robots: they were most likely to give credit to a properly functioning robot when it appeared young and performed a cognitivetask. Our results tentatively suggest that human theories of social cognition do not wholly translate to technology-based contexts and that future work may elaborate on these findings.

Practitioner summary: Perception and expectations of the capabilities of robots may influence whether users accept and use them, especially older users. The current results suggest that care must be taken in the design of these robots as users may stereotype them.

 

Robots and elder care — NYTimes

I was quoted in a recent story on the ethical dilemma of robots and elder care by Maggie Jackson:

Many in the field see the tensions and dilemmas in robot care, yet believe the benefits can outweigh the risks. The technology is “intended to help older adults carry out their daily lives,” says Richard Pak, a Clemson University scientist who studies the intersection of human psychology and technology design, including robots. “If the cost is sort of tricking people in a sense, I think, without knowing what the future holds, that might be a worthy trade-off.” Still he wonders, “Is this the right thing to do?”

Liberty Mutual Medal

Our recent paper was awarded the 2019 Liberty Mutual Medal.

The IEA/Liberty Mutual Medal was instituted in 1998 and consists of a plaque (certificate), medal, and monetary award ($10,000US). This medal recognizes outstanding original research leading to the reduction or mitigation of work-related injuries and/or to the advancement of theory, understanding, and development of occupational safety research.

PUBLISHED:Detecting Automation Failures in a Simulated Supervisory Control Environment

Foroughi, C. K., Sibley, C., Brown, N. L., Rovira, E., Pak, R., & Coyne, J. T. (2019). Detecting Automation Failures in a Simulated Supervisory Control Environment. Ergonomics, 1–22. https://doi.org/10.1080/00140139.2019.1629639

The goal for this research was to determine how individuals perform and allocate their visual attention when monitoring multiple automated displays that differ in automation reliability. Ninety-six participants completed a simulated supervisory control task where each automated display had a different level of reliability (viz., 70%, 85%, and 95%). In addition, participants completed a high and low workload condition. The performance data revealed that 1) participants’ failed to detect automation misses approximately 2.5 times more than automation false alarms, 2) participants’ had worse automation failure detection in the high workload condition, and 3) participant automation failure detection remained mostly static across reliability. The eye tracking data revealed that participants spread their attention relatively equally across all three of the automated displays for the duration of the experiment. Together, these data support a system-wide trust approach as the default position of an individual monitoring multiple automated displays.

Practitioner Summary: Given the rapid growth of automation throughout the workforce, there is an immediate need to better understand how humans monitor multiple automated displays concurrently. The data in this experiment support a system-wide trust approach as the default position of an individual monitoring multiple automated displays.

Keywords: automationautomation failureshuman-automation interactionsupervisory controlattention allocationsystem-wide trusteye-tracking

PUBLISHED: Looking for age differences in self-driving vehicles: Examining the effects of automation reliability, driving risk, and physical impairment on trust

Full text link

Self-driving cars are an extremely high level of autonomous technology and represent a promising technology that may help older adults safely maintain independence. However, human behavior with automation is complex and not straightforward (Parasuraman & Riley, 1997; Parasuraman, 2000; Parasuraman & Wickens, 2008; Parasuraman & Manzey, 2010; Parasuraman, de Visser, Lin, & Greenwood, 2012; Rovira, McGarry, & Parasuraman, 2007). In addition, because no fully self-driving vehicles are yet available to the public, most research has been limited to subjective survey-based assessments that depend on the respondents’ limited knowledge based on second-hand reports and do not reflect the complex situational and dispositional factors known to affect trust and technology adoption.

33 Questions Psychology Must Answer…

The American Psychological Association recently asked 33 psychologists to identify critical questions yet to be answered in their specific area of psychology. I had the honor of answering for the Engineering Psychology (human factors division):

Leaps in technological evolution will turn simple tools into autonomous teammates that have the ability to communicate with us in ways that are even more personal and accessible. A diverse range of new users will collaborate with these entities in new settings. The goal of engineering psychology has always been to enhance the safety, performance, and satisfaction of human-machine interaction. We must adapt to the idea that these machines are quickly changing and becoming less tool-like and more human-like. How will this new human/machine paradigm affect human safety, satisfaction, and performance?

Check out the other interesting questions from other areas; AI is mentioned a few times too!

Available Now: Aging, Technology, and Health Book

Our new edited book, Aging, Technology, and Health is now available!  The book is a collection of contributions from researchers on the leading edge of the topic of using technology for maintaining older adults health and well-being.  The topics range from current to future uses and application of technology.  Here is an exerpt:

Technology has always been used by people of all ages to help manage chronic conditions (e.g., diabetes). However, it is increasingly being used by older adults for a wide range of health-related purposes ranging from maintaining fitness to leading a more engaged life (e.g., maintaining communication). This book will take a problem-centered approach to understanding how different knowledge and methods of human factors is used to examine older adults use of technology for health.
Pak, R., & McLaughlin, A. C. (Eds). (2018). Aging, Health, and Technology. Elsevier.

Technology has been used by people of all ages to help manage their health, from the first branch crafted into a toothbrush to a home glucose monitor for people with diabetes. People over 65 now have many technologies to track everything from activity to diet to medication. Most of these include a social dimension for sharing health information, progress, and goals with loved ones or healthcare providers. However, as is often true with a boom in new technology, these health technologies do not always take into account the ease of use for older adults. For example, smartphones are often the gateway and controller to many health technologies. But the smartphone form factor may not be the optimal choice, merely the most conve- nient. Such usability issues become even more apparent when the user has age- related challenges in vision, hearing, cognition, and movement control.

PUBLISHED: From “automation” to “autonomy”: The importance of trust repair in human-machine interaction

Modern interactions with technology are increasingly moving away from simple human use of computers as tools to the establishment of human relationshipswith autonomous entities that carry out actions on our behalf. In a recent commentary, Peter Hancock (Hancock, 2017) issued a stark warning to the field of human factors that attention must be focused on the appropriate design of a new class of technology: highly autonomous systems. In this article, we heed the warning and propose a human-centered approach directly aimed at ensuring that future human-autonomy interactions remain focused on the user’s needs and preferences. By adapting literature from industrial psychology, we propose a framework to infuse a unique human-like ability, building and actively repairing trust, into autonomous systems. We conclude by proposing a model to guide the design of future autonomy and a research agenda to explore current challenges in repairing trust between humans and autonomous systems.

Practitioner summary

This paper is a call to practitioners to re-cast our connection to technology as akin to a relationship between two humans rather than between a human and their tools. To that end, designing autonomy with trust repair abilities will ensure future technology maintains and repairs relationships with their human partners.

Link: https://www.tandfonline.com/doi/abs/10.1080/00140139.2018.1457725