Here comes the bad news: Doctor robot taking over

Johan F. Hoorn, Sonja D. Winter

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

To test in how far the Media Equation and Computers Are Social Actors (CASA) validly explain user responses to social robots, we manipulated how a bad health message was framed and the language that was used. In the wake of Experiment 2 of Burgers et al. (Patient Educ Couns 89(2):267–273, 2012. https://doi.org/10.1016/j.pec.2012.08.008), a human versus robot doctor delivered health messages framed positively or negatively, using affirmations or negations. In using frequentist (robots are different from humans) and Bayesian (robots are the same) analyses, we found that participants liked the robot doctor and the robot’s message better than the human’s. The robot also compelled more compliance to the medical treatment. For the level of expected quality of life, the human and robot doctor tied. The robot was not seen as affectively distant but rather involving, ethical, skilled, and people wanted to consult her again. Note that doctor robot was not a seriously looking physician but a little girl with the voice of a young woman. We conclude that both Media Equation and CASA need to be altered when it comes to robot communication. We argue that if certain negative qualities are filtered out (e.g., strong emotion expression), credibility will increase, which lowers affective distance to the messenger. Robots sometimes outperform humans on emotional tasks, which may relieve physicians from a most demanding duty of disclosing unfavorable information to a patient.

Original languageEnglish
Pages (from-to)519-535
Number of pages17
JournalInternational Journal of Social Robotics
Volume10
Issue number4
Early online date13 Dec 2017
DOIs
Publication statusPublished - Sep 2018

Fingerprint

Robots
Health
Communication

Keywords

  • CASA
  • Communication
  • Framing
  • Healthcare
  • Language
  • Media Equation

Cite this

@article{1c556478593a466b8cb9364c1fec71df,
title = "Here comes the bad news: Doctor robot taking over",
abstract = "To test in how far the Media Equation and Computers Are Social Actors (CASA) validly explain user responses to social robots, we manipulated how a bad health message was framed and the language that was used. In the wake of Experiment 2 of Burgers et al. (Patient Educ Couns 89(2):267–273, 2012. https://doi.org/10.1016/j.pec.2012.08.008), a human versus robot doctor delivered health messages framed positively or negatively, using affirmations or negations. In using frequentist (robots are different from humans) and Bayesian (robots are the same) analyses, we found that participants liked the robot doctor and the robot’s message better than the human’s. The robot also compelled more compliance to the medical treatment. For the level of expected quality of life, the human and robot doctor tied. The robot was not seen as affectively distant but rather involving, ethical, skilled, and people wanted to consult her again. Note that doctor robot was not a seriously looking physician but a little girl with the voice of a young woman. We conclude that both Media Equation and CASA need to be altered when it comes to robot communication. We argue that if certain negative qualities are filtered out (e.g., strong emotion expression), credibility will increase, which lowers affective distance to the messenger. Robots sometimes outperform humans on emotional tasks, which may relieve physicians from a most demanding duty of disclosing unfavorable information to a patient.",
keywords = "CASA, Communication, Framing, Healthcare, Language, Media Equation",
author = "Hoorn, {Johan F.} and Winter, {Sonja D.}",
year = "2018",
month = "9",
doi = "10.1007/s12369-017-0455-2",
language = "English",
volume = "10",
pages = "519--535",
journal = "International Journal of Social Robotics",
issn = "1875-4791",
publisher = "Springer Verlag",
number = "4",

}

Here comes the bad news : Doctor robot taking over. / Hoorn, Johan F.; Winter, Sonja D.

In: International Journal of Social Robotics, Vol. 10, No. 4, 09.2018, p. 519-535.

Research output: Contribution to JournalArticleAcademicpeer-review

TY - JOUR

T1 - Here comes the bad news

T2 - Doctor robot taking over

AU - Hoorn, Johan F.

AU - Winter, Sonja D.

PY - 2018/9

Y1 - 2018/9

N2 - To test in how far the Media Equation and Computers Are Social Actors (CASA) validly explain user responses to social robots, we manipulated how a bad health message was framed and the language that was used. In the wake of Experiment 2 of Burgers et al. (Patient Educ Couns 89(2):267–273, 2012. https://doi.org/10.1016/j.pec.2012.08.008), a human versus robot doctor delivered health messages framed positively or negatively, using affirmations or negations. In using frequentist (robots are different from humans) and Bayesian (robots are the same) analyses, we found that participants liked the robot doctor and the robot’s message better than the human’s. The robot also compelled more compliance to the medical treatment. For the level of expected quality of life, the human and robot doctor tied. The robot was not seen as affectively distant but rather involving, ethical, skilled, and people wanted to consult her again. Note that doctor robot was not a seriously looking physician but a little girl with the voice of a young woman. We conclude that both Media Equation and CASA need to be altered when it comes to robot communication. We argue that if certain negative qualities are filtered out (e.g., strong emotion expression), credibility will increase, which lowers affective distance to the messenger. Robots sometimes outperform humans on emotional tasks, which may relieve physicians from a most demanding duty of disclosing unfavorable information to a patient.

AB - To test in how far the Media Equation and Computers Are Social Actors (CASA) validly explain user responses to social robots, we manipulated how a bad health message was framed and the language that was used. In the wake of Experiment 2 of Burgers et al. (Patient Educ Couns 89(2):267–273, 2012. https://doi.org/10.1016/j.pec.2012.08.008), a human versus robot doctor delivered health messages framed positively or negatively, using affirmations or negations. In using frequentist (robots are different from humans) and Bayesian (robots are the same) analyses, we found that participants liked the robot doctor and the robot’s message better than the human’s. The robot also compelled more compliance to the medical treatment. For the level of expected quality of life, the human and robot doctor tied. The robot was not seen as affectively distant but rather involving, ethical, skilled, and people wanted to consult her again. Note that doctor robot was not a seriously looking physician but a little girl with the voice of a young woman. We conclude that both Media Equation and CASA need to be altered when it comes to robot communication. We argue that if certain negative qualities are filtered out (e.g., strong emotion expression), credibility will increase, which lowers affective distance to the messenger. Robots sometimes outperform humans on emotional tasks, which may relieve physicians from a most demanding duty of disclosing unfavorable information to a patient.

KW - CASA

KW - Communication

KW - Framing

KW - Healthcare

KW - Language

KW - Media Equation

UR - http://www.scopus.com/inward/record.url?scp=85040242345&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85040242345&partnerID=8YFLogxK

U2 - 10.1007/s12369-017-0455-2

DO - 10.1007/s12369-017-0455-2

M3 - Article

VL - 10

SP - 519

EP - 535

JO - International Journal of Social Robotics

JF - International Journal of Social Robotics

SN - 1875-4791

IS - 4

ER -