Psychological Mechanisms and Ethical Dilemmas of Ambient Persuasive Technology

ECIS 

Workshop "Persuasive Technology for Health and Wellness"
 
TU/Eindhoven, November 28 and 29, 2011
 
Organizing Committee: Felicitas Kraemer, Jaap Ham, Philip Nickel, and Andreas Spahn
 
Key-Note Speakers: Reinder Haakma, (Philips); Arie Dijkstra, (University of Groningen, Social Psychology); Soren Holm, (University of Manchester, Bioethics)
 
Workshop Language: Dutch (main language) & English
 
The aim of the workshop is to bring together stakeholders and researchers in the field of emerging persuasive technologies for healthcare and well-being. How can technology support physicians and patients to restore health and treat diseases? How can it stimulate adherence to a healthy life-style?  What are the different design principles, ethical issues and psychological mechanisms surrounding the use of these technologies for hospital and home-care, e-health, diagnosis and prevention? What are the institutional, legal and technical issues linked to these technologies? The workshop is meant to explore the most pressing issues surrounding these technologies and set the agenda for future research in this field. The focus will be on how research in ethics and HTI (Human-Technology Interaction) can help address these issues, bringing stakeholders and scientists together.
 
Background:
 
The physician-patient relationship is very special from an ethical point of view. It involves relations of trust, care, diligence and professionalism. But this age-old interpersonal relationship is nowadays increasingly mediated by technology. This technology can no longer be seen as a neutral instrument in the hand of professional, because it is being used to take over or assist some of the physician’s tasks that in the past only humans could do, such as diagnosis, communication (e.g., telemedicine), and storage/retrieval/search of medical records.
 
A striking example of a technology that takes over human tasks is persuasive healthcare technology. Persuasive technology in the health-care domain motivates people to lead a healthier life style by mediating prevention and treatment. Such technology is extremely promising in cases where behavior is a key part of maintaining or improving one’s health. But it also raises many ethical issues for ethics, psychology, medicine and medical engineering. Our 2 day workshop will bring together stakeholders and researchers to explore the ethical, psychological and technological issues surrounding persuasive technology in this domain.
 
The workshop will foster contacts between stakeholders, industry and researchers from different disciplines and lay the groundwork for a funding initiative in the field. Active participation and input is requested from all participants.
 
Registration is closed, but if you wish to participate, please send an email to Felicitas Kraemer f [dot] kraemer [at] tue [dot] nl.
 
 
General description of the project:
 
The Secret Persuader: Psychological Mechanisms and Ethical Dilemmas of Ambient Persuasive Technology
 
1. General Aim
It is commonly recognized that one important challenges of contemporary technological development is the need for a transition towards a sustainable society. At the same time it is recognized that technology alone cannot bring about this transition. Individuals need to change their behavior also. How do we motivate agents such that they strive for sustainable behavior, even when this might conflict with other interests they might have?
This is where persuasive technology appears. It aims at persuading human agents to behave in socially-valued ways, by giving information, providing feedback, and taking over actions. Persuasive technologies (PT) are technologies which are “intentionally designed to change a person’s attitude or behavior or both” (IJsselsteijn, et al., 2006; Fogg, 2003). Examples are instruments that give car drivers feedback about their fuel consumption, and robots that take on the role of social actors and praise or criticize users depending on their performance (e.g., Ham, Midden, & Tak, 2008).
However, in daily life, people probably do not have the focal attention available that PT requires. Therefore, research that studies traditional PT might suggest promising results, but PT’s effectiveness might decrease when focal attention is lost. For example, PT that gives car drivers information about their fuel consumption might be ignored within weeks after installation and thereby stop being effective.
In this research proposal, we propose investigating a solution to this issue: Ambient Persuasive Technology. This new line of persuasive technologies can influence user behavior and attitudes at a low level of conscious attention. Researchers have started recently to investigate PT that can be integrated unobtrusively into the environment. This allows new forms of influencing and offers some important advantages over more focal persuasive technologies. One of these advantages is the ability to deploy influence attempts at exactly the right time and place. Ambient displays, e.g., provide users with information by making it available in an environment through “subtle changes in form, movement, sound, color, smell, temperature or light” (Wisneski et al., 1998).
Intriguingly, recent research by Ham and colleagues indicates that ambient persuasive technology is indeed able to unconsciously influence a person. That is, even without directing conscious attention to the influencing (ambient) persuasive technology, this technology can be effective in changing attitudes (Ham et al., 2009).
This attempt to direct user behavior through subliminal or hardly conscious ways raises important new questions, that demand to be addressed from psychological and ethical perspectives.
Our key research question is: Under which conditions could ambient persuasive technology contribute to the realization of our social values (e.g. sustainability)?
We aim at investigating both the psychological conditions, under which ambient persuasion might be effective, as well as the ethical conditions under which a use of this technologies might be morally permitted. We suggest that both questions are closely linked to each other.
1.1 Psychology and Ethics
We argue that in understanding the mechanisms of ambient persuasive technologies, ethical and psychological issues are closely intertwined. For example, there is reason to assume that the ethical believes we hold influence our actions and have an impact on the way we react when a person or a technology tries to persuade us. Therefore, it must be investigated whether solid ethical convictions make ambient persuasion attempts more effective on people who share these values. For example, we would measure people’s ethical convictions about persuasive technology, and investigate its relationship to susceptibility to ambient persuasive technology. Also, we could manipulate ethical convictions, or assess people’s reactions to (ambient) persuasive attempts in spite of their moral convictions. The relation between consciously held ethical believes and persuasion attempts is an interesting field, in which little research has been done so far.
1.2 Psychological Issues
From a psychological perspective we will analyze whether an unconscious influence on user behavior and attitude is possible and under which conditions it is likely to be effective. Initial research, as mentioned, has already shown that a change in user behavior and attitudes might be possible (Ham et al., 2009). We argue that from a psychological perspective, understanding the psychological mechanisms of ambient persuasive technology is fundamental for effective human-technology interaction. Moreover, research on (ambient) persuasive technology opens up a new world of psychological science. That is, questions about the workings of (unconscious) human persuasion can now be investigated outside of the restrictions of human-to-human settings.
Based on earlier work (a.o., Cvetkovich & Lofstedt, 1999) we conjecture that the level of trust (e.g., in the supposed persuading agent, or in the situation in which the persuasion takes place) might be an important predictor of the effectiveness of ambient persuasive technology. We argue so, because trust gains relevance especially in situations in which little cognitive processing is possible, and it is in these situations that ambient persuasive technology can be most effective.
1.3 Ethical Issues
From an ethical dimension an obvious question is, whether such attempts to influence our behavior through ambient persuasive technologies are morally allowed at all, or under which conditions they might be permissible. If I get persuaded in a subconscious way, I have no choice to avoid the persuasion attempt. One can thus argue that these attempts should be regarded as a subtle form of manipulation. Indeed it might be questionable, whether these forms of influence do still fall under ‘persuasion’. Many countries have therefore e.g. passed legislation that forbids the use of subliminal messages in context of advertisement (e.g., Shapiro, 1999).
A further interesting ethical question is, whether the use of ambient persuasive technology would require a form of informed consent. It might be morally legitimate to use subliminal persuasion on oneself, while it might be not permitted to use it without restriction on people who did not give their consent. It is also worthwhile to analyze the notion of ‘trust’ in technology, a concept that closely links together ethical and psychological research. We might be more willing to be ‘persuaded’ by somebody we trust. In a similar line, we might need trust in the designers of persuasive technology and share some basic values with them, in order to consider subjecting us to a subconscious form of persuasion.
The envisioned project will probably profit from attempts to develop an ethical framework for persuasive technologies (Berdi­chevs­ky, 1999; Baker, 2001; Fogg, 2003, 211-239; Verbeek, 2009). It might also draw conclusion from philosophical analyses of the ways technology mediates our perceptions and actions (Verbeek, 2005). Also work done on ethical issues of persuasion in fields previously not linked to technology might serve as a valuable source of insights, as in rhetoric (Christensen & Hasle, 2007)
2. Researchers involved and expected ‘synergy'
The research question inherently demands cooperation between an ethical/philosophical and a psychological perspective. Jaap Ham’s research investigates persuasive technology, human-technology interaction, and social justice. For example, he studied the persuasive effects of social feedback from a robot on energy conservation behavior (Ham, et al., 2008), and the unconscious mechanisms of ambient persuasive technology (Ham et al., 2009). His research is core to the PT Lab Group and the Human-Technology Interaction section.
Andreas Spahn’s current research lines include research about ethics of sustainability (Spahn & Verkerk, 2009) and ethical issues of hermeneutics and communication (Spahn, 2008). Philip Nickel’s research explores the connections between ethics of belief and the ethics of technology. He is the author of papers on voluntary control of belief (Nickel, 2009) and philosophical accounts of trust (Nickel, 2007). The research of Spahn and Nickel is embedded in the Program Artifacts, Agents and Normativity and in the research activities of the 3TU Centre of Ethics.
The research lines of these researchers come together exactly at the domain of the current research question. By combining our strengths, we argue that new, undiscovered questions arise—not earlier has ambient persuasive technology been investigated from both an ethical and a psychological perspective. We expect strong synergy effects because of two reasons. First, these perspectives can been seen as complementary: psychological research of PT, and especially of ambient persuasive technology is strongly in need of an ethical analysis. Second, an ethical analysis of ‘persuasion’ needs thorough psychological methods and insights. Furthermore, we think that previous research on the notion of trust (Ham, Nickel) could prove to be a valuable resource.
Leonie Geerdinck will closely cooperate in developing and writing the research proposal. She will also be a candidate for one of the PhD-student positions. In both activities she will bring in her knowledge as a HTI-master student and her expertise as applied researcher of user-driven innovation gained at, and in cooperation with Philips Lighting. Thereby, we argue we have a strong and involved candidate to do (part of) the grant writing and research, and additionally, a close connection to industry.
3. University Setting and additional funding activities
Recently the Human-Technology-Interaction and the philosophy section joined forces to apply for the NWO program ‘Social Responsible Innovation’ with a proposal entitled Persuasive Technology, Allocation of Control, and Social Values. (Meijers, Midden, Steinbuch, Hofman, Spahn, Ham). This proposal is currently under evaluation in the final round. The core idea is to investigate the psychological and ethical issues of persuasive technologies based on an empirical case of energy management and vehicle-safety.
This submitted NWO research proposal, however, focuses on classical feedback systems, and omits thus the powerful and ubiquitous possibilities of ambient persuasive technology. This new suggested ECIS research investigates ambient persuasive technology and would thus complement the NWO research very well (given we will be awarded the funding). Independent from the outcome of NWO’s decision, we aim in either case at investigating possibilities to seek funding for this new proposal, since this field is of great joint interest.
4. Envisioned Activities
4.1. Writing Grant Proposal (Budget € 5000)
We will write a research proposal to fund one or two Ph.D. project(s). We consider the following sources for funding:
- NWO
- EOS Long Term (Subpart of “Energy Saving through Innovation”, Dutch Ministry of Economics)
- EU-FP7 program (ERC, Marie Curie Fellowship, part of an IP)
- HTAS (program for automotive innovations)
Writing the research proposal will be done by Ham, Spahn, Nickel, and Geerdinck. We consider hiring Leonie Geerdinck for this task, and/or a student-assistant to do literature research.
4.2 International ECIS workshop
To help us prepare a successful grant application, we will organize a workshop to discuss our ideas with leading scientists in the field. We aim at inviting:
- Prof.dr.Sven Ove Hansson, professor of philosophy, Stockholm University of Technology
- Dr.B.J. Fogg, professor of Persuasive Technology at Stanford University, USA.
- Prof.dr.Ap Dijksterhuis, professor of unconscious cognition at Nijmegen University
- Prof.dr.Peter Paul Verbeek, associate professor of philosophy of technology at Twente University
- Prof.dr.Henk Aarts, professor of social cognition at Utrecht University
In addition, we will invite various colleagues from the TUe who have contributed to research on Persuasive Technology (e.g., IJsselstein, deKort, Midden). The workshop will consist of presentation of the research proposal, extensive comments by experts, and discussion. Also, the workshop will contain presentations and  discussion about grant application policy, for which we will invite participants from EG-Liason, and the TUe ASC-Bureau (grant services). In addition, the workshop will discuss valorization of the research project, for which we will invite the TUE Innovation Lab.
Costs consists of expenses for travel, accommodation (mainly for international guests), and lunch and dinner. For this we request subvention of € 5000.
4.3  Program Page at the ECIS website
We welcome the possibility to present our research on the ECIS website. Not only would this help promote research ideas, but also to organize meetings.
4.4  joint journal papers
Ham, Nickel and Spahn aim at writing at least one joint paper during the process of preparing a grant proposal. The preliminary title is: ‘…and lead us (not) into persuasion’ – ethics and psychology of ambient persuasive technology.
5. References
Baker, S., Martinson, D. L. (2001). The TARES Test: Five Principles for Ethical Persuasion, Journal of Mass Media Ethics 7 (2), 148-175.
Barney, R. D. (ed.) (2001). Ethics and professional persuasion: A special double issue of the journal of mass media ethics: Lawrence Erlbaum.
Berdichevsky, D.  & Neuenschwander, E. (1999). Toward an ethics of persuasive technology.  Communications of the ACM, 42, 51-58.
Christensen, A-K. & Halse, P. (2007). Classical rhetoric and a limit of persuasion. In Y. de Kort, W. IJsselstein, C. Midden, B. Eggen, & B. J. Fogg (Eds.). Persuasive technology. Second International Conference on Persuasive Technology, Palo Alto, CA, USA, April 26-27, 2007, Revised Selected Papers, p. 307-310.
Fogg, B. J. (2003). Persuasive Technology. Using Computers to Change What We Think and Do. San Franciso: Morgan Kaufman Publishers.
Ham, J., Midden, C., & Tak, S. (2008). The persuasive effects of positive and negative social feedback from an embodied agent on energy conservation behavior. Proceedings of Persuasive 2008, Oulu, Finland, June 4-6, 2008. Oulu, Finland: University of Oulu.
Ham. J., & Midden, C. (2009). A Robot That Says “Bad!”: Using Negative and Positive Social Feedback From a Robotic Agent to Save Energy. Paper submitted for Human-Robot Interaction 2009.
Ham, J., Van den Bos, K., & Van Doorn, E. A. (in press). Lady Justice Thinks Unconsciously: Unconscious Thought can Lead to More Accurate Justice Judgments. Social Cognition.
Ham, J., & Van den Bos, K. (2008). Not fair for me! The influence of personal relevance on social justice inferences. Journal of Experimental Social Psychology, 44, 699-705.
Ham, J., Midden, C., & Beute, F. (2009). Can Ambient Persuasive Technology Persuade Unconsciously? Using Subliminal Feedback to Influence Energy Consumption Ratings of Household Appliances. Paper submitted for Persuasive 2009.
Midden, C., McCalley, T., Ham, J., & Zaalberg, R. (2008). Using persuasive technology to encourage sustainable behavior. Proceedings of Pervasive 2008, Sydney, May 19-22 2008. Sydney: Conference organization.
Midden, C., McCalley, T., Ham, J., & Zaalberg, R. (2008). Using persuasive technology to encourage sustainable behavior. Proceedings of the (British) Society for the Study of Artificial Intelligence and the Simulation of Behaviour, Aberdeen, April 1-4 2008. Aberdeen: Conference organization.
Midden, C., & Ham, J. (2009). Using Negative and Positive Social Feedback From a Robotic Agent to Save Energy. Paper submitted for Persuasive 2009.
Nickel, P. (2007). "Trust and Obligation-Ascription." Ethical Theory and Moral Practice 10: 309-319.
Nickel, P. (2009). Forthcoming. "Voluntary Belief on a Reasonable Basis." Philosophy and Phenomenological Research.
Roubroeks, M., Midden, C., & Ham, J. (2009). Does It Make a Difference Who Says It? Exploring the Role of a Social Agent for Psychological Reactance. Paper submitted to Persuasive 2009.
Shapiro, S. (1999). When an ad´s influence is beyond our concious control: Perceptual and conceptual fluency effects caused by incidental ad exposure. Journal of Consumer Research, 26, 16-36.
Spahn, A. (2008): Hermeneutik zwischen Traditionalismus und Rationalismus. Würzburg: Königshausen & Neumann.
Spahn, A., Verkeerk, M. (2009): Virtue Theory and Sustainability – Only a third of the truth? Paper to be submitted to Environmental Ethics (in preparation).
Verbeek, P. P. (2005). What things do. Pennsylvania State University Press.
Verbeek, P.P. (2009). Persuasive Technology and Moral Responsibility. Towards an Ethical Framework for Persuasive Technologies, fourthcoming in:  Behaviour and Information Technology, 30.
Vossen, S., Ham, J., & Midden, C. (2009). Social Influence of a Persuasive Agent: the Role of Agent Embodiment and Evaluative Feedback. Paper submitted for Persuasive 2009.
Wisneski, C., Ishii, H., Dahley, A., Gorbet, M.G., Brave, S., Ullmer, B., Yarin, P. (1998). Ambient Displays. In: Streitz, N.A., Konomi, S., Burkhardt, H.-J. (eds.) CoBuild 1998. LNCS, vol. 1370. Springer, Heidelberg.