Robots Offering Encouragement Drive People To Take Risks, Study Suggests

A photo of SoftBank Robotics' Pepper robot encouraging the participant to take risks. Credit: University of Southampton

Peer pressure can drive people to do wild things, things they might never do of their own volition. When surrounded by others, people – particularly younger individuals – can be driven into positive, negative, and even risky behaviors just by getting encouragement from onlookers.

So, if a crowd can egg you on, could a robot do the same thing? Scientists from the University of Southampton suggest they can.

According to new research published in Cyberpsychology, Behavior, and Social Networking, humans can be stimulated into taking greater risks while gambling in the presence of a robot that encourages them to do so. The findings may have large implications on the regulations of future robotic systems in areas where gambling occurs.

“We know that peer pressure can lead to higher risk-taking behaviour. With the ever-increasing scale of interaction between humans and technology, both online and physically, it is crucial that we understand more about whether machines can have a similar impact,” said Dr Yaniv Hanoch, an associate professor in risk management at Southampton, in a statement.

To try and understand the effect encouraging robots could have on gambling humans, Hanoch and colleagues set up an experiment using the Balloon Analogue Risk Task (BART). BART is an online measurement of risk-taking in which the participant is asked to pump up a digital balloon, with each pump netting them some money. However, blow the balloon up too much and it will burst, taking all the accrued money with it. The participants can cash in their score at any point to secure their current balance and move on to the next balloon, so BART is a useful way to measure how many risks participants will take before the prospect of losing it all becomes too overwhelming.

Three groups, adding up to a total of 180 undergraduates, took the test – one with a friendly and encouraging robot that asked questions like "why did you stop pumping?", one with an ominously silent robot simply standing there, and one where the participant was completely alone.

After their scores were compared, the researchers found a significant increase in the amount of money gained – and therefore the more risks taken – by the group with the encouraging robot. A large difference in the number of pumps is seen between the experimental and control groups, with an average of around 1,100 pumps performed with the robot and just 900 without it

Furthermore, if the balloon previously burst on the participant, most would then take less risk, but with the robot pushing them to pump more, the participants in the experimental group did not slow down following a burst.

"We saw participants in the control condition scale back their risk-taking behaviour following a balloon explosion, whereas those in the experimental condition continued to take as much risk as before. So, receiving direct encouragement from a risk-promoting robot seemed to override participants' direct experiences and instincts," said Dr Hanoch.

There was no difference between a silent robot being present and no robot, suggesting it is likely the words of encouragement making the difference here.

The authors do acknowledge some limitations of the study, most notably the sample consisting mainly of females. However, with previous studies suggesting males take more risks than females, it is possible the results would be even more significant. It is also important to understand that, whilst BART is one of the best ethical metrics to test risk-taking, the decisions do not actually affect the participant and results may be different if real risk was involved.

These results have large implications for developing future AI and robotic technologies, with careful consideration needed for the potential harmful impacts of encouragement from any source when it comes to people gambling and doing risky behaviors. However, the researchers also state that it may be useful in more positive campaigns.

"On the one hand, our results might raise alarms about the prospect of robots causing harm by increasing risky behavior. On the other hand, our data points to the possibility of using robots and AI in preventive programs, such as anti-smoking campaigns in schools, and with hard to reach populations, such as addicts," said Dr Hanoch.

Comments

If you liked this story, you'll love these

This website uses cookies

This website uses cookies to improve user experience. By continuing to use our website you consent to all cookies in accordance with our cookie policy.