Kelly Fisher started using a robo-advisor a year and a half ago because she thought it would be more convenient and easier than investing through a human advisor. What she didn’t anticipate, though, was just how much more truthful she would be with an automaton rather than a living, breathing person sitting across the desk.
When someone starts asking me about my net worth, I get uncomfortable. With a robo-advisor there’s no judgement.
The San Francisco-based retail executive has about $8,000 invested in accounts with robo-advisors. These are sites that ask a series of questions and then they match a fund with that investor’s risk tolerance and lifestyle. Fisher doesn’t have anything to hide by using these websites, but she says it’s human nature to judge, and that’s made it uncomfortable for her to open up to a human financial expert. With a robo-advisor, Fisher can be more honest about where she stands financially.
“When someone starts asking me about my net worth, that’s when I get uncomfortable,” Fisher said. “With a robo-advisor there’s definitely no judgment. There’s no stigma over having… debt or deciding that you want to spend money on a vacation or about coming into a windfall.”
Fisher’s comfort with a computer isn’t unusual – there’s a growing body of research showing that people trust robots and automated online forms more than humans. That’s one big reason, beyond lower costs, why robo-advice – which is expected to have a staggering $2.2 trillion in assets under management by 2020, up from $50bn today – may become the preferred way for individuals to invest.
Gale Lucas, director of research at Dallas, Texas-based Organizational Wellness and Learning Systems, has been studying robot-human trust for years. She’s found that people really are more truthful when they disclose information to a computer. That goes for both general questions as well as more intimate details.
The effect on higher rates of “honest responding… are especially strong when the information is illegal, unethical, or culturally stigmatised,” she wrote in a 2014 paper.
People are more open with automated tools because they believe computers don’t judge and that they’re more ethical.
Finances would fall into that latter category, Lucas explained. Debt, for instance, has a negative connotation attached to it. People feel uncomfortable discussing it and may even hide how much they really owe from a human financial advisor.
“They’re very embarrassed and don’t want to admit how much credit card debt they have,” Lucas said. “It’s anxiety producing, so to have someone you can talk to where it’s safe to say that you’re worried that you’ll never get out from all of that debt is important.”
People are more open with automated tools because they believe computers don’t judge and that they’re more ethical, studies show.
26% of Canadian adults believed an unbiased computer program would be more trustworthy and ethical than their workplace leaders and mangers.
A study, released in March by Vancouver, Canada-based market research firm Intensions Consulting found that 26% of Canadian adults believed an unbiased computer program would be more trustworthy and ethical than their workplace leaders and mangers. That number was even higher with younger adults — those aged 20 to 39 —at 31%.
The study also found that 26% of Canadians would rather be screened, hired and have their job performance assessed by an unbiased computer program. Nick Badminton, a futurist and a co-author of the study, said in a release that “people are losing faith in human management, and rightly so. Who would you trust, a human with personal biases and opinions or a rational and balanced (artificial intelligence)?”
Lucas’s research asks that same question, but in a mental health context. She’s examined whether or not people with post-traumatic stress disorder disclose more personal information to a robot – in her case a virtual bot that looks human – than an actual doctor.
The “virtual agent” that she helped create asks personal and probing questions, such as “Do you have any regrets?” and “Is there anything you wish you didn’t remember?” In her study, participants were told that they would either be talking to a computer or a human behind the scenes.
In nearly every case, the people who were told that they were taking to a computer revealed more details than the people who thought they were talking to a human, Lucas said. She then asked the participants if they felt judged or if they were afraid that the human or the computer was going to negatively evaluate them. Those who thought they were talking to a computer said no, the others said yes.
“That’s the crux of why we think this happens,” she says. “People who talk to a virtual agent know their data is anonymous and safe and that no one is going to judge them.”
A 2014 survey found that one in four people born between 1980 and 1989 trust 'no one' for money-related information.
Millennials driving the market
At this point in the robo-advisor cycle the appeal isn’t the anonymity, said Kendra Thompson, a Toronto, Canada-based managing director at Accenture Wealth & Capital Markets. Companies don’t yet offer sophisticated advice through these sites. Convenience and cost – some charge as little as 0.15% annually on assets invested, while advisor fees range between 1% and 2% of assets — is the attraction now.
However, that is likely to change, she said. In Asia, the demand for digital investment tools is growing exponentially. Elsewhere, the demand for more unbiased automated long-term advice is expanding, but it’s mostly coming from younger savers.
A 2014 survey from Fidelity Investments found that one in four people born between 1980 and 1989 trust “no one” for money-related information, while a Bank of America report said that affluent millennials are more likely to place a “great deal” of faith in technology compared to other generations “and this is no different in financial advisory services”.
People who have a good relationship with an advisor will open up, Thompson said, but it’s still hard for people to not feel judged.
“There are people who might say ‘I don’t get where the recommendations are coming from’ or ‘I don’t know why the advisor is asking me these questions’,” she said. “That’s the powerful thing about these tools – you can play around with them without feeling like you’re exposing yourself.”
A robot is still a robot
While automated devices may seem more trustworthy than humans, it’s important to keep in mind that robots are still machines and they can be manipulated by the end user.
You can play around with them without feeling like you’re exposing yourself.
Alan Wagner, a social robots researcher at Georgia Tech Research Institute in Atlanta, Georgia ran a study where he simulated a fire in a building and asked people to follow a robot to safety. The robot, though, took them into wrong rooms, to a back door instead of the correct door, and (by design) it broke down in the middle of the emergency exit.
Yet, through all of that, people still followed the robot around the building hoping it would lead them outside. This study proved to Wagner that people have an “automation bias”, or a tendency to believe an automated system even when they shouldn’t.
“People think the system knows better than they do,” Wagner said. Why? Because robots have been presented as all-knowing. Previous interactions with automated systems have also worked properly, so we assume that every system will do the right thing.
As well, since robots don’t react or judge what someone says, our own biases get projected onto these automated beings and we assume they’re rooting for us no matter what, he said.
However, Wagner says it’s important to remember that someone – a mutual fund company, an advisor – is controlling the bot in the background and they want to achieve certain outcomes. That doesn’t mean people shouldn’t be truthful with a robot, but these systems are fallible.
“You have to be able to say that right now I shouldn’t trust you, but that’s extremely difficult,” Wagner said.
To comment on this story or anything else you have seen on BBC Capital, please head over to our Facebook page or message us on Twitter.