By Spela Majcen Marusic, Communications Manager
“Almost every generation feels that younger generations are less honest”, explains Prof. Jeff Hancock, Founding Director of Stanford Social Media Lab, during the latest lecture of the Aleph Executive Speaker Series. And while many generations throughout history have felt this way, could they all be wrong?
Changing the way we trust
People heavily rely on trust to mitigate risk uncertainty. We move through an imperfect world, and many times we are unsure of exactly what the consequences of our actions will be, and how these will affect our surroundings. We have limited cues to understand how others around us will behave and literally zero way of telling if somebody is lying on the spot.
So we have learned to trust. First, to trust in the people we know, then, as communities grew bigger, we started to trust networks we were secondarily connected to. Most recently, we have put our trust into institutions, such as the government, courts, financial corporations and banks.
While trust levels vary immensely by culture, they all rely on the premise of a promise, and the question of whether or not it will be delivered.
We could claim that the latest major shift in the way we trust was brought on by the rapid development of technology. Through the simplicity it offers for sharing information, digitizing data and managing it with support of artificial intelligence (AI) it has undoubtedly contributed to the rise of trust in our networks and a decline in trust in our institutions.
So, how does all this manifest?
Digital literacy to combat fake news
One of the most evident, as well as dangerous phenomena is fake news. Limited to selected tabloids and gossip papers in the past, fake news nowadays can spread like wildfire on social media.
Today, we see fake news being used for various reasons, including political. However, clever and targeted actions could help get the spread of fake news under control in the future.
In fact, researchers found that during the 2016 US Presidential election campaigns, the large majority of fake news was shared by people in the 60-70 age bracket. This was due to its targeting settings, but also in a large part because their digital literacy levels generally rank much lower than that of younger generations.
“Our latest research showed that with just a little bit of digital literacy, we were able to improve how older people detected fake news.”
Prof. Jeff Hancock
However, a much larger societal movement is required to curb the spread of fake news online. The return to normalcy would only occur when lying online (as well as offline) becomes socially unacceptable to the point where political leaders utilising fake news to fight what they refer to as “political wars” receive the ultimate punishment by the people and are voted out of office.
Embedded trust is a winning formula for tech companies
On social media the “known” and “stranger” networks begin to overlap. And sometimes it’s difficult to quickly assess how to deal with strangers online. In fact, for many older adults, it is hard to distinguish between known and unknown people. While we know very well how to react to abusive behaviour from a stranger on the street, many times we feel more confused when encountering somebody similar on the internet.
“If StarBoy36249 says something angry I should be able to very easily ignore it in the same way that we can turn our bodies and ignore somebody that we have nothing to do with offline.”
Prof. Jeff Hancock
As trusting people online is less intuitive than what we’ve evolved to do throughout history in face to face interactions, we look for authorities such as academics, journalists or financial industry experts for reassurance.
In fact, a modern technology based company can thrive immensely by building on network trust, driven by social media, big data, AI, and embedding a strong institutional trust aspect into their services. As such, companies like Airbnb base themselves on their network through reviews and comments, while ensuring that legal support is available, cooperation with government institutions is maintained, insurance is taken care of, and transactions are completed in a secure way.
Psychology beats technology
It takes time to build trust, but once it’s violated, trust erodes extremely quickly. In fact, when we are assessing whether or not to trust a person, what we’re actually thinking about is whether they are ready to deceive us.
While many of us might intuitively doubt technology and characterize it as the ultimate tool of deception, research has shown that when it comes to known networks, technology can actually reduce the amount of deception. People tend to lie most when talking on the phone, followed by face to face interactions, and only then by e-mail communications.
Studies have also demonstrated that most people do not lie on social media, and that their personality can be assessed based on their Facebook profile as easily as it would have been done in person. In fact, people do not even lie about their attributes on online dating platforms any more so than they would have in an offline setting.
“It’s not the technology that is driving deception, especially when it comes to known networks. Instead it’s our psychological situation that would lead us to need deception to accomplish some psychological goal.”
Prof. Jeff Hancock
This is just as well, because people typically seem to be very bad at detecting lies. Timothy Levine’s research showed that we are only 54% accurate when it comes to identifying deception. Mainly because we have a truth bias. Our default state is to believe that what we are being told is true, as there are no actual reliable cues for deception. And so instead of wondering how to tell if somebody is lying, we instead need to understand when to stop trusting a certain person. When we sense a deceptive motive, witness a dishonest display, lack of coherence or get a warning from a third party, that’s when we should switch to the “Suspicion state”.
Authenticity trumps channel
Contrary to the lack of reliable deception cues, we do know what signals trustworthiness. And while we struggle with a 50/50 chance of catching out a liar, we can quickly and accurately identify a person of whom we should trust.
Humans are driven to evaluate others based on competence (can they do what they say they can?), honesty (can they do what they promised to do?), and reliability (will they do it?).
As an extension, companies can leverage the same attributes to drive corporate trust and success. However, organizations should not forget the importance of authenticity. Understanding core values and consistently communicating them across multiple channels is the key to laying a strong foundation of trust in any organization.
“Trust and safety on platforms is not a destination but a journey.”
Dave Byrne, Global Head of Brand Safety and Industry relations, TikTok
Authenticity trumps technology in all aspects, including requirements of different communication channels. At the end of the day, an attempt to showcase your company as a serious business player on LinkedIn and a fun boho influencer on Instagram would erode trust and create identity issues for your brand.
No high-tech crystal ball
No amount of technology and foresight can help us speculate on how technology will continue to influence trust in the near and far future. In one corner we have the rapid development of AI, mediated communication through smart replies, automatic summaries and chatbots, and in the other we see the challenges of building a solid corporate culture in the hybrid workplace of the future. We are definitely entering uncharted waters in how and when people will continue to trust each other.
Just like Diogenes back in Ancient Greece, we might grab a virtual lamp and go on a mission of finding that one honest person on the internet. However, considering we are not all cynics, we might as well realise that the majority of people out there are indeed telling the truth most of the time with or without the technology filter.