Coding matters: A question of trust

Image of an older man and a menacing robot staring at each other.

AI is in the news all the time. I don’t have to tell you that. It’s also very controversial. Even people with a deep knowledge of the topic have opposing views about what it means for our future.

Companies are investing huge sums in AI, because they believe it will be profitable. But right now, you and your customers are not quite as keen.

We don’t want AI (yet)

Gartner found that two-thirds of customers don’t want companies to use AI in customer service. The top two concerns are that it will be too difficult to reach a real human agent, and AI will displace jobs. The third concern is that AI will be wrong.

That’s not a surprise. AI hallucinations make for great news. Like when Google AI recommended that you put glue on pizzas to make the cheese stick. (This was traced back to a satirical comment on Reddit.)

We don’t trust AI (yet)

We make thousands of decisions a day. Some we agonise over. Others we make automatically, or with very little thought. That is possible, and necessary, because we subconsciously store information about what is “good” or “bad.”

Information about AI already influences our decision-making.

Recently I was googling for some information on uranium glass. (Uranium glass actually contains uranium. It glows under UV light.) The first search result that came up was from Google’s AI. I ignored it and continued my search. I made the decision to skip the AI results automatically. Why?

When I thought about my decision, I realised why I made it. I believed I wouldn’t know if the AI results were correct. Which is true. But that is also true for the information that I come across on any web site. The difference is that I think I can judge the information better if I know where it is coming from. With AI results, I have no idea of the source.

Bear in mind that I don’t have strong views or fears about AI. I’ve read stories of AI hallucinations, but also amazing stories of AI successes. I’ve played with ChatGPT. And in spite of their problems, I’d prefer self-driving cars to South African taxi drivers.

My behaviour matches what most of you believe. Another study found that using the term “AI” to describe a product lowers a customer’s intention to buy it. This finding was consistent across age groups and across products – from vacuum cleaners to health services. For example, one experiment gave participants identical descriptions of smart TVs. The only difference was that some descriptions contained the term “artificial intelligence”. Guess which TV people would buy.

It seems the term “AI” lowers our level of emotional trust.

The negativity effect

I’m sure there are lots of reasons for this. But one reason is because of the negativity effect.

The negativity effect is your brain’s tendency to pay more attention to negative events than positive ones. Negative things have a greater impact on your behaviour and processes than positive or neutral things of equal intensity.

I liked this description by neuropsychologist Dr. Rick Hanson:

“The mind is like Velcro for negative experiences and Teflon for positive ones.”

If you are interested, you can read the article on Wikipedia about the negativity bias.

Nobody knows what impact AI will have on our lives 5 or 10 years from now. But the negativity effect has a negative impact on your life right now, with or without AI. So do yourself a favour, and read about a few ways to counter this bias in your thinking.

What do you think? Please share your comments on the blog post].

If you enjoyed this, subscribe to our weekly newsletter

Leave a Comment

Your email address will not be published. Required fields are marked *

Thank You

We're Excited!

Thank you for completing the form. We're excited that you have chosen to contact us about training. We will process the information as soon as we can, and we will do our best to contact you within 1 working day. (Please note that our offices are closed over weekends and public holidays.)

Don't Worry

Our privacy policy ensures your data is safe: Incus Data does not sell or otherwise distribute email addresses. We will not divulge your personal information to anyone unless specifically authorised by you.

If you need any further information, please contact us on tel: (27) 12-666-2020 or email info@incusdata.com

How can we help you?

Let us contact you about your training requirements. Just fill in a few details, and we’ll get right back to you.