The scene is set for a tragicomedy. Bring out the popcorn.

AI has slipped seamlessly into our work lives. No spelling skill? Bad with grammar? No problem. Your AI co-pilot smooths those rough edges as it simultaneously composes your emails, summarises boring meetings, and puts verbose reports into neat bullet points, all while making your restaurant booking. But look more closely and you’ll see that the emergence of AI dependency may be a double-edged sword.

In April this year, the weekly active users of ChatGPT hit the 800 million mark, having crossed the milestone 100 million mark in August 2023. It currently processes one billion queries per day. That’s a lot of co-piloting. But what’s more interesting is what people are using AI platforms such as ChatGPT for.

In a recent survey, 80% of the respondents say they now use AI in both their professional and personal lives. Hallucinations be dammed. If there’s a short-cut, humans will take it. We are, after all, fully subscribed to the convenience economy.

The personal AI prompts are interesting and range from “write me a loving poem to my wife which she can read after I die” from someone with a terminal disease, to “write me a eulogy for my alcoholic ex-husband, but show him in a positive light for the sake of our 10-year-old daughter”. These kinds of prompts are useful and illustrate the positive side of AI and there is a pattern emerging.

People are using AI in three distinct categories:

  1. Better communication skills, to sound funnier, smarter, more empathetic, and so on.
  2. For therapy and emotional support.
  3. For romantic purposes.

Even though it’s in a nascent stage, it’s the final category – for romantic purposes – that’s starting to have an impact on in-person interactions, and that’s where the tragicomedy comes into play.

In modern dating – and I’m referring to dating human to human, not human to virtual avatar – flirting is conducted via text messages. In an era when a phone call is considered weird and possibly “moving too fast”, the nuances of your text messages are crucial. (Sexting is probably considered pre-marital sex today – but that’s another trend column entirely.) 

Cue the AI dating wingman or wingwoman.

If 50% of people are using AI chatbots to better their communication skills then it stands to reason that many are also using chatbots to make themselves sound funnier, smarter, and more desirable. The problem arises when there is an eventual in-person date and the chatbot user has to function without their wingperson. 

It’s becoming more common that first, in-person dates are proving to be huge disappointments. The person who appeared to have so much “rizz” (Gen Z slang for charisma) online turns out to be a dull, tongue-tied imposter. The socio-cultural shifts we’re headed for are going to be fascinating.

Call my agent 

Agentic AI, the more advanced sibling of Generative AI, is unleashing AI agents into unexpected areas of business, such as retail. Pretty soon online shopping will feel “so 2024”. Shoppers looking for a fashion fix will, instead, rely on their personal AI stylists who are armed with all their data and therefore preferences. The AI agent that will browse multiple sites following a detailed prompt (for example: “I need a weather-appropriate outfit for a wedding in December, in Franschhoek, that matches the ‘spring in Paris’ theme”).

The AI agent will collate a selection and once you’ve settled on something, will find a pay gate (Mastercard has already launched “agent pay”) and facilitate the sale and delivery.

What does this mean for online retailers who are not speaking to the end customer but dealing with an AI intermediary? Concepts such as SEO, which have governed our online businesses for so long, are fast becoming obsolete. GEO (generative engine optimisation) is the new buzzword. Retail is being disrupted, again.

Bot vs bot – recruitment in an AI era

If you want a glimpse of how AI dependence plays out, then look at what is already happening in the HR sector and the recruitment processes.

Not only are universities grappling with students using AI for their assignments, but graduates are turning to AI to help them with their CVs and job applications. But companies are also using AI recruitment tools to filter and shortlist the large numbers of CVs that come their way. So essentially in this initial job application process, bots are talking to bots.

Once on the shortlist, a potential candidate now faces a new challenge when they secure a first-round interview. These interviews are increasingly being conducted by AI avatars and feel so realistic that it’s only deep into the interview, or once it’s over, that the candidate realises that something is not quite what it seems.

Time spoke to Canadian job seeker Wafa Shafiq, who applied for a job in marketing through a recruitment agency. She was delighted to receive a request for a video interview with “Alex from Apriora”. As any eager candidate would do, Shafiq did a quick search of her interviewer to prepare for the interview and discovered that Alex wasn’t a person but an AI recruiter. 

After the interview, which consisted of only six or seven questions, Shafiq said that “it wasn’t as bad as I thought it would be”, and the overall experience, was “cold, but efficient”.

According to a recent report by Resumé Now, an American CV-building platform, 96% of the 900 HR practitioners surveyed said they used AI to help sort, schedule, and screen applicants to reduce hiring costs and time.

The dark side – AI voices inside your head

Using an AI wingman/wingwoman to make you sound more eloquent, whether for business or romance, leans more towards the convenience economy, but the second category of personal-use AI, for therapy or emotional support, is one to watch with more concern.

There have already been several cases of suicide attributed to AI chatbots. People are starting to form strong emotional bonds with their AI companions, who – after a long “relationship” – have convinced them to end their lives. In August, the parents of a teenage boy who committed suicide sued OpenAI, alleging that the chatbot helped their son “explore suicide methods”.

It is the first time the AI company has been directly accused of wrongful death. In other, less fatal, cases people have lost jobs, ended relationships, committed criminal acts, or have had nervous breakdowns, all attributed to AI chatbots.

What’s emerging is the term “AI psychosis”, which is not well understood as the phenomenon is so new. What is clear is that AI chatbots are designed to tell you what want you want to hear, so they mirror a user’s language and validate their assumptions or their neurosis. Thus, if a person is emotionally vulnerable and has a predilection for distorted thinking or conspiracy theories, the chatbot will simply amplify and affirm those thoughts.

So be wary of what your AI chatbot whispers in your ear.

KEY TAKEAWAYS

In the past 24 months I have read an equal number of articles for and against AI. In among the doomsday future-of-work narratives are reports that AI’s LLM feeding frenzy has reached its peak, but at the same time how it is quietly revolutionising sectors such as healthcare and agriculture.

Either way, just like smartphones changed our lives, so too will AI. The shift from a co-pilot for work to a dating wingperson signals a tipping point of its potential to slip effortlessly into our day-to-day lives. How the story unfolds is yet to be determined but the ripple effect of AI dependency will alter everything from retail to socio-cultural status quos.

Choose your wingman or wingwoman carefully!

Related

Part Jobs Portal; Part Dating App

Part Jobs Portal; Part Dating App

The Real Cost of Dull: Why Brands Can’t Afford Mediocrity

The Real Cost of Dull: Why Brands Can’t Afford Mediocrity

Retirement Planning: Navigating the Void

Retirement Planning: Navigating the Void