In the two years since the Cambridge Analytica exposé, the issue of data privacy has been a growing concern. Last year the dynamic shifted from an uneasy relationship between the tech giants - the FANGs – and their customers, to a more ominous convergence of tech companies and governments. Does this new form of data ownership – dubbed 'authoritarian tech' – place society onto a slippery slope?

Back in 1984, the American singer Rockwell released his hit song Somebody's Watching Me, which carried the chorus line, “I always feel like somebody's watching me. And I have no privacy.” Little did he know just how true those words would ring three decades later. His 1984 world was a pre-Internet world, so when he sang, “Well, can the people on TV see me. Or am I just paranoid?” it was based on a spy vs. spy, cold war era – ie: surveillance between government agencies.

Today, digitisation has created an effective and seamless way of collecting almost anyone’s personal data. It’s now not only easy to track our likes, dislikes and preferences but also where and how those preferences play out in our lives. From smartphone usage, Internet searches, social media, online shopping and app updates, we give away – every minute of the day – digital gold that can be used against us. I say “against us” because data collection has very quickly moved from a benign form of creating a customer profile for retailers and advertisers, to a subtle, but sophisticated, method of behaviour modification.

Jaron Lanier – an American computer philosophy writer and computer scientist – remarked in a TED Talk that “I can’t call these things social networks anymore – I call them behaviour modification empires.”  A chilling statement made even more chilling because it has made us realise what we are really up against. We now know that our digital addiction has been cleverly engineered, specifically to feed into a very clever business model.

Who/what are the FANGs?

An acronym that joins together the tech giants of our digital world: Facebook, Amazon, Netflix, and Google – all of whom rely on “the Silicon Valley business model” of personal data collection, which is then sold to advertisers. We, the users of these platforms, are not the customers, but the product being sold.

Boundless Mind is a Silicon Valley start-up comprised of a group of trained neuroscientists trying to find ways to stem digital addiction. They explained to Time magazine the unique business model that's driven by “persuasive technology”: a potent combination of tech and neuroscience that deliberately encourages certain human behaviours (like keeping you scrolling mindlessly) while discouraging others (like considered, nuanced ideas).

Ramsey Brown, a co-founder of Boundless Mind, elaborates on how the business model flows on from persuasive technology: “The longer we are glued to an app – known as eyeball time – the more money its creators make by selling our attention and access to our personal data to advertisers and others.

Now that we have your attention…

Mining data from a captive audience is the gift that keeps giving, which is why this business model is so successful. Creating customer profiles with that accumulated data was the first step. Framed, ostensibly, as a harmless process that will allow you to receive more targeted advertising and not be bothered by ads that have no relevance to you as a demographic. All well and good, but then the goalposts shifted. Enter nudge theory. What if companies used the same data to encourage better behaviour: a healthier lifestyle for example, like Discovery’s Vitality points? The more preventative behaviours you adopt – like exercising more and buying healthier food at the supermarket – the less Discovery will have to spend on fixing the problem, and therefore they encourage you by rewarding (or nudging) you.

Nudge theory

Nudge theory is a combination of behavioural science and economics which uses positive reinforcement and indirect suggestions as a means to influence, and ultimately change, the behaviour and decision-making process of groups or individuals. 

Nudge theory maintains that choice-making can, and should be, based on how people actually think and decide (instinctively and perhaps irrationally), rather than how authorities traditionally believe people think and decide (logically and rationally).

Nudge theory is a great idea when you only look at it from a rewards point of view. But what happens when you use the same behaviour modification technology to do the opposite and punish bad behaviour? For a glimpse into that future, we just have to look to China, leaders in the art of nudge theory – for reward, but increasingly as punishment.

In terms of technology, facial recognition is China’s data weapon of choice. The grand plan is to capture all their citizens on a facial recognition system by 2020, which is why facial recognition is becoming more commonplace: from being scanned at an airport for check-in, to ordering chicken wings at a “smart KFC restaurant” in Beijing, citizens are “willingly” forgoing their privacy for the sake of convenience.

With Augmented Reality (AR) technology, facial recognition makes it faster and more effective to track known felons than using fingerprints. The police force has already been equipped with AR eyewear, and should a suspect’s face appear on the system, the police are immediately alerted via a message that appears on their eyewear. It sounds like the movie Minority Report come to life – and it is. Even banks are using “micro-expression” facial recognition to detect early signs of fraud in a customer’s expressions.

This facial recognition programme will dovetail perfectly with the already advanced Chinese social credit system called Zhima Credit (aka Sesame Credit), which appears as an app as part of the extensive Alipay mobile payments ecosystem: a system that is designed to make your life easier and seamless. But as it’s designed to engulf every aspect of your life, adding a social credit system – that the government has access to – requires a strategic alliance between technology companies and government: the growing relationship between technology, business and state.

Social credit systems – the slippery slope

China's Communist Party motivates that the social credit system is an effort to nudge people toward good social behaviours ranging from energy conservation to doing charity work. The more good you do, like paying your traffic fines, up go your Zhima points. If you don’t pay your fines, points are deducted. Similar automotive demerit systems are already in place in many countries around the world, and will soon be implemented in South Africa. We should take note.

Journalist Rene Chan writes that the government’s massive data bank of all its citizens includes everything from bank account numbers to court records to Internet search histories. With 1.4 billion citizens to track, it is an ambitious project. But China has never shied away from ambitious projects.

The government has already stated that the goal of the programme is to “allow the trustworthy to roam everywhere under heaven, while making it hard for the discredited to take a single step.” It’s a utopian and dystopian dream rolled into one, and the first signs of how the system will be used, specifically to punish, are surfacing.

Travel is one of the main areas where social credit scores can effectively change people’s lives. Nine million people with low scores have already been blocked from buying tickets for domestic flights, while three million people have been barred from purchasing business class train tickets. Citizens who refused military service have also been banned from booking holidays and from certain hotels.

Jobs and access to education are other leverage points for social credits. A student was denied entry into a Chinese university because of his father’s bad social credit score, while "trust-breaking" individuals can also be banned from doing management jobs in state-owned firms and big banks.

The Internet in China is heavily monitored, so too many social credit demerits could result in a household receiving prohibitively slow Internet speed. You could even lose your dog if you walk it without a leash or if it causes public disturbances.

Naming and shaming is another tactic available, which is being used more frequently. Pictures of jaywalkers are now being posted on giant electronic billboards at the intersection of their indiscretion, and to complete the service, the offender will also be sent a text notification via their WeChat app (which, of course, the government also has access to).

If this dystopian glimpse is a cause of concern, then we should support people like Alex Gladstein (chief strategy officer for the Human Rights Foundation) who is leading the charge against these kinds of surveillance systems and is advocating the use of decentralised networks, like blockchain, to help strengthen civil liberties by reclaiming ownership of our personal data.

If we don’t, we’ll just become part of the matrix, and that’s not a movie I want to live in.

Into the future 

Tech that can decode your brain activity and reveal what you're thinking and pre-empt your feelings is on the horizon, says legal scholar and ethicist Nita Farahany. What will it mean for our already violated sense of privacy if we get to a point where (like the movie Minority Report) people are arrested for merely considering committing a crime and our brain data then becomes a commodity, alongside our digital data?


Making Ethics Tangible

Making Ethics Tangible

Entrepreneurship: It’s a Hard-knock Life

Entrepreneurship: It’s a Hard-knock Life

20/20 Hindsight - The State We’re In

20/20 Hindsight - The State We’re In