Share

How Trump turned up the heat on fake news, algorithms and privacy

Cape Town - A new book by Meltwater CEO Jørn Lyseggen, titled Outside Insight: Navigating a World Drowning in Data, illustrates why business leaders need to be looking at data outside their own fire walls to discover opportunities and threats in real time, and make better, more forward-looking decisions.

“Every day, competitors are leaving behind online breadcrumbs filled with valuable external data - from hiring a new employee, to filing a new patent, launching a new product, online ad spend, and social media,” explains Lyseggen.

“Leveraging insights gleaned from this outside information will allow companies to look ahead and make more informed decisions, to the benefit of boards, executives, investors, marketers, product developers and more.”

In Outside Insight, Lyseggen shows that by moving from a focus on lagging, internal data, toward an analysis that encompasses industry-wide, external data, businesses will be able to paint a more complete picture of their brand’s opportunities and threats, and uncover forward-looking insights, in real time.

“Mining external data related to your brand, your competitors and your industry for insights that can help you predict what’s coming and make more informed decisions will be an essential practice for any company looking to remain ahead of the competition in the future. However, not all data is created equal. In a world where we’re producing 2.5 quintillion bytes of data every day, there are some concerns we all need to be aware of.”   

WATCH: Interview with Meltwater CEO Jørn Lyseggen and Fin24's Matthew le Cordeur


The below is an extract from a chapter in Outside Insight titled, “The potential concerns of Outside Insight”:

Three areas of potential concern raised by the 2016 presidential election

The 2016 US presidential election brought to the surface three important areas of concern that are also broadly relevant for Outside Insight.

The first one is that of privacy. As we all leave behind us a constant trail of ‘likes’, tweets, check-ins and photos, how can we protect this data from sophisticated algorithms that psychometrically profile us and take advantage of us?

The second is the danger of the algorithms themselves. Can algorithms become too smart? Is there an ethical line that algorithms can cross?

The third area is that of fake news. During the 2016 presidential election a whirlwind of fake news was created. Examples of fake news that were widely shared online are reports of Hillary Clinton running a child sex ring out of a pizza shop, of Democrats wanting to impose Islamic law in Florida and of Trump supporters in a Manhattan rally chanting, ‘We hate Muslims, we hate blacks, we want our great country back.’ Interestingly, such fake news galvanized the beliefs of existing voter bases and eroded the credibility of traditional news sources.

1. How do we protect people’s privacy?

Many people argue that we just have to forget about privacy in this day and age. With the growth of social media we have opened the door to an era of radical transparency, and many people, such as Google’s Eric Schmidt, argue that we just have to accept that privacy is a thing of the past.

Many people don’t worry much about privacy because they are oblivious to the amount of information they’re sharing about themselves. For instance, if you’re out having dinner at a restaurant, you may find yourself tagged in someone else’s status update. Or someone may take a photo of you without your knowledge. Status updates and photos are both frequently geotagged, revealing information about your location.

Social media are littered with information about you, such as where you eat, who you socialize with, where you shop, what products you’re purchasing and a host of other details of your life. Even if you are not very active on social media yourself, Facebook, Twitter, Instagram, Pinterest and Snapchat will know a lot about you because your friends have tagged you in their social posts.

For many people this isn’t something to lose sleep over; they argue that they have nothing to hide. But analysing all the online digital breadcrumbs we leave behind, we are revealing more information about our-selves than we may realize. By analysing a person’s Facebook ‘likes’ or Twitter timeline it’s possible to deter-mine with a high degree of accuracy a person’s salary range, education level, sexuality and political leanings. Over time, as an increasing amount of data is collected on social platforms and smart algorithms become smarter, profiling will become more accurate and there-fore more invasive.

During a job interview a potential employer in the US is not allowed to ask questions relating to a candidate’s age, religious beliefs, sexual preference or political affiliation. This law is created in order to prevent people from being discriminated against. But employers can now glean most of this information from social media anyway.

The 2016 presidential election in the US brought to the surface the importance of privacy. As analytics grow in sophistication, privacy is clearly going to become an increasingly important issue.

2. When are algorithms too clever for their own good?

When it comes to algorithms, we are constantly pushing for greater sophistication and accuracy. On the surface it does seem that the better algorithms we get, the better off we are. An example of this is our analysis of customers’ social media chatter. The more accurately the algorithms understand the true sentiment of our clients, the better. But is this always the case, or are there situations where algorithms raise important ethical questions?

The US retailer Target’s data science program made headline news when it was reported by Forbes magazine the future of outside insight in 2012 that Target had sent coupons for baby clothes to a high-school girl based on her purchase history, correctly predicting that she was pregnant before she had told her parents. Some doubts about the truth of this story have since been raised, but the story still illustrates that algorithms can potentially cross an ethical line.

When it comes to deducing personal or intimate information about a person, algorithms are entering an ethically sensitive area. Skin colour, sexual orientation, political leaning, education, salary level, intelligence and religious affiliation are all examples of information that people don’t usually share directly but which algorithms can deduce from a lot of data points that in themselves may seem innocent. This can create many difficult ethical situations. In many countries, such as the US, it is illegal to discriminate against a job applicant on grounds of age, religion or sexual orientation. In some countries homosexuality is against the law. The existence of algorithms that can infer sensitive information about people can in these situations be used for discrimination or, in the worst case, prosecution.

Perhaps one of the most ethically sensitive areas for algorithms is when they are used to profile people and these profiles are used to actively develop strategies to manipulate their behaviours. If algorithms are so sophisticated that they understand which buttons to push in order to provoke a desired reaction, they are becoming dangerous psychological weapons. Many believe that Donald Trump was able to reduce the black vote

on the eve of the election when his campaign targeted black voters through social media with videos of Hillary Clinton talking about ‘ super- predators’. Clinton was accused of using the term to characterize young African- Americans. Black voters would generally be expected to vote more for Clinton than Trump, so the more black voters stayed at home rather than going out to vote, the better it was for Trump.

Manipulating people to vote a particular way sounds bad, but if we think about it, we are surrounded by messages trying to convince us one way or another all the time. We are constantly bombarded with advertisements and messaging that are carefully tailored to us. Some want us to buy a particular type of jeans or drink a certain soft drink. Others want us to change our job, support a good cause or start a new work-out regime. Where do we draw the line between advertisement and manipulation? The only thing separating the two is the strength of the algorithm, isn’t it?

3. Fake news!

The 2016 presidential election in the US created fake news stories that were often produced by propaganda websites and subsequently spread via social media.

We have always had news sites with a certain political leaning which to a greater or lesser degree coloured its coverage, but what we saw in the 2016 election was a flurry of entirely fabricated news stories created for the purpose of misinformation and to create confusion.

In the same way as fake news was created in order to fabricate an alternative reality to the one described in the traditional news sources, so Outside Insight can also expect to see its fair share of fake breadcrumbs produced by companies that want to confuse and outmanoeuvre their competitors. As Outside Insight becomes increasingly widespread, such fake breadcrumbs will become more and more commonplace and will be used by companies to hide their real intentions. This will induce an arms race between those who are producing fake bread-crumbs and those who can identify them. This arms race will be very similar to what we today see between those who create viruses and those who create anti-virus software.

It is the nature of all new technologies to solve problems that could not be solved before and at the same time inadvertently create new problems for which we have to find good solutions. Outside Insight is no exception in this respect.

The three problems that are outlined above – how we protect people’s privacy, how we make sure our algorithms are ethical, or are used ethically, and how we address the natural development of fake breadcrumbs – are all important areas of concern.

I don’t have immediate solutions. All I want to do is to raise awareness of the issues. As we scramble to implement Outside Insight solutions, I also think it is important that we are mindful of the ethical issues that will come up so that we can find ways to tackle them. Only then can we fully benefit from all the benefits Outside Insight has to offer.

* Visit Penguin for options on purchasing the book.

* Sign up to Fin24's top news in your inbox: SUBSCRIBE TO FIN24 NEWSLETTER

We live in a world where facts and fiction get blurred
Who we choose to trust can have a profound impact on our lives. Join thousands of devoted South Africans who look to News24 to bring them news they can trust every day. As we celebrate 25 years, become a News24 subscriber as we strive to keep you informed, inspired and empowered.
Join News24 today
heading
description
username
Show Comments ()
Rand - Dollar
19.02
+1.0%
Rand - Pound
23.79
+0.7%
Rand - Euro
20.39
+0.8%
Rand - Aus dollar
12.41
+0.6%
Rand - Yen
0.12
+1.2%
Platinum
914.20
+0.2%
Palladium
1,006.00
+0.1%
Gold
2,327.26
+0.5%
Silver
27.37
+0.8%
Brent Crude
88.02
-0.5%
Top 40
68,408
-0.2%
All Share
74,361
-0.2%
Resource 10
61,578
+1.9%
Industrial 25
102,917
-1.1%
Financial 15
15,809
-0.2%
All JSE data delayed by at least 15 minutes Iress logo
Company Snapshot
Editorial feedback and complaints

Contact the public editor with feedback for our journalists, complaints, queries or suggestions about articles on News24.

LEARN MORE
Government tenders

Find public sector tender opportunities in South Africa here.

Government tenders
This portal provides access to information on all tenders made by all public sector organisations in all spheres of government.
Browse tenders