Co-Pierre Georg, University of Cape Town; Christoph Aymanns, University of St.Gallen, and Jakob Foerster, University of Oxford

The term “fake news” has become ubiquitous over the past two years. The Cambridge English dictionary defines it as “false stories that appear to be news, spread on the internet or using other media, usually created to influence political views or as a joke”.

As part of a global push to curb the spread of deliberate misinformation, researchers are trying to understand what drives people to share fake news and how its endorsement can propagate through a social network.

But humans are complex social animals, and technology misses the richness of human learning and interactions.

That’s why we decided to take a different approach in our research. We used the latest techniques from artificial intelligence to study how support for – or opposition to – a piece of fake news can spread within a social network. We believe our model is more realistic than previous approaches because individuals in our model learn endogenously from their interactions with the environment and not just follow prescribed rules. Our novel approach allowed us to learn a number of new things about how fake news is spread.

Photo Credit: How false news can spread – Noah Tavlin

The main take away from our research is that when it comes to preventing the spread of fake news, privacy is key. It is important to keep your personal data to yourself and be cautious when providing information to large social media websites or search engines.

The most recent wave of technological innovations has brought us the data-centric web 2.0 and with it a number of fundamental challenges to user privacy and the integrity of news shared in social networks. But as our research shows, there’s reason to be optimistic that technology, paired with a healthy dose of individual activism, might also provide solutions to the scourge of fake news.

Modeling Human Behaviour

Existing literature models the spread of fake news in a social network in one of two ways.

In the first instance, you could model what happens when people observe what their neighbours do and then use this information in a complicated calculation to optimally update their beliefs about the world.

The second approach assumes that people follow a simple majority rule: everyone does what most of their neighbours do.

Photo Credit: Willett

But both approaches have their shortcomings. They cannot mimic what happens when someone’s mind is changed after several conversations or interactions.

Our research differed. We modelled humans as agents who develop their own strategies on how to update their views on a piece of news given their neighbours’ actions. We then introduced an adversary that tried to spread fake news and compared how efficient the adversary was when he had knowledge about the strength of other agents’ beliefs compared to when he didn’t.

So in a real world example, an adversary determined to spread fake news might first read your Facebook profile and see what you believe, then tailor his disinformation to try and match your beliefs to increase the likelihood that you share the fake news he sent to you.

We learnt a few new things about how fake news is spread. For example, we show that providing feedback about news that’s been shared means that its easier for people to detect fake news.

Our work also suggests that artificially injecting a certain amount of fake news into a social network can train users to better spot fake news.

Crucially, we can also use models like ours to come up with strategies on how to curb the spread of fake news.

There are three things we have learned from this research about what everyone can do to stop fake news.

Fighting Fake News

Because humans learn from their neighbours, who learn from their neighbours, and so on, everybody who detects and flags fake news can help prevent the spread of fake news on the network. When we modelled how the spread of fake news can be prevented, we found the single best way was to allow users to provide feedback to their friends about a piece of news they shared.

Beyond pointing out fake news, you can also praise a friend when they share a well researched and balanced piece of quality journalism. Importantly, this praise can happen even when you disagree with the conclusion or political point of view expressed in the article. Studies in human psychology and reinforcement learning show that people adapt their behaviour in response to negative and positive feedback – particularly when this feedback comes from within their social circle.

The second big lesson was: keep your data to yourself.

The web 2.0 was built on the premise that companies offer free services in exchange for users’ data. Billions followed the siren’s call, turning Facebook, Google, Twitter, and LinkedIn into multi-billion dollar behemoths. But as these companies grew, more and more data was collected. Some estimate that as much as 90% of all the world’s data has only been created in the past few years.

Do not give your personal information away easily or freely. Whenever possible, use tools that are fully encrypted and very little information is collected about you online. There is a more secure and more privacy-focused alternative for most applications, from search engines to messaging apps.

The ConversationSocial media sites don’t yet have privacy-focused alternatives. Luckily the emergence of blockchain has provided a new technology that could solve the privacy-profitability paradox. Instead of having to trust Facebook to keep your data secure, you can now put it on a decentralised blockchain that was designed to operate as a trustless environment.

Co-Pierre Georg, Senior Lecturer, African Institute for Financial Markets and Risk Management and Director, UCT Financial Innovation Lab, University of Cape Town; Christoph Aymanns, Assistant Professor, University of St. Gallen – School of Finance, University of St.Gallen, and Jakob Foerster, Doctoral student, Artificial Intelligence and Machine Learning, University of Oxford

This article was originally published on The Conversation. Read the original article.

Join our mailing list

We promise to only send you great stories you might have missed once a week.

You May Also Like

Jack Ma: Africa is the Future of e-Commerce. And Young Africans hold the Keys

Jack Ma reiterated his confidence in the youth whom he said, had huge potential to boost emerging markets to develop Africa.

Leadership lessons from a Pandemic

The coronavirus pandemic has taught us much about the unpredictability of life but even more, about the current technological changes and the abrupt adoption we have had to make

COVID-19: Leadership is the Bridge between Chaos and Hope

Previous global health crises, such as the SARS, Ebola, or Zika did not leave much of an imprint on the world as the case has been for coronavirus

Opportunities for Public Private Partnerships (PPP) in Kampala City Road Repairs

To improve the resilience of global infrastructure, we need new approaches to public-private partnerships, project finance, and risk management