Inicio Información ‘Epistemic security’ in the post-truth age

‘Epistemic security’ in the post-truth age

40
0

In a recent report published by the UK’s Alan Turing Institute, it was argued that the COVID-19 pandemic is no less than a threat to global security itself as are «national security» or «cyber-security». Beside that we ought to be pay attentio to «epistemic security» too, because without it, our societies will lose the ability to respond to the most severe risks we face in the future.

If home security is about making sure our possessions are safe, financial security is about keeping our money safe, national security is about keeping our country safe, then epistemic security is about keeping our knowledge safe.

Episteme is a Greek philosophical term, meaning «to know». Epistemic security therefore involves ensuring that we do in fact know what we know, that we can identify claims that are unsupported or not true, and that our information systems are robust to «epistemic threats» such as fake news.

In the report of Alan Turing Institute, there are potential countermeasures and areas of research that may help preserve epistemic security in democratic societies. Let’s look at four key trends that have exacerbated the problem, and made it increasingly difficult for societies to respond to pressing challenges and crises:

1. Attention scarcity

At the 13th Century just before the invention of the printing press in Europe most scholars complained about information overload. In 1255, the Dominican Vincent of Beauvais wrote of «the multitude of books, the shortness of time and the slipperiness of memory». However, the internet has made massive quantities of hard-to-verify information more easily accessible than ever before. It is difficult to sift through which tidbits are true and which are not. Our limited capacity for attention is simply spread too thin. Abundance of information and limitations on attention creates a fierce «attention economy» in which governments, journalists, interest groups and others must compete for eyeballs. Unfortunately, some of the most effective attention-grabbing strategies appeal to people’s emotions and existing beliefs, and these sources are otherwise ambivalent about the truth.

2. Filter bubbles and bounded rationality

A particularly worrisome consequence of the «attention economy» is the formation of filter bubbles, where people are exposed primarily to their own pre-held beliefs, and opposing views are filtered out. When facing information overload, people naturally prefer to pay more attention to like-minded individuals in their own communities over unfamiliar outsiders. Using social media platforms, it is easier than ever to form and join communities unified by shared beliefs and values. The epistemic consequence of filter bubbles is called «bounded rationality». If access to information is the foundation of good reasoning and decision-making, then limiting one’s access to potentially relevant information by becoming entrenched in filter bubbles will in turn limit one’s ability to reason well.

3. Adversaries and blunderers

It’s easier than ever to distribute and access information. The downside is that these same technologies also make it easier for people to either intentionally or accidentally spread false or misleading information. Actors (individuals, organisations, or states) who intentionally manipulate information to maliciously mislead or deceive information recipients in order to lead them to false beliefs are called «adversaries». Adversaries mount «adversarial attacks» to incite people to action based on misleading or false information. For example, a political campaign might use deepfake video technology to fabricate incriminating footage of other political candidates in order to manipulate election results in their own favour. On the other hand, actors who spread false or poorly supported beliefs by either well-intentioned or accidental means are called «blunderers». For example, a vaccine researcher wary of side effects and distrustful of medical authority might make a well-meaning but slightly alarmist comment during an interview, which could then be picked up and spread on social media, instigating a widespread anti-vaccination campaign.

4. Erosion of trust

Humans have evolved natural techniques to decide when to trust others. For example, we are more likely to trust someone if they are believed by a large number of people, and we are even more willing to believe a person who is a member of own community. We also use body language, vocal intonation, and speech patterns to judge honesty. These strategies are fallible, but in general, they have served humans well. However, modern information technologies can undermine those tricks. For example, the emergence of filter bubbles can make otherwise minority opinions much more visible and seem to be much more widely believed than they actually are. While some minority perspectives ought to be made more visible, there is a problem when harmful, extremist narratives are made to appear much more mainstream than they actually are. Some technologies also hijack our subconscious tendency to search for signs of honesty and insincerity in vocal patterns and body language. Artificially-generated speech or deepfake videos are not plagued by the little ticks that tip us off when someone is fibbing.

The researchers at the Alan Turing Institute describe the worst-case scenario as the ‘epistemic babble’, in which the entire population loses the ability to distinguish between truth and fiction. Despite all the information available, people no longer know whether something is true or not. In the event of a pandemic, solidarity with the policy becomes impossible. According to these authors, we are closer to this scenario than most suspect. To achieve epistemic security, it is necessary to influence four parts of the information process: how and by whom is the information produced, how is it disseminated, who takes note of it and who evaluates its accuracy. This requires a holistic approach where all actors are identified.


ORIGEN AUTORAL:  Elizabeth Seger

PUBLICACIÓN:   www.BBC.com

RECOPILACIÓN: www.AMBIDEXTRAS.org