What is the number one factor that can obstruct OSINT investigations?

The answer is probably looking back at you in the mirror.

All too many conversations regarding open-source intelligence focus on exciting tools and techniques.

A quick Google search for OSINT will reveal endless articles on using such-and-such a system to extract a bounty of data.

But any tool or tactic is only as effective as the human analyst employing them.

And unfortunately for us, that tends to be the source of many problems throughout the intelligence cycle.

Why?

We are all hard-wired with cognitive biases – a collection of mental shortcuts that help our brains make sense of large amounts of data.

Our ancestors needed these heuristics to scratch out a living on the African savanna.

But they create problems for us when navigating the modern world. And cognitive biases can lead to bad decisions in everything from investing and business to household shopping.

Or in the case of OSINT investigations specifically, these mental shortcuts can result in faulty analysis.

So how can we address this problem?

Savvy analysts have the humility to admit that they suffer from common cognitive biases.

Then they take specific steps throughout the intelligence cycle to address these possible pitfalls.

But first, it makes sense to take inventory of these psychological shortcuts and then understand how they might seep into our analysis.

Of course, a complete list and description of our cognitive biases goes beyond the scope of this article.

And, for the record, psychologists have documented almost 200 such heuristics.

So for this post, we wanted to highlight the top five cognitive biases that can obstruct OSINT investigations.

We also detailed strategies and techniques to safeguard against these heuristics.

 

1. Confirmation Bias

Confirmation bias describes our tendency to seek information consistent with our pre-existing beliefs. And by extension, this heuristic often leads us to disregard evidence that doesn't fit in with our original hypothesis or point-of-view.

Confirmation bias crops up often in situations where we prefer a specific outcome or when dealing with emotionally charged issues.Confirmation bias diagram

In OSINT investigations, this often occurs when we have a preconceived idea of where our analysis will go.

That predetermined hypothesis becomes almost a self-fulling prophecy, where the analyst begins selecting sources and data points that conform to their original point of view.

Many new analysts fall into the confirmation bias trap, particularly when searching for the perpetrators behind a particular incident. But that said, any senior OSINT investigator can make this mistake too.

While we can’t eliminate confirmation bias from our analysis, we can take steps to mitigate this problem.

“Rapid destruction of your ideas when the time is right is one of the most valuable qualities you can acquire,” Charlie Munger, the business partner for famed-investor Warren Buffett, onced explained.

“You must force yourself to consider arguments on the other side.”

In the case of OSINT investigations, it makes sense to take each bit of relevant intelligence you collect and assign it to one of two columns.

In column A, include any information that supports your original theory.

In column B, add every data point that might disprove your first hypothesis. Never ignore any intelligence just because it invalidates your preferred conclusion.

Afterward, review both columns.

First, check to see if you have enough evidence to support your original theory.

But more importantly, you also want to examine the strength of the data that disproves your preferred answer.

In other words, OSINT investigators have to guard against forcing the data into telling the story they want. Instead, we have to let the data tell the story on its own.

 

2. Availability Bias

An airplane flying through the skyThe availability bias refers to our tendency to overestimate the value of easily accessible information. This heuristic also refers to our propensity to rely on examples that come immediately to mind when evaluating a specific decision.

For example, many people consider air travel more dangerous than driving.

In reality, air travel represents a far safer mode of transportation than almost every other method by several orders of magnitude.

So why do we make this mistake?

A plane crash is a highly-publicized, mass casualty event. A car accident, by comparison, seldom makes the news.

So because of the availability bias, people often overestimate the risks of air travel over other modes of transport.

But the same mistake also comes up in the world of corporate security.

Take safeguarding an executive, for instance. Threat assessments sometimes overestimate the possibility of dramatic incidents like shootings or kidnappings.

In reality, the biggest risk to the safety of many corporate executives are often simple trips, falls, and stumbles.

During an OSINT investigation, the availability bias can compromise our analysis in several ways:

We overweight the value of our own experiences or those of friends, colleagues, and family members. We underweight the value of raw data or subject matter experts we may not know personally.

  • We overweight the value of easily accessible information, such as indexed results on Google searches. We underweight censored data or hard to access information, such as dark web forums or obscure alt-tech social networks.

  • We overweight the importance of topics and information that grab headlines or trends on social networks. We underweight the value of information that generates little media buzz or engagement online.

  • We overweight the value of information written in our native language or originates from our home country. We underweight information published in foreign languages or originates from unfamiliar communities.

OSINT analysts can account for the availability bias by considering their data sources.

Then ask yourself, “What could be missing from this list?”

From there, go out and deliberately seek information that might be hard to find.

For example, an investigator might want to expand their data sources by exploring difficult to access parts of the internet, such as the deep or dark web.

Or they may want to consider researching topics from unfamiliar regions and languages.

 

3. Stereotyping Bias

Stereotyping refers to our tendency to make sweeping, often incorrect, judgments about the beliefs and characteristics of a particular group of people.

In other words, we assume a group of people or a particular person has certain qualities, even without information to back up our assumptions.

In a recent presentation on bias in OSINT investigations, Heather Honey, President of Haystack Investigations, asks her audience to consider the example of heavy metal music fans.

“Perhaps you will make some assumptions about these guys based on the stereotypes that we have about people that listen to heavy metal,” she explains.

“What you think and what you believe about them is largely dependent on whether you like heavy metal. Is that your in-group right there or your outgroup?”

Stereotyping can undermine OSINT investigations.

For starters, stereotypes can drive your research. It can tempt you to seek out information that will reinforce your original point of view.

Even worse, stereotypes can blind you from information or data that contradicts your original thesis.

In both cases, such biases can lead analysts towards disastrously wrong conclusions.

To curb reflexive reactions, we must consider our own potential biases.

Which groups or subcultures do I belong to? How might this positively or negatively impact my assessment of the group or individual in question?

And from there, it helps to see people, as best we can, as individuals. And if time permits, then increase your exposure to the stereotyped group.

 

4. Clustering Illusion

An illustration of people reviewing statisticsThe clustering illusion refers to our tendency to underestimate the amount of variability in random data. And as a result, this cognitive bias can lead us to detect non-existent patterns.

One famous example of the clustering illusion at work occurred during the Nazi bombardment of London during World War Two.

At the time, some neighborhoods suffered more attacks than others. City residents devised all sorts of theories as to when and where the unguided V2 rockets would strike.

But after the war, psychologist Thomas Gilovich conducted a statistical analysis of bombing locations.

He found the sites of V2 rocket strikes had a completely random distribution. The clustering illusion, he observed, resulted in the patterns detected by observers during the war.

The clustering illusion pops up frequently in the world of finance and investing, too.

For instance, many people choose to invest in shares of public companies that have outperformed in recent months.

Economists, however, have long observed stock price movements are almost entirely random. And an overwhelming amount of evidence shows past results serve as no indicator for future returns.

In other words, people poorly allocate capital from mistakenly detecting patterns in random data.

In fact, an entire cottage industry, called technical analysis, has cropped up from the clustering illusion. Proponents of this theory claim they can detect predictive patterns in stock price data, even though these same patterns appear in completely random data sets.

In the world of OSINT and corporate security, the clustering illusion can also lead us to misinterpret data.

For example, say you want to find the safest place for a VIP to stay on an upcoming trip. You might start by combing through news reports of incidents over the past few months of various neighborhoods in a city.

In your analysis, you find two boroughs have slightly more reported robberies than the others. In such a case, it might be easy to believe crime is localized to these areas.

In reality, however, it’s hard to make such a sweeping conclusion from such a limited data set.

The total amount of crime may be distributed randomly throughout the area. And there may be no statistically significant difference between the number of reported incidents in each neighborhood.

To avoid letting this bias seep into your analysis, don’t put much weight on a limited data set.

Random patterns can pop out when you’re working with a small number of inputs.

In such situations, we must seek out more data. Or if that’s not possible, we must refrain from making any broad conclusions.

 

5. Law of the Instrument

The law of the instrument refers to a cognitive bias that involves an over-reliance on familiar tools.

This heuristic makes us inefficient. When trying to complete a task, we become fixated on the idea of using skills or tools we’re already comfortable with.

And as a result, any operation could take far longer than it would have if we had selected an alternative method.

OSINT investigators in the digital world need to pay special attention to this particular issue.

Each day, daily activities we once conducted in person now occur more and more online. That requires analysts to keep updating their research toolkit.

Alternatively, once popular sites or platforms can disappear overnight. In such situations, users from these communities may migrate to completely different parts of the web.

For instance, millions of users boycotted the mainstream social networks following the January riots at the U.S. Capitol building.

That triggered a migration to a swath of lesser-known alt-tech platforms, like Gab, Telegram, and Raddle.me.

Since then, any effective social media monitoring now requires keeping tabs on far more social networks than analysts may have once considered.

So how can we reduce the impact of this bias in future OSINT investigations?

First, after defining the nature and scope of your research question, reframe from diving into the first strategy that comes to mind.

Instead, consider what resources you have available and prepare a list of people that can help you through the process.

Next, brainstorm multiple ways to tackle your research question while weighing the pros and cons of each method.

Taking these steps when approaching a problem can help you avoid reaching out for the same tool or solution time and again.

 


The Bottom Line for OSINT Investigations

Far too many conversations on OSINT focus on exciting new tools and techniques. But perhaps more attention needs to be paid to the people actually conducting these investigations.

For individual analysts, it’s important to understand how cognitive biases can obstruct your research.

Before you tackle a problem, review the list of biases above. Then consider how each heuristic could impact your thinking.

For managers or executives, the existence of such mental blind spots emphasizes the value of building a diverse team.

Organizations can curb such heuristics by simply allowing peers with a variety of perspectives to cross-check each other's work.

And in both cases, such strategies can go a long way towards limiting the impact these biases can have on your future OSINT investigations.

 

You may also like

What are Insider Threats (and How to Detect Them)
What are Insider Threats (and How to Detect Them)
9 March, 2021

When we picture a threat actor, outsiders usually come first to mind: hackers, shoplifters, active shooters, crime outfi...

7 Ways to Use OSINT for Corporate Security
7 Ways to Use OSINT for Corporate Security
9 February, 2022

For decades, open source intelligence, or OSINT, remained in the purview of the defense and intelligence community. Few ...

How Does OSINT Support Close Protection Teams
How Does OSINT Support Close Protection Teams
28 June, 2022

The internet has allowed close protection teams to access vast swaths of information with the click of a mouse. Unfortun...