How do we know if the information we are consuming is true, accurate or justified? How can we be confident that the information we use to inform our judgments and advice to decision makers is of good quality? In this blog, I’ll provide a snapshot into some of the tradecraft, tools and techniques used by intelligence analysts to better evaluate information and sources online which can be neatly summarised using the following four key criteria.
The relevance of the information.
The reliability of the source.
The credibility of the information.
The corroboration of the information.
To remember this, I like to use the mnemonic (or pattern) R2C2. It’s a simple way to remember two ‘Rs’ for Relevance and Reliability and two ‘Cs’ for Credibility and Corroboration.
R2C2: Is the information RELEVANT?
The first step in verifying the information we’ve collected online is to do a quick check that it’s relevant. This might seem like an obvious point, but when we are collecting large amounts of information – sometimes quickly – it can be easy to pick up information that may only be tangentially linked or after closer examination, not be relevant to our intelligence issue. When evaluating for relevance we want to check if the information is closely connected to our issue or problem and that the information is appropriate to the current time period or circumstances of interest.
R2C2: Is the source RELIABLE?
Once we’ve done a quick relevance test, we can then start to look more closely at the source of the information and ask ourselves some key questions to determine if the source is reliable. That is, evaluating whether the source is consistently good in quality or performance and able to be trusted.
R2C2: Is the information CREDIBLE?
Turning now to our next criterion, credibility. Is the information provided by the source convincing and able to be believed? To better evaluate credibility, we first need to consider if the information is plausible. If not, under what conditions would the information be plausible?
R2C2: Can I CORROBORATE this information?
Which leads us to our fourth criterion, corroboration. Can we corroborate this information? Do other quality sources tell a similar story? If not, why not? Is further research required to corroborate the information we’ve found?
What does it mean if the information we have can’t be corroborated? Could the information still be accurate even if it can’t be corroborated? And, if so, how confident are we that it’s true and why?
R2C2: It Gets You Thinking
These are all questions we need to ask ourselves when determining the quality or strength of our information.
Using the R2C2 criteria should get you thinking about the nature of the information you’ve collected and the sources of that information. You likely will have identified some new information gaps, including the need to do more research on the sources of your information. You are probably also wanting to sort or prioritise your information in a way that will help you make better sense of what it’s telling you – so that you can come to sensible conclusions or make considered judgments about the problem or issue you’re working on. Why? Because ultimately the judgements and assessments about the issue you’re working on are going to inform your brief to your boss!
What does this look like in practice?
Let’s switch gears slightly – from the theoretical to something a little more practical and use R2C2 to evaluate a couple of different information sources. To scope this activity, I have chosen two written examples and have deliberately avoided other types of collect such as imagery or communications as we will look at these in future Tools and Techniques webinars and blog posts. Also, I’m going to assume that these sources are relevant to the scenario I'm considering and will focus on one criterion per source. We will step through reliability and corroboration together, but I’ll leave you to have a go at credibility yourselves.
Source 1: BUZZFEED
Up first, I’ve chosen an article from BuzzFeed. For this source we are going to consider its reliability.
For those of you who are not familiar with BuzzFeed, it is an American internet media, news, and entertainment company with a focus on digital media. It started as an algorithm designed to pull stories from around the web that were going viral and was famous for its pop culture articles, quizzes and cat pictures.
Over the years, BuzzFeed has also invested millions into serious investigative journalism and even won a Pulitzer Prize in 2021 for its coverage of China's campaign against the Uyghurs in Xinjiang. BuzzFeed is left-leaning and despite its entrance into serious journalism, is still often viewed as questionable source. Nevertheless, it is popular. And chances are, if you’re conducting research on any number of topics, you’ll probably come across something relating to your issue or problem on BuzzFeed.
In this scenario, let’s pretend that we are researching whether Australia is safer than the United States. Using our criteria for evaluating sources (R2C2), how does it stack up for reliability?
Based on our simple understanding of BuzzFeed, it’s possibly going to score lower than we’d like, as we begin to answer some of our criteria questions. But we can’t be sure until we step through and try to evaluate its reliability for ourselves.
In most cases you will need to do more research on the author of the article and consider the type of news that is being reported. Have they relied on other sources to inform their judgments? Do they reference academic studies or reporting?
At a quick glance, we can see that this article is based primarily on the views of individuals and although the title of the article isn’t misleading, I’m starting to wonder about some of the conclusions the author may be drawing from the source material. There’s a lot of emotive language and Reddit references also feature heavily. In some cases, the references for direct quotes have been deleted so I’m unable to see the context in which they were written. It’s not looking super helpful, but it possibly does give us a place to start investigating – even if we do so to find other more quality sources. Depending on your project and timeline, you will need to make a judgment about whether you will review this source further or rule it out.
Source 2: KATEHON
Our next source is an article from the Russian ‘think tank’, Katehon. We are going to review this source in the context of identifying opportunities for corroborating the information presented in the article.
Katehon is a Moscow-based quasi-think-tank that is a proliferator of anti-Western disinformation and propaganda. It is led by individuals with probable links to the Russian state and Russian intelligence services. Within Russia’s broader disinformation and propaganda ecosystem, Katehon plays the role of providing supposedly independent, analytical material aimed largely at European audiences.
So, let’s act as if we are researching the war in Ukraine and have come across this article. At a very base level, it suggests that the Ukrainian President, Volodymyr Zelensky, is trying to start World War III. Using our criteria for evaluating sources (R2C2), how does it stack up on corroboration?
Let’s start by acknowledging that corroborating propaganda is often difficult unless we go straight to other propaganda sources – and there are certainly times we would want to do this. But, for this exercise, we are looking to dissect the information in the article and see if we can corroborate it. We are not making a judgment about whether we agree with the author's position on the war in Ukraine.
To start, we know we are going to have to do more research and are going to have to look closely at the facts, rationale and arguments presented in the article, breaking them down into manageable chunks. We might try to ground truth any number of the provocative threads – Is Volodymyr Zelensky heading a Nazi regime? Is Ukraine a proxy in a war between Russia and the US? Was it a Russian or Ukrainian missile that struck Poland in November last year? Is Ukraine developing a ‘dirty bomb’ as the author claims? Then there’s all the statistics (presumably aimed at a conservative US audience!) about how much money the US is spending on supporting Ukraine, and some emotive language and assertions about Zelensky’s character. We can quickly see that analytically, this article is a minefield. But one we need to make our way through if we want to be able to make informed judgments about the issues, and ultimately better understand the war in Ukraine.
Importantly, what does it mean if we can’t corroborate the information? Could some of this information still be accurate even if we can’t corroborate it? And could some of the information still be accurate even if the conclusions drawn by the author are flawed? Is this a useful source? As always, it depends on how we think about the problem and the questions we are trying to answer. I probably wouldn’t recommend adopting the author’s views too quickly, but I’m prepared to acknowledge that this source does give us insight into Russian propaganda; it does highlight key events/issues that we might be looking to better understand if we are researching the war in Ukraine, and it does prompt some important questions for us about the value in corroborating information and what it means for our analysis if we can’t.
Are there any tools to help us evaluate sources and information?
There’s no doubt that evaluating sources and information online takes a bit of brain power and discipline. I chose the examples above for ease of explanation, but source evaluation is often complex even with the help of R2C2. So, are there any other tools that can help us on this journey, perhaps do some of the grunt work for us?
I never like to answer that question with a definitive yes. As an analyst, you’re always going to have to employ your critical thinking to a problem. That doesn’t mean we can’t get a little help along the way. Here are a couple of examples of tools I’ve chosen to get you started.
https://firstdraftnews.org/ – Ethical guidelines for evaluating information
www.factcheck.org – Umbrella site for fact checking websites
www.abc.net.au/news/factcheck/ – RMIT ABC Fact Check
https://mediabiasfactcheck.com/ – Reports on media bias
https://adfontesmedia.com/interactive-media-bias-chart/ – Reports on media bias
www.allsides.com/media-bias – Reports on media bias
www.politifact.com/ – Rates the accuracy of claims by US elected officials
https://www.reuters.com/fact-check – Reuters Fact Check
Before I wrap this up, a couple of quick comments about using these tools effectively.
Tools don’t do all the work for you – they will just help inform your judgments.
They can reduce the time you spend researching.
They won’t be applicable to all your sources and problem sets – so you’ll need to go out and find the ones that work best for you.
‘Old tools’ still work – don’t dismiss journal articles or academic texts. Academic websites have really improved their access to the public and provide some of the best information about staying savvy online.
Question everything, and if there’s something you’re not sure about, try the fact checking sites.
Don’t forget to check the checker! You may still have to determine if a fact-checking website is nonpartisan and reliable and its always a good idea to do a quick check on media bias.
Key takeaways
I can’t stress enough the importance of putting some formality and consistency around the evaluation of your sources and information. This won’t just improve your analysis but will improve the advice and recommendations you make to decision makers. Source evaluation is something that all OSINT practitioners should be actively thinking and learning about; simply because sophisticated or unique collection methods don’t mean much if we don’t have the ability to analyse and evaluate the information we are collecting.
If you take away anything from this post, it should be:
Take time to learn about source evaluation and the principles for conducting good research and interpreting material online.
Source evaluation can be complex. And, if you get stuck there are some good online tools available to help you – but don’t forget to check the checker!
If you get overwhelmed, you can always go back to basics with R2C2 (Relevance, Reliability, Credibility, Corroboration) and ask yourself the questions under each heading. But it’s a good idea to have other tools in your toolbox too, such as those used by academic institutions.
Always be mindful of bias and disinformation.
I’ll conclude this post with a gentle reminder that OSINT analysts must make judgements about information to best support policy, planning and decision makers. To do this, we need to critically evaluate the sources of information we choose to use to inform those judgments. This is not easy work, nor is it always a labour of love. Nevertheless, it is a critical element of intelligence analysis. Without quality source evaluation, the time we spend collecting information is arguably wasted.
To support OSINT collection and analytical capability uplift and to delve deeper into some of the learnings above, please look at our in-person training courses, or our online, self-paced options here. Alternatively, contact us at training@osintcombine.com to learn about our bespoke training offerings.
Author: Kylie Pert
Kylie is a Senior Intelligence Specialist at OSINT Combine. She has over 15 years’ experience in the Australian Defence and Intelligence Communities and has held Senior Executive roles in Intelligence and Cyber. As an intelligence analyst, Kylie worked across multiple target-sets including China, South-East Asia, Counter-Terrorism, Counter-People Smuggling, Iraq and Afghanistan.