In this blog I would like us to look at Nation State threat actors, social manipulation and how we all need more CRITICAL THINKING in our lives to assess the risk of what we are seeing/saying. #FAKENEWS

Nearly everything you do online is tracked, quantified, and assessed. There is a reason products like Facebook and Google can run for free, and that's because of advertising. The proven effectiveness of their algorithms reach and "click-through" differentiate them from competitors. A refreshed video can be found here:

For a lot of us, Facebook is just a platform to do a bit of social watching - you can catch up with your friends, see pictures of family you are distant from and generally socialise. It has become essential through this pandemic for social interactions and being human. But to opportunistic people, it can be a very different platform.

Big Data – is a term used to describe a huge data set, at BT, we called it a data lake. It comprises of several data sources, both historical and "live" (current). Companies like Facebook and Google both buy these data sets from survey/ research companies and train algorithms or sell them for research purposes once compiled. Google and Facebook, however, are public (and although questionable at times) do not hide how they use your data. US Congress is currently doing privacy hearings on Google, Facebook, Amazon, and Apple for how much data they have and how they use it.

My big thing for this month is CRITICAL THINKING and #FAKENEWS. Thanks to the very nature of social media, it is built on sharing - taking content either from yourself or from others and using it to create discussions with friends. With Facebook, Twitter, Reddit, YouTube, Google, WhatsApp, this content moves FAST! It's incredible how quickly a video can be shared and have millions of views within hours or days. The issue is when that content has inconsistencies, miss-truths, or no valid data behind it. The big problem comes with Algorithms - social media platforms live and die by their "Engagement", and this can lead to something called Confirmation Bias.

Before we get into the crux of today's topic, bias and reinforcement, I want to state I have no direct political alignment. I am not pushing an agenda, and I like facts. As a Cyber Security Expert and a data scientist, ethics in the field of technology is crucial. I think this activity is immoral, and understanding how it works has helped me form that opinion. I will happily debate with anyone who disagrees with the below.

If we buy a magazine in a shop, it's because we chose to do so. It says something about us, who we are, what we like and who we want to be. That purchase is private (or maybe shared with Tesco if you have a Clubcard, or Visa if you paid by card). However, if you bought a digital copy of that magazine on Facebook, they now have that information. Facebook have the knowledge and the power to push related content. An algorithm is just logical computer code trained to provide you content you want and like. You like cars – here are car reviews, liked the car reviews – here are car parts website, like the car parts websites – here is installation guides and car insurance platforms which will take your modified cars. You will keep coming back to the platform for the content you like, and they get rich promoting content through algorithms. What happens if you then combine this with politics?


"Facebook is not the best single source for unbiased news by design. Because Facebook tailors your News Feed based on your own behaviour, you inadvertently become a victim of your own biases. It is particularly concerning because 78% of users don't go to Facebook to read news, but somehow, they end up doing so. Keep in mind you can customise the News Feed by hiding certain types of posts, unfollowing a friend's posts while staying friends, or hiding posts from a page."
- Nelson Granados for Forbes.


This is where we are having more and more people swinging left-wing or right-wing, by accidentally or inadvertently getting involved in one news article. These algorithms will keep feeding you more content with stronger views. Currently, there is a strong suggestion in the research community that these algorithms will slowly manipulate you. If you are on a news article and you click on another article with similar themes, the algorithm will take note of this, then assume other people looking on that original article might want to make the same step that you did. These steps can be through content and news sources which are unvetted/uncontrolled as anyone on social media can say whatever they want (free speech). However, thanks to groups like Cambridge Analytica and Aggregate IQ (shining a spotlight on this topic) other factors can tactically move you in directions they want.

Looking at Cambridge Analytica, they were a "data research" and advertising agency. Some of you may have heard the name as they have been mentioned in other blog posts before. These guys were for hire. Through their data points (over 5,000 with Facebook just being 1 of them), they were able to ascertain information about nearly anyone they wanted to. They would target them with adverts based on their personality and the message they had decided would best resonate with them to influence their opinion: YouTube.

However, they had a fascinating past and profile as the "SCL Group", a military contractor based on psychological warfare, and had started in the UK working for the MoD (Ministry of Defence): YouTube.

Facebook has big issues with this and is making progress: "We're taking down more fake accounts than ever, 2.19 billion in Q1, up from 1.2 billion in Q4 last year" Facebook 23/5/2019. It's all-important work because it's fake profiles which spread the targeted content along with these companies buying advertising space. Having said that, if money from political campaigns is coming in, then they have no real incentive to stop these practices. YouTube.

These groups and tactics were deployed in the Trump campaign and in the Brexit campaign. Whether you deny effectiveness or involvement, they were involved, and they did do targeted advertising. Whether it worked, is up for debate. Christopher Wylie testimony to UK Gov inquiry.

Another big thing that we see now is the debate around Masks and if they are "potentially" dangerous for you. They aren't. But there are large portions of America where the advertising and the virtual confirmation bias will tell them it is dangerous for them to wear a mask. There has been a video (now widely taken down) where a semi-professional show "demonstrated" how oxygen levels dropped underneath a child's mask. This is problematic because it just wasn't true, but the video evidence seemed conclusive, and it tricked a lot of people. This is why we have now got response videos like this: NHS employee debunks "evidence."

A lot of people also think that the government has been indecisive about masks so now don't trust that. So, this is where I want to apply critical thinking. Why might it be that the government is so undecided and keeps changing their mind? The government did not have enough PPE, and the supplies they did have were mostly unusable. There were reports of police raids on homes where PPE was being hoarded. They had to do something, but they couldn't compete with the demand from the public. Reducing the public need and want for PPE means they will get enough for the NHS (Stay home, don't need PPE). As soon as the supply chain of these goods was at a sustainable level, then it was time for the public to be told – yes, you do in fact, need them.

They had a terrible decision to make, and we have seen this kind of activity all the time in war. Save the important ones (NHS workers), so they can support the people who will need them (General public). Think Alan Turing breaking Enigma in WW2 and having to let a certain % of people die so the Germans didn't get suspicious. If the masks didn't help or weren't necessary, why did the NHS need them so bad?

Critical thinking: the analysis of facts to form a judgment.
Analysis: detailed examination of the elements or structure of something
Confirmation bias: the tendency to interpret new evidence as confirmation of one's existing beliefs or theories

I have some advice for you:

  1. ALWAYS view the source of the content you are seeing! Is it from a government source? Who made it? What budget did they have to make it? (Big production with video and graphics vs a solo person's blog). How recently was the website created? (
  2. NEVER give out personal details to groups you do not know because if they have your information, they can use platforms like Facebook, Twitter, etc. to keep pushing the targeted adverts.
  3. CHALLENGE the data, always look in the comments, is there a war going on? I'm not saying you need to get involved, but read them, see some other opinions, and research a topic further. Worryingly, Trump constantly saying "Fake News" is so damaging because it's asking society to ignore and disregard any alternative opinion/content which might exist.
  4. CHALLENGE YOURSELF; there is no way to avoid these algorithms skewing your opinions or views of the world. So mix things up, read other data sources, try a different newspaper and see what they say as it will help confuse the algorithms and not just lock you into a "predictable model" that it can take advantage of.
  5. BE RESPECTFUL of others' opinion, and there is a reason and a method for how they formed those views. I'm always interested to understand how they got to those opinions, and if maybe my data sources are wrong/biased.

I had this a few weeks ago when a local pub put out a sign (very prominently, near a main road) for "All Lives Matter". Upon a group of us writing reviews on Facebook asking for the sign to be taken down, I got into a debate with a lady. Her "sources" of information were terrifying. I wanted to find out how she had acquired such radical views, but it quickly became obvious it was Confirmation Bias on social media – she had been pushed down the rabbit hole. Again, I took her "sources" and analysed them, to which she just threw more/different sources at me rather than defending their legitimacy. She hadn't applied CRITICAL THINKING.

Unfortunately, it does appear everyone falling for this content and NOT UNDERSTANDING THE PLATFORMS (Like we were talking about in Part 3) but do consider the advice and spread the word. Facebook is not a news site, and it's not hidden that the actual news sites have their own biases; we all need to take a wider sample. We know there are HUNDREDS of companies now trying to build the same success and data models as Cambridge Analytica. Social unrest is a key tool used by the CIA when they were de-stabilising South America and putting in leaders they wanted. USA has a lot of social unrest now thanks to Trump, masks, immigration, racism, Covid-19, healthcare, finance, elections, China ... I wonder who might gain from USA at civil war?

Big data companies know all, they track all, and it's all for sale to the highest bidder. Even Google admitted a while back that "incognito mode" in Chrome, doesn't mean "incognito" to Google.

Until next time,