Based on research from two British universities, Google’s Jigsaw unit will start a campaign to combat misinformation about Ukrainian refugees in Poland, Slovakia, and the Czech Republic next week.

In collaborating with Google’s Jigsaw, psychologists from Cambridge and Bristol universities created 90-second videos to “inoculate” individuals against misinformation on social media.

The videos, which will appear in advertising spaces on Google’s YouTube and other platforms such as Twitter, TikTok, and Meta’s Facebook, are intended to assist individuals in identifying psychological manipulation and demonization in news stories.

Immunization videos

Jon Roozenbeek, the lead author of a study on the campaign’s research, said in an interview that people tend to argue when you tell them what’s true and untrue. “But what you can predict”, he added, “are the techniques that will be used in spreading misinformation, like with the Ukrainian crisis.”

The study included seven trials, including one with a sample of adult US citizens that watch political news on YouTube. Jigsaw showed an immunization video to around 5.4 million US YouTubers, with almost a million engaging for at least 30 seconds.

In collaboration with local non-governmental organizations, fact-checkers, academics, and disinformation specialists, the initiative aims to create resistance to anti-refugee propaganda.

A scalable trial

The proliferation of false and misleading content on social media networks in the US and EU has prompted several governments to call for additional regulations to combat misinformation.

In an interview, Beth Goldberg, Jigsaw’s head of research, said that the team behind the project regards it as a pilot trial. Goldberg sees no reason why the method couldn’t be scaled to other nations.

She explains that Poland was selected because it has the highest number of Ukrainian migrants, adding that the Czech Republic and Slovakia would serve as important indicators for the rest of Europe. The program will last a month.

Tip: ‘Cyberattacks increasingly involve deepfakes’