YouTube to limit exposure of 5G conspiracy theories but won’t remove content
YouTube has confirmed it will reduce the exposure videos which promote 5G as some sort of cause or accelerator of the coronavirus, though its actions are somewhat limited.
April 6, 2020
YouTube has confirmed it will reduce the exposure videos which promote 5G as some sort of cause or accelerator of the coronavirus, though its actions are somewhat limited.
Officially, the video content which makes the link between 5G and COVID-19 does not actually break community guidelines, but it will be removed from recommendation engines as it has been deemed as borderline content.
Thanks to the incorrect conspiracy theories, telecommunications infrastructure was set on fire over the course of the weekend in multiple locations. Three has confirmed at least five sites were attacked, while Vodafone has said six sites were damaged, some of which were shared infrastructure and not all of which were housing 5G base stations. BT and O2 did not respond at the time of writing.
“Our thoughts are with everyone affected by the coronavirus around the world,” a YouTube spokesperson said. “We’re committed to providing timely and helpful information at this critical time, including raising authoritative content, reducing the spread of harmful misinformation and showing information panels, using NHS and WHO data, to help combat misinformation.
“We have also begun reducing recommendations of borderline content such as conspiracy theories related to 5G and coronavirus, that could misinform users in harmful ways. We’ll continue to evaluate the impact of these videos on the UK community and look forward to continuing our work with the UK Government and the NHS to keep the British public safe and informed during this difficult time.”
While YouTube will remove this content from the recommendation engine, it is stopping short of completely removing conspiracy theories from the platform. Although these statements are quite obviously false, creating content which states such beliefs or theories are not actually in violation of YouTube’s rules.
According to YouTube, the conspiracy theories would be labelled as borderline content. This is a category of content which could misinform users in harmful ways, such as promoting miracle cures, claiming the earth is flat or making blatantly false claims about historic events. Such content accounts for less than 1% of the videos available on YouTube, and while it will remain on the platform, removing it from the recommendation engine will make it much more difficult to find.
This is the position which YouTube is currently taking, but it might well be encouraged to a more firm stance over the coming days or weeks. Aside from Government pressure, the content is linked to violence, which will get YouTube’s PR team twitchy.
Following a weekend which saw ill-informed arsonists attack dozens of masts which host critically important communications infrastructure, it was suggested Secretary of State for Digital, Culture, Media and Sport Oliver Dowden would be speaking to social media representatives on ways in which misinformation can be combatted.
Although there is no official statement from DCMS on action moving forward, Chair of the DCMS Select Committee Julian Knight has called for more stringent action.
“To hear that crackpot theories are leading to people attacking phone masts or threatening telecom workers is sickening and it’s clearly time to act,” said Knight. “Government should work with social media companies to stamp out deliberate attempts to spread fear COVID-19.”
Considering the importance of communications infrastructure to aid society while COVID-19 is forcing a state of lockdown, but also its role in helping the economy bounce back in the future, something needs to be done. The infrastructure needs to be protected from the idiots who believe the pseudoscience and ignore statements made by who have the qualifications to make such assertions.
About the Author
You May Also Like