YouTube to Update Sharing Features to Prevent Videos With Misinformation From Going Viral

YouTube announced new initiatives to combat misinformation on the network. According to YouTube’s chief product officer, Neal Mohan, the three areas of attention are preventing misinformation from spreading, addressing misinformation in languages other than English, and reducing cross-platform sharing of misinformation.

Misinformation to be Addressed by YouTube

According to The Times, YouTube’s focus on cross-platform sharing would reduce views of content deemed harmful under the platform’s current misleading policy.

The video streaming service claims that changes to its recommendation algorithm have reduced consumption of these questionable videos, but traffic from other sites embedding and linking to these videos remains a problem.

Possible solutions include deactivating the platform’s share button or breaking links to videos that have already been suppressed on YouTube.

Another possible solution is to provide warnings that a video may include inaccurate information, which the platform does for graphic and age-restricted content.

To combat misinformation, YouTube is considering larger and more knowledgeable staff, as well as partnerships with non-governmental organizations and local experts.

The platform may also add additional labels to movies on emergent, rapidly evolving themes like as natural disasters.

The measures are the latest look at YouTube’s attempt to combine safety and disinformation suppression with freedom of expression, which has been a hot topic throughout the COVID-19 outbreak.

With over 2 billion monthly visitors, the platform is the largest source of internet video. Mohan stated that they will take precautions to limit the spread of potentially damaging disinformation while still allowing for conversation and education on sensitive and controversial topics.

Critics believe that the platform isn’t doing enough to combat disinformation. According to Protocol, more than 80 fact-checking organizations addressed a letter to YouTube CEO Susan Wojcicki earlier this year seeking greater action on disinformation on the platform.

YouTube to Update Sharing Features to Prevent Videos With Misinformation From Going Viral-Glamsquadmagazine

Misinformation Regarding COVID-19

According to CNET, during the last two years, YouTube has eliminated more than 1 million videos containing dangerous material concerning coronavirus, such as fake cures or hoax claims.

Putting the figure into context is challenging because to the massive scope of Google’s service, which is the internet’s largest source of video, with over 2 billion monthly users.

It is more than treble the entire number of videos removed since the outbreak began. YouTube said it had removed over 500,000 videos due to COVID-19 disinformation.

However, YouTube does not publish how many videos are added to its library, and it has failed to update its total video-removal figures for the last 5 months, clouding the picture of how coronavirus-related removals compare to other types. Every three months, the platform eliminates about ten million videos.

YouTube, like other social media platforms such as Reddit, Facebook, and Twitter, provides a forum for people to submit their own content and has struggled with how to balance freedom of expression with efficient policing of the worst content posted on its site.

YouTube has had issues with misinformation, discrimination, conspiracy theories, bigotry, harassment, exploitation, child abuse, and videos about mass murder at an unparalleled worldwide scale over the years.

No Comments Yet

Leave a Reply

Your email address will not be published.