Monday, July 9, 2018
YouTube is fighting conspiracy theories with ‘authoritative’ context and outside links
- YouTube is adding “authoritative” context to search results about conspiracy-prone topics like the Moon landing and the Oklahoma City Bombing, as well as putting $25 million toward news outlets producing videos. Today, the company announced a new step in its Google News Initiative, a program it launched in March. The update is focused on reducing misinformation on YouTube, including the conspiracy theories that have flourished after events like the Parkland shooting.
This update includes new features for breaking news updates and long-standing conspiracy theories. YouTube is implementing a change it announced in March, annotating conspiracy-related pages with text from “trusted sources like Wikipedia and Encyclopedia Britannica.” And in the hours after a major news event, YouTube will supplement search results with links to news articles, reasoning that rigorous outlets often publish text before producing video. “It’s very easy to quickly produce and upload low-quality videos spreading misinformation around a developing news event,” said YouTube chief product officer Neal Mohan, but harder to make an authoritative video about a developing story.
YouTube is also funding a number of partnerships. It’s establishing a working group that will provide input on how it handles news, and it’s providing money for “sustainable” video operations across 20 markets across the world, in addition to expanding an internal support team for publishers. (Vox Media, The Verge’s parent company, is a member of the working group.) It’s previously invested money in digital literacy programs for teenagers, recruiting prominent YouTube creators to promote the cause.
Will this be effective? It’s hard to say. YouTube is proposing links to text articles as a cure for misinformation, but Google Search’s featured results — including its Top Stories module — have included links to dubious sites like 4chan and outright false answers to basic questions. Unlike with deliberate “fake news” purveyors, this obviously isn’t intentional, but it makes it harder to believe that Google will provide truly authoritative answers. The Wikimedia Foundation was also initially ambivalent about having Wikipedia articles added to YouTube results, worrying that it would increase the burden on Wikipedia’s community of volunteers. ContinueReading
Labels:
Conspiracy,
Politics,
YouTube
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment