YouTube Will Link Directly to Wikipedia to Fight What They Call ‘Conspiracy Theories’






Google has expanded far beyond its original claim to fame as a search engine.

Google bought YouTube in November 2006 for US$1.65 billion; YouTube now operates as one of Google’s subsidiaries.

Most of you reading this will know that there is censorship going on with all social media sites,  therefore, blocking the ability for others to be able to tell the truth or for those who don’t believe that, then they are purposely blocking the ability to tell a different point of view regarding information.

However, when a social media outlet, like YouTube, tries to get people to view another source of information like Wikipedia, calling an INDEPENDENT news source giving a different narrative than their corporate news false, that is “Covert” CENSORSHIP. They are trying to lead people into thinking that any information that is different from the corporate story being portrayed as being conspiratorial based and not true. 

YouTube CEO Susan Wojcicki announced the new feature, which she called “information cues,” during a talk with WIRED editor-in-chief Nicholas Thompson at the South by Southwest conference in Austin, Texas.

Here’s how it will work:

If you search and click on a conspiracy theory video about, say, chemtrails, YouTube will now link to a Wikipedia page that debunks the hoax alongside the video. Here’s another example: A video calling into question whether humans have ever landed on the moon might be accompanied by the official Wikipedia page about the Apollo Moon landing in 1969. Wojcicki says the feature will only include conspiracy theories right now that have “significant debate” on the platform.

“Our goal is to start with a list of internet conspiracies listed on the internet where there is a lot of active discussion on YouTube,” Wojcicki said at SXSW.

The decision to include links to other websites represents a dramatic shift for YouTube, which has historically existed as a mostly contained ecosystem. It’s also notable that YouTube chose to link out to text-based sites, rather than rearrange its own search algorithm to further favor content from truthful creators and video journalists. One reason for the decision might be that YouTube wants to avoid the perception that it’s rigging its platform to favor certain creators, a criticism it has faced in the past. It also prevents YouTube from having to censor content outright, serving as the ultimate arbiter of truth.

“People can still watch the videos, but then they have access to additional information,” said Wojcicki.

Merely placing links to factual information alongside videos won’t solve the company’s moderation problems wholesale. For one, as Zeynep Tufekci at The New York Times and others have pointed out, YouTube’s recommendation algorithm is often how users end up seeing conspiracy theories in the first place. Wikipedia in particular can also be edited by anyone, and its own reliability issues of misinformation.

The problem with the recommendation algorithm is that it feeds users ever-more extreme content, sometimes straying from what they searched for in the first place. For example, if you search for a video about the Holocaust, YouTube might recommend that you then watch one about how the tragedy was a hoax. The recommendation system isn’t designed to ensure you’re informed; its main objective is to keep you consuming YouTube videos for as long as possible. What that entails has mostly been an afterthought. Even if every conspiracy video is served up with a Wikipedia article contradicting the information that it presents, there’s no guarantee that users will choose to read it over the video they’ve already clicked on.

Take, for example, what happens when you search conspiracy theorist Alex Jones’ videos about the Parkland shooting. After watching one, YouTube recommends you then watch another of Jones’ videos, this time about how the Sandy Hook shooting was a hoax. It doesn’t suggest that you watch factual clip about Parkland or Sandy Hook at all. YouTube’s algorithm system serves to radicalize users, and until that’s fixed, the company will likely continue to suffer from scandals related to misinformation.

YouTube has also still yet to decide and implement clear rules for when uploading conspiracy theory content violates its Community Guidelines. Nothing in the rules explicitly prevents creators from publishing videos featuring conspiracy theories or misleading information, but lately YouTube has been cracking down on accounts that spread hoaxes anyway.

………

Merely serving up factual information has also not been a cure-all for other platforms that have suffered from scandals associated with misinformation, like YouTube’s parent company Google and Facebook. Both Google Newsand Facebook’s trending bar have surfaced conspiracy theories during breaking news events in the past, despite having plenty of links to more reputable news sites on their platforms. It’s remarkable, too, that an enormous platform, equipped with a flow of advertising cash, has chosen to address its misinformation problem primarily using the work of a donation-funded volunteer encyclopedia.

Another obvious question here is whether Wikipedia and YouTube will be able to keep up with with breaking news events that quickly fall prey to conspiracy theories. For example, the Parkland shooting survivors were accused of being actors within hours of the tragedy. It’s unclear how quickly YouTube will be able to add links to the thousands of misinformation videos that are uploaded every time a major news event occurs.

Still though, YouTube should be applauded for doing something to try to fight conspiracy, especially since adding links elsewhere will do nothing to immediately aid its bottom line.

<!–

–>

Source Article from http://feedproxy.google.com/~r/ASheepNoMore/~3/NRA33XE9PfM/

You can leave a response, or trackback from your own site.

Leave a Reply

Powered by WordPress | Designed by: Premium WordPress Themes | Thanks to Themes Gallery, Bromoney and Wordpress Themes