YouTube has a Content Problem



What happens when a company allows everyone to use its website to post videos of whatever they want? Some people will post videos that are filled with misinformation. Others will post videos that seem to be intended to stoke hate or to provoke viewers into causing harm to people who are different from themselves.

Bloomberg posted a detailed article that focuses on YouTube’s content problem. The article includes information from people who used to work at YouTube and/or Google.

The article points out several of YouTube’s missteps regarding content regulation. It points to the problem of videos aimed at children that included explicit content. It notes the videos that are full of misinformation (about vaccines, for example). It mentions politically motivated videos that appear to be designed to evoke outrage, (by calling survivors of mass shootings “crisis actors”, as an example).

Overall, the Bloomberg article makes it clear that YouTube has a long way to go towards cleaning up the website. Many efforts created by YouTube workers to do so were rejected. The implication is that YouTube valued growth over quality of content.

Motherboard posted an article in which they reported that YouTube has not removed videos that contain “neo-Nazi and white nationalist propaganda”. It notes that other social media giants have banned or shut down that type of content after what happened in Christchurch, New Zealand.

According to Motherboard, YouTube has demonetized some of that content and placed those videos behind a content warning. But, the videos are still searchable on YouTube.

YouTube has a content problem. Some of the most disturbing videos on YouTube can actually cause people harm. Misinformation about vaccines leads to measles outbreaks. Videos that promote hate can influence viewers to cause physical harm to other people. YouTube needs to put more effort into removing that type of content.