Media

ProBeat: YouTube is no better than Facebook or Twitter


What a whirlwind week of YouTube news. On Monday, the spotlight shone on YouTube’s pedophilia problem. On Tuesday, YouTube decided that homophobic slurs don’t violate its policies. And then on Wednesday, YouTube banned videos promoting groups that claim superiority to justify discrimination and videos that deny factual catastrophes. Also on Wednesday, YouTube decided that it shouldn’t monetize homophobic slurs after all. And on Thursday, we were reminded that questionable “engagement” algorithms drive millions of YouTube views. YouTube may not be a social network in the strictest sense, but when it comes to enforcing its own rules and developing recommendation algorithms, the Google company is no better than Facebook and Twitter.

There are too many examples to list for the duo, so I’ll just do one for each social network. Last month, Facebook refused to remove a heavily edited video that attempted to make U.S. House Speaker Nancy Pelosi look incoherent. Last year, Twitter took weeks longer than Apple, YouTube, and Facebook to conclude Alex Jones broke its rules. Both platforms consistently fail to enforce their own rules — these are just two high-profile cases that come to mind.

YouTube is no different. The racism, homophobia, and extremist “bans” are just the latest crackdowns that happened to line up this week. If you go back a few more days, YouTube concluded that videos claiming vaccines poison babies and that children with autism have parasites that can be cleansed don’t violate its policies on medical misinformation.

Déjà vu

There are two whack-a-mole cycles happening on Facebook, Twitter, and YouTube. First, these companies fail to write clear rules. Disgusting content on their platform is brought to light. It isn’t removed. Users protest. The companies point to their rules. More outrage follows. Finally, if there is enough of a blowback, apologies are issued, the rules are “clarified,” and the example content is taken down.

The second cycle is happening at the same time. A given rule is already clear and specific. Disgusting content is brought to light. It isn’t removed. Users protest. The companies fail to explain why it wasn’t removed immediately or make up excuses. More outrage follows. Finally, if there is enough of a blowback, apologies are issued, the rules are “clarified,” and the example content is taken down.

In these cycles, only blatant and high-profile cases are removed. And that process can take anywhere from weeks to months from when the original content was published. By then it has done the damage and generated the revenue.

Making policies is one thing, but enforcement is another. Even if YouTube had set up rules to ban anti-vaxxers, for example, I’m not convinced those videos would disappear. Sure, with enough pressure a popular channel here or there might be demonetized or even deleted, but those types of videos would remain.

Like Facebook and Twitter, YouTube is updating its rules, issuing changes, and going through the motions. But overall, it’s not keeping up. Like Facebook and Twitter, YouTube isn’t doing enough.

ProBeat is a column in which Emil rants about whatever crosses him that week.





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.