Darth Vadar was drawn to the dark side. YouTube has its dark side as well, only it isn’t fantasy and the results don’t end when the screen goes blank.

When a video is clicked on YouTube, algorithms match parts of that content to other videos, picking from among the billions that have been uploaded. They appear in a list of what might be chosen to watch next. Users are assumed to stay longer on sites that are somewhat familiar or “like” what is already “liked.”

YouTube first acknowledged that a completely open platform without some editing might be creating problems when the company banned videos depicting obviously dangerous actions, like driving blindfolded. It was only a start.

Algorithms must be altered to preclude a less obvious danger. Displaying “related” videos creates a funnel that can and does lead into very dark places on the web. It also creates filter bubbles that make such places seem like the only reality.

Some of these bubbles form around obscure and false information that users may not have ever thought of seeking out. The conspiracy theories and lies typified by Alex Jones and Infowars are just the tip of what becomes a violence-promoting iceberg.

Related video rankings aren’t created by people who want to make a point. That would be a free speech issue. Mathematical formulas processing ones and zeros, ranking without judging, sorting without editing, is not speech. It is software.

YouTube can fix the algorithms. “Recommended for you” could be labeled “Chosen using your age, gender and past views.” Violence-promoting videos should be banned. Radical ideological or conspiracy theory-laced videos could generate an “up next” list from the opposite spectrum.

Algorithms are great for some tasks but only if they work in the light.

Load comments