YouTube’s algorithm still amplifies violent videos, hateful content and misinformation despite the company’s efforts to limit the reach of such videos, according to a study published this week. The ...
"If you randomly follow the algorithm, you probably would consume less radical content using YouTube as you typically do!" So says Manoel Ribeiro, co-author of a new paper on YouTube's recommendation ...
YouTube’s proprietary AI algorithm is at the heart of the company’s success, and it’s secrecy is key to continued Internet video dominance. However, a recent report from Mozilla, found YouTube’s ...
YouTube tends to recommend videos that are similar to what people have already watched. New research has found that those recommendations can lead users down a rabbit hole of extremist political ...
Researchers found that clicking on YouTube’s filters didn’t stop it from recommending disturbing videos of war footage, scary movies, or Tucker Carlson’s face. Reading time 3 minutes My YouTube ...
Every time Ben publishes a story, you’ll get an alert straight to your inbox! Enter your email By clicking “Sign up”, you agree to receive emails from Business ...
For years YouTube’s video-recommending algorithm has stood accused of fuelling a grab bag of societal ills by feeding users an AI-amplified diet of hate speech, political extremism and/or conspiracy ...
We’ve all seen it happen: Watch one video on YouTube and your recommendations shift, as if Google’s algorithms think the video’s subject is your life’s passion. Suddenly, all the recommended ...
YouTube’s algorithm recommends “regrettable” videos that violate the platform’s own content policies, according to a new investigation by the nonprofit Mozilla Foundation. The crowd-sourced study ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results