YouTube doesn’t count when we dislike a video

It always happens, that by chance or accident you see a video of a theme on YouTube, and the next day you have all the recommendations at the top of that same theme. So you start to ‘manipulate’ the platform’s algorithm, to untrain it by systematically giving those videos a Dislike or I’m Not Interested, so that YouTube stops recommending them to you.

But, apparently, you are wasting your time, because YouTube completely ignores you.


According to new research by Mozilla – yes, the authors of the Firefox browser – is that YouTube’s internal controls, such as the “Dislike” button, “are largely ineffective as a tool to control suggested content.” According to the report, these buttons “prevent less than half of unwanted algorithmic recommendations.”

The Mozilla researchers used data collected from RegretsReporter, their browser extension that allows people to “donate” their recommendation data for use in studies like this one. In total, the report was based on data from 22,722 participants, 2,758 respondents and more than 567 million videos analyzed. Mozilla tested the effectiveness of four different controls:

  • “I don’t like” button
  • “I’m not interested” button
  • “Do not recommend the channel” button
  • “Remove from viewing history” button.

The researchers found that they had varying degrees of effectiveness, but that the overall impact was “small and inadequate.” Of the four controls, the most effective was “do not recommend the channel”, which prevented 43% of spam recommendationswhile “I’m not interested” was the least effective and only prevented about 11% from unwanted suggestions. The “dislike” button was about the same, at 12%, and “remove from visit history” removed about 29%.

Untrain YouTube, a complicated task

In their report, Mozilla researchers noted that study participants said they sometimes went out of their way to avoid unwanted recommendations, such as watching videos while offline or connected to a VPN. The researchers say the study highlights the need for YouTube to better explain its controls to users and offer them more proactive ways to define what they want to watch.

“The operation of YouTube and many platforms is based on passive data collection to infer user preferences. But it is a somewhat paternalistic way of operating, in which decisions are made on behalf of the people. You could ask people what they want to do on the platform, instead of just watching what they do,” reviews Becca Ricks, Mozilla principal investigator and co-author of the report.

Leave a Comment

Your email address will not be published.