YouTube claims its systems work as intended. YouTube spokesperson Elena Hernandez said that the report does not take into account how YouTube’s systems work. She also stated that viewers have control over recommendations. This includes “the ability to block a video or channel from being recommended to them in the future.”
Where Mozilla and YouTube differ in their interpretations of how successful their “don’t recommend” inputs are appears to be around the similarity of topics, individuals, or content. YouTube claims that asking it not to recommend a video or channel to you does not prevent the algorithm from doing so. It also doesn’t affect your access to specific topics, opinions, or speakers. “Our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, like creating echo chambers,” says Hernandez.
Jesse McCrosky, a data scientist working with Mozilla on the study, says that isn’t entirely clear from YouTube’s public statements and published research about its recommender systems. He says that there are small glimpses of the black box which show YouTube considers two types feedback. On the positive side, YouTube views engagement as a measure of how long people watch YouTube videos and how often they view them; on the negative side, YouTube considers explicit feedback including dislikes. McCrosky says that there is a balance in how they treat these two types of feedback. “What we’ve seen in this study is that the weight toward engagement is quite exhaustive, and other sorts of feedback are quite minimally respected.”
The distinction between what YouTube believes it says about its algorithms and what Mozilla says is important, says Robyn Caplan, senior researcher at Data & Society, a New York nonprofit that has previously investigated YouTube’s algorithm. She says that while some of the findings are not contradictory to what YouTube claims, they show that users don’t have an adequate understanding of which features are available to control their experience and which are available to provide feedback to content creators. Caplan is happy with the findings of the study. She says that although Mozilla’s intended slam dunk revelation might be less dramatic than they had hoped, it still highlights an important problem: YouTube users are confused about how much control they have over recommendations. Caplan states that the research shows the need to regularly survey users about site features. “If these feedback mechanisms aren’t working as intended, it may drive folks off.”
Confusion over the intended functionality of user inputs is a key theme of the second part of Mozilla’s study: a subsequent qualitative survey of around one-tenth of those who had installed the RegretsReporter extension and participated in the study. Mozilla interviewed people who said they were happy that inputs were targeted at channels and videos, but they expected that it would be used to inform YouTube’s recommendation algorithm.
Ricks says that this is Ricks trying to have more control over what recommendations I get in the future. According to Mozilla’s research, YouTube should give users more control over their experiences by letting them define their content preferences. McCrosky also recommends that YouTube explain how its recommendation systems work better.
McCrosky believes that there is a disconnect between what YouTube users believe YouTube is sending and what YouTube actually does. McCrosky says that there is a disconnect between what they perceive YouTube to be providing and the actual content it provides.