YouTube and Power Without Responsability
It’s easy to say “You get awful recommendations because of the content you watch on Youtube”. I could say that very easily, but I don’t accept that premise because I took a Bachelor of Arts in Media Studies and I learned to think about deeper topics. I thought about media ownership, control and the theme of power without responsibility. It’s a book about the Press and Broadcasting in Britain but the concepts apply globally.
The Notion of Power
The premise is simple. All newspapers and broadcasters, at the time the written, and all social media companies, and social networks today, have power. They have the power to control what we see, or don’t see. They have the power to make us visible, or invisible. They also have the power to control what we hear, think, and see.
Manipulation
AppleBaum and Wylie both wrote about how Facebook data was used to profile and then target specific people to make them think and vote one way, rather than another. The results were Trump as President of the US and Brexit in England.
You See What You Watched
With this in mind the idea that what I see in my timeline on YouTube is entirely my fault, is erroneous, and simplistic. YouTube has data from billions of minutes of content being watched and it profiles all of us all the time. In my case Youtube has viewing data that spans back to the 9th of August 2005. Over the decades I have watched a lot of content.
During this time I have seen a big change with how the timeline presents videos. Before we would get 30-60 video recommendations. Now it’s down to six per niche and we can flit between topics but six video of choice.
There was a time when we had 30 videos to choose from, and now we have six videos. If you pay attention the content that is being recommended, unless it’s by a creator you already watch, has hundreds of thousands, if not millions of views. A few years ago these would have been “viral” but now the numbers are so high because the algorithms push the ever more popular content higher and higher in the recommendation.
The result of this is that recommendations are less and less personal, and more and more about the lowest common denominator. We’re getting suggestions. We are not given a choice.
Video Rentals and Book Shops
Remember, in the day of video stores we would go and browse among hundreds of VHS tapes and DVDs and we would have a real choice. We see the same with books. Audible and Kindle recommend a dozen books and if we want to browse we can’t. We see the top 50-100 per category, rather than random books. The result is that we are directed towards some content and selections, rather than free to browse.
Fatigue
When I was paying for YouTube prime I got fatigue. I knew that I wanted to watch something but I had no idea what to watch. When I refreshed the screen the choices I didn’t want to watch lingered. In and of itself this wouldn’t matter except that plenty of choices have clickbait headlines, and clickbait images. The result is that the more you search for something to watch, the more fatigue you feel. When you pay 18 CHF per month you want a certain quality of service that Youtube’s algorithms do not deliver. I felt fatigue. In the end I gave up.
Low Production Values
What makes YouTube so tiring to use is that a lot of content has low production values. They edit like a bird moves its head. They all use the same 20 sounds and music. They all boast about their viewership and ask us to like, subscribe and ding the bell. Do you want that in video at 18 CHF per month? I don’t. I don’t want to have to sort through the chaff to get to the wheat.
Helping the Young and Inexperienced
A healthy, socially responsible social network should look at content and provide recommendations that are relevant but also of good quality. That content should not be clickbait, and the choices should be numerous, rather than narrow. People shouldn’t be pigeonholed for what they have watched. The algorithm should not be based on the lowest commmon denominator but should use something more advanced.
And Finally
The notion that recommendations I get are crap, is because of my viewing habits is not an acceptable answer. We should be able to browse and over time, as trends emerge get better and better recommendations based on what we watched, and enjoyed, and shared. The algorithm should learn from us as individuals, rather than use everyone’s habits, to form a model.
Too many times I watch one video, and then I get that channel’s content recommended for days or weeks afterwards. Instead of the algorithms recommending a diversity of content they recommend three or four channels all the time so you watch it, because you give up ignoring it.