Asking for Permission on Instagram

Page content

If you want to see how unhealthy social media is just look at this story about DMs on Instagram. Now if you want to DM someone that you don’t follow they can send just one text message.

Imagine, you’re a user of Instagram. You’re following friends, family and colleagues. Now consider that every fourth post is by someone you don’t know anything about. Now imagine that you see the influencers several times a day, every single time you refresh your feed.

Complete strangers are invading your timeline, polluting your streams, and in general reminding you of your social isolation, reminding you to feel Fear Of Missing out. You’re then told that the FOMO person can only be texted once. This is absurd, because Instagram isn’t a social media site now. It is an advertising platform with user generated content spread thinly.

Threads at a Fifth of It’s Peak

I read that Threads is now at one fifth of it’s 100 million user peak. It’s at around 20 million users. This makes sense. Why would people want a timeline filled with strangers, rather than friends? Why would people join a website/app that is part of Facebook. I know that it’s called Meta, to whitewash itself, but I call it Facebook, to show that the whitewashing effort failed.

Toxic Culture and the Need For Change

That Instagram feels the need to limit DMs tells us two things. The first is that they have made Instagram toxic. It’s because of this toxicity that everyone needs to protected from one eyed trouser snake pics and other forms of spam via DM. If Instagram was still a network of friends of friends, it would still be self-policing. It isn’t, so new rules need to be put in place.

The second thing it tells us is that rather than tweak the algorithm to make suggestions and conversations healthier, they are just adding barriers, rather than tackling the core issues.

Social Media and Iconoclasm

This morning before getting up I read about how some people rented a villa, and damaged a statue taking photos of themselves with it. It also mentioned at least three people adding graffiti to the Colosseum. The issue with social media is that instead of having the morality of healthy communities, it has the morality of advertisers and marketers. The result is the vandalism and iconoclasm that is becoming more and more common. social media algorithms amplify emotions, and emotions, especially on social media are toxic.

The Social Media Algorithm Distortion

Social Media Algorithms Distort Social Instincts and Fuel Misinformation

Key facts:

  1. Social media algorithms are designed to promote user engagement, thereby amplifying inherent human biases for learning from prestigious or in-group members.
  2. This amplification often promotes misinformation and polarization as it doesn’t discern the accuracy of the information.
  3. Researchers suggest that both users and tech companies need to take steps to mitigate these effects, including user education and algorithmic changes.

Social Media algorithms are toxic. Rather than tackle the cause of toxic behaviour companies like Facebook prefer to pretend that the problem is the user, rather than the algorithm that drives humans to behave in a toxic or trollish manner. Instead of encouraging humanism algorithms amplify emotion, because emotion encourages people to stick around.

In contrast, algorithms are usually selecting information that boosts user engagement in order to increase advertising revenue. This means algorithms amplify the very information humans are biased to learn from, and they can oversaturate social media feeds with what the researchers call Prestigious, Ingroup, Moral, and Emotional (PRIME) information, regardless of the content’s accuracy or representativeness of a group’s opinions. “It’s not that the algorithm is designed to disrupt cooperation,” says Brady. “It’s just that its goals are different. And in practice, when you put those functions together, you end up with some of these potentially negative effects.” In addition, the researchers propose that social media companies could take steps to change their algorithms, so they are more effective at fostering community. Instead of solely favoring PRIME information, algorithms could set a limit on how much PRIME information they amplify and prioritize presenting users with a diverse set of content.

Twitter, Facebook, Instagram, YouTube, Threads

All of these social media sights are driven by algorithms that amplify negative emotions, rather than foster community. That’s why I think hashtags are bad, and that twitter threads are bad. That’s why I think commenting, re-sharing and other forms of behaviour are better, especially in a chronological timeline, as we have with blogs and most of the fediverse. I won’t use pixelfed because it uses hashtags rather than categories or healthier community building tools.

We worry about AI but algorithms control more of what we see and feel, than AI.

And Finally

After decades of using Social Networks I have almost never felt the need to send DMs, especially to strangers. I usually use them sparingly, either to coordinate IRL meetings, or to share information that I do not want everyone to have access to. Instagram is restricting DMs not because they care about their users, but because they are deflecting from the problems posed by their algorithms that encourage polarisation, and trolling.