It was just a couple of years ago that I was writing about how the social media algorithms could be shaping what we think, by giving us more of anything that we interact with.
For example, Facebook and Google know exactly how much time you spent looking at friends vacation photos and wishing you could have as glamorous of a trip, or wishing you could have their looks, or that body. They know how often you read stories about child abuse victims, and statistics on child abuse. They know exactly how many times you’ve been appalled by the death of a child in the news, or liked a rant about how social workers missed child abuse.
They know this about you, and they want you to keep paying attention so that you’ll stay on their sites, whether you’re talking about Facebook, or any other website that you regularly visit. So they show you MORE of it. They keep showing it to you so that you’ll continue to be jealous/obsessed/upset and continue to look at the content until you start to believe that the entire world is nothing but other people with better lives than yours, or that everyone has mental health issues and no children are safe, and on and on.
I can’t imagine that being surrounded by this stuff day in and day out is good for our mental health.
Today, I saw a story about how much damage this can really do:
IT’S TROUBLING ENOUGH that British teenager Molly Russell sought out images of suicide and self-harm online before she took her own life in 2017. But it was later discovered that these images were also being delivered to her, recommended by her favorite social media platforms. Her Instagram feed was full of them. Even in the months after her death, Pinterest continued to send her automated emails, its algorithms automatically recommending graphic images of self-harm,
The problem here, of course, goes back to that idea of the algorithm not being tuned for an individual’s best interest, but being tuned to show us more of what we show an interest in, regardless of what that interest is. The more extreme we get i our viewing habits, the more extreme the algorithm skews. The only thing it cares about is giving us stuff that will keep us on the site.
Now, the article also goes on to talk about how there’s no easy answer here for the social media companies. I agree with that. For example, Facebook could simply ban all discussion of child abuse, but they would be doing a great deal of harm to the survivor communities that have built up in their groups and pages, and they’d make it much more difficult to talk about or do research about it. We don’t want that.
What we need though, is a way out of the algorithm. That may be something each of us needs to be aware of, and educate ourselves, and others about for now. Be willing to walk away from the next “suggested” post or video. Limit the groups and people you interact with to the ones that are helpful to you, use tech tools to go back to a chronological timeline and take the control of what you see from your news feed, i.e. use Tweetdeck to view your twitter timeline, or friend groups to see a Facebook feed of just those people, etc. Turn off the email “suggestions”, and other notifications designed to outrage you into going back and starting down into the rabbit hole again.
Take control of your social media use instead of letting the algorithm control you, and send you into rabbit holes of despair and poor mental health. Be willing to look beyond the next “recommendation” at the larger picture of what you pay attention to, and where you choose to spend your time. It is watching you, and it only wants to continue to feed whatever it is that makes you look at the next thing, and the next thing. Be mindful of that.
And, maybe, just maybe, spend more time contributing in positive ways. Write positive posts, share positive stories, introduce people to ideas, accounts, people, who can be of benefit to them. Social media isn’t just there to consume. It allows each of us to try and contribute, to reach out and affect people in a positive way too.
Give it a try. You never know who needs to see that in their feed today.