|
Mike Zywina, Director at Lime Green, explores the murky world of social media censorship, algorithmic trapdoors and vanity metrics - and what it means for charities and social enterprises. Last month, I wrote by far my worst-ever performing LinkedIn post. This month, I’m writing a blog out of it. Cast your mind back to early January. Easy enough as it wasn’t much different to now – just a little darker, a little wetter, and with Donald Trump defecating marginally more on the global news cycle. For those of us hoping for a more positive 2026, the year didn’t start well. The US captured the Venezuelan President on 3rd January. Trump refused to rule out invading Greenland, touted military action in Iran, and directed threats at various other countries. NATO Chief Mark Rutte ominously said “we're not at war, but we’re at not peace either.” There was frenzied speculation about what was really driving the world's most powerful and belligerent Wotsit: was it geopolitical strategy? Thirst for oil? Was he just unhinged? Or did he simply take a look at a world map and notice that Greenland seemed really big? Around the same time, I listened to a podcast about the upcoming US midterm elections, how crucial they were to the MAGA agenda, and the very real risk they won’t be conducted freely and fairly. Joining some dots in my head, I wrote this on LinkedIn: It was a departure from my usual topics. It was speculative (arguably unhelpfully so) and pessimistic. I didn’t expect it to be everyone’s cup of tea. But I didn’t expect it to be LinkedIn Kryptonite either. I can almost guarantee you didn't see this post. After six weeks, it’s had around 50 impressions (almost all of them me), compared to the usual figure of a few thousand, and 0 reactions. It’s been “seen” by 8 people, except it hasn’t really. The post didn't appear in anyone's feed. It doesn't show up in my post history. When I sent the direct post link to people, they received a message saying "This post cannot be displayed", despite them being counted towards the paltry number of people who had seen it. It's like it never existed. I couldn’t tell you exactly which words triggered this – it's not like this was my first post about politics or Trump – though I suspect the phrase “dying megalomaniac dictator mode” did me no favours. My post doesn't matter - but the bigger picture of censorship and harmful content doesNobody is going to cry about the loss of one LinkedIn post. There's enough noise on there anyway. Those of us who post regularly tend to get hung up on how our posts perform, and which topics get engagement. I’m fully aware this will always risk sounding vain and self-absorbed. Who really cares? Except we should care, because it's part of a much larger and worrying picture. We know that all the big social media platforms manipulate the content that we see, shaping what gets heard and what doesn’t. We also know who owns these platforms, and their political orientation. In a rare moment of accidental tech bro transparency, they even lined up together at Trump’s inauguration. Jeff Bezos: Amazon. Mark Zuckerberg: Meta/Facebook. Tim Cook: Apple. Sundar Pichai: Google. Elon Musk: X. You've probably seen people using emojis and deliberate misspellings when talking about Palestine, genocide or ICE, to avoid triggering the mysterious algorithmic trapdoors. A recent report by 7amleh - The Arab Center for the Advancement of Social Media gathered testimonies from anonymous LinkedIn employees alleging that the company’s internal political landscape was dogged by anti-Palestinian racism, and moderation practices were being misused to suppress criticism of Israel. Meanwhile, worrying research by Best For Britain showed how multiple new TikTok accounts – set up with different ages, locations, dates and IP addresses – were shown far-right, racist content within minutes, despite doing nothing to engage with it. We live in a world where content related to women’s health is routinely censored (a white paper by CensHERship found that 95% of women’s health creators reported censorship on social media), while Grok persistently allowed users to digitally undress women for entertainment. Just this week, a teenage girl wrote a shocking account for The Guardian detailing her daily experience of objectification and misogyny on social media. The overall picture is grim. Selective censorship and harmful content isn't just permitted, it's designed in. Worse, we’re now trusting the same type of people – often literally the same individuals – to be the custodians of AI. It's their algorithms and safety mechanisms that are shaping the content we produce, the images we generate, the information and advice we receive. Why should this matter to charities?In a sector all about connecting with people, social media is crucial in enabling us to increase participation in our services, raise money, build movements. And it’s so easy to be seduced by the analytics. As we invest time in creating content, we watch our number of followers, impressions and reactions creep up. This creates a reassuring sense of progress and permanence. We talk about “our” channels, “our” audience, “our” income streams. But is it really our channel if we have to routinely self-censor in order to be heard? Is it really our audience if we could be cut off overnight – either because we get banned for something we say, or because a different tech bro takes over and decides to change the rules, making the game unplayable? It might feel like we’re building something on solid foundations, but it’s actually a bed of sand. And that's not to mention the obvious ethical implications of participating on – and generating income for – platforms that are contributing towards the very problems we exist to solve. But when the creeping involvement of tech in our lives is so insidious and inevitable, what can we actually do?It's easy to feel powerless – and it's likely that disengaging from social media would harm your mission, income and service users. I’m not suggesting doing this, but I do have some alternative suggestions: Recognise the risks: building your work on the shaky foundations of social media comes with risks – operational, reputational, financial. But my sense is that this is a common blind spot. How many organisations have risk assessments that detail myriad unlikely scenarios, but have never even considered the potential impact if TikTok was switched off overnight, or LinkedIn was bought by an intolerable tech bro? Having an honest conversation about the risks and your vulnerabilities – and the extent to which you really have control over what you rely on – will be helpful, even if there aren’t any easy solutions. More platforms, less exposure: while you can’t prevent a platform becoming unviable for your organisation, you can minimise the damage this would do. Being active and building a following on a range of social platforms will reduce your vulnerability to external events. I say this in the full knowledge that, at Lime Green, we've gone pretty much all-in on LinkedIn since coming off Twitter/X in 2024. It’s easy to become over-reliant on a platform that’s working well for you. That doesn't make it a good idea. Beware the meaningless vanity metrics: sites like LinkedIn and Facebook provide you with endless data, but how much is it really telling you? Does gaining hundreds of followers, or thousands of post impressions, mean you’re actually achieving your objectives? Do those people actually engage with you? Are they even active, human users? I heard a conversation between journalists recently where they talked about feeling reluctant to leave Twitter/X because they'd accumulated thousands of followers, even though the algorithm meant their content never got more than a handful of likes. They eventually left, and lost absolutely nothing. Platform data can give you a false feeling of comfort, but it’s not always a reliable guide to where you should focus your efforts or remain active. Look beyond social media: because it's important, but it's not everything. It might bring us lots of fleeting connections, but it can lack the depth of other channels, as well as being at the mercy of billionaires.
At Lime Green, many of the people we interact with most frequently are email subscribers. There are people who open and click on every email but have never even engaged with us on LinkedIn. Email provides a safety net that social media could never offer – there’s no single event that could disconnect us from subscribers, and we’re not reliant on a platform to be able to talk to them – and it often makes for richer communication. For your organisation, your golden channels could be email, or post, or phone calls, or SMS. My recent experience on LinkedIn – combined with so much of what I read – is making me re-evaluate how best to use social media. Well, it's either that or just pack it all in, go off-grid and relocate to a cave in the woods. Some days it’s tempting, but I’m not quite there yet.
2 Comments
Lou
26/2/2026 09:52:27 am
It never even occurred to me that LinkedIn would be censoring, although of course they would; not sure why I thought them any different.
Reply
Wigs
4/3/2026 01:52:15 pm
Always such good food for thought, Mike! Thanks for articulating. It's an uncertain world right now, and sensorship and social media manipulation is never more pertinent than it is at this moment...
Reply
Leave a Reply. |
Want to receive this regularly by email?
Categories
All
Archive
February 2026
|
RSS Feed