The launch of Highguard this week has reignited a debate around good manners. Over a hundred people worked for several years on a project they were passionate about. Instead of everyone giving it a fair shake and casually talking about what they did or didn't like about it, the conversation around the new multiplayer shooter was quickly subsumed by a torrent of knee-jerk negativity. Social media is full of people dunking on the game for clout
This grief feels similar to what they would experience if their family member died, but in some cases, it feels even worse. Family estrangement has reached epidemic proportions. A 2022 survey found 29 percent of Americans are currently cut off from a parent, child, sibling, or grandparent, and a 2025 survey found 38 percent have experienced estrangement from a close family member at some point. These aren't just statistics. They're the tragic consequences of families ripped apart.
Before I've poured my first morning coffee I've already watched the lives of strangers unfold on Instagram, checked the headlines, responded to texts, swiped through some matches on a dating app, and refreshed my emails, twice. I check Apple Maps for my quickest route to work. I've usually left it too late to get the bus, so I rent a Lime bike using the app.
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging. At such a critical moment in US history, we need reporters on the ground.
As Rolling Stone reports, the theory of Nazi identification was tied to such flimsy evidence as a lightning-bolt necklace that maybe-kinda-sorta looks like an SS symbol, or fixating on an out-of-context use of the word "savage" in the song "Eldest Daughter." The campaign began in places like 4chan but quickly went mainstream, relying on a stan army as well as argumentative normies to give the theory an algorithmic boost.
There is a word that has been adopted for this practice, now known as doomscrolling. This term refers to getting stuck in a social media rabbit hole, consuming more and more information. It's the act of continuing to scroll and read new content, even if that content might be upsetting or worrying. Doomscrolling was one of the Oxford English Dictionary's words for 2020, as this practice seemed to arise during the pandemic when so many people were inside and using social media more than usual.
The most prominent AI systems today are Large Language Models (LLMs) like ChatGPT, Claude, Grok, Perplexity, and Gemini. These systems work through computational models that mimic the human brain's structure, thus termed "neural networks." They consist of interconnected nodes that process and learn from internet data, enabling pattern recognition and decision-making in the field of artificial intelligence called "Machine Learning." LLMs are trained on massive datasets containing billions of words from books, websites, and other text sources.
The teenage brain is built to help young people explore the questions, "Who am I?" and "Where do belong?" Answering these questions isn't a solitary endeavor. It's a profoundly social one. As young people try out different versions of themselves, they watch how others respond, gathering information about what feels authentic and what doesn't. Today, many of those experiments and reflections unfold online, where algorithms and influencers play an outsized role in shaping the feedback loop.
We live in a world where it's easier than ever to surround ourselves with people who think exactly like we do. Social media bubbles, corporate cultures and even leadership teams can all become echo chambers, places where the loudest reinforcement drowns out the most valuable challenge. The problem? Echo chambers create blind spots. They emphasize what we want to hear, not what we need to hear. They boost our confidence but rarely bring clarity.
A single-second pause over a social video can dismantle your algorithm. Even at my best, I can still hesitate over a poorly shot video demonstrating the "latest tech" that I simply "can't live without" and later learn I really could have lived without it. However, by then my feed is already filled with duplicate videos trying to push the same product.
The goal is to make buttons intuitive, easy to use, and - predictable. But is the disclosure, about participating in social media and expressing approval, full and revealing? I guess it all comes down to what you would define as a "positive experience". As I write this, two messed up, intertwined things are happening. Both can be directly linked to how the engagement dynamics of social media, driven by technology such as "like" buttons, has negatively impacted global politics.
Too many people use the word 'but' in relation to what happened overnight. "It's extraordinarily easy to condemn violent acts against somebody with whom you share their views. "It is much more important that we are consistent in terms of calling it out when it's against somebody whose work, whose views differ to us."
My social media algorithms knew I was pregnant before family, friends or my GP. Within 24-hours, they were transforming my feeds. On Instagram and TikTok, I would scroll through videos of women recording themselves as they took pregnancy tests, just as I had done. I liked, saved, and shared the content, feeding the machine, showing it that this is how it could hold my attention, compelling it to send me more.
If you've never heard the term, ragebait marketing is simple: a brand does something polarizing or controversial - sometimes accidentally but often intentionally - with the goal of going viral by wreaking havoc in the comments and inspiring think pieces and millions of dollars in free publicity. And the truth is, it works - at least on the surface, if you measure the success of a campaign in views.