Why Safety by Design Must Be the Foundation of Our Digital Future

(Photo: VisbyStar, CC BY-SA 4.0 via Wikimedia Commons)

Adele Zeynep Walton is a British-Turkish journalist reporting on the human impacts of digital technology and social media, and the author of Logging Off: The Human Cost of Our Digital World, set to be published in June 2025. She is also an online safety campaigner with Bereaved Families for Online Safety and Ctrl + Alt + Reclaim, a Europe-wide youth movement working to reclaim the digital world. In addition, she serves as a youth ambassador with People vs Big Tech and is a Connected by Data fellow. She is the co-founder of the Logging Off Club, which brings people together offline through phone-free events to promote connection, curiosity, and well-being.

In this interview, Walton discusses how addictive algorithms expose users to harm, the importance of making safety by design a foundational principle in tech development, and why structural change is essential to reclaiming our digital future.

Yash Sharma: How have your personal experiences shaped your interest and work in the field of digital safety?

Adele Zeynep Walton: As a 25-year-old Gen Z, I grew up online, getting my first Facebook account at 10 years old and Instagram at age 12. But until two years ago, I didn’t realise how pervasive the harmful impacts of social media on our mental health could be. This is despite the fact that I had my own fair share of struggles as a result of what I consumed and saw online— like struggling with disordered eating for years after being exposed to content that promoted anorexia and being trolled by men on Twitter for my political views— but for the most part, I thought this was something I had to tackle myself; my individual responsibility.

Adele Zeynep Walton

But in 2022, after losing my sister Aimee to online harms, I realized just how pervasive the negative impacts of the unregulated digital world can be, and, in the worst circumstances, can cause death. My vulnerable sister was drawn into a toxic forum that encourages and assists people in taking their lives, which is illegal. I’m now an online safety campaigner with Bereaved Families for Online Safety, and as a journalist, I report on online harms and tech policy.

YS: You frame the harms of the digital world not just in terms of personal and physical safety, but also as a crisis of values and community. Could you discuss your approach on this front? 

AZW: The harms that are happening as a result of social media and big tech’s business model are a by-product of a crisis of our humanity. Tech companies are cashing in on the addiction, division, polarization, and radicalization of users through their products, whilst society loses out at every level. Democracy is in crisis, political apathy and division are at a high, and rates of mental health issues are sky high. Over the past decade, we’ve been sold a lie of connection and convenience, but we are now seeing that, at both an individual and global level, the human cost is far too high.

YS: In your opinion, what does transparency, accountability, and redress from Big Tech platforms look like, and how can advocacy groups better organize and apply public pressure on these organizations? 

AZW: Whilst fines and regulations are vital, the sheer economic scale and power that tech companies have today means they are adept at evading accountability and lobbying decision makers to act in their interests. As users and advocates, we must continue to inform the general public of the harms that are already happening, and empower them to feel that we all have a voice and say in shaping our digital future. Translating the individual frustrations and personal grievances that we all have with the digital dictatorship we see today into organised action at a global scale would ensure that decision makers cannot afford to ignore us and our demands.

Technology needs to be designed with the principle of safety by design. This isn’t a big ask— if you applied these standards to any other product you use daily, be it your toaster or your car, you would expect to use it safely, and if not, these companies would go out of business. Social media companies should not be an exception to this rule.

YS: You also work closely with groups pushing for legislative change. Which current laws or regulatory proposals do you see as most promising, and where are the biggest gaps that still leave young people vulnerable?

AZW: In the European Union, the Digital Fairness Act could potentially ban addictive social media design. This is an example of tech policy that could nip the harms of social media at the root, rather than fix it with a temporary band-aid solution. As a young woman who grew up using social media, I’ve experienced the negative impact of addictive algorithms that send us down rabbit holes of harm and withhold our attention. Algorithms that are designed to keep us scrolling, whatever the cost, do not decipher between what’s true and false or what’s harmful and what’s safe. This only exposes users to more risks. When it comes to risks of misinformation or harmful content like dangerous online challenges and content that glorifies self-harm, we need digital design that truly puts user safety at the core. A ban on addictive design can only be a positive change. It would free our time from hours spent mindlessly scrolling, prevent us from getting sucked into unsafe paths online, and help us to reclaim our lives from the relentless grasp of profiteering social media companies. 

On the other hand, online safety legislation across the globe— be it social media bans for under 16’s or legislation that removes illegal content— does not go far enough.  The UK’s Online Safety Act, for example, falls short. Despite meeting with bereaved families, including my own, and hearing our stories and the recommendations we gave to them, Ofcom’s code fails to go far enough to tackle self-harm and suicide material online. 

There is not a single targeted measure for platforms to tackle suicide and self-harm material that meets the criminal threshold. These rules shield platforms from real accountability. If platforms find harm that isn’t in the written code, they don’t have to do anything. This lack of targeted measures leaves huge loopholes in the Act. Yet again we are seeing that the platforms that profit from young people’s vulnerabilities aren’t being held to account, and legislation is failing to keep young people safe online.

On another note, social media bans for children, whilst potentially preventing harms in the immediate instance, would only leave young people vulnerable to harm once they turn 16. We also need an end to the surveillance-for-profit business model by banning the addictive algorithms and recommender systems that track and target us. It’s time the tech industry was held to the same ethical standards that other industries are and that, above all, our safety and well-being are put first.

YS: In your experience, what strategies have been most effective in empowering young people to shape digital policy, and how can advocacy organizations better support them?

AZW: Involving young people in the conversations around tech policy and online safety would give advocacy organizations the ability to learn from our perspectives and hear about the solutions that young people want. One of the most powerful spaces I’ve been in was a youth boot camp hosted by People Vs Big Tech, which gathered young people from across Europe for three days to convene and organize around tech justice. What emerged was an intersectional movement of people across a continent with their own networks, expertise, and skills, which have now contributed to the wider tech justice movement.

YS: What are some practical and immediate actions at the individual level that can help fulfill what you call “a radical reclamation of our digital world”? 

AZW: Know that you’re not an impostor, and we all have the right to shape our digital world. So far, the digital future has been gatekept by a handful of billionaires, but we collectively need to reject the impostor syndrome that’s been ingrained in us that makes us believe the myth that they’re the only people equipped to speak on technology. In reality, the users are the experts. 

Talk to your friends and peers about the frustrations you have with social media, as this will make you realise you’re far from alone and that,— unlike the way social media forces us to think we’re isolated — our experiences are actually all connected. 

We all need to redevelop a balanced relationship with technology— not a complete ban, not obsessed or addicted, not dependent or totally withdrawn, but a healthy middle ground. Implementing boundaries in your own digital life, being more intentional with your screen time, and existing more offline than you do online are all steps towards this.

Share the Post: