Editor’s Note: Dries Buytaert is founder of Drupal and CTO of Acquia. The opinions expressed in this commentary are his own.
I still remember the feeling in the year 2000 when a group of five friends and I shared a modem connection at the University of Antwerp. I used it to create an online message board so we could chat back and forth about mostly mundane things. The modem was slow by today’s standards, but the newness of it all was an adrenaline rush. Little did I know that message board would change my life.
In time, I turned this internal message board into a public news and discussion site, where I shared my own experiences using experimental web technologies. Soon, I started hearing from people all over the world that wanted to provide suggestions on how to improve my website, but that also wanted to use my site’s technology to build their own websites and experiment with emerging web technologies.
Before long, I was connected to a network of strangers who would help me build Drupal, one of the first web content management systems that now powers more than a million websites. For me, the early web was a place to build, connect and experience the kindness of strangers. I’m afraid that experience is all but gone for my kids.
I want it back.
Today, the reality is much different for my sons, who will grow up on a web where rampant privacy violations seem to happen on a weekly basis. Nearly 90 million Facebook users had their personal data leaked during the Cambridge Analytica scandal, and the FTC is now said to be investigating YouTube for allegedly violating the Children’s Online Privacy Protection Act.
And privacy isn’t the only problem. Unlike the original web, where everyone built their own little corner of the internet universe, millions of people are concentrated within powerful social media platforms that provide a curated filter for the web, spreading bullying, hate speech and worse around the globe in seconds. Large social media platforms like Twitter and Facebook have struggled to contain this type of content, even with the most sophisticated machine-learning algorithms and extensive content curation teams.
The difference between the old web and the web today is that most of the power is now controlled by a small group of companies. These companies have exchanged free services for users’ data and hidden their practices behind difficult-to-understand terms of service and privacy policies. And, while connecting massive amounts of users around the world has been positive in many ways, it has also become frighteningly simple to spread hate or misinformation. Only now are we beginning to see the ramifications of power and virality.
Our web is no longer our own.
Yet, despite abuse after abuse, consumers continue to give their attention — and data — willingly to giants like Facebook, Google, Twitter and others. According to Pew Research, the share of US adults using social media has remained largely unchanged since 2018. How much abuse are we willing to tolerate to make a change? And what kind of consequences will be enough to make these abuses stop?
The mega-corporations responsible for privacy violations need to be held accountable for their actions. I don’t necessarily believe that a breakup of big technology companies is the answer, but I do believe in increased government regulation of algorithms, consumer freedom to delete or transfer data between services, and penalties for violations that adequately match the crimes.
More Tech & Innovation Perspectives
First, the web needs a better consent system for data privacy. If you’ve paid attention to the latest Google and Facebook developer conferences, they’re doing their best to assure us that a new era of privacy has arrived. While Facebook’s ideas of encrypted private messaging and disappearing social posts are good ones, they are not good enough. Good enough is when users can control what data is stored about them, consent to the sharing of any data and easily delete or update that data. Facebook, Twitter or Google won’t implement ‘good enough’ without a government mandate. Why would they? Their businesses are built on the tracking and collection of data.
This consent mechanism will have to come as a combination of both regulation and technical advancement. We have to build a more transparent, user-friendly way for people to understand the data they’re sharing with companies on the web — both mega-platforms and individual companies alike. Just like federally mandated nutrition labels on food or warning labels on cigarettes and alcohol, people need clearer information about what data they’re sharing with whom.
Governments have to mandate that first. Once the mandate is in place, the technology will follow. From a technology standpoint, consent could look like a browser setting that lets users select exactly which data they’re willing to share with specific companies. Site owners must earn a person’s trust in order to gain access to more data over time, and users should be intimately aware of exactly what they’re giving up in exchange for convenience, personalization and other features.
In terms of hate speech, companies need to get better at policing the content on their own platforms. Today, this is done by machine-learning algorithms, but with the help of human content moderators who spend their days scouring social posts and videos looking for graphic language and imagery, from racist comments to acts of murder. These individuals have helped block thousands of videos and posts from public exposure, but at great cost to their mental and emotional wellbeing. In time, the machine-learning systems these companies have in place will “learn” from human content moderators and become more sophisticated at identifying and removing objectionable content.
There’s a wave of entrepreneurs working on a new foundation for the web, built on blockchain-based technology. The entrepreneurs building on these foundations remind me of the people I met in the early days of Drupal. Yet, they have a new motivation to create an internet that’s safer and more equitable for its users.
Rather than a single company or entity controlling data, blockchain databases allow the user to control their own data in a tamper-proof system. While I can see how blockchain-based platforms will offer better data privacy, it’s not clear how they can deal with content quality at scale.
The hope, instead, is that a secondary group of more intimate, secure and private social networks emerge that are more dependent on participation than ad revenue. Rather than a larger-scale “town square” approach, smaller, decentralized social networks will decrease the spread of bullying and hate speech, and lead to the rise of common interest groups where people feel safer to share their thoughts.
Unfortunately, mass adoption of a new, decentralized system may prove challenging. Social media platforms like Twitter and Facebook are popular not because of their great content, but because of how easy it is to for content creators to get attention. Unless distributed applications make it more rewarding to share and spread information, they might not see the same adoption as Twitter and Facebook.
Ultimately, I want my kids to have a similar experience with the web that I had in college. I want the web to be a platform for publishing their thoughts, without the fear of being exploited; for connecting with others, without the fear of being bullied; and for sharing ideas and code, while building lifelong friendships with people around the world. It’s staggering to think that we built the web as we know it in only a few decades.
I’m convinced we can build it better for future generations with new, decentralized systems that will provide people with more options that echo the promise of the original web. In all likelihood, these decentralized and centralized applications will coexist as consumers make a choice about how much they value privacy and quality over attention and reach.
While we can’t bring back the original web, at least my kids will have better options to choose from.