WASHINGTON, DC - APRIL 10: Facebook co-founder, Chairman and CEO Mark Zuckerberg testifies before a combined Senate Judiciary and Commerce committee hearing in the Hart Senate Office Building on Capitol Hill April 10, 2018 in Washington, DC. Zuckerberg, 33, was called to testify after it was reported that 87 million Facebook users had their personal information harvested by Cambridge Analytica, a British political consulting firm linked to the Trump campaign. Photo by Zach Gibson/Getty Images)

Big Tech can't be trusted. It's time for regulation

Updated 3:35 PM ET, Tue April 30, 2019

Chat with us in Facebook Messenger. Find out what's happening in the world as it unfolds.

Mike Chapple is associate teaching professor of information technology, analytics, and operations at the University of Notre Dame's Mendoza College of Business. The opinions expressed in this commentary are his own.

Perspectives Mike Chapple

Big Tech is under the spotlight, and for good reason.

In the past couple of years, an onslaught of incidents have shook public confidence in major technology firms. Facebook apologized for allowing Cambridge Analytica to harvest the personal information of more than 80 million users. Google shut down its social network in the wake of reports that it failed to disclose a serious security vulnerability that could have revealed the private information of as many as 500,000 users. The company said it found "no evidence" that any data was actually misused. Then, just a couple months later, Marriott announced a data breach that affected 500 million individuals.
It's become clear that Big Tech can't be trusted to govern itself. It's time for regulators to step in.
Michael Beckerman, President and CEO of the Internet Association, a technology industry lobbying group, spoke at the Milken Institute Global Conference this week and reminded attendees that technology is a force for good that improves our lives. That's true. There's no doubt that artificial intelligence is transforming medicine, online learning is democratizing education, and technology businesses are creating economic opportunities. But that doesn't earn the industry a free pass to justify bad behavior or inadequate privacy practices.
Consumers have justifiably lost confidence in Big Tech. A YouGov survey released last week showed that over 80% of Americans don't trust Facebook, Google, or Dropbox.
    It seems like each time a major privacy scandal hits the news, Congress holds another series of hearings with grim-faced politicians cluelessly questioning smug technology executives. But nothing changes. It's been over a decade since Congress passed the last major privacy law in the United States: a 2010 update of health care privacy requirements.
    The privacy laws that we do have are woefully inadequate, targeting very specific uses of very specific categories of information. The Health Insurance Portability and Accountability Act prevents your doctor from sharing your medical records without your consent, but it doesn't stop Apple from sharing health information from your Apple Watch. Apple has a good track record when it comes to privacy, but makers of iPhone apps have been accused of sharing health information with Facebook and others. The Gramm-Leach-Bliley Act prevents your bank from sharing your financial records, but it doesn't apply to Facebook or Amazon -- both of which have expressed interest in entering the financial space. The United States lacks the kind of comprehensive privacy law that would grant individuals control over their own information, wherever it resides.
    Big Tech worries that regulation will be costly and interfere with growth. But, for consumers, it is the only means they have to assert control over their own data.
    A balanced regulatory regime isn't an impossible fantasy. The European Union made great strides toward this goal with the implementation of the General Data Protection Regulation, or GDPR. That law applies to broad categories of personal information across all industries and offers individuals some basic protections. It requires that companies obtain consent before collecting personal information, disclose how they will use the information they do collect, and provide a mechanism for consumers to request the deletion of their personal information from corporate files. GDPR also requires that companies promptly disclose data breaches to regulators and affected individuals.
    GDPR just went into effect last year, so it's still too early to see the real impact. But early indications are that the law has had positive impact globally, not just within the European Union. Seeking to comply with GDPR, Apple rolled out a privacy tool that allows users to download all of the personal information the company maintains about them. This tool was initially available only to EU residents, but is now available in the US as well.
    Proponents of industry self-regulation downplay the power that resides in the hands of Big Tech, saying that consumers are free to take their business elsewhere. When Gizmodo's Kashmir Hill tried to take this advice and quit Google earlier this year, she discovered that it's easier said than done. The tech giant controls everything from the login systems used by many online services to the very fonts that appear on millions of websites. It's simply not practical to live a modern life outside the reaches of Big Tech.