UK lawmakers have accused Facebook of violating data privacy and competition laws in a report on social media disinformation that also says CEO Mark Zuckerberg showed “contempt” toward parliament by not appearing before them.
The UK Digital, Culture, Media and Sport Committee said in a report published Monday that a trove of internal Facebook emails it reviewed demonstrated that the social media platform had “intentionally and knowingly” violated both data privacy and competition laws.
The cache of documents reviewed by the committee, some of which include correspondence between Zuckerberg and company executives, stem from a lawsuit filed in California against Facebook (FB). The committee obtained the documents late last year from a small app company called Six4Three that is behind the suit.
According to the committee, the documents show that Facebook was “willing to override its users’ privacy settings in order to transfer data” to app developers. The lawmakers also claim the documents show the social network was able to “starve” some developers of data and force them out of business.
“Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law,” the report said.
In response to the report, Facebook said it had not breached data protection or competition laws. Karim Palant, Facebook’s UK public policy manager, said in a statement that the company “supports effective privacy legislation” and is also open to “meaningful regulation.”
Facebook said in December that the documents from the Six4Three lawsuit had been “selectively leaked” to tell “only one side of the story.” CNN and other news outlets had asked the California court to make the documents public.
The allegations are the latest headache for the social media giant, which has come under intense scrutiny from policymakers in the United States and around the world following a series of data scandals including Cambridge Analytica.
While Facebook was a major focus of the report, the Digital, Culture, Media and Sport Committee made several recommendations on how to combat fake news and disinformation.
The committee said:
- Social media platforms should be subject to a compulsory code of ethics.
- An independent UK regulator should monitor tech companies, and be able to launch legal proceedings against them.
- UK antitrust regulators should conduct a “comprehensive audit” of the advertising market on social media.
- UK regulators should investigate whether Facebook has been involved in anti-competitive practices.
- The government should examine recent elections for evidence of voter manipulation.
The committee’s investigation lasted 18 months and featured nearly two dozen oral evidence sessions, including a special hearing in Washington, D.C. and an “international grand committee” attended by representatives of nine countries. The final report ran to over 100 pages.
“The big tech companies must not be allowed to expand exponentially, without constraint or proper regulatory oversight,” the report stated. “Only governments and the law are powerful enough to contain them.”
The report harshly criticized Facebook and Zuckerberg, who repeatedly refused to appear before the committee last year despite numerous requests.
“The management structure of Facebook is opaque to those outside the business and this seemed to be designed to conceal knowledge of and responsibility for specific decisions,” the report said. “Facebook used the strategy of sending witnesses who they said were the most appropriate representatives, yet had not been properly briefed on crucial issues, and could not or chose not to answer many of our questions.”
The report’s authors said they had “no doubt that this strategy was deliberate.”
Damian Collins, the committee’s chair, said in a statement that Zuckerberg “continually fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world’s biggest companies.”
British authorities ruled last year that Facebook broke UK law by failing to safeguard user data, and by not telling tens of millions of people how Cambridge Analytica harvested their information for use in political campaigns.
Palant, the Facebook public policy manager, said that the company shares the committee’s “concerns about false news and election integrity” and that it had made “a significant contribution to their investigation” by answering more than 700 questions.
Palant also highlighted the “substantial changes” to political advertising standards the company has undertaken.
“No other channel for political advertising is as transparent and offers the tools that we do,” said Palant. “We have tripled the size of the team working to detect and protect users from bad content to 30,000 people and invested heavily in machine learning, artificial intelligence and computer vision technology to help prevent this type of abuse.”