WASHINGTON (Sinclair Broadcast Group) — A British parliamentary report released Monday branded Facebook and companies like it “digital gangsters” and called for new regulation of social media platforms as it accused the social media giant of violating privacy and competition laws.
After an 18-month investigation, the Digital, Culture, Media and Sport Committee’s report on disinformation and “fake news” calls for the creation of a code of ethics and an independent regulator to oversee technology companies’ compliance with it, reforms to current electoral communications laws, and requiring social media companies to take down harmful content.
“The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights,” Committee Chairman Damian Collins said in a statement. “Companies like Facebook exercise massive market power which enables them to make money by bullying the smaller technology companies and developers who rely on this platform to reach their customers.”
Although the investigation was far-reaching, including interviews with 73 witnesses at 23 oral evidence sessions, the report often zeros in on Facebook and founder Mark Zuckerberg, who repeatedly refused to testify before the committee.
“By choosing not to appear before the Committee and by choosing not to respond personally to any of our invitations, Mark Zuckerberg has shown contempt towards both the UK Parliament and the ‘International Grand Committee’, involving members from nine legislatures from around the world,” the report states, alleging that the company is unwilling to be scrutinized or regulated.
Facebook has disputed that characterization, insisting it is open to “meaningful regulation” and stiffer data privacy laws.
“We have already made substantial changes so that every political ad on Facebook has to be authorized, state who is paying for it and then is stored in a searchable archive for 7 years. No other channel for political advertising is as transparent and offers the tools that we do,” Karim Palant, Facebook’s U.K. public policy manager told NBC News in a statement.
Palant acknowledged the company has more work to do, but he stressed the progress it has made in the last year.
“We have tripled the size of the team working to detect and protect users from bad content to 30,000 people and invested heavily in machine learning, artificial intelligence and computer vision technology to help prevent this type of abuse,” he said.
The scathing report revived the debate over how to effectively regulate social media platforms without undermining their purpose. The issue has already been the subject of several congressional hearings in the United States, but experts say there is growing pressure to do something, for the sake of the public and the companies.
“What the past few years showed is that their profit motives can conflict with the public's best interest, on the one hand, and that they can be arbitrary and ad hoc when trying to deal with it, on the other,” said Drew Margolin, a professor of communication at Cornell University.
Facebook and Twitter have sparked controversy with enforcement decisions that seem inconsistent and capricious. They often find themselves issuing apologies for either censoring something benign or leaving something harmful up for too long.
“Unless they are going to be totally free spaces, which is probably too risky to be realistic, or allowed to censor in an ad hoc way, which rightly worries communities who do not have representatives in the executive suites of these companies, there have to be independent rules that they adhere to,” Margolin said.
Britain is just one country struggling to rein in the dangers of social media, and Germany has already imposed a legal requirement that tech companies remove hate speech. Elizabeth Cohen, an associate professor of communication studies at West Virginia University, is hopeful any reasonable changes platforms like Facebook are forced to make to comply with regulations overseas will lead to similar restrictions in the U.S.
“If Facebook is required to police themselves in one place, they may be more willing to police themselves voluntarily in other places,” she said.
If nothing else, Cohen said steps taken by Germany and Britain can serve as a model for future U.S. regulations. Recent hearings in which members of Congress displayed minimal comprehension of Facebook’s features and its business model have not filled her with confidence they can do it on their own.
“One of the fears I’ve had is our lawmakers don’t really know what to do with Facebook because they don’t understand what Facebook does,” she said.
As Russian interference in the 2016 U.S. presidential election and suspected foreign efforts to influence the Brexit and Scottish independence votes in the U.K. demonstrate, the inability of platforms and governments to police social media content can have enormous consequences.
“Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalized ‘dark adverts’ from unidentifiable sources, delivered through the major social media platforms we use every day,” Collins said. “Much of this is directed from agencies working in foreign countries, including Russia.”
Beyond content concerns, the U.K. committee’s report also takes aim at Facebook for its data-sharing practices that allowed companies like Cambridge Analytica to harvest data from unknowing users for political purposes. Citing internal company documents revealed in a U.S. lawsuit, the committee alleges Facebook consistently prioritizes profits over data security.
“Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law,” the report states.
The data-harvesting is not necessarily a concern, according to Cohen. Users often want to see ads and information tailored to their interests. She is more troubled by a lack of transparency regarding what data are collected, how it is used, and whether users can easily refuse consent to it.
“Part of the problem is that Facebook is developing procedures to harvest user data that users aren’t necessarily familiar with or even aware of,” she said.
Although the report laments that tech companies seem to consider themselves ahead of the law, it also recognizes current U.K. electoral laws have failed to keep up with the shift from traditional print advertising to online micro-targeted campaigning.
“There needs to be: absolute transparency of online political campaigning, including clear, persistent banners on all paid-for political adverts and videos, indicating the source and the advertiser; a category introduced for digital spending on campaigns; and explicit rules surrounding designated campaigners’ role and responsibilities,” it recommends.
As more governments move closer to imposing limits on social media platforms, experts warn overreaching could have unintended ramifications for users, companies, and the internet itself.
“My hope is that regulation would recognize that the issue is not social media per se, but the dominance of certain platforms that makes them de facto broadcasters,” Margolin said, though he quickly added, “I’m not sure how to navigate this, though.”