Ofcom should be given new powers to regulate Google and Facebook, members of the House of Lords have argued, as tech platforms have failed to tackle harms and abuse in digital media.
In its report, Regulating in a Digital World, the Lords’ communications committee condemned online platforms that host user-generated content for having "unacceptably opaque and slow" moderation processes.
Instead, they have called big tech to be reined in by a single regulator with complete oversight of the industry, while UGC-hosting platforms should be held to statutory duty of care that is enforced by Ofcom.
While Ofcom has not lobbied for a wider remit, last July its chief executive Sharon White called for Facebook and Google, the digital ad industry's dominant players, to be independently regulated. In a speech the following September, White described regulation of media as a "standards lottery" because the same content can be "governed by different regulation in different places".
The digital world does not merely require more regulation, the report explains, but also "a different approach to regulation", while policy-making and legislation is too slow at keeping up with the pace of change in the digital media industry.
The chairman of the committee, Lord Gilbert, said: "Self-regulation by online platforms is clearly failing and the current regulatory framework is out of date. The evidence we heard made a compelling and urgent case for a new approach to regulation.
"Without intervention, the largest tech companies are likely to gain ever more control of technologies which extract personal data and make decisions affecting people's lives."
Tech companies warn of 'one-size-fits-all' approach
The committee is now recommending a new Digital Authority, guided by 10 principles. This authority would have the remit to "continually assess regulation in the digital world and make recommendations on where additional powers are necessary to fill gaps."
It would instruct and coordinate regulators, as well as bring together non-statutory organisations with duties in this area.
Facebook and Google would not comment and instead referred Campaign to the Internet Association, which said the report was "important" and would be looked at closely.
Daniel Dyball, UK executive director of the Internet Association, said: "Our members work hard to keep their services free of some of the most serious issues that the report mentions - from strong terms and conditions; to investment in hiring teams and improving systems for removing inappropriate content; and leadership of global bodies like the Internet Watch Foundation and Global Internet Forum to Counter Terrorism. But we also recognise that more needs to be done to address potential online harms, and internet companies are committed to making their platforms safe.
"We continue to work with the government on its forthcoming White Paper on internet safety which will be an important step forward. Creating a system of regulation for the internet is a complex task, and taking a one-size-fits-all approach could jeopardise the social and economic benefits the internet sector has produced. Last week we set out six policy principles for any future regulation of the internet."
The Lords’ report is the latest in a series of investigations into the digital media industry by UK lawmakers, following last month’s digital, culture, media and sport select committee’s Fake News inquiry which called for an independent regulator.
In an apparent admission that GDPR is not sufficient when it comes to capturing user data for online advertising, the report argues for greater transparency when data is collected, as well as greater choice to allow users to control which data are taken and more clarity over what data-collecting algorithm is being used.
There are also antitrust concerns, given how the biggest tech companies can "buy start-up companies before they can become competitive".
"Responses based on competition law struggled to keep page with digital markets and often take place only once irreversible damage is done. We recommend that the consumer welfare test needs to be broadened and a public interest test should be applied to data-driven mergers," the report added.
The 10 principles the committee recommends should guide the development of regulation online are:
- Parity: there should be the same level of protection online as offline
- Accountability: processes must be in place so that individuals and organisations are held to account for their actions and policies
- Transparency: powerful businesses and organisations operating in the digital world must be open to scrutiny
- Openness: the internet must remain open to innovation and competition
- Privacy: measures should be in place to protect the privacy of individuals
- Ethical design: services must act in the interests of users and society
- Recognition of childhood: the most vulnerable users of the internet should be protected
- Respect for human rights and equality: the freedoms of expression and information online should be protected
- Education and awareness-raising: people should be able to navigate the digital world safely
- Democratic accountability, proportionality and an evidence-based approach