The events of this past year have raised important questions about media in the 21st century.
For many of us in the media business, it’s been a much-needed wake-up call, reminding us that our profession has enormous power to do good and harm — in equal measure. No one can wonder if there is a question of moral responsibility in the media business anymore.
Nowhere is this more apparent than on social media, which plays such a dominant role in culture as well as marketing budgets. This year, social media has been used as a powerful weapon in the culture wars and as a divisive political vehicle.
Last summer, our response was to introduce 10 Media Responsibility Principles to guide us and our clients in approaching hate speech, misinformation and disinformation and data privacy for children and adults. They are concrete policies we can enforce and use to report on violations. We’ve used them to assess the platforms and identify best practices that set the bar across all of them. And we’re seeing many clients adapt them to match their CSR practices.
But while the platforms all take their responsibility seriously, their responses differ dramatically. That makes it difficult to compare progress and creates a patchwork we need to monitor almost daily — none of which helps anyone.
It begs the question: Is it even possible to establish common social media responsibility standards? If so, how exactly would we do it?
One option is by using collective action. For example, we are working with industry bodies, including the 4A’s and the Global Alliance for Responsible Media, to adapt our principles for use across the industry.
We can also rely on the platforms to collaborate on a set of standards that hold each other accountable. That’s possibly the best solution, albeit the most unlikely, considering the personalities involved. (To its credit, TikTok recently proposed something similar.)
A third route is to create a licensing process, similar to the way TV stations — all fiercely independent and competitive entities — must adhere to the same standards around media responsibility.
Broadcast TV standards are set by the FCC, and stations must renew their licenses every year by proving they meet the minimum requirements of serving the public interest; not committing serious violations of the Communications Act or the FCC’s rules; and steering clear of any other violations that constitute a pattern of abuse.
However imperfect, this annual requirement forces us to review media responsibility against shared standards. It requires media owners to balance growth with social implications, and ensures they meet responsibilities around data protections, privacy, content moderation, anti-bullying and the protection of children.
Licensing social media platforms won’t be politically easy, but has advantages over the other path being considered: revising or rescinding section 230 of the Communications Decency Act, which requires bipartisan alignment in Congress. A license could keep existing protections for social media platforms in place while holding platforms to shared basic standards.
It would also be easy to implement with clear standards designed by the FCC and the industry, and the requirement that platforms meet those standards annually. It could follow established protocols and wouldn’t require new processes or infrastructure. And it’s easy to understand. Licensing requirement thresholds could be determined by user numbers, revenue and scale.
Finally, fees from the licensing program could support a Digital Literacy Fund — because clearly, raising awareness around reality and manipulation on social media is crucial.
I often hear from social media platforms that they are bombarded by requests to adhere to different media responsibility paradigms from agencies and clients. They want consistency, and they have a valid point.
A license to operate is not just a simple, tried and true solution. It would also convey our commitment to holding powerful engines in our society responsible for our collective safety.
Daryl Lee is global CEO of IPG Mediabrands.