Google said it was working on changes to stop websites making money via its AdSense network.
Facebook announced it would not serve ads in apps or sites "containing content that is illegal, misleading or deceptive, which includes fake news."
Neither company has outlined precisely how the vetting process will work, but Facebook said it would check publishers for compliance.
The announcements come as Google chief executive Sundar Pichai admitted to the BBC that fake news spread on the search engine and on social networks could have swung the election in Donald Trump's favour.
"You know, I think fake news as a whole could be an issue," he said. "From our perspective, there should just be no situation where fake news gets distributed, so we are all for doing better here.
"So, I don't think we should debate it as much as work hard to make sure we drive news to its more trusted sources, have more fact checking and make our algorithms work better, absolutely."
In one notable example, Google's top result for "final election results" at one point showed a fake site with inaccurate numbers, according to reports.
Facebook chief executive Mark Zuckerberg has separately denied that fake news spread on his social network contributed to the election's outcome. At the time, he said "99%" of posts were genuine. But he soon announced policies to help tackle the issue. And Facebook's own employees reportedly formed a task force to try and stop the spread of fake news.