Ronan Harris told Campaign that it was "wholly unacceptable" that the video site has been hosting inappropriate content and comments about young children, as new revelations emerged just as the company was hosting Brandcast, its annual "upfront" presentation to UK advertisers.
Harris insisted that identifying the problem of inappropriate comments can be more "nuanced" than dealing with extremist, jihadi content.
There might be "an innocent video" about a girl but then some users might post unpleasant or even illegal comments below the video clip on the site, he said.
Harris, the managing director of Google UK & Ireland, conceded there is "nervousness" among some advertisers and the onus in on the global parent company to ensure YouTube is "safe" for users, advertisers and creators.
He said: "One of the areas where I have been talking to my engineering counterparts and the leaders over in California is: ‘Are we doing everything within our power, given the advances in technology, to identify the emergence of new categories of questionable content on the platform?’ And that’s a very live issue at the moment."
Harris said YouTube needs to come up with a global, not just a UK solution: "The nature of the platform is it’s not enough to do something in just one market, you’ve got to sort it out on the platform globally."
He promised YouTube will be taking further steps in the coming weeks, in addition to tighter measures announced today that will see the video site remove inappropriate comments and users and refer "illegal" behaviour to legal authorities.
Harris admitted YouTube needs to work harder at these problems and is hiring more independent experts and doubling the number of "trusted flaggers".
"I don’t think we have identified or solved all of the issues around this," he said. "There’s clearly content that should not be on the platform. There’s also content that is very nuanced.
"We don’t always have the answers to identify the content," he added, explaining how there is "content that doesn’t have any clear bad intent, that isn’t contravening the guidelines of YouTube, but where there is clearly very bad behaviour in the comments section".
Harris maintained YouTube’s machine learning tools have been "making progress" in identifying jihadi content, since that problem first arose in February when The Times exposed how ads were appearing next to extremist videos.
YouTube’s technology removed 83% of extremist content that was spotted during the last month, before it was flagged by a human, he said.
"The problems that we’re talking about on YouTube are existing around the fringes," he said. "The vast majority of creators, of viewers and of brands are using YouTube and the power of technology and the power of community to do incredible stuff."
Harris acknowledged in his speech to the Brandcast crowd that YouTube has had a tough year, describing it as one of "highs and lows", and admitted that the site needed to tackle child safety with the same "urgency" as it tried to deal with jihadi content.
The 900-strong crowd listened in silence as he talked of his "deep sense of personal responsibility to get it right" and said he was speaking "as a parent and a leader".
Senior leaders from the ad industry and agencies in the audience said afterwards that Harris was right to confront the issue head-on.
Industry sources estimate as many as a quarter of large advertisers are not spending on YouTube because of continued worries about brand safety.
"I dispute the 25% [figure]," Harris said, while acknowledging brands have legitimate concerns.
"If they've got nervousness, they absolutely should take the action to pause [their ad spend] and make sure that they understand the facts and that they are only advertising in situations that are brand-appropriate," he said.
"It’s wholly unacceptable to us that any brand appears against undesirable content or that undesirable content is exposed to a user whether that’s in the form of content or a comment. We want to do a better job."