YouTube will age-restrict more content showing ‘graphic violence’ in video games

YouTube is updating its policy to age-restrict more content containing video game scenes with “graphic violence,” the company announced on Tuesday. The update will go into effect on November 17th and will prevent accounts under 18 and signed-out users from watching videos with games focused on showing “realistic human characters” involved in scenes of “mass violence against non-combatants” or torture.

When choosing whether to restrict a video, YouTube says it will take the length of the graphic scene into account, as well as whether it’s zoomed in or the main focus of a scene. It will also evaluate whether the video shows violence targeted at a character “that looks like a real human.” YouTube’s post doesn’t clarify if this applies to the violence seen in games like Grand Theft Auto, a Call of Duty sequence like the infamous “No Russian” mission, or other kinds of realistically rendered scenes.

However, YouTube spokesperson Boot Bullwinkle said in a statement to The Verge that “certain content may be age-restricted if it’s non-fleeting or zoomed in,” adding that “there may be ways the creator can choose to play the mission to avoid content that would lead to an age restriction.” Creators can also choose to blur or obscure the violence, according to Bullwinkle.

The update builds upon YouTube’s existing policies, which state that the platform may restrict videos featuring “dramatized violence” focused on torture, severe injuries, or violent deaths with blood, with an exception for games. The current policy says, “Generally, we do not remove dramatized violence when the content or metadata lets us know that the content is fictional, or when it’s apparent from the content itself, such as with animated content or video games.”

“YouTube’s policies are designed to adapt to the evolving digital world, and these updates reflect our ongoing commitment to protect younger users and foster a responsible platform,” Bullwinkle said.

In addition to stricter policies surrounding video games, the update will also prevent creators from directing users to online gambling content involving digital goods, like video game skins, cosmetics, or NFTs. YouTube stopped allowing creators to verbally mention or display online gambling services not approved by Google in March. The platform started blocking approved online gambling content for users under 18 at this time, and now it’s planning to age-restrict social casino content as well.

Update, October 28th: Added clarification from YouTube.