NEW YORK — The YouTube video shows two women, dressed in suits and ties. They smile; they sniffle back tears; they gaze into each other’s eyes. They are reading their wedding vows to one another.
The four-minute video titled ‘‘Her Vows’’ contains no nudity, violence, or swearing. There’s no revealing clothing. No one is engaging in activities that have a ‘‘high risk of injury or death.’’ And yet, YouTube has deemed the video unsuitable for people under 18.
Several YouTube users, many of them have in the lesbian, gay, bisexual, and transgender community, have been complaining that their videos are categorized as ‘‘restricted’’ for no obvious reasons. Besides the vows, targeted videos include coming out stories and one from YouTube celebrity Tyler Oakley titled ‘‘8 Black LGBTQ+ Trailblazers Who Inspire Me.’’
After several days of complaints, Google hinted Monday that it might have made a mistake and said it was investigating.
The ‘‘restricted’’ designation lets parents, schools, and libraries filter out content that isn’t appropriate for users under 18. Turning on the restriction makes videos inaccessible. YouTube calls it ‘‘an optional feature used by a very small subset of users.’’
It’s unclear whether the types of videos in question are now being categorized as ‘‘restricted’’ for the first time, or whether this is a long-standing policy that is only now getting attention.
The complaints spawned the hashtag #YouTubeIsOverParty over the weekend. One person even made a video to voice her complaints.
YouTube said in a tweet Sunday that LGBTQ videos aren’t automatically filtered out, though some discussing ‘‘more sensitive issues’’ might be restricted. But the company, which is owned by Google, did not specify what it counts as ‘‘more sensitive issues.’’
In an e-mailed statement on Monday, YouTube said ‘‘some videos that cover subjects like health, politics and sexuality may not appear for users and institutions that choose to use this feature.’’
In the case of LGBTQ topics, which are by definition intertwined with health, politics, and sexuality, filtering out what is and isn’t appropriate can be difficult.
YouTube followed that statement with another hours later: ‘‘We recognize that some videos are incorrectly labeled by our automated system and we realize it’s very important to get this right. We’re working hard to make some improvements.’’ The statement offered no further explanation.
YouTube content creators can decide to age-restrict their videos themselves. But that’s just one of the ways sensitive content is filtered out.
YouTube says it also uses ‘‘community flagging,’’ which means users who have a problem with content in a video can flag it to YouTube for possible restrictions or removal.
But just because something is flagged, it is not automatically removed. Once a video is flagged, YouTube says it is reviewed.
‘‘If no violations are found by our review team, no amount of flagging will change that and the video will remain on our site,’’ YouTube says in its online support page.