もっと詳しく

For the past several months, TikTok has been working on new ways to age-restrict certain types of content as part of a broader push to ramp up safety features for younger users. The app unveiled a new ratings system earlier this year, called Content Levels, to help it identify more “mature” content.

Now, the company has another update on those efforts. In a blog post, the company says that it’s launching a new version of its “borderline suggestive model,” which the company uses to automatically identify “sexually explicit, suggestive, or borderline content.” According to a TikTok spokesperson, the new model is better able to detect so-called “borderline content,” videos that don’t explicitly break the app’s rules, but may not be suitable for younger users.

TikTok isn’t the only platform to filter out this type of content from recommendations. Instagram has long attempted to weed borderline content out of its recommendations as well. But content with more “mature” themes, but that doesn’t contain explicit nudity, has long been more difficult for automated systems to consistently detect. TikTok didn’t offer specifics on how much more accurate the new system is, but it shared that in the last 30 days the company has “prevented teen accounts from viewing over 1 million overtly sexually suggestive videos.”

Elsewhere, the app is also rolling out the ability for creators to restrict their videos to adult viewers. This feature was previously only available for live videos, but will now be enabled for short-form clips as well.