Instagram: Big update: Teenagers will no longer be able to view 18+ content, giving parents control..

Meta's social media platform, Instagram, has made a major change to further strengthen the safety of minor users. The company will now default to showing content rated PG-13 to users under the age of 18, protecting them from content containing topics such as violence, nudity, or drug abuse.
Stricter Content Filters for Teens
Instagram states that users under the age of 18 will not be able to change this setting on their own unless their parents or guardians give permission. The company has also introduced a new content filter, "Limited Content," that prevents teens from viewing or commenting on posts that contain sensitive or adult topics.
Stricter AI Chats
Instagram will also implement these content controls on AI chats in the coming year. This means that teens will now be able to have limited interactions with AI chatbots. This change comes at a time when companies like OpenAI and Character.AI are facing allegations of influencing minors through inappropriate chats. Recently, OpenAI added new safety policies to its ChatGPT app for users under 18, prohibiting "flirting" conversations. Character.AI also added new parental controls to its platform.
Blocking Inappropriate Content and Accounts
Instagram will now block accounts that display or share content deemed inappropriate for teens. If a minor user follows such an account, they will no longer be able to see its posts or interact with it. The company is also removing these accounts from search and recommendations, making them harder to find.
Strictness on Sensitive Topics
Meta already bans content related to eating disorders, self-harm, alcohol, and violence. Now, the company will also ensure that teens can't see such content even if they search for these words with misspellings.
Supervision tools may soon be available for parents.
Instagram is testing a new feature that allows parents to directly flag content they deem inappropriate for minors through its supervision tools.
Such posts will be sent to the company's review team, which will decide whether to remove them from the platform.
Where will this feature be implemented?
According to the company, these changes are currently being implemented in the US, UK, Australia, and Canada. The update will be rolled out globally by early next year.
Why is this update necessary?
The increasing addiction to social media among teenagers and the accessibility of content related to sensitive topics have sparked numerous controversies in recent years. With this move, Instagram is not only strengthening its Teen Safety Policy but also reassuring parents that their children are enjoying a safe digital experience on the platform.
Disclaimer: This content has been sourced and edited from Amar Ujala. While we have made modifications for clarity and presentation, the original content belongs to its respective authors and website. We do not claim ownership of the content.