india employmentnews

Instagram DMs Are No Longer Safe Starting Today; Your Secrets Could Be Leaked—End-to-End Encryption Disabled..

 | 
 Social media

Meta has finally taken the step it announced some time ago. Effective today—May 8, 2026—support for end-to-end encryption (E2EE) for messaging is being discontinued on Instagram. The popular photo and video-sharing service, Instagram, has offered encrypted DMs since 2023; however, starting today, this feature will no longer be available to users. This means that if you use Instagram DMs, your chats will no longer remain as private as they were before.

Meta had informed users about this change via a blog post in March. The company stated, "If your chats are affected by this change, you will be provided with instructions on how to download any messages or media that you wish to keep." The idea behind this is to allow users to download and back up the chats they wish to keep private after the change goes into effect. If end-to-end encryption (E2E encryption) is removed, anyone with even a basic technical understanding could potentially view your messages. This directly implies that your conversations will no longer remain private, and there is a risk that someone could misuse them.

For the time being, this change affects only Instagram users. Meta has not clarified whether it plans to implement a similar measure for its other services, such as Facebook Messenger or WhatsApp.

Why is Meta introducing this change?
The primary reason cited for this change is child safety. Earlier this year, in March, a jury in New Mexico imposed a $375 million fine on Meta. This penalty stemmed from allegations that the company misled customers regarding the platform's safety features and facilitated harms such as child sexual exploitation. By removing end-to-end encryption, the company will be able to scan messages when necessary to detect content related to Child Sexual Abuse Material (CSAM), grooming, or other forms of abuse. Governments worldwide—including those of the United States and the United Kingdom—along with policymakers in the European Union, are pressuring companies to identify and remove harmful content from private messaging apps. Proposed regulations, such as the UK's 'Online Safety Act 2023' and the EU's 'Chat Control Regulation,' could empower authorities to compel platforms to detect CSAM even within private conversations. While this measure may help curb harmful content, it could simultaneously draw the ire of some privacy experts.


Disclaimer: This content has been sourced and edited from TV9. While we have made modifications for clarity and presentation, the original content belongs to its respective authors and website. We do not claim ownership of the content.