Instagram introduces DM warnings and new transparency features for teens: What to know

Instagram introduces DM warnings and new transparency features for teens: What to know


Meta has introduced new safety measures for teenage users on Instagram, expanding its efforts to create a more secure experience on the platform. The update, announced via Meta’s newsroom, brings enhancements to direct messaging (DM) safety protocols and introduces protections for accounts managed by adults that represent children.

Among the key changes, Instagram will now display safety prompts when teens initiate conversations, even with users they follow. These prompts advise teens to review the recipient’s profile, reminding them they are under no obligation to engage if the exchange feels uncomfortable. Users will also see guidance on what personal information should be kept private while interacting online.

Another feature adds transparency by showing the month and year an account joined Instagram at the top of a new chat thread, claims the company. Meta says this addition aims to provide teenagers with better context about who they are speaking to and could help in identifying potential fraudulent accounts.

Changes have also been made to the platform’s block function. When teenagers attempt to block someone, they will now be given the option to block and report the user simultaneously. Meta claims this adjustment simplifies the process and may encourage more users to report inappropriate behaviour.

In addition to teen users, Meta has extended certain safety tools to accounts run by adults on behalf of children, particularly those under 13. These accounts, typically featuring a child’s image and managed by parents or child talent representatives, will now be subject to Instagram’s strictest messaging settings. Offensive comments will be filtered using Instagram’s Hidden Words feature, which will now be enabled by default.

Adult-managed accounts that fall into this category will receive notifications informing them of the updated safety settings.

Meta stated that accounts directly operated by children under 13 remain against Instagram’s policies and are removed when detected.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *