Meta-owned Instagram has rolled out a new feature that blocks screenshots or recordings of ‘view once’ pictures and video sent via private messages.

The new feature will also ban people from opening ‘view once’ or ‘allow replay’ images or videos on Instagram web, to avoid them circumventing this screenshot prevention. 

The move is part of the social media conglomerate’s efforts to combat sextortion, with the company launching a range of new safety features designed to further protect people from the phenomenon.

Earlier this week, Meta announced a range of new safety features designed to further protect people from sextortion as the online blackmail method becomes a growing issue.

Meta will also blur images containing nudity when sent or received in Instagram direct messages, warning people of the risks associated with sending sensitive images. 

The business said the feature will be enabled by default for teens under 18.

According to the social media giant, it has eliminated over 800 Facebook groups and 820 accounts that were affiliated to groups attempting to organise, recruit and train new sextortion scammers. 

Meta added it has also partnered with Crisis Text Line in the US to give free and private mental health support to people that have been a victim of sextortion.

This week Meta also announced the launch of a new campaign aimed at helping protect children from sextortion scams.

The tech company developed an education video that helps teens recognise the signs that are currently associated with a sextortion scammer’s behaviour.

The video was produced in collaboration with the National Center for Missing and Exploited Children (NCMEC) and and nonprofit organiszation Thorn, including Thorn’s Youth Advisory Council.

“Campaigns like this bring much-needed education to help families recognize these threats early,” said senior vice president from the National Center for Missing & Exploited Children John Shehan. “By equipping young people with knowledge and directing them to resources like NCMEC’s CyberTipline, and Take it Down, we can better protect them from falling victim to online exploitation,”

In September, the Meta-owned platform introduced “teen accounts”, aiming to further reinforce its privacy protection and parental control capabilities for younger users.

The new feature enables accounts opened by teenagers to have integrated built-in protections that will limit who can contact them and the type of content they see, while providing new ways for younger users to explore content that interests them.

During the past two years, a number of social media introduced stricter rules to reinforce its privacy usage across their channels.

Last year, social media platform Snapchat rolled out new parental Content Controls to help prevent minors from being exposed to potentially inappropriate content while using the app.

The feature allowed parents and guardians to access new content filtering capabilities through Snapchat’s Family Centre supervision tool to block “sensitive or suggestive” content from appearing on their child’s Snapchat Stories or Spotlight feed.

Companies in a range of sectors, including banking and real estate, have been fighting the consequences of the misuse of private digital data which include violation of intellectual property rights and other interests, and general misuse for authorised purposes.

In August, HSBC UK partnered with the Open Property Data Association (OPDA) to enable the sharing of digital property information across the mortgage market, with several UK financial institutions joining the OPDA including Nationwide, which announced its membership in July, and Lloyds Banking Group, which signed up in March.


Share.
Exit mobile version