Apple
Posted in

Apple’s Nudity-Blurring Messages Function is Now Available Internationally

It will be Released in the United Kingdom, Canada, Australia, and New Zealand.

Apple’s “communication safety in Messages function, which automatically blurs nudity-containing photographs sent to youngsters via the company’s messaging service, is now available in more countries. Following its debut in the United States last year, the functionality is now available for users in the United Kingdom, Canada, New Zealand, and Australia in the Messages apps for iOS, iPadOS, and macOS. Although the exact date is unknown, The Guardian reports that the functionality will be available in the UK “soon.”

Scanning takes place on the device and has no effect on the message’s end-to-end encryption. The functionality, which is integrated with Apple’s current Family Sharing system, can be enabled by following the instructions here.

To protect youngsters, the opt-in feature examines incoming and outgoing photographs for “sexually explicit” material. If the image is located, it is blurred, and instructions for getting help are given, along with assurances that it is fine not to view the image and to quit the chat. “You’re not alone,” the pop-up message says, “and you can always receive help from someone you trust or from skilled specialists.” “This person can also be blocked.” Children will be able to message an adult they trust about a flagged photo, same as they did when it first launched in the United States.

When Apple first announced the function in August of last year, it implied that the notification would occur automatically. Critics quickly pointed out that the initial strategy ran the risk of exposing queer children to their parents and may be misused in various ways.

In addition, Apple is extending the launch of a new feature for Spotlight, Siri, and Safari searches that will direct users to safety services if they search for themes related to child sexual abuse.

Along with these two kid safety measures, Apple launched a third endeavour in August, which involves screening images for child sexual abuse material (CSAM) before uploading them to a user’s iCloud account. However, privacy advocates were outraged about this feature, claiming that it risked introducing a backdoor that would threaten Apple’s users’ security. Later, the business indicated that all three functions would be delayed while it addressed issues. Apple has yet to provide an update on when the more contentious CSAM detection feature will be ready, despite having launched the first two features.

Join the conversation

SHOPPING BAG 0

ACCOUNT
Wishlist

Wishlist

Login

Create an account

Password Recovery

Lost your password? Please enter your username or email address. You will receive a link to create a new password via email.