Nowadays, messenger systems allow all types of media to be shared online. You can send photos, videos, and other types of media files. This could also mean that anyone can misuse this facility. People can also send vulgar or sexually explicit content through this. It will be fine between two consenting adults but what about the children?
The exploiters can get hold of your child’s phone number and can send them these vulgar contents. Therefore, Apple has introduced a new safety feature for protecting children from these types of attacks. Apple will develop an on-device machine learning where it will go through the attachments on Messages to make sure that the child does not receive any sexually explicit content. If such images are detected it will blur the photos and attach a warning. The child’s parent will also receive a notification if their child has seen these images. When a child sends such content, then also the parent will be notified of it after issuing a warning to the child.
This feature isn’t live yet and is probably arriving with the iOS 15 update. to enable this feature, the accounts must be set up using the Families feature in iCloud.