Apple Sets Ashley On Undefined Leave After She Voice About Sexism
Engineering program manager Ashley Gjøvik had been raising concerns with Apple employee...
In the end of 2021, Apple will be launching new tools that alarms parents and children when the child sends or receives sexually explicit photos via messaging app. The new feature will be launched with the aim of limiting Child Sexual Abuse Material (CSAM).
Apple will easily detect unknown CSAM images on iPhone and iPads etc., while respecting the consumer privacy.
Furthermore, these warnings focus on guiding the child to make the right decision by choosing not to view the content. Even if the child taps on such content, he/she will be warned that proceeding further will notify the guardian, and that the parents want the user to be safe.
In principle, people think this is excellent and about time. CSE is very damaging, as the effects can go on for a lifetime and attempts to thwart those that perpetuate such crimes are long overdue.
Catch all the Sci-Tech News, Breaking News Event and Latest News Updates on The BOL News
Download The BOL News App to get the Daily News Update & Follow us on Google News.