Apple s new FAQ unsuccessfully attempts to clarify the new child abuse scanning features The News Pocket

Apple s new FAQ unsuccessfully attempts to clarify the new child abuse scanning features The News Pocket

Apple s new FAQ unsuccessfully attempts to clarify the new child abuse scanning features - The News Pocket 0 0 HomeAutomobileBusinessTechnologyEntertainmentInternetSportsReviews 0 Internet

Apple s new FAQ unsuccessfully attempts to clarify the new child abuse scanning features

Posted by By Maria Janulis August 10, 2021 Apple s new FAQ unsuccessfully attempts to clarify the new child abuse scanning features Apple has attempted to assuage concerns regarding its new anti-child abuse measures in a new FAQ. The company wrote in its FAQ, “Let us be clear, this technology is limited to detecting CSAM [child sexual abuse material] stored in iCloud and we will not accede to any government’s request to expand it.” Apple’ announced new tools last Thursday including two features for protecting children. One such feature is “communication safety” which uses on-device machine learning to identify and blur sexually explicit images received by children in the Messages app. It sends a notification to a parent if a child aged less than or equal to 12 views or sends such an image. The second feature is designed to detect known CSAM by scanning users’ images once such an image is synced in iCloud. Apple gets notifications only if CSAM is detected. Once Apple receives such a notification, Apple will alert authorities after verification. Apple’s plans were not welcomed with open arms because digital privacy groups and campaigners argued that these steps introduce a backdoor into Apple’s software through which Apple gets more exposure to scan types of content going beyond child sexual abuse. The basic problem identified by the groups is authoritarian governments around the world could use it to scan for politically dissent material or anti-LGBT regimes could use it to crack down on sexual expression. The Electronic Frontier Foundation wrote “Even a thoroughly documented, carefully thought-out, and narrowly scoped backdoor is still a backdoor. We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of ‘terrorist’ content that companies can contribute to and access for the purpose of banning such content.” On the other hand, Apple argues that it has safeguarded the system from being misused. It has designed the security of the system in such a way that it does not have the potential of detecting things other than sexual abuse imagery. Apple said that its list of banned images is provided by the National Centre for Missing and Exploited Children (NCMEC) and other child safety organizations. Apple argued, the system “only works with CSAM image hashes provided by NCMEC and other child safety organizations.” In simple words, Apple clarified that it won’t add to this list of image hashes. Apple further said it would refuse demands from governments to add non CSAM images to the list. Apple said “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.” Note that despite Apple’s assurances, in the past, the company has made concessions to governments for the sake of getting a pass to continue operating in their countries. It is to be noted that Apple sells iPhones without FaceTime in countries that don’t allow encrypted phone calls. Similarly, Apple removed thousands of apps from its App Store in China. The FAQ fails to address concerns regarding the feature that scans Messages for sexually explicit material, therefore, it remains to be seen what concrete steps are taken next by the company. EFF wrote, “All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts.” Tags: Apple Share on Share on Facebook Share on Twitter Share on Pinterest Share on Email Maria Janulis August 10, 2021 Maria Janulis View More Posts Maria is a Florida-based columnist, working in the Journalism industry for the last five and a half years. She spends most of her time interacting with the like-minded group of people on social media.

Leave a Reply

Leave a Reply Cancel reply

You must be logged in to post a comment.

You Might Also Enjoy

Internet

The Ultimate Guide to Building Successful Apps for Businesses

Posted by By TNP Team 8 Min Read Internet

Top 5 Apps to Remove Backgrounds From Photos

Posted by By TNP Team 9 Min Read Internet

Image Search Technique is Getting Popular – How

Posted by By Fred Tucson 7 Min Read Internet

The Best Free Online Resources You’ ll Need at College

Posted by By Staff Reporter 6 Min Read GamingInternet

The Online Gaming Trends to Watch for This Summer

Posted by By Staff Reporter 8 Min Read Internet

Quizlet and Its Quality Alternatives

Posted by By TNP Team 12 Min Read Load More Our website uses cookies to improve your experience. Learn more about: Cookie Policy Accept Go to mobile version
Share:
0 comments

Comments (0)

Leave a Comment

Minimum 10 characters required

* All fields are required. Comments are moderated before appearing.

No comments yet. Be the first to comment!

Apple s new FAQ unsuccessfully attempts to clarify the new child abuse scanning features The News Pocket | Trend Now | Trend Now