Face Recognition and Face Masks   Accuracy of face recognition plummets when applied to mask wearers NIST report  World Privacy Forum

Face Recognition and Face Masks   Accuracy of face recognition plummets when applied to mask wearers NIST report  World Privacy Forum

Face Recognition and Face Masks: Accuracy of face recognition plummets when applied to mask-wearers; NIST report World Privacy Forum Skip to Content Javascript must be enabled for the correct page display Home Connect With Us: twitter Vimeo email Main Navigation Hot Topics

Face Recognition and Face Masks Accuracy of face recognition plummets when applied to mask-wearers NIST report

World Privacy Forum Issue Brief

PDF Version of Brief

30 July 2020 NIST has published its first report [1] regarding face recognition algorithms and the wearing of face masks. The report quantifies how one-to-one [2] face recognition systems perform when they are utilized on images of diverse people wearing a variety of mask types and colors. The study found that all pre-COVID-19 algorithms give increased false non-match rates [3] when the probe images are masked. Some of the false non-match rates were quite high; for example, some of the algorithms failed 30-50 percent of the time. Some of the same algorithms with high false non-match rates (FNMR) with masks had results of less than .01 FNMR without a mask. In other words, the accuracy of even the best algorithms became unreliable. A 50 percent false non-match rate indicates a very meaningful failure of the algorithm to perform an accurate one-to-one match. The details of the study are important. For this study, NIST tested 89 one-to-one face recognition algorithms that were created prior to the COVID-19 pandemic, utilizing a variety of digitally applied mask shapes. The test database for this study is the same extensive and diverse dataset that NIST has utilized in its Facial Vendor Recognition Tests since 2018. [4] What shape of mask a person is wearing, and what color of mask a person is wearing made a difference in the outcome of the results. Masks that are more round, such as many N95 masks, had fewer false non-match rates, as did light blue masks. Masks that covered more of the face, and masks that were black, had higher false match rates. The screenshot below shows how NIST digitally applied masks to its test database for the study. [Screenshot: NIST, NISTIR 8311 – Ongoing FRVT Part 6A: Face recognition accuracy with face masks using pre-COVID-19 algorithms, p. 3. https://pages.nist.gov/frvt/reports/facemask/frvt_facemask_report.pdf ] The study authors indicate that the testing described in the report has limitations, which are outlined in the report’s summary. For example, newer face recognition algorithms that have been calibrated during the COVID-19 pandemic to account for mask wearing were not tested in this round. This NIST study is nevertheless an important first study on masks and face recognition — we now have quantification that masks cause very significant errors in one-to-one face recognition algorithms. Many policy issues can flow from the problems this study has outlined. One meaningful question that will need to be addressed during the COVID-19 crisis is the issue of requests to remove masks for ID purposes. There may be a time when that is comfortable and safe for people, but for many people, especially those vulnerable to more severe forms of illness from COVID-19, that time is not now. Until the COVID-19 crisis is fully resolved, there needs to be meaningful public health input into decisions around the use of face recognition systems that largely do not work for mask wearers. No one should be asked to take off a mask and risk their health to accommodate a face recognition system. NIST has additional forthcoming publications on one-to-many algorithms regarding the wearing of masks, including the new “COVID-19 aware” algorithms designed to work with masks. To follow this work, see NIST’s dedicated page for its ongoing testing and reporting on face masks and face recognition systems: Face Recognition Accuracy with Face Masks Using Pre-COVID-19 Algorithms, https://pages.nist.gov/frvt/html/frvt_facemask.html Notes [1] National Institute of Standards and Technology Internal Report 8311, Natl. Inst. Stand. Technol. Interag. Intern. Rep. 8311, 58 pages (July 2020). Available at: https://doi.org/10.6028/NIST.IR.8311. [2] Biometric systems are generally set to run in either identification (one to many) or verification (one to one) mode. “One – to – one” face recognition systems are those that are operating in verification or authentication mode by comparing one supplied faceprint with one stored faceprint. An example of authentication/verification in one-to-one mode is that of an individual’s fingerscan used to unlock their smart phone or make a mobile payment. One live fingerscan is being compared to one stored fingerscan. For comparison, an example of a biometric identification system would be a law enforcement system that uses a single fingerprint to search across millions of stored fingerprints for potential matches. In biometric discussions, the distinction between identification (one to many) and verification (one to one) is important to take into account. See: Dixon, Pam, Introduction, A Failure to Do No Harm: India’s Aadhaar biometric ID program and its inability to protect privacy in relation to measures in Europe and the U.S. Springer Nature, Health Technology. DOI 10.1007/s12553–017-0202-6. Available at: http://rdcu.be/tsWv. Open Access, via Harvard-Based Technology Science: https://techscience.org/a/2017082901/. [3] A “false non-match rate” or FNMR is the rate at which a biometric process mismatches two signals from the same individual as being from different individuals. False non-match rates are important quantifiers of a particular type of error in face recognition systems. Note that “false match rate” or FMR is a different kind or error measurement. See: Schuckers M.E. (2010) False Non-Match Rate, Computational Methods in Biometric Authentication. Information Science and Statistics. Springer, London. Available at: https://doi.org/10.1007/978-1-84996-202-5_3. [4] NIST Facial Vendor Recognition Test Page, Ongoing. National Institute of Standards and Technology. Available at: https://www.nist.gov/programs-projects/face-recognition-vendor-test-frvt-ongoing . Related Work World Privacy Forum Biometrics: https://www.worldprivacyforum.org/category/biometrics/ Pam Dixon, A Failure to Do No Harm: India’s Aadhaar biometric ID program and its inability to protect privacy in relation to measures in Europe and the U.S., Springer Nature, Health Technology. DOI 10.1007/s12553-017-0202-6 . http://rdcu.be/tsWv . Open Access via Harvard- Based Technology Science: https://techscience.org/a/2017082901/ . Publication Information World Privacy Forum
www.worldprivacyforum.org
Original publication date: 30 July 2020
PDF Version: https://www.worldprivacyforum.org/wp-content/uploads/2020/07/WPF_FaceRecognition_Masks_COVID19_30July2020_fs.pdf This work is made available under the terms of the Creative Commons Attribution- NonCommercial 4.0 license. Posted July 30, 2020 in Biometrics, COVID-19, Facial Recognition, National Institute of Standards and Technology, WPF Issue Brief Next »HHS makes significant changes to COVID-19 reporting process « PreviousAfrica’s Rising Leadership in Privacy: breaking new ground before and during the COVID-19 crisis WPF updates and news CALENDAR EVENTS

WHO Constituency Meeting WPF co-chair

6 October 2022, Virtual

OECD Roundtable WPF expert member and participant Cross-Border Cooperation in the Enforcement of Laws Protecting Privacy

4 October 2022, Paris, France and virtual

OECD Committee on Digital and Economic Policy fall meeting WPF participant

27-28 September 2022, Paris, France and virtual more Recent TweetsWorld Privacy Forum@privacyforum·7 OctExecutive Order On Enhancing Safeguards For United States Signals Intelligence Activities The White House https://www.whitehouse.gov/briefing-room/presidential-actions/2022/10/07/executive-order-on-enhancing-safeguards-for-united-states-signals-intelligence-activities/Reply on Twitter 1578431679592427526Retweet on Twitter 1578431679592427526Like on Twitter 1578431679592427526TOP REPORTS National IDs Around the World — Interactive map About this Data Visualization: This interactive map displays the presence... Report: From the Filing Cabinet to the Cloud: Updating the Privacy Act of 1974 This comprehensive report and proposed bill text is focused on the Privacy Act of 1974, an important and early Federal privacy law that applies to the government sector and some contractors. The Privacy Act was written for the 1970s information era -- an era that was characterized by the use of mainframe computers and filing cabinets. Today's digital information era looks much different than the '70s: smart phones are smarter than the old mainframes, and documents are now routinely digitized and stored and perhaps even analyzed in the cloud, among many other changes. The report focuses on why the Privacy Act needs an update that will bring it into this century, and how that could look and work. This work was written by Robert Gellman, and informed by a two-year multi-stakeholder process. COVID-19 and HIPAA: HHS’s Troubled Approach to Waiving Privacy and Security Rules for the Pandemic The COVID-19 pandemic strained the U.S. health ecosystem in numerous ways, including putting pressure on the HIPAA privacy and security rules. The Department of Health and Human Services adjusted the privacy and security rules for the pandemic through the use of statutory and administrative HIPAA waivers. While some of the adjustments are appropriate for the emergency circumstances, there are also some meaningful and potentially unwelcome privacy and security consequences. At an appropriate time, the use of HIPAA waivers as a response to health care emergencies needs a thorough review. This report sets out the facts, identifies the issues, and proposes a roadmap for change.
Share:
0 comments

Comments (0)

Leave a Comment

Minimum 10 characters required

* All fields are required. Comments are moderated before appearing.

No comments yet. Be the first to comment!