Theres also a whole category of facial recognition fooling imagery you can wear yourself, called adversarial examples, that work by exploiting weaknesses in how computer vision software has been trained to identify certain characteristics. Take for instance this pair of sunglasses with an adversarial pattern printed onto it that can make a facial recognition system think youre actress Milla Jovovich.
But that type of thwarting of facial recognition usually means altering a photograph or a still image captured from a security camera or some other source after the fact. Thats a first for the industry, FAIR claims, and good enough to combat sophisticated facial recognition systems. You can see an example of it in action in this YouTube video, which, because its de-listed, cannot be embedded elsewhere.
Face recognition can lead to loss of privacy and face replacement technology may be misused to create misleading videos, reads the paper explaining the companys approach, as cited by VentureBeat. Recent world events concerning the advances in, and abuse of face recognition technology invoke the need to understand methods that successfully deal with de-identification.
The other concern FAIRs research addresses is facial recognition, which is also unregulated and causing concern among lawmakers, academics, and activists who fear it may violate human rights if it continues to be deployed without oversight by law enforcement, governments, and corporations.
Original article