Your Face is Going Places You May Not Like

This form is for Science and Tech topics.

Note: Anyone can read this forum, only registered users may post or reply to messages.
Post Reply
apollo
Posts: 83
Joined: Wed Jun 10, 2015 1:59 pm

Your Face is Going Places You May Not Like

Post by apollo » Thu Jan 03, 2019 8:31 am

While the article is recent the much of technology and the public surveillance projects that the author is talking about has been in use for over a decade. It just was not public knowledge. These days with camera quality improving dramatically and the size of the cameras being smaller than the end of the end of ones little finger (consider the size and quality of the camera in most cell phones). These devices can be nearly anywhere and virtually undetectable without a very careful methodical search.

As for the face/body recognition software simply consider the quality off the software that is preinstalled or free to download for your cell phone. What do you think the government has available for their use?

The wide spread use, as well as, unfortunately far more likely the wide spread misuse, of this technology is why the U.N. is pushing the Global Legal ID on all countries with their 2030 agenda. This technology and the willingness of so many to conform makes it very easily identify everyone, track what we are doing, where we go, what groups we belong, even if we have an opinion that is not popular with the government, etc., etc., etc. all by simply having inexpensive tiny unseen cameras scattered around your neighborhoods.


copied from:

https://hackaday.com/2019/01/02/your-fa ... -not-like/


Your Face is Going Places You May Not Like


Bob Baddeley
January 2, 2019


Many Chinese cities, among them Ningbo, are investing heavily in AI and facial recognition technology. Uses range from border control — at Shanghai’s international airport and the border crossing with Macau — to the trivial: shaming jaywalkers.

In Ningbo, cameras oversee the intersections, and use facial-recognition to shame offenders by putting their faces up on large displays for all to see, and presumably mutter “tsk-tsk”. So it shocked Dong Mingzhu, the chairwoman of China’s largest air conditioner firm, to see her own face on the wall of shame when she’d done nothing wrong. The AIs had picked up her face off of an ad on a passing bus.

False positives in detecting jaywalkers are mostly harmless and maybe even amusing, for now. But the city of Shenzhen has a deal in the works with cellphone service providers to identify the offenders personally and send them a text message, and eventually a fine, directly to their cell phone. One can imagine this getting Orwellian pretty fast.

Facial recognition has been explored for decades, and it is now reaching a tipping point where the impacts of the technology are starting to have real consequences for people, and not just in the ways dystopian sci-fi has portrayed. Whether it’s racist, inaccurate, or easily spoofed, getting computers to pick out faces correctly has been fraught with problems from the beginning. With more and more companies and governments using it, and having increasing impact on the public, the stakes are getting higher.



How it Works

Your face is like a snowflake; delicate and unique. While some people struggle to tell them apart, cameras can create accurate measures of many of their dimensions, including distance from eye to eye, forehead to chin, and other relative measurements. Put these together and you get a signature of metrics that can identify someone. Now you can use image analysis on photos or video to identify faces, finding zones of a particular range of colors, and run analytics on them, compare the results to a database, and find your match.

Governments, which have access to drivers license photos, booking photos, and other large collections of images, can put together immense sets of metrics and refine their algorithms using subsets as training data. Companies like Facebook also have huge datasets at their fingertips, matching photos with the person in the photo, then tying your friend list to the facial recognition to get a high likelihood of accurate identification of other faces in the photos.

In this way, Facebook may even have a leg up on governments; they have metadata about network links that increase the chances of positive identification. They also have more accurate/recent data. But possibly most important, their training datasets include not just mugshots taken from the same angles and the same lighting conditions, but real situations with varying light levels, moods, angles, obstructions, etc, so that they can train their datasets with much richer data. While law enforcement may have access to lots of mugshots, matching them up to video camera feeds to extract faces is a lot more difficult than CSI:Miami would have you believe.

The great thing is you can go play with yourself using OpenCV and your own camera. OpenCV has a face detection algorithm, and the tutorial walks you through all the math and complexity of identifying faces in photographs.

To Err is AI

Facial recognition is imperfect. In addition to picking up jaywalking bus advertisements, facial recognition has also made headlines recently when Microsoft’s face API technology was insensitive because it couldn’t recognize the faces of people of color as well as it could white people. It turned out their training data didn’t have enough dark-skinned people. In a study by the ACLU, Amazon’s Rekognition software accidentally matched 28 members of congress with mugshots in its database.

The training set isn’t the only problem for recognition. Your mood also has an impact. Your angry face standing at the back of the line for the TSA looks different from your relieved face when you are about to exit the DMV. The angle of the shot must be accounted for when calculating the metrics, and subtle variations in lighting can change a few pixels just enough to get a different value. Someone who knows they are being recorded may be able to change their facial features enough to fool the algorithm as well. Facial recognition for the purpose of access has very different parameters from facial recognition for identification without consent. Accounting for all of these variables and slight changes is extremely difficult, if not impossible, and in a situation where subtle differences are barely distinguishable from noise, the likelihood of error is high.

Increasing the size of the database can also increase the likelihood of confusion. If the database has multiple people with similar metrics, slight variations in the source image can result in slightly different metrics, which then identify the wrong person. Think of all the times where a TV show officer has pulled out a book of mugshots and asked someone to try to identify a perp. If you aren’t in the book, you are pretty safe. But if the book is compiled by an entity that has everyone’s picture, you’re in the book whether you like it or not.

Some facial recognition systems are trained to identify characteristics without identifying the person. For example, some advertising companies are looking at customizing ads for passersby based on what it captures as their sex or age. Unfortunately, this assumption of gender can be offensive and reinforce stereotypes in a world where it’s already tough enough when people do it. A static billboard is one thing, but a billboard that judges you and offers you products based on your appearance may not last long. If we ever reach a day where sex has to be confirmed by an algorithm before an impatient person can be granted access to a bathroom, I will be among the first to foil the cameras.

The Consequences

Which bathroom to use isn’t as bad as it gets, though. Imagine if self-driving cars were to use aspects of facial recognition to solve the trolley problem. Someone would have to define in code or in law what characteristics of a human are more valuable than others, possibly leading to split-second decisions by a computer that one person’s life is worth 3/5 another person’s.

When facial recognition inevitably gets it wrong and misidentifies someone, or when their face is copied and used maliciously, it can have horrible and long-lasting effects in the same way that identity theft can ruin someone’s credit for years. China’s cameras are being linked to a new social credit system in which people who make small mistakes are penalized in a way that publicly shames them and affects their ability to live and work. When the system gets it wrong, and it already does, it will ruin innocent people.

Faces are being used for granting access to phones, computers, and buildings. Facial recognition is being rolled out in China as part of a social score system that penalizes people for traffic offenses. Straight out of the movie Minority Report, facial feature recognition being used to customize advertisements in public. In an application we’ve all seen coming for a long time because of its use on TV, facial recognition is happening in law enforcement, being used to identify wanted suspects. The TSA is also getting on board, and has been testing fingerprints and facial recognition to verify identity of travelers.

As the uses of facial recognition grow, so will the number of innocent people falsely accused; it’s simply a matter of percentages. If a system recognizes people correctly 99.5% of the time, and analyzes 10,000 traffic incidents per day or per week, 50 innocent people will receive fines. 70,000,000 people flow through Shanghai’s airport per year. 99.5% accuracy means nearly 1,000 false identifications per day.

Fooling Facial Recognition

Imagine that you wish to opt out of a facial recognition system. With carefully applied makeup, it’s possible to obscure the face in such a way to convince the neural network that there isn’t a face at all. Other accessories like sunglasses can be effective. Failing that, wearing a point light source on a face, such as a headlamp, can saturate the camera. Even the San Francisco uniform, a hoodie, can obscure the face enough to prevent face recognition.

When access is the goal, a common trick is to use a printed photo of the intended person to be recognized. Windows 10 was vulnerable to this for a while, but has since fixed this problem. Computers are getting smarter about collecting 3D data to make spoofing more difficult, but it’s still relatively easy to do.

Of course you shouldn’t be using your face as a password anyway; using something that is public to get access to something that is private is bad security practice. Besides being easy to spoof, it can be used by law enforcement to get access to your phone or computer if it uses Face ID to unlock. With so many facial recognition databases growing in size every day, their value to hackers is increasing. The public release of these databases has the power to make everybody’s face as a password useless forever.

Conclusions

Facial recognition changes our idea of what can be considered private. While US law has maintained for a long time that your presence in a public place is considered public, the reality was that publicly available information was different from publicly accessible information, and one’s daily habits could generally be considered private, with the exception of celebrities. Now we are almost in an age where everyone can be scrutinized and put on display as thoroughly as a celebrity by the paparazzi, and just like the paparazzi the public won’t be satisfied by the mundane but will instead be looking for the freak in all of us. Someone with a camera outside a bar will be able to identify everyone that goes in and how frequently, and publish that information to people who will be critical, unforgiving, and who will use that information against them.

Thankfully, there are a lot of people who are concerned, and they are having an impact. Google has pledged to refrain from selling facial recognition products. Amazon has admitted to working with law enforcement to sell its Rekognition software, but that is under scrutiny by some members of Congress. Microsoft recently published a detailed and insightful position on the facial recognition that’s very much worth a read. Even the Chinese AI unicorns are concerned.

Whether you’re playing with facial recognition at home, crossing borders, or just crossing the street, cameras are watching you. What is happening with those images is impressive and intimidating, and just like all new technologies is prone to error and ripe for abuse. What we’d like to see is more transparency in implementation and the right to opt out, and these issues will doubtless play themselves out in the world’s governments and courtrooms for years to come.

Post Reply