When Face Recognition Doesn’t Know Your Face Is a Face

when-face-recognition-doesn’t-know-your-face-is-a-face

Autumn Gardiner thought updating her driving license would be straightforward. After getting married last year, she headed to the local Department of Motor Vehicles office in Connecticut to get her name changed on her license. While she was there, Gardiner recalls, officials said she needed to update her photo. That’s when things started to go wrong.

Every time staff tried to take her photo, Gardiner says, the system would reject it. “Everyone’s watching. They’re taking more photos,” she recalls. Gardiner, who works as a grant manager for an environmental conservation charity, is one of a small number of people globally who live with Freeman-Sheldon syndrome. Sometimes known as Whistling Face syndrome, the genetic condition impacts muscles around the face and skull, which can result in an undersized mouth.

As more staff members at the DMV were called to help, Gardiner says she started to believe the rejected photos were being caused by her facial difference. The camera system didn’t seem to work for her, she says. “It was humiliating and weird. Here’s this machine telling me that I don’t have a human face,” Gardiner says.

Gardiner isn’t alone. Around half a dozen people living with a variety of facial differences—from birthmarks to craniofacial conditions—tell WIRED they are increasingly struggling to participate in modern life as identity verification software, which often is powered by machine learning, is quickly becoming commonplace.

When Face Recognition Doesnt Know Your Face Is a Face

Courtesy of Rick Guidotti for Positive Exposure

Some of those living with facial differences tell WIRED they have undergone multiple surgeries and experienced stigma for their entire lives, which is now being echoed by the technology they are forced to interact with. They say they haven’t been able to access public services due to facial verification services failing, while others have struggled to access financial services. Social media filters and face-unlocking systems on phones often won’t work, they say.

“The facial difference community is constantly overlooked,” says Phyllida Swift, the CEO of Face Equality International (FEI), an umbrella group representing other facial difference and disfigurement charities and organizations. There are more than 100 million people worldwide who live with facial disfigurements, FEI estimates. People with facial differences have experienced problems with airport passport gates, photo apps, social media video filters, background blurring on video calls, and more, according to FEI’s research. “In many countries, facial recognition is increasingly a part of everyday life, but this technology is failing our community,” Nikki Lilly, a patron for FEI, told a United Nations meeting in March.

Access Denied

From phones to hotel rooms, your face increasingly acts as a digital key. Over the past decade, rapid machine learning and AI advancements have led to the creation of a range of face recognition technologies—meaning that more than ever before, your appearance can be used as a digital identifier. Police have widely deployed face recognition systems, which have frequently been found to be inaccurate and biased against Asian and Black people, while the wider world of face checking has seen government services, anti-fraud systems, and financial institutions using AI to complete identity checks. Most recently, social media and porn websites have adopted face scanning as part of age verification measures.

These “authentication” face checks can take multiple forms. Selfies can be automatically compared to existing ID documents; while liveness tests can require you to take a short video to show you are real—not a fraudster holding a printed photo to the camera. Broadly speaking, these biometric systems often measure your facial features—such as the distance between your eyes, or the size of your jaw—to create “faceprints.” While these kinds of surveillance technologies may be effective for a lot of people, they may not be able to detect people with facial differences. The underlying machine learning technology may not be trained on datasets with a variety of faces, for example.

“The case of Face Equality International—of people who have different faces—is a really important canary in the coal mine of what can go wrong when these systems don’t work,” says Greta Byrum, the founder of technology consultancy firm Present Moment Enterprises, which focuses on the social impact of technologic systems and has provided some pro bono work to FEI. “We’re seeing facial recognition technologies becoming one of those hammers to which everything looks like a nail,” Byrum says.

“I would like to see people realize that not everyone looks the same, from basic skin tone to facial features to the standard facial appearance,” says Crystal Hodges, a speaker and writer who lives with Sturge-Weber syndrome and has a purple “port wine stain” across part of her face. “People’s facial symmetry may not be the same. Other people have different facial features that you don’t always see every day, but they exist.”

When Face Recognition Doesnt Know Your Face Is a Face

Courtesy of Crystal Hodges

Hodges, who lives in Arizona, says that last year she and her husband were trying to get their credit scores from a large credit reporting agency when its face verification service wouldn’t work for her. “​​I tried five to eight times,” she says. Towards winter, Hodges says, the birthmark will become darker. “We were trying a different ring light, we were trying calmer lighting, and it would not accept. My husband had a beard, and it accepted his.” Hodges says she recently tried the same system again, and while it worked this time, she still never got her credit score.

“If you don’t include people with disabilities or people with facial differences in the development of these processes, no one’s going to think of these issues,” says Kathleen Bogart, a psychology professor at Oregon State University who specializes in disability research and lives with a facial difference. “AI has amplified these issues, but it’s rooted in long-standing underrepresentation and prejudice towards people with facial differences that occurred long before AI was a thing.”

Too Little, Too Late

When face verification systems fail, it’s often hard to find help—piling more pressure on a stressful situation. For months, Maryland resident Noor Al-Khaled has struggled to create an online account with the Social Security Administration. Al-Khaled, who lives with the rare cranio-facial condition Ablepheron Macrostomia, says having an online account would allow her to easily access SSA records and quickly send documents to the agency.

“I don’t drive because of my vision; I should be able to rely on the site,” Al-Khaled says. “You have to take a selfie, and the pictures have to match,” Al-Khaled says. “Because of the facial difference, I don’t know if it’s not recognizing the ID or the selfie, but it’s always saying images don’t match.”

Not having that access makes life harder. “On an emotional level, it just makes me feel shut out from society,” she explains. Al-Khaled says that all services should provide alternative ways for people to access online systems. “The lack of other fallback options means that sometimes people get trapped in these labyrinths of technological systems,” says Byrum from Present Moment Enterprises.

Woman

Courtesy of WIRED source

An SSA spokesperson says alternative options to face verification are available, and it is “committed” to making its services accessible to everyone. The agency, the spokesperson says, does not run facial recognition systems itself but uses Login.gov and ID.me for verification services. The General Services Administration, which runs Login.gov, did not respond to WIRED’s request for comment. “Accessibility is a core priority for ID.me,” a spokesperson for ID.me says, adding it has previously helped people with facial differences and offered to directly help Al-Khaled after WIRED was in touch.

“There are few things more dehumanizing than being told by a machine that you’re not real because of your face,” says Corey R. Taylor, a New York–based actor and motivational speaker who lives with a craniofacial anomaly. Last year, Taylor says, he was using a financial app to access a small amount of money; as he tried to complete the payment processes, he found that the face verification system could not match his selfie to the image on his ID. To get the system to work, he had to move into different positions. “I had to literally raise my eyes and contort my face,” Taylor says. When he emailed the company, he got what appeared to be a boilerplate response.

“One of the main things that we are advocating for is alternative methods,” says Hannah Saunders, the head of policy and education at FEI. “There’s possibly a long-term fix in terms of changing the technology, but at the moment, we’re most concerned that people can apply for jobs, they can access their money, they can get through passport control with their family, and that negotiating those kinds of systems isn’t as stressful as it is.”

Swift, the CEO at FEI, says the organization has been “approached” multiple times to work with businesses to address some of the issues but has not yet been able to sort out any formal arrangements. Swift says FEI wants to put the “right systems and safeguards in place” to protect its members. Also, Swift adds, “we get the sense that this is quite low down on the list of priorities [of tech companies], and therefore progress is seemingly slow.”

However, the problems extend beyond emerging technologies. Many of the people WIRED spoke to explained that they often face appearance-based bullying online, and the internet’s wider pivot to short-form video content like TikTok and Instagram Reels leaves them feeling isolated. “The internet, for so long, was a safe haven for me,” Al-Khaled says. “I found my home and solace online because it was the only place where I could meet people and actually be myself without being stared at or gawked at.” Hodges says people have previously said to her that they are shocked that she can leave her house. “I feel like that’s kind of transferred now to: ‘I can’t believe you leave your house, and I can’t believe you go online,’” she says.

When Face Recognition Doesnt Know Your Face Is a Face

Courtesy of Corey Taylor

At the DMV in Connecticut last year, Gardiner says that staff were able to manually override the photography system, although she did explain her facial difference to them in the process. She received her replacement temporary license that day and full license later in the mail. “I’ve dealt with a lot of shit when it comes to my face, but after I left, I just cried,” Gardiner recalls. “It was kind of like a reminder that yes, I’m different and I don’t fit in.”

After leaving, Gardiner messaged a state representative she knew and told her about the experience, according to messages shared with WIRED. The official got in touch with the local DMV commissioner and discussed the issue plus staff training, they say.

“While we adhere to stringent guidelines to guarantee photo compliance, our employees approach their roles with empathy, ensuring they understand and address the needs of those they serve,” a DMV CT spokesperson says. “We want our residents to have professional interactions with the DMV.” A spokesperson for Idemia, which provides the DMV with technology, says it regularly offers staff training, and its algorithms have been ranked highly in external evaluations.

Gardiner says that while there was ultimately a positive response, she has heard stories of people with facial differences running into problems at the DMV in other states. “I really want to try to pressure for better protocols: What do humans do when the AI doesn’t work?” she says. But with face recognition systems being used so broadly, the problem can seem insurmountable. “After that whole DMV experience, I then had to get my passport done,” Gardiner says, “and I ran into the same problem with my passport photo.”

Related Posts

Leave a Reply