Skip to Content

A wrongful arrest. A “racist robot.” A call for new laws

By Ciara Cummings

Click here for updates on this story

    ATLANTA, Georgia (WANF) — So called “false matches” using facial recognition technology (FRT) have led to the wrongful arrests of Black men across the country. But new research from Georgia Tech is aiming to close that widening gap of racial disparities.

Calvin Lawrence, a distinguished Atlanta engineer, has developed artificial intelligence (AI) including FRT for government agencies across the country. As one of his field’s few Black expert creators, he said even the best algorithms can be biased, which is why he authored “Hidden in White Sight: How AI Empowers and Deepens Systemic Racism,” published this year.

“It really goes back to the training models, the data that was inputted and who inputs the data,” Lawrence said. “At its basis, it’s built to mimic human behavior, so you train it.”

Georgia Tech’s experiment, which was funded by federal grants, trained a robot to seemingly act out racist behavior, to prove that bias can exist in the software.

“Robots are kind of like a child when they’re first born; they’re not good or bad,” said Matthew Gombolay, a Georgia Tech assistant professor of interactive computing who assisted in the test.

In the experiment, the robotic arm was tasked with grabbing a small picture with the image of a criminal portrayed on it. One box had a White male on it; the other box pictured a Black male. The researchers built an algorithm based on disproportionate negative data about Black people, such as public information obtained from the internet, including social media, mugshots and divisive rhetoric.

“What’s the robot going to do?” Gombolay asked. “Its best guess is based upon what the internet tells it.”

When the robot was instructed to choose the box with the criminal’s image, it repeatedly selected the box with the Black male’s picture.

“The internet taught the machine there is a more likely relationship that a person would think that a Black person is a criminal and a white person would not be,” Gombolay said.

The whole point of the study proves what goes in, will come out. Algorithms used to build systems like these are similar to that FRT, suggesting if FRT systems have disproportionate data in them or the software hasn’t been “trained” enough on people of color, false matches will continue.

The researchers advocate for state and federal regulation, a sentiment shared by a growing list of wrongfully accused.

“You have to prove yourself innocent; that’s how that goes,” Alonzo Sawyer said. His wife, Carronne Sawyer, echoed the same, “I had to prove my husband innocent before anything could happen.”

On March 26, 2022, Maryland Transit Authority (MTA) surveillance video showed a man boarding an MTA bus. A verbal back-and-forth occurred between the bus operator and the passenger. A police report detailed it as a dispute over a face mask, with the driver telling the passenger to wear one. After arguing, the passenger snatched the driver’s cell phone and fled.

The driver followed the suspect off the bus, trying to get her phone back. The male suspect hit her several times and fled again.

After getting back onto her bus, the operator drove to a nearby parking lot where she was met by MPT police. Local law enforcement then issued an alert for the suspect that included a screenshot image taken from surveillance video.

Five days later, U.S. marshals arrested Sawyer outside of Harford County District Court, half an hour away from where the crime happened. Agents found Sawyer at the courthouse since he was due in court on an unrelated traffic offense.

Law enforcement tracked down Sawyer because facial recognition identified him as the suspect, according to case files obtained by Atlanta News First Investigates.

According to an incident report, Sawyer’s former probation officer and a state trooper also identified him. But additional case records show Sawyer’s parole officer and the trooper initially said they were “not 100% sure.” The probation officer “positively” identified Sawyer after seeing surveillance footage.

Even the victim herself told police Sawyer was not her assailant. After looking at a photo lineup, she wrote, “Not the person who assaulted me,” under Sawyer’s image.

The 56-year-old father remained in jail without bond for nine days. “It was really traumatic just to be sitting there [and] being so frantic,” he said.

Caronne Sawyer began advocating on her husband’s behalf. She confronted his former probation officer; she said they had only met a few times during the pandemic and were both wearing face masks.

On April 5, 2022, the probation officer recanted his positive identification to law enforcement. Around that same time, detectives got a new lead from an officer who believed Deon Ballard was the actual suspect. New warrants were issued and on April 22, Ballard was arrested. He later pleaded guilty to the incident, and even Ballard’s mother positively identified him.

Multiple experts on Sawyer’s case called it an example of confirmation bias: an officer gets a “match” after running an image through FRT and because they’re convinced it’s right, they look for evidence to solely support that theory.

The family wants an apology from Maryland law enforcement.

When asked why Sawyer was arrested when witnesses were unsure, as well questions regarding department FRT policy, Baltimore County, Maryland, provided this statement:

“This office has received your request for comment with regard to a case involving Alonzo Sawyer. It is apparent that you possess the case file regarding the investigation. It is concerning that you may intend to report this case consistent with the way you summarized it in the email to us.

“You state among other things that Mr. Sawyer was identified by facial recognition and imply that the arrest was based on that “despite all possible witnesses being unsure”. That is not accurate reporting. You don’t mention that after the “parole officer” said he was unsure by looking at a BOLO which was emailed to him, the investigator took the CCTV footage to him and showed him the footage. At that time, the agent positively identified Mr. Sawyer from the footage. The charging document (which you also have) states that the facial recognition just provided an investigative lead and that the basis for the charge was the positive identification by the probation agent.

“You may also note that on the same day that the agent contacted the police and expressed doubt about his identification (about a week after the positive identification), this office filed a Motion to have him released from incarceration and later dismissed the charges against him.”

When later asked for information on FRT policy and if an apology was forthcoming, the agency did not respond.

“To wake up one day and just get accused … what if I would’ve been accused of murder?” Sawyer said. “If it hadn’t been for my wife doing the leg work for me, I would have just got lost in the system.”

Please note: This content carries a strict local market embargo. If you share the same market as the contributor of this article, you may not use it on any platform.

Article Topic Follows: CNN - Regional

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KTVZ NewsChannel 21 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content