Racist moments do not arrive as headlines. They arrive mid-errand. Mid-meeting. Mid-sentence. Mid-ceremony. 

And now, mid-smartphone notification.

On Tuesday, Google apologized for sending what it called an “offensive notification” about the BAFTA Film Awards controversy — a push alert that included the N-word. Completely spelled out. Hard “er” and all.

Initial reports called it an AI error. However the company has since clarified that although the slur’s appearance was the result of a technical failure, it was not AI-generated.

Google said its systems “recognised a euphemism for an offensive term on several web pages, and accidentally applied the offensive term to the notification text.” It added: “This system error did not involve AI. Our safety filters did not properly trigger, which is what caused this.”

If it wasn’t AI, then who exactly wrote it — and will they suffer any consequences? 

A System Error — Designed by People

If Google is to be believed, the push notification did not self-publish. It moved through human-designed, human-operated systems — coded, reviewed, and deployed by people.  

The notification linked to coverage of Sunday’s BAFTA Film Awards, which aired worldwide on BBC One and BBC iPlayer. During the ceremony, as actors Delroy Lindo and Michael B. Jordan presented an award, Tourette syndrome activist John Davidson — whose documentary, “I Swear,” was also up for a BAFTA award —  involuntarily shouted the N-word loudly enough for everyone to hear. 

RELATED: Michael B. Jordan and the ‘Tell’ of Awards Season

The BAFTA Awards are pre-taped but the broadcast didn’t edit the slur out, even though Warner Bros. — the studio behind “Sinners” — formally requested it. BBC leadership apologized and announced on Wednesday that they’ve launched a formal review.

But along the way, two conversations emerged on social media.

One centered Tourette syndrome — explaining coprolalia, which caused Davidson’s outburst, urging empathy for him, and warning against “ableist”  backlash. Davidson told Variety on Tuesday that his outburst “is literally the last thing in the world I believe.” 

Davidson, sitting 40 rows back, said he didn’t know a mic was nearby and that he could be heard. He didn’t realize it until Jordan and Lindo “appeared to look up from their role as presenters, and soon after that I decided to leave the auditorium.”

Although Davidson insists Tourette syndrome is a condition and not a disability, his online defenders accuse Black people of overreacting, condemning them as unreasonable ableists for demanding an apology. 

Richie Brave, host of a Black talk show on BBC radio, wrote on Threads that Davidson’s supporters are “[positioning] Black people as ignorant for having an emotional reaction to a slur that has horrific historical connotations for us.”

Activist Brittany Packnett, who is disabled, reflected on Threads about a former classmate who repeatedly threatened her; she later found out he had suffered a traumatic brain injury. But his disability, she wrote, did not make her safer — nor did it lessen the impact of his anti-Black threats of violence.  

“Stop making Blackness absorb every sin against us like we don’t bleed,” Packnett wrote.

Indeed, a December 2024 article in the Columbia Law Review Forum noted a grim statistical reality: “In the United States, 50 percent of people killed by law enforcement are disabled, and more than half of disabled African Americans are arrested by the age of 28.”  

And yet, the immediate response at the BAFTA ceremony was to explain the white man’s condition and ask for understanding.

Moments after the outburst, BAFTA host Alan Cumming apologized to the audience on Davidson’s behalf, “if you are offended.” The BBC later issued a broader apology. 

But bestselling author Fredrick Joseph believes the outpouring of sympathy for Davidson sends a clearer message. He wrote on Threads that if Davidson had offended or harmed white people, “there wouldn’t be all this conjecture.”

White people are “so comfortable with Black harm, Black fear, and Black death that [they] want the conversation to be about grace rather than accountability,” he wrote.

Meanwhile, the pain of two Black men actors who had to keep their composure before a global audience in the face of a racial slur was barely acknowledged.

The Emotional Math of Being Black

In the aftermath of the incident, writer Ijeoma Oluo, author of the bestseller “So You Want to Talk About Race,” wrote about the emotional calculations Lindo and Jordan were forced into: Is it safe to respond, and if I do, what will this cost me?

“Every Black person I know is punished for whichever way they choose to react,” Oluo wrote. “If they are openly angry then they are dangerous, or militant, overreacting, or lacking empathy for the person who has just harmed them. If they react with kindness or a desire to educate then they are forced to carry the emotions of the person who has just harmed them. If they decide to ignore it then they are told by others that it’s obvious by their reaction that this harm is acceptable and it will continue with gusto.”

That math doesn’t disappear under stage lights.

Jordan and Lindo had to make those calculations in front of a global audience of millions. Lindo told Vanity Fair he and Jordan “did what we had to do” and kept presenting. 

And then Google amplified the slur.

Racial Controversies Are Not New for Google

Google’s excuse — that the N-word was “accidentally applied” — sounds technical and contained, as if the slur were corrupted data rather than a word steeped in violence and pain.

But Google is a repeat offender

Back in May 2015, it was revealed that when users paired the N-word with “house” or “king,” Google  directed them to the White House in Google Maps. Google issued a halfhearted “we’re sorry if you were offended”-style apology. 

Five years later, a Google image-labeling service — used for automated object recognition — produced starkly different labels on nearly identical photos based on skin tone. An image of a dark-skinned person holding a thermometer was labeled “gun,” while the same image with lighter skin tones was labeled as an “electronic device.” Another lukewarm apology, this time with a system update. 

When anti-Black racism is dismissed as a system error, it tells us something about the system.

When George Floyd was murdered a few weeks later, Google CEO Sundar Pichai encouraged employees to observe 8 minutes of silence because “Our Black community is hurting.” Then he followed up by pledging money to organizations addressing racial inequities and increased Black representation in its workforce. 

By 2025, however, Google had abandoned its diversity initiatives, and Pichai was among the tech billionaires who attended President Donald Trump’s inauguration.

Representation alone does not prevent harm such as a push alert featuring the N-word. A 2024 study in Nature found that AI models make covertly racist decisions about people based on speech patterns associated with Black people — a form of bias that is less overt but no less harmful.

“Accidentally Applied” Isn’t Neutral Language

Google says only a “very small subset” of users received the N-word push notification. But are we supposed to shrug it off if only 1 million people globally received the alert instead of, say, 10 million?   

And if this was not AI, then logic leads us to the conclusion that it was human-engineered automation. And automation reflects choices. So again: If it wasn’t AI, who exactly wrote it? Is someone being fired for it?

When anti-Black racism is dismissed as a system error, it tells us something about the system. It tells us who it was built to protect, and who it wasn’t. 

And until preventing harm to Black people becomes a priority, we can expect more sorry if you were offended statements from corporate PR departments. Apologies, after all, are far easier than preventing the damage in the first place.